SlideShare a Scribd company logo
Together. Free your energies

Implementing a level 5 metrics programme @Capgemini Netherlands”
Selecting The Right Set...
Niteen Kumar - 26/11/2013
MEASUREMENT
OBJECTIVE

SCOPE

LEADING &
LAGGING
INDICATORS

LAGGING
INDICATORS

SPECIFIC

MEASUREABLE

Are the measurement target
oriented?

Can it be measured?

RELEVANT

What story will it tell?

ATTAINABLE

Does it cost too much?

TIME BOUND

By When?

Capgemini Leading and Lagging Indicators – NESMA Presentation

Niteen Kumar

2
MEASUREMENT
OBJECTIVE

SCOPE

LEADING &
LAGGING
INDICATORS

REGRESSION
EQUATION

APPLICATION DEVELOPMENT KPI PORTFOLIO

COST

QUALITY

SCHEDULE

APPLICATION MAINTENANCE KPI PORTFOLIO

Capgemini Leading and Lagging Indicators – NESMA Presentation

Niteen Kumar

3
MEASUREMENT
OBJECTIVE

SCOPE

LEADING &
LAGGING
INDICATORS

REGRESSION
EQUATION

LIST OF X FACTORS
code complexity, encapsulation, program language & tools, code review checklist, coding skills and experiences with the program
languages and tools used, code review skills and experiences, %

of tickets having existing

solution in KEDB,

quality of reused source code, requirements

volatility, integration test methods and tools, Integration test skills and
experiences with methods and tools used, quality of reused test cases ,domain, requirements volatility, quality attributes,
readability of documents, architecture measures, code complexity, encapsulation, requirements methods and tools, Right

shore Ratio requirements inspection checklist, high-level design methods and tools, high-level design inspection checklist,
detailed design methods and tools, detailed design review/inspection checklist, program language & tools, code review checklist

requirement inspection
skills and experiences, high-level design skills and experiences with the methods and tools used, high-level design
inspection skills and experiences, detailed de-sign skills and experiences with the methods and tools used, detailed design
review/inspection skills and experiences, # of CR’s Rolled Back, coding skills domain, architecture measures, high-level
usage, domain experiences, requirements skills and experiences with methods and tools used,

design methods and tools, high-level design skills and experiences with methods and tools used, quality of reused high-level
design documents, Rework

Effort

Capgemini Leading and Lagging Indicators – NESMA Presentation

Niteen Kumar

4
MEASUREMENT
OBJECTIVE

SCOPE

DEVELOPMENT
PROJECT –
INDICATORS

REGRESSION
EQUATION

APPLICATION DEVELOPMENT KPI PORTFOLIO
COST
Y - FACTORS

 % EFFORT VARIANCE

 CONTRIBUTION MARGIN

X - FACTORS
 Requirements Volatility
 Skill Index
 Reusability
 Effort by SDLC Phase
 Review , Rework Effort
 Resource Cost
 Code Complexity / Quality
 Overrun / Underrun
 # of times resource
changed during build

QUALITY
Y - FACTORS
 DEFECT REMOVAL
EFFICIENCY

 DELIVERED DEFECT
DENSITY

 COST OF QUALITY
X - FACTORS

SCHEDULE
Y - FACTORS

 % SCHEDULE VARIANCE

X - FACTORS
 Resource Availability
 Requirements Volatility
 Skill Index
 Reusability
 Rework Effort
 # of times resource
changed during build

The “X” factors
influencing the
outcome of “Y” was
identified during the
workshops.

The identified “X”
factors are logical in
nature and may
change during
statistical validation

 Rework Effort
 Test Coverage
 Testing Rate
 Review Effort
 Skill Level
 Code Complexity / Quality
 Test Preparation Effort

Capgemini Leading and Lagging Indicators – NESMA Presentation

Niteen Kumar

5
MEASUREMENT
OBJECTIVE

SCOPE

DEVELOPMENT
PROJECT –
INDICATORS

REGRESSION
EQUATION

APPLICATION DEVELOPMENT KPI PORTFOLIO
SPRINT

SPRINT
NUMBER
i

Total # of
Features /
Use Case
Planned
i

Estimated
Size (SP)
i

COST

Planned
Actual Size Productivity
%
(SP)
Factor
Completion
i
i
i

Total
Planned
Effort
(P.Hrs)
i

TOTAL AT ENGAGEMENT LEVEL g

Total # of
User
Stories
Total
Total
Total Actual MODIFIED Features /
Features / Overall Effort
Effort
During the Use Case
Use Case Variance in
(P.Hrs)
Iteration. COMPLETED ACCEPTED
%
i
i
i
i
i

ACTUAL EFFORT For Below Activities (Person Hours)
P.Hrs
902,00 1985,00 2177,00 1561,00 1153,00 1412,00 1805,00 1089,00 1478,00 1498,00

7935,00

174,00

628,00

534,00

h

h

h

i

MODL

COD

TST-P

TST-E

REFTR

SCRUM
MASTER

REV

REW

h

g

g

g

g

g

38,00

g

g

g

63,00

7,00

56,00

70,00

4,00

2,00

19,00

25,00

10,00

25,00

5,00

223,00

4

7

6

-

67,00

9,00

27,00

48,00

45,00

22,00

27,00

23,00

18,00

22,00

2,00

243,00

0

12

8

263%

120,00

11,00

12,00

19,00

27,00

19,00

16,00

32,00

26,00

21,00

21,00

204,00

5

9

10

70%

i

i

i

i

i

i

h

SCOPE &
REFTR -1

g

g

g

g

g

50,00

25,00

1

8

139

152

7

60

245,00

2

12

200

175

8

100

3

9

150

130

7,5

100

SCOPE DSGN

QUALITY DETAILS

Total Number Of
Planned Test
Cases
i

Total Number Of
Total Number Of
Test Cases
Total Number Of
EXTERNAL
Executed
INTERNAL Defects
Defects
i
i
i

QUALITY

DOD Performed
i

% DOD Steps
Performed
i

Total # Of
Impediments
Reported
i

Total # Of
Impediments
Removed
i

Defect Removal
Effeciency (%)
i

1289

1084

1443

472

46

-

284

163

h

h

h

h

h

h

h

h

i

28

20

17

9

YES

70

4

3

65%

34

30

21

16

NO

100

7

7

50%

12

12

25

12

YES

100

3

3

50%

Capgemini Leading and Lagging Indicators – NESMA Presentation

Niteen Kumar

6
MEASUREMENT
OBJECTIVE

SCOPE

MAINTENANCE
ENGAGEMENT
INDICATORS

REGRESSION
EQUATION

APPLICATION MAINTENANCE KPI PORTFOLIO
COST
Y - FACTORS
 % EFFORT VARIANCE FOR
KT and RELEASE
 PRODCTIVITY (AET)
 % BACKLOG OF TICKET

 CONTRIBUTION MARGIN

X - FACTORS
 Idle Time (under discussion)
 Resource Cost
 Right shore Ratio
 Skill Index
 Effort Spent on KT
 % of tickets having existing
solution in KEDB
 # of modules reoccurring
impacted
 System Downtime
 % Additional Work

QUALITY
Y - FACTORS
 % INCIDENT REDUCTION
 % FIRST TIME PASS
 % OF SYSTEM S’FULLY
TRANSITIONED DURING KT
STAGE
 DEFECT REMOVAL
EFFICIENCY FOR RELEASE
 DELIVERED DEFECT
DENSITY FOR RELEASE
 COST OF QUALITY
X - FACTORS
 Rework Effort
 Test Coverage
 Testing Rate
 Test Preparation Effort
 System Downtime
 # of CR’s Rolled Back
 % RCA Compliance
 # of reoccurring modules
impacted

SCHEDULE
Y - FACTORS
 % SCHEDULE VARIANCE
FOR KT PHASE
 % RESPONSE &
RESOLUTION COMPLIANCE
 % SCHEDULE VARIANCE
FOR RELEASE

The “X” factors
influencing the
outcome of “Y” was
identified during the
workshops.

X - FACTORS
 Resource Availability
 Skill Index
 Reusability
 Rework Effort
 % of tickets having existing
solution in KEDB
 Elapsed Time to Assign /
Investigate / Testing /
Implementation Per Ticket
 # of times incident/service
request assigned within and
between teams

Capgemini Leading and Lagging Indicators – NESMA Presentation

The identified “X”
factors are logical in
nature and may
change during
statistical validation

Niteen Kumar

7
MEASUREMENT
OBJECTIVE

SCOPE

INDICATORS
MAINTENANCE
PROJECT

REGRESSION
EQUATION

APPLICATION MAINTENANCE KPI PORTFOLIO
INCIDENT / PROBLEM MANAGEMENT - LEADING & LAGGING INDICATORS

5811

Reporting
Month

Priority

Jun-13
Jun-13
Jun-13
Jun-13
Jun-13
Jun-13
Jun-13
Jun-13
Jun-13
Jun-13
Jun-13
Jun-13

P0
P1
P2
P3
P4
P5
S0
S1
S2
S3
S4
S5

926

5895

Tickets
Backlog
Number of
Received Tickets of
Tickets
Current
Previous
Resolved
Month
Month

11
13
807
44

1
4
132
35

12
17
845
47

2
10
437
53

1
2
82
46

3
10
475
52

842

20545

3

Number of
Backlog
Tickets

Effort Spent
in Closing
The Ticket
P. Hours

Average
Effort per
Ticket

0
0
94
32
0
2
44
47
-

29,75
40,5
2552
216

4
1134
259

2,48
2,38
3,02
4,60
0,00
0,40
2,39
4,98
-

64

98,91

263

95,54

# of
# of
% SLA
% SLA
Response
Resolution
Compliance
Compliance
Breach
Breach

2
3
5
1

0
6
0
0

83,33
82,35
99,41
97,87
100,00
40,00
100,00
100,00
-

0
0
45
0

0
2
55
0

100,00
100,00
94,67
100,00
100,00
80,00
88,42
100,00
-

Capgemini Leading and Lagging Indicators – NESMA Presentation

0

FTR

0

-

Average
Elapsed Time
To Closure
H:MM:S

1:30:22
0:33:00
0:02:00
0:02:00

4:54:00
11:33:00
30:10:00
149:38:00

0:05:00
43:34:00
0:02:00
0:16:00

%First Time
Right

Average
Elapsed Time
To Assign
Ticket
Hours : Mins:
Sec

10:56:00
52:14:00
24:25:00
169:45:00

Niteen Kumar

8
MEASUREMENT
OBJECTIVE

SCOPE

INDICATORS
MAINTENANCE
PROJECT

REGRESSION
EQUATION

Example: Delivered Defect Density

*DDD = 3.0-0.05*RAE-0.06*TPE – 0.025CRE
RAE = Requirement Analysis Effort
TPE = Test Preperation Effort
CRE = Code Review Effort
* Example

Capgemini Leading and Lagging Indicators – NESMA Presentation

Niteen Kumar

9
About Capgemini
With more than 120,000 people in 40 countries, Capgemini
is one of the world's foremost providers of consulting,
technology and outsourcing services. The Group reported
2011 global revenues of EUR 9.7 billion.
Together with its clients, Capgemini creates and delivers
business and technology solutions that fit their needs and
drive the results they want. A deeply multicultural
organization, Capgemini has developed its own way of
working, the Collaborative Business ExperienceTM, and
draws on Rightshore ®, its worldwide delivery model.
Rightshore® is a trademark belonging to Capgemini

www.capgemini.com

The information contained in this presentation is proprietary.
© 2012 Capgemini. All rights reserved.

More Related Content

PDF
Nesma autumn conference 2015 - Functional testing miniguide - Ignacio López C...
PDF
Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...
PPTX
Estimation in the Tendering Process - Frank Vogelezang
PDF
Nesma autumn conference 2015 - Agile may deliver but it does not win (yet) -...
PDF
Nesma autumn conference 2015 - Agile normalized size - Theo Prins
PDF
7. space the estimation aid for bringing agile delivery predictability - p...
PPT
Top Metrics for Agile @Agile NCR2011
PDF
Software estimation challenge diederik wortman - metri
Nesma autumn conference 2015 - Functional testing miniguide - Ignacio López C...
Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...
Estimation in the Tendering Process - Frank Vogelezang
Nesma autumn conference 2015 - Agile may deliver but it does not win (yet) -...
Nesma autumn conference 2015 - Agile normalized size - Theo Prins
7. space the estimation aid for bringing agile delivery predictability - p...
Top Metrics for Agile @Agile NCR2011
Software estimation challenge diederik wortman - metri

What's hot (20)

PDF
Estimation of a micro services based estimation application bhawna thakur -...
PDF
Governance of agile Software projects by an automated KPI Cockpit in the Cloud
PDF
5. agile estimation reconsidered again esteban sanchez
PPT
Agile Metrics
PDF
Test Estimation in Practice
PDF
Ac2017 6. output based contracting
PPT
Agile Metrics
PDF
Effective Test Estimation
PPTX
Seven Key Metrics to Improve Agile Performance
PDF
Anne Mette Hass - I Don't Want To Be A Tester Anymore - EuroSTAR 2010
PPTX
The art of project estimation
PDF
Essam lotffy this project is not in line with organizational strategic plans
PDF
Module 12 Monitoring and Control I
PPT
Rob Baarda - Are Real Test Metrics Predictive for the Future?
PPTX
Cost and time estimation methods pros and cons
PPTX
Project Cost Management
PDF
Agile metrics what is... riga-version
PPT
Testestimationtechniques
PDF
Agile metrics - Measure and Improve
PDF
Practical usage of fpa and automatic code review piotr popovski
Estimation of a micro services based estimation application bhawna thakur -...
Governance of agile Software projects by an automated KPI Cockpit in the Cloud
5. agile estimation reconsidered again esteban sanchez
Agile Metrics
Test Estimation in Practice
Ac2017 6. output based contracting
Agile Metrics
Effective Test Estimation
Seven Key Metrics to Improve Agile Performance
Anne Mette Hass - I Don't Want To Be A Tester Anymore - EuroSTAR 2010
The art of project estimation
Essam lotffy this project is not in line with organizational strategic plans
Module 12 Monitoring and Control I
Rob Baarda - Are Real Test Metrics Predictive for the Future?
Cost and time estimation methods pros and cons
Project Cost Management
Agile metrics what is... riga-version
Testestimationtechniques
Agile metrics - Measure and Improve
Practical usage of fpa and automatic code review piotr popovski
Ad

Viewers also liked (20)

PPTX
Function Point Analysis: Size Matters - No Matter What You Have Been Told!
PDF
Workshop - Estimating Packaged Software - Nesma - Eric van der Vliet - Frank ...
PDF
Nesma autumn conference - the gains of unit based pricing - Sytse van der Schaaf
PPTX
Capgemini - Oracle Engineered Systems
PPTX
The top 5 issues for CFO's in 2016
PPT
Estimation and planning with smart use cases
PPTX
Cg Portfolio Presentation
PDF
The Nielsen Company BI COE: A Case Study
PDF
Solution deck capgemini cloud assessment
PPTX
Microservices: The Future-Proof Framework for IoT
PPTX
Closing the gap in your cloud ecosystem capgemini mark skilton v1
PDF
Implications of Industry 4.0 for CIOs
PDF
Cloud Computing for Enterprise Architects
PDF
A Portfolio Strategy To Execute Digital Transformation
PPTX
Enterprise Architecture & Project Portfolio Management 2/2
PDF
SAP HANA Transformation: Cloud Advantage
PPTX
Enterprise Architecture & Project Portfolio Management 1/2
PDF
IT Portfolio Management Using Enterprise Architecture and ITIL® Service Strategy
PDF
Cloud-Migration-Methodology v1.0
PDF
The Product Management X-Factor: How to be a Rock Star Product Manager
Function Point Analysis: Size Matters - No Matter What You Have Been Told!
Workshop - Estimating Packaged Software - Nesma - Eric van der Vliet - Frank ...
Nesma autumn conference - the gains of unit based pricing - Sytse van der Schaaf
Capgemini - Oracle Engineered Systems
The top 5 issues for CFO's in 2016
Estimation and planning with smart use cases
Cg Portfolio Presentation
The Nielsen Company BI COE: A Case Study
Solution deck capgemini cloud assessment
Microservices: The Future-Proof Framework for IoT
Closing the gap in your cloud ecosystem capgemini mark skilton v1
Implications of Industry 4.0 for CIOs
Cloud Computing for Enterprise Architects
A Portfolio Strategy To Execute Digital Transformation
Enterprise Architecture & Project Portfolio Management 2/2
SAP HANA Transformation: Cloud Advantage
Enterprise Architecture & Project Portfolio Management 1/2
IT Portfolio Management Using Enterprise Architecture and ITIL® Service Strategy
Cloud-Migration-Methodology v1.0
The Product Management X-Factor: How to be a Rock Star Product Manager
Ad

Similar to Implementing Level 5 Metrics Programme @ Capgemini Netherlands (20)

PDF
Harold van Heeringen - Nesma FP in Cost Estimation.pdf
PDF
The Use of Functional Size in the Industry.pdf
PDF
Agile Base Camp - Agile metrics
PDF
Using Benchmarking to Quantify the Benefits of Software Process Improvement
PDF
Capgemini Interview Questions By ScholarHat
PDF
Software Quality KPI
PPTX
IT Metrics Presentation
PPT
Asset enlargement certification part 2
PDF
Failing Fast - An Autopsy of a Failed Release
PDF
Deploying a data centric approach to enterprise agility
PPT
Metrics Sirisha
PPT
Metrics Sirisha
PPT
Metrics Sirisha
PDF
Jurnal an example of using key performance indicators for software development
PPTX
Test Metrics in Agile - powerful tool to support changes - Zavertailo Iuliia
PDF
The value of benchmarking IT projects - H.S. van Heeringen
PDF
Test Metrics in Agile: A Powerful Tool to Demonstrate Value
PDF
SI Competence in Capgemini Czech
PPT
Pms software and kp is
PDF
12 Key Performance Indicators for QA & Test Magers
Harold van Heeringen - Nesma FP in Cost Estimation.pdf
The Use of Functional Size in the Industry.pdf
Agile Base Camp - Agile metrics
Using Benchmarking to Quantify the Benefits of Software Process Improvement
Capgemini Interview Questions By ScholarHat
Software Quality KPI
IT Metrics Presentation
Asset enlargement certification part 2
Failing Fast - An Autopsy of a Failed Release
Deploying a data centric approach to enterprise agility
Metrics Sirisha
Metrics Sirisha
Metrics Sirisha
Jurnal an example of using key performance indicators for software development
Test Metrics in Agile - powerful tool to support changes - Zavertailo Iuliia
The value of benchmarking IT projects - H.S. van Heeringen
Test Metrics in Agile: A Powerful Tool to Demonstrate Value
SI Competence in Capgemini Czech
Pms software and kp is
12 Key Performance Indicators for QA & Test Magers

Recently uploaded (20)

PPT
Teaching material agriculture food technology
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PDF
Empathic Computing: Creating Shared Understanding
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PPTX
Cloud computing and distributed systems.
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
Encapsulation theory and applications.pdf
DOCX
The AUB Centre for AI in Media Proposal.docx
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PDF
Machine learning based COVID-19 study performance prediction
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Teaching material agriculture food technology
Chapter 3 Spatial Domain Image Processing.pdf
Empathic Computing: Creating Shared Understanding
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Cloud computing and distributed systems.
Encapsulation_ Review paper, used for researhc scholars
Advanced methodologies resolving dimensionality complications for autism neur...
Encapsulation theory and applications.pdf
The AUB Centre for AI in Media Proposal.docx
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Machine learning based COVID-19 study performance prediction
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
MIND Revenue Release Quarter 2 2025 Press Release
NewMind AI Weekly Chronicles - August'25 Week I
Digital-Transformation-Roadmap-for-Companies.pptx
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf

Implementing Level 5 Metrics Programme @ Capgemini Netherlands

  • 1. Together. Free your energies Implementing a level 5 metrics programme @Capgemini Netherlands” Selecting The Right Set... Niteen Kumar - 26/11/2013
  • 2. MEASUREMENT OBJECTIVE SCOPE LEADING & LAGGING INDICATORS LAGGING INDICATORS SPECIFIC MEASUREABLE Are the measurement target oriented? Can it be measured? RELEVANT What story will it tell? ATTAINABLE Does it cost too much? TIME BOUND By When? Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 2
  • 3. MEASUREMENT OBJECTIVE SCOPE LEADING & LAGGING INDICATORS REGRESSION EQUATION APPLICATION DEVELOPMENT KPI PORTFOLIO COST QUALITY SCHEDULE APPLICATION MAINTENANCE KPI PORTFOLIO Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 3
  • 4. MEASUREMENT OBJECTIVE SCOPE LEADING & LAGGING INDICATORS REGRESSION EQUATION LIST OF X FACTORS code complexity, encapsulation, program language & tools, code review checklist, coding skills and experiences with the program languages and tools used, code review skills and experiences, % of tickets having existing solution in KEDB, quality of reused source code, requirements volatility, integration test methods and tools, Integration test skills and experiences with methods and tools used, quality of reused test cases ,domain, requirements volatility, quality attributes, readability of documents, architecture measures, code complexity, encapsulation, requirements methods and tools, Right shore Ratio requirements inspection checklist, high-level design methods and tools, high-level design inspection checklist, detailed design methods and tools, detailed design review/inspection checklist, program language & tools, code review checklist requirement inspection skills and experiences, high-level design skills and experiences with the methods and tools used, high-level design inspection skills and experiences, detailed de-sign skills and experiences with the methods and tools used, detailed design review/inspection skills and experiences, # of CR’s Rolled Back, coding skills domain, architecture measures, high-level usage, domain experiences, requirements skills and experiences with methods and tools used, design methods and tools, high-level design skills and experiences with methods and tools used, quality of reused high-level design documents, Rework Effort Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 4
  • 5. MEASUREMENT OBJECTIVE SCOPE DEVELOPMENT PROJECT – INDICATORS REGRESSION EQUATION APPLICATION DEVELOPMENT KPI PORTFOLIO COST Y - FACTORS  % EFFORT VARIANCE  CONTRIBUTION MARGIN X - FACTORS  Requirements Volatility  Skill Index  Reusability  Effort by SDLC Phase  Review , Rework Effort  Resource Cost  Code Complexity / Quality  Overrun / Underrun  # of times resource changed during build QUALITY Y - FACTORS  DEFECT REMOVAL EFFICIENCY  DELIVERED DEFECT DENSITY  COST OF QUALITY X - FACTORS SCHEDULE Y - FACTORS  % SCHEDULE VARIANCE X - FACTORS  Resource Availability  Requirements Volatility  Skill Index  Reusability  Rework Effort  # of times resource changed during build The “X” factors influencing the outcome of “Y” was identified during the workshops. The identified “X” factors are logical in nature and may change during statistical validation  Rework Effort  Test Coverage  Testing Rate  Review Effort  Skill Level  Code Complexity / Quality  Test Preparation Effort Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 5
  • 6. MEASUREMENT OBJECTIVE SCOPE DEVELOPMENT PROJECT – INDICATORS REGRESSION EQUATION APPLICATION DEVELOPMENT KPI PORTFOLIO SPRINT SPRINT NUMBER i Total # of Features / Use Case Planned i Estimated Size (SP) i COST Planned Actual Size Productivity % (SP) Factor Completion i i i Total Planned Effort (P.Hrs) i TOTAL AT ENGAGEMENT LEVEL g Total # of User Stories Total Total Total Actual MODIFIED Features / Features / Overall Effort Effort During the Use Case Use Case Variance in (P.Hrs) Iteration. COMPLETED ACCEPTED % i i i i i ACTUAL EFFORT For Below Activities (Person Hours) P.Hrs 902,00 1985,00 2177,00 1561,00 1153,00 1412,00 1805,00 1089,00 1478,00 1498,00 7935,00 174,00 628,00 534,00 h h h i MODL COD TST-P TST-E REFTR SCRUM MASTER REV REW h g g g g g 38,00 g g g 63,00 7,00 56,00 70,00 4,00 2,00 19,00 25,00 10,00 25,00 5,00 223,00 4 7 6 - 67,00 9,00 27,00 48,00 45,00 22,00 27,00 23,00 18,00 22,00 2,00 243,00 0 12 8 263% 120,00 11,00 12,00 19,00 27,00 19,00 16,00 32,00 26,00 21,00 21,00 204,00 5 9 10 70% i i i i i i h SCOPE & REFTR -1 g g g g g 50,00 25,00 1 8 139 152 7 60 245,00 2 12 200 175 8 100 3 9 150 130 7,5 100 SCOPE DSGN QUALITY DETAILS Total Number Of Planned Test Cases i Total Number Of Total Number Of Test Cases Total Number Of EXTERNAL Executed INTERNAL Defects Defects i i i QUALITY DOD Performed i % DOD Steps Performed i Total # Of Impediments Reported i Total # Of Impediments Removed i Defect Removal Effeciency (%) i 1289 1084 1443 472 46 - 284 163 h h h h h h h h i 28 20 17 9 YES 70 4 3 65% 34 30 21 16 NO 100 7 7 50% 12 12 25 12 YES 100 3 3 50% Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 6
  • 7. MEASUREMENT OBJECTIVE SCOPE MAINTENANCE ENGAGEMENT INDICATORS REGRESSION EQUATION APPLICATION MAINTENANCE KPI PORTFOLIO COST Y - FACTORS  % EFFORT VARIANCE FOR KT and RELEASE  PRODCTIVITY (AET)  % BACKLOG OF TICKET  CONTRIBUTION MARGIN X - FACTORS  Idle Time (under discussion)  Resource Cost  Right shore Ratio  Skill Index  Effort Spent on KT  % of tickets having existing solution in KEDB  # of modules reoccurring impacted  System Downtime  % Additional Work QUALITY Y - FACTORS  % INCIDENT REDUCTION  % FIRST TIME PASS  % OF SYSTEM S’FULLY TRANSITIONED DURING KT STAGE  DEFECT REMOVAL EFFICIENCY FOR RELEASE  DELIVERED DEFECT DENSITY FOR RELEASE  COST OF QUALITY X - FACTORS  Rework Effort  Test Coverage  Testing Rate  Test Preparation Effort  System Downtime  # of CR’s Rolled Back  % RCA Compliance  # of reoccurring modules impacted SCHEDULE Y - FACTORS  % SCHEDULE VARIANCE FOR KT PHASE  % RESPONSE & RESOLUTION COMPLIANCE  % SCHEDULE VARIANCE FOR RELEASE The “X” factors influencing the outcome of “Y” was identified during the workshops. X - FACTORS  Resource Availability  Skill Index  Reusability  Rework Effort  % of tickets having existing solution in KEDB  Elapsed Time to Assign / Investigate / Testing / Implementation Per Ticket  # of times incident/service request assigned within and between teams Capgemini Leading and Lagging Indicators – NESMA Presentation The identified “X” factors are logical in nature and may change during statistical validation Niteen Kumar 7
  • 8. MEASUREMENT OBJECTIVE SCOPE INDICATORS MAINTENANCE PROJECT REGRESSION EQUATION APPLICATION MAINTENANCE KPI PORTFOLIO INCIDENT / PROBLEM MANAGEMENT - LEADING & LAGGING INDICATORS 5811 Reporting Month Priority Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 P0 P1 P2 P3 P4 P5 S0 S1 S2 S3 S4 S5 926 5895 Tickets Backlog Number of Received Tickets of Tickets Current Previous Resolved Month Month 11 13 807 44 1 4 132 35 12 17 845 47 2 10 437 53 1 2 82 46 3 10 475 52 842 20545 3 Number of Backlog Tickets Effort Spent in Closing The Ticket P. Hours Average Effort per Ticket 0 0 94 32 0 2 44 47 - 29,75 40,5 2552 216 4 1134 259 2,48 2,38 3,02 4,60 0,00 0,40 2,39 4,98 - 64 98,91 263 95,54 # of # of % SLA % SLA Response Resolution Compliance Compliance Breach Breach 2 3 5 1 0 6 0 0 83,33 82,35 99,41 97,87 100,00 40,00 100,00 100,00 - 0 0 45 0 0 2 55 0 100,00 100,00 94,67 100,00 100,00 80,00 88,42 100,00 - Capgemini Leading and Lagging Indicators – NESMA Presentation 0 FTR 0 - Average Elapsed Time To Closure H:MM:S 1:30:22 0:33:00 0:02:00 0:02:00 4:54:00 11:33:00 30:10:00 149:38:00 0:05:00 43:34:00 0:02:00 0:16:00 %First Time Right Average Elapsed Time To Assign Ticket Hours : Mins: Sec 10:56:00 52:14:00 24:25:00 169:45:00 Niteen Kumar 8
  • 9. MEASUREMENT OBJECTIVE SCOPE INDICATORS MAINTENANCE PROJECT REGRESSION EQUATION Example: Delivered Defect Density *DDD = 3.0-0.05*RAE-0.06*TPE – 0.025CRE RAE = Requirement Analysis Effort TPE = Test Preperation Effort CRE = Code Review Effort * Example Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 9
  • 10. About Capgemini With more than 120,000 people in 40 countries, Capgemini is one of the world's foremost providers of consulting, technology and outsourcing services. The Group reported 2011 global revenues of EUR 9.7 billion. Together with its clients, Capgemini creates and delivers business and technology solutions that fit their needs and drive the results they want. A deeply multicultural organization, Capgemini has developed its own way of working, the Collaborative Business ExperienceTM, and draws on Rightshore ®, its worldwide delivery model. Rightshore® is a trademark belonging to Capgemini www.capgemini.com The information contained in this presentation is proprietary. © 2012 Capgemini. All rights reserved.