SlideShare a Scribd company logo
1IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Using Benchmarks to
Accelerate Process
Improvement
1°International Conference on
IT Data Collection, Analysis and Benchmarking
Rio de Janeiro (Brazil) - October 3, 2013
Joe Schofield
joescho@joejr.com
2IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Using Benchmarks
to Accelerate
Process
Improvement
1. Establish a brief framework for benchmarking: definition,
benefits, approach
2. Using a real life story, explore some of the motivation for
benchmarking
3. Present several examples of why cost and schedule data,
without some notion of product quality can be misleading
4. Provide compelling evidence for the need for meaningful
size attributes
5. Provide insights regarding the need for benchmark data to
support process improvement
6. Identify what makes measurements useful (5 Cs)
Goals of the Presentation
3IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Abstract
Using Benchmarks to Accelerate Process Improvement
Organizations are constantly pressured to prove their value to their leadership and
customers. A relative comparison to “peer groups” is often seen as useful and
objective, thus benchmarking becomes an apparent alternative. Unfortunately,
organizations new to benchmarking may have limited internal data for making valid
comparisons. Feedback and subsequent “action” can quickly lead to the wrong
results as organizations focus on improving their comparisons instead of improving
their capability and consistency.
Adding to the challenge of improving results, software organizations may rely on
more readily available schedule and financial data rather than indicators of product
quality and process consistency. This presentation provides measurement program
lessons learned and insights to accelerate benchmark and quantification activities
relevant to both new and mature measurement programs.
4IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Benchmarking Defined
Benchmarking is the process of comparing one's business processes and performance metrics to
industry bests or best practices from other industries. Dimensions typically measured are quality,
time and cost. In the process of best practice benchmarking, management identifies the best firms
in their industry, or in another industry where similar processes exist, and compares the results and
processes of those studied (the "targets") to one's own results and processes. In this way, they learn
how well the targets perform and, more importantly, the business processes that explain why these
firms are successful.
Benchmarking is used to measure performance using a specific indicator (cost per unit of measure,
productivity per unit of measure, cycle time of x per unit of measure or defects per unit of
measure) resulting in a metric of performance that is then compared to others.
Also referred to as "best practice benchmarking" or "process benchmarking", this process is used in
management and particularly strategic management, in which organizations evaluate various
aspects of their processes in relation to best practice companies' processes, usually within a peer
group defined for the purposes of comparison. This then allows organizations to develop plans on
how to make improvements or adapt specific best practices, usually with the aim of increasing some
aspect of performance. Benchmarking may be a one-off event, but is often treated as a continuous
process in which organizations continually seek to improve their practices.
Wikipedia: 5/31/2013
5IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Benchmarking and Other Tools
In 2008, a comprehensive survey on benchmarking was commissioned by The Global
Benchmarking Network, a network of benchmarking centers representing 22 countries.
Over 450 organizations responded from over 40 countries. The results showed that
organizations using 20 improvement tools:
•77% use Mission and Vision Statements and Customer (Client) Surveys
•72% use SWOT analysis
•68% use Informal Benchmarking (68%)
•49 % use Performance Benchmarking
•39% use Best Practice Benchmarking
The tools that are likely to increase in popularity the most over the next three years are:
•Performance Benchmarking
•Informal Benchmarking
•SWOT
•Best Practice Benchmarking
Over 60% of organizations that are not currently using these tools indicated they are
likely to use them in the next three years.
Wikipedia: 5/31/2013
6IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Benchmarking Approaches
Identify problem areas: Because benchmarking can be applied to any business process or function, a range of
research techniques may be required. They include informal conversations with customers, employees, or
suppliers; exploratory research techniques such as focus groups; or in-depth marketing research,
quantitative research, surveys, questionnaires, re-engineering analysis, process mapping, quality control variance
reports, financial ratio analysis, or simply reviewing cycle times or other performance indicators. Before
embarking on comparison with other organizations it is essential to know the organization's function and
processes; base lining performance provides a point against which improvement effort can be measured.
Identify other industries that have similar processes: For instance, if one were interested in improving hand-offs
in addiction treatment one would identify other fields that also have hand-off challenges. These could include air
traffic control, cell phone switching between towers, transfer of patients from surgery to recovery rooms.
Identify organizations that are leaders in these areas: Look for the very best in any industry and in any country.
Consult customers, suppliers, financial analysts, trade associations, and magazines to determine which
companies are worthy of study.
Survey companies for measures and practices: Companies target specific business processes using detailed
surveys of measures and practices used to identify business process alternatives and leading companies. Surveys
are typically masked to protect confidential data by neutral associations and consultants.
Visit the "best practice" companies to identify leading edge practices: Companies typically agree to mutually
exchange information beneficial to all parties in a benchmarking group and share the results within the group.
Implement new and improved business practices: Take the leading edge practices and develop implementation
plans which include identification of specific opportunities, funding the project and selling the ideas to the
organization for the purpose of gaining demonstrated value from the process.
Wikipedia: 5/31/2013
7IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Selected Benchmarking Types 1 0f 2
Process benchmarking - the initiating firm focuses its observation and
investigation of business processes with a goal of identifying and observing the
best practices from one or more benchmark firms. Activity analysis will be
required where the objective is to benchmark cost and efficiency; increasingly
applied to back-office processes where outsourcing may be a consideration.
Financial benchmarking - performing a financial analysis and comparing the
results in an effort to assess your overall competitiveness and productivity.
Performance benchmarking - allows the initiator firm to assess their competitive
position by comparing products and services with those of target firms.
Product benchmarking - the process of designing new products or upgrades to
current ones. This process can sometimes involve reverse engineering which is
taking apart competitors products to find strengths and weaknesses.
Wikipedia: 5/31/2013
8IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Benchmarking Types 2 0f 2
Strategic benchmarking - involves observing how others compete. This type is
usually not industry specific, meaning it is best to look at other industries.
Functional benchmarking - a company will focus its benchmarking on a single
function to improve the operation of that particular function. Complex functions
such as Human Resources, Finance and Accounting and Information and
Communication Technology are unlikely to be directly comparable in cost and
efficiency terms and may need to be disaggregated into processes to make valid
comparison.
Best-in-class benchmarking - involves studying the leading competitor or the
company that best carries out a specific function.
Operational benchmarking - embraces everything from staffing and productivity to
office flow and analysis of procedures performed.
Wikipedia: 5/31/2013
9IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Benchmarking Comments
• Organizations are reluctant to publicly share their
data; therefore, purchase data from ISBSG Deeper Reading:
Effective Applications Development and Maintenance; The IFPUG Guide to IT and Software
Measurement; Pam Morris; 2012
• It is much more difficult to obtain correct and
reliable information with regard to external
organizations. Benchmark defect turnaround times,
defect age. Deeper Reading: Benchmarking Techniques and Their Applications in
IT; The IFPUG Guide to IT and Software Measurement; Nishant Pandey; 2012
• Improvement Needs Measurement : Measurement
Needs Improvement, me 6-13-2013
10IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
If Published Data is True . . .
Barry Boehm – requirements defects that
made their way into the field could cost 50-
200 times as much to correct as defects that
were corrected close to the point of
creation.
Boehm, Barry W. and Philip N. Papaccio. "Understanding and
Controlling Software Costs," IEEE Transactions on Software
Engineering, v. 14, no. 10, October 1988, pp. 1462-1477.
An example: These four Projects (A, B, C, D)
produced the same product using the same
technology, but different verification
processes
Project Cost
Person
Months Reviews Variation
Project A $250K 10 Often and disciplined (rigorous,
CRM, . . .)
Project B $500K 20 Often but not disciplined 100 %
Project C $1,000K 40 Not often and not disciplined 400 %
Project D $50,000K 2000 Worst case per Boehm (100x) 2000 %
11IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
If Published Data is True . . .
Capers Jones – reworking defective
requirements, design, and code typically
consumes 40 to 50 percent or more of the
total cost of most software projects and is
the single largest cost driver.
Jones, Capers. Estimating Software Costs, New York: McGraw-
Hill, 1998.
An example: These two Projects produced the
same product using the same technology,
but used different requirements elicitation
techniques
Project Cost
Person
Months Requirements Variation
Project A $250K 10 Captured and understood early
Project B $500K 20 Requirements volatility at high-end
(50%)
100 %
12IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
If Research is True . . .
Capers Jones – as a rule of thumb, every hour
you spend on technical reviews upstream
will reduce your total defect repair time
from three to ten hours.
Jones, Capers. Assessment and Control of Software Risks.
Englewood Cliffs, N.J.: Yourdon Press, 1994.
An example: These three Projects produced
the same product using the same
technology, but one used technical reviews,
the others did not
Project Cost
Person
Months Reviews Variation
Project A $25K 1 Spent 1 person month in reviews
Project B $75K 3 Did not conduct reviews (3 months of
repair time)
300 %
Project C $250K 10 Worst case with data 1000 %
13IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
If Research is True . . .
Don O’Neill – calculated the ROI for software
inspections between four and eight to one.
O’Neill, Don; National Software Quality Experiment:
Results 1992 – 1999: Software Technology Conference, Salt Lake
City, 1995, 1996, 2000
An example: These three Projects produced
the same product using the same
technology, but one used inspections, the
others did not
Project Cost
Person
Months Inspections Variation
Project A $25K 1 Spent 1 person month in reviews
Project B $100K 4 Did not conduct (4 months repair
time)
400 %
Project C $200K 8 Worst case with data (8 months
repair)
800 %
14IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Summary of Defect-Related Data
“You don’t know the status of your project (D, I)until you know the fidelity of
your process (K, W).”
Ref: Fidelity & Defect Metrics; Information Technology Measurement and Governance – International Strategies;
Ottawa, Canada, May 1, 2013; Joe Schofield
Assertion – In the absence of defect data:
• Productivity metrics are misleading
• Quality metrics are inadequate
• Value is impossible to ascertain
Opportunity – Reduce development and support costs
• Industry data has demonstrated a ROI for peer reviews of 2:1 to 3:1
• 30 – 60 percent of all development work is rework from changing or misunderstood requirements
• Instead of removing defects early in product development, organizations often rely on more testing to improve
the quality of their products. It’s the other 50 percent of defects from requirements and design that aren’t
found by testing, and which are the most expensive to resolve.
Fidelity – Quantifying how often we do what we say . . .
• We have a policy for product development, how often do we follow it?
• We have a process for product development, how often do we use it?
• We have criteria for tailoring our work, how often do we apply it?
• During a crisis, do we rely on process or abandon it?
• Is it useful or possible to benchmark with other organizations
if we characterize our own capability?
“7” Types of Waste – Toyota
1.Overproduction
2.Inventory
3.Wait Time
4.Transportation
5.Processing
6.Motion
7.Defects
8.Underutilized People
15IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Inappropriate size measures will distort your
data: lines of code, story points, page size
Deeper Reading: The Statistically Unreliable Nature of Lines of Code; CrossTalk; April, 2005 - NIST Citation
16IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Inappropriate size measures will distort your
data: lines of code, story points, page size
Deeper Reading Function Points, Use Case Points, Story Points: Observations from a Case Study; CrossTalk; May / June, 2013
Characteristic Function Points Use Case Points Story Points
Useful at the project
level for estimating or
planning
With historical FP data
With historical UCP
data
With historical SP data
ISO / Standards based ISO 20926 no no
Captures customer view Expected Expected Definitely
Useful for
benchmarking outside
the company
Could be Could be Less so
Easy to calculate Less so More so Yes
Easy to validate for
repeatability /
consistency
More so More so Less so
Objectivity More so More so
Less so (team / team
member variability)
Technologically
independent
Yes Yes Maybe
Functional
measurement to
customer
Yes Yes
Not exclusively (may
include refactoring,
design, and other work)
17IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Inappropriate size measures will distort your
data: lines of code, story points, page size
Margins Font Font Size Spacing Bolding
Char. Per
page
%
Content
Loss
Initial
settings
.3 top &
bottom; .
4 sides
Times
New
Roman
10 Single none 7584 0
1” 5450 28
Verdana 5686 25
12 5177 32
Double 4353 43
ON 7185 5
Initial
settings
1” Verdana 12 Double ON 1403 83
 Read “% Content Loss” (last column) as variation!
 Cumulative difference of one page to almost six
 Consider still larger font, font size, spacing, charts, diagrams, pictures, etc.
 Impact on PMC SP1.1 – Monitor actual values of project planning
parameters against the project plan.
Deeper Reading Size - The Forgotten Measure; SEPG North America; Albuquerque, N. M.; March 15, 2012
18IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
5 C’s of Sizing Measures
Deeper Reading: : Size - The Forgotten Measure; SEPG North America; Albuquerque, N. M.; March 15, 2012 
19IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Takeaways
• You won’t understand your benchmark data until
you understand your process fidelity
• Inappropriate size measures will distort your data:
story points, lines of code, page size
• The absence of quality-related data will distort
your benchmark results
• Allowing teams to retain their own measures and
report them as needed to a measurement group
will add a layer of inconsistency to your data
• Benchmark to improve, not to impress
20IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Questions
21IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com
Further Readings
Joe Schofield is the President of the International Function Point
Users Group.  He retired from Sandia National Laboratories as a
Distinguished Member of the Technical Staff after a 31-year career. 
During twelve of those years he served as the SEPG Chair for an
organization of about 400 personnel which was awarded a SW-CMM®
Level 3 in 2005.  He continued as the migration lead to CMMI® Level 4
until his departure.
 
Joe has facilitated over 100 teams in the areas of software
specification, team building and organizational planning by using lean
six sigma and business process reengineering.  Joe has taught
graduate courses since 1990. He was a licensed girl’s mid-school
basketball coach for 21 seasons--the last five undefeated, over a span
of 50 games. 
He has over 80 published books, papers, conference presentations
and keynotes—including contributions to the books The IFPUG Guide
to IT and Software Measurement (2012), IT Measurement, Certified
Function Point Specialist Exam Guide, and The Economics of Software
Quality.   He is a CMMI Institute-certified Instructor for the
Introduction to the CMMI® and two other CMMI Institute courses,
Certified Software Quality Analyst, Certified Function Point Specialist,
and a Certified Software Measurement Specialist.
 
Joe is a frequent presenter in the Software Best Practices Webinar
Series sponsored by Computer Aid, Inc.  Joe completed his Master’s
degree in MIS at the University of Arizona in 1980.  By "others" he is
known as a husband, father, and grandfather.
http://joejr.com/presentd.htm (~55)
http://joejr.com/publishd.htm (~36)

More Related Content

PDF
Benchmarking | Arrelic Insights
DOCX
Benchmarking
DOCX
Beanchmarketing
PPTX
Benchmarking
PPT
Benchmarking
PPTX
Benchmarking
PDF
Advanced Sourcing Maximizing Savings Identification
PPTX
Benchmarking
Benchmarking | Arrelic Insights
Benchmarking
Beanchmarketing
Benchmarking
Benchmarking
Benchmarking
Advanced Sourcing Maximizing Savings Identification
Benchmarking

What's hot (17)

PPTX
Benchmarking
PPT
group 3 Benchmarking
PPTX
Benchmarking
DOC
Benchmarking
PPTX
Competitive Benchmarking and the Small Business Owner
PDF
The Adoption of Benchmarking Principles for Project Management Performance Im...
PDF
Benchmarking
PDF
Benchmarking
DOC
Advantages Of Benchmarking
PPTX
Benchmarking
PPSX
Benchmarking
PDF
Benchmarking
PDF
Vodafone Business Performance Measures
PDF
ASC Benchmarking - R Bays
PPTX
Benchmarking final
PPTX
Benchmarking
PDF
Private Equity Due Diligence - Think Operational
Benchmarking
group 3 Benchmarking
Benchmarking
Benchmarking
Competitive Benchmarking and the Small Business Owner
The Adoption of Benchmarking Principles for Project Management Performance Im...
Benchmarking
Benchmarking
Advantages Of Benchmarking
Benchmarking
Benchmarking
Benchmarking
Vodafone Business Performance Measures
ASC Benchmarking - R Bays
Benchmarking final
Benchmarking
Private Equity Due Diligence - Think Operational
Ad

Similar to Schofield - Using Benchmarks to Accelerate Process Improvement (20)

PPTX
Hotel benchmarking
PPT
Total Quality Management.ppt
PPTX
Lecture 3 benchmarking in quality system
PPT
Six Sigma Benchmarking
PPT
Benchmarking and outsourcing
PDF
Types of Benchmarking in Total Quality Management
PDF
94C94A923521.pdf
PPTX
Benchmarking in healthcare
PPT
Bpr 04 Benchmarking
PPTX
Benchmarking
PPTX
Voice of the market
PPT
TQM proper on training and development.ppt
PDF
benchmarking-12517018313111-phpapp01.pdf
PDF
benchmarking-12517018313111-phpapp01.pdf
PPT
generic Benchmarking for Continuous improvement
PPTX
BENCHMARKING pptx
PPTX
Benchmarking
PPT
Benchmarking
PPTX
Benchmarking
PDF
Benchmarking in Pharmaceutical pptx MQA.
Hotel benchmarking
Total Quality Management.ppt
Lecture 3 benchmarking in quality system
Six Sigma Benchmarking
Benchmarking and outsourcing
Types of Benchmarking in Total Quality Management
94C94A923521.pdf
Benchmarking in healthcare
Bpr 04 Benchmarking
Benchmarking
Voice of the market
TQM proper on training and development.ppt
benchmarking-12517018313111-phpapp01.pdf
benchmarking-12517018313111-phpapp01.pdf
generic Benchmarking for Continuous improvement
BENCHMARKING pptx
Benchmarking
Benchmarking
Benchmarking
Benchmarking in Pharmaceutical pptx MQA.
Ad

More from International Software Benchmarking Standards Group (ISBSG) (15)

PDF
Forselius - New look at project management triangle
PDF
Ogilvie - Beyond the statistical average
PDF
R.D.Fernandez et al - Software rates vs price of function points
PDF
Fehlmann and Kranich - Measuring tests using cosmic
PDF
Green - sizing for estimating, measurement and benchmarking
PDF
PDF
Furuyama - analysis of factors that affect productivity
PDF
Hill - Are we really bad? A look at software estimation accuracy
PDF
Minkiewicz - Lessons Learned from the ISBSG Database
PDF
Bertazo et al - Application Lifecycle Management and process monitoring throu...
PDF
Silveira - KPIs used in a 6,000 Function Points Program
PDF
Galorath - IT Data Collection, Analysis and Benchmarking: From Processes and...
PDF
Dekkers, T. - Software Estimation – The next level
PDF
De la fuente and Castelo - Software Rates vs cost per Function Point: a cost ...
PDF
S Woodward - What is your quest for software analytics
Forselius - New look at project management triangle
Ogilvie - Beyond the statistical average
R.D.Fernandez et al - Software rates vs price of function points
Fehlmann and Kranich - Measuring tests using cosmic
Green - sizing for estimating, measurement and benchmarking
Furuyama - analysis of factors that affect productivity
Hill - Are we really bad? A look at software estimation accuracy
Minkiewicz - Lessons Learned from the ISBSG Database
Bertazo et al - Application Lifecycle Management and process monitoring throu...
Silveira - KPIs used in a 6,000 Function Points Program
Galorath - IT Data Collection, Analysis and Benchmarking: From Processes and...
Dekkers, T. - Software Estimation – The next level
De la fuente and Castelo - Software Rates vs cost per Function Point: a cost ...
S Woodward - What is your quest for software analytics

Recently uploaded (20)

PDF
SIMNET Inc – 2023’s Most Trusted IT Services & Solution Provider
PDF
Deliverable file - Regulatory guideline analysis.pdf
PPTX
CkgxkgxydkydyldylydlydyldlyddolydyoyyU2.pptx
PPTX
5 Stages of group development guide.pptx
DOCX
unit 2 cost accounting- Tender and Quotation & Reconciliation Statement
PPTX
AI-assistance in Knowledge Collection and Curation supporting Safe and Sustai...
PPTX
The Marketing Journey - Tracey Phillips - Marketing Matters 7-2025.pptx
DOCX
Euro SEO Services 1st 3 General Updates.docx
PPTX
Lecture (1)-Introduction.pptx business communication
PPTX
HR Introduction Slide (1).pptx on hr intro
PDF
DOC-20250806-WA0002._20250806_112011_0000.pdf
PPT
340036916-American-Literature-Literary-Period-Overview.ppt
PDF
pdfcoffee.com-opt-b1plus-sb-answers.pdfvi
DOCX
unit 1 COST ACCOUNTING AND COST SHEET
PDF
Stem Cell Market Report | Trends, Growth & Forecast 2025-2034
PDF
COST SHEET- Tender and Quotation unit 2.pdf
PDF
Power and position in leadershipDOC-20250808-WA0011..pdf
PPTX
Belch_12e_PPT_Ch18_Accessible_university.pptx
PDF
Unit 1 Cost Accounting - Cost sheet
DOCX
Business Management - unit 1 and 2
SIMNET Inc – 2023’s Most Trusted IT Services & Solution Provider
Deliverable file - Regulatory guideline analysis.pdf
CkgxkgxydkydyldylydlydyldlyddolydyoyyU2.pptx
5 Stages of group development guide.pptx
unit 2 cost accounting- Tender and Quotation & Reconciliation Statement
AI-assistance in Knowledge Collection and Curation supporting Safe and Sustai...
The Marketing Journey - Tracey Phillips - Marketing Matters 7-2025.pptx
Euro SEO Services 1st 3 General Updates.docx
Lecture (1)-Introduction.pptx business communication
HR Introduction Slide (1).pptx on hr intro
DOC-20250806-WA0002._20250806_112011_0000.pdf
340036916-American-Literature-Literary-Period-Overview.ppt
pdfcoffee.com-opt-b1plus-sb-answers.pdfvi
unit 1 COST ACCOUNTING AND COST SHEET
Stem Cell Market Report | Trends, Growth & Forecast 2025-2034
COST SHEET- Tender and Quotation unit 2.pdf
Power and position in leadershipDOC-20250808-WA0011..pdf
Belch_12e_PPT_Ch18_Accessible_university.pptx
Unit 1 Cost Accounting - Cost sheet
Business Management - unit 1 and 2

Schofield - Using Benchmarks to Accelerate Process Improvement

  • 1. 1IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Using Benchmarks to Accelerate Process Improvement 1°International Conference on IT Data Collection, Analysis and Benchmarking Rio de Janeiro (Brazil) - October 3, 2013 Joe Schofield joescho@joejr.com
  • 2. 2IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Using Benchmarks to Accelerate Process Improvement 1. Establish a brief framework for benchmarking: definition, benefits, approach 2. Using a real life story, explore some of the motivation for benchmarking 3. Present several examples of why cost and schedule data, without some notion of product quality can be misleading 4. Provide compelling evidence for the need for meaningful size attributes 5. Provide insights regarding the need for benchmark data to support process improvement 6. Identify what makes measurements useful (5 Cs) Goals of the Presentation
  • 3. 3IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Abstract Using Benchmarks to Accelerate Process Improvement Organizations are constantly pressured to prove their value to their leadership and customers. A relative comparison to “peer groups” is often seen as useful and objective, thus benchmarking becomes an apparent alternative. Unfortunately, organizations new to benchmarking may have limited internal data for making valid comparisons. Feedback and subsequent “action” can quickly lead to the wrong results as organizations focus on improving their comparisons instead of improving their capability and consistency. Adding to the challenge of improving results, software organizations may rely on more readily available schedule and financial data rather than indicators of product quality and process consistency. This presentation provides measurement program lessons learned and insights to accelerate benchmark and quantification activities relevant to both new and mature measurement programs.
  • 4. 4IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Benchmarking Defined Benchmarking is the process of comparing one's business processes and performance metrics to industry bests or best practices from other industries. Dimensions typically measured are quality, time and cost. In the process of best practice benchmarking, management identifies the best firms in their industry, or in another industry where similar processes exist, and compares the results and processes of those studied (the "targets") to one's own results and processes. In this way, they learn how well the targets perform and, more importantly, the business processes that explain why these firms are successful. Benchmarking is used to measure performance using a specific indicator (cost per unit of measure, productivity per unit of measure, cycle time of x per unit of measure or defects per unit of measure) resulting in a metric of performance that is then compared to others. Also referred to as "best practice benchmarking" or "process benchmarking", this process is used in management and particularly strategic management, in which organizations evaluate various aspects of their processes in relation to best practice companies' processes, usually within a peer group defined for the purposes of comparison. This then allows organizations to develop plans on how to make improvements or adapt specific best practices, usually with the aim of increasing some aspect of performance. Benchmarking may be a one-off event, but is often treated as a continuous process in which organizations continually seek to improve their practices. Wikipedia: 5/31/2013
  • 5. 5IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Benchmarking and Other Tools In 2008, a comprehensive survey on benchmarking was commissioned by The Global Benchmarking Network, a network of benchmarking centers representing 22 countries. Over 450 organizations responded from over 40 countries. The results showed that organizations using 20 improvement tools: •77% use Mission and Vision Statements and Customer (Client) Surveys •72% use SWOT analysis •68% use Informal Benchmarking (68%) •49 % use Performance Benchmarking •39% use Best Practice Benchmarking The tools that are likely to increase in popularity the most over the next three years are: •Performance Benchmarking •Informal Benchmarking •SWOT •Best Practice Benchmarking Over 60% of organizations that are not currently using these tools indicated they are likely to use them in the next three years. Wikipedia: 5/31/2013
  • 6. 6IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Benchmarking Approaches Identify problem areas: Because benchmarking can be applied to any business process or function, a range of research techniques may be required. They include informal conversations with customers, employees, or suppliers; exploratory research techniques such as focus groups; or in-depth marketing research, quantitative research, surveys, questionnaires, re-engineering analysis, process mapping, quality control variance reports, financial ratio analysis, or simply reviewing cycle times or other performance indicators. Before embarking on comparison with other organizations it is essential to know the organization's function and processes; base lining performance provides a point against which improvement effort can be measured. Identify other industries that have similar processes: For instance, if one were interested in improving hand-offs in addiction treatment one would identify other fields that also have hand-off challenges. These could include air traffic control, cell phone switching between towers, transfer of patients from surgery to recovery rooms. Identify organizations that are leaders in these areas: Look for the very best in any industry and in any country. Consult customers, suppliers, financial analysts, trade associations, and magazines to determine which companies are worthy of study. Survey companies for measures and practices: Companies target specific business processes using detailed surveys of measures and practices used to identify business process alternatives and leading companies. Surveys are typically masked to protect confidential data by neutral associations and consultants. Visit the "best practice" companies to identify leading edge practices: Companies typically agree to mutually exchange information beneficial to all parties in a benchmarking group and share the results within the group. Implement new and improved business practices: Take the leading edge practices and develop implementation plans which include identification of specific opportunities, funding the project and selling the ideas to the organization for the purpose of gaining demonstrated value from the process. Wikipedia: 5/31/2013
  • 7. 7IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Selected Benchmarking Types 1 0f 2 Process benchmarking - the initiating firm focuses its observation and investigation of business processes with a goal of identifying and observing the best practices from one or more benchmark firms. Activity analysis will be required where the objective is to benchmark cost and efficiency; increasingly applied to back-office processes where outsourcing may be a consideration. Financial benchmarking - performing a financial analysis and comparing the results in an effort to assess your overall competitiveness and productivity. Performance benchmarking - allows the initiator firm to assess their competitive position by comparing products and services with those of target firms. Product benchmarking - the process of designing new products or upgrades to current ones. This process can sometimes involve reverse engineering which is taking apart competitors products to find strengths and weaknesses. Wikipedia: 5/31/2013
  • 8. 8IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Benchmarking Types 2 0f 2 Strategic benchmarking - involves observing how others compete. This type is usually not industry specific, meaning it is best to look at other industries. Functional benchmarking - a company will focus its benchmarking on a single function to improve the operation of that particular function. Complex functions such as Human Resources, Finance and Accounting and Information and Communication Technology are unlikely to be directly comparable in cost and efficiency terms and may need to be disaggregated into processes to make valid comparison. Best-in-class benchmarking - involves studying the leading competitor or the company that best carries out a specific function. Operational benchmarking - embraces everything from staffing and productivity to office flow and analysis of procedures performed. Wikipedia: 5/31/2013
  • 9. 9IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Benchmarking Comments • Organizations are reluctant to publicly share their data; therefore, purchase data from ISBSG Deeper Reading: Effective Applications Development and Maintenance; The IFPUG Guide to IT and Software Measurement; Pam Morris; 2012 • It is much more difficult to obtain correct and reliable information with regard to external organizations. Benchmark defect turnaround times, defect age. Deeper Reading: Benchmarking Techniques and Their Applications in IT; The IFPUG Guide to IT and Software Measurement; Nishant Pandey; 2012 • Improvement Needs Measurement : Measurement Needs Improvement, me 6-13-2013
  • 10. 10IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com If Published Data is True . . . Barry Boehm – requirements defects that made their way into the field could cost 50- 200 times as much to correct as defects that were corrected close to the point of creation. Boehm, Barry W. and Philip N. Papaccio. "Understanding and Controlling Software Costs," IEEE Transactions on Software Engineering, v. 14, no. 10, October 1988, pp. 1462-1477. An example: These four Projects (A, B, C, D) produced the same product using the same technology, but different verification processes Project Cost Person Months Reviews Variation Project A $250K 10 Often and disciplined (rigorous, CRM, . . .) Project B $500K 20 Often but not disciplined 100 % Project C $1,000K 40 Not often and not disciplined 400 % Project D $50,000K 2000 Worst case per Boehm (100x) 2000 %
  • 11. 11IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com If Published Data is True . . . Capers Jones – reworking defective requirements, design, and code typically consumes 40 to 50 percent or more of the total cost of most software projects and is the single largest cost driver. Jones, Capers. Estimating Software Costs, New York: McGraw- Hill, 1998. An example: These two Projects produced the same product using the same technology, but used different requirements elicitation techniques Project Cost Person Months Requirements Variation Project A $250K 10 Captured and understood early Project B $500K 20 Requirements volatility at high-end (50%) 100 %
  • 12. 12IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com If Research is True . . . Capers Jones – as a rule of thumb, every hour you spend on technical reviews upstream will reduce your total defect repair time from three to ten hours. Jones, Capers. Assessment and Control of Software Risks. Englewood Cliffs, N.J.: Yourdon Press, 1994. An example: These three Projects produced the same product using the same technology, but one used technical reviews, the others did not Project Cost Person Months Reviews Variation Project A $25K 1 Spent 1 person month in reviews Project B $75K 3 Did not conduct reviews (3 months of repair time) 300 % Project C $250K 10 Worst case with data 1000 %
  • 13. 13IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com If Research is True . . . Don O’Neill – calculated the ROI for software inspections between four and eight to one. O’Neill, Don; National Software Quality Experiment: Results 1992 – 1999: Software Technology Conference, Salt Lake City, 1995, 1996, 2000 An example: These three Projects produced the same product using the same technology, but one used inspections, the others did not Project Cost Person Months Inspections Variation Project A $25K 1 Spent 1 person month in reviews Project B $100K 4 Did not conduct (4 months repair time) 400 % Project C $200K 8 Worst case with data (8 months repair) 800 %
  • 14. 14IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Summary of Defect-Related Data “You don’t know the status of your project (D, I)until you know the fidelity of your process (K, W).” Ref: Fidelity & Defect Metrics; Information Technology Measurement and Governance – International Strategies; Ottawa, Canada, May 1, 2013; Joe Schofield Assertion – In the absence of defect data: • Productivity metrics are misleading • Quality metrics are inadequate • Value is impossible to ascertain Opportunity – Reduce development and support costs • Industry data has demonstrated a ROI for peer reviews of 2:1 to 3:1 • 30 – 60 percent of all development work is rework from changing or misunderstood requirements • Instead of removing defects early in product development, organizations often rely on more testing to improve the quality of their products. It’s the other 50 percent of defects from requirements and design that aren’t found by testing, and which are the most expensive to resolve. Fidelity – Quantifying how often we do what we say . . . • We have a policy for product development, how often do we follow it? • We have a process for product development, how often do we use it? • We have criteria for tailoring our work, how often do we apply it? • During a crisis, do we rely on process or abandon it? • Is it useful or possible to benchmark with other organizations if we characterize our own capability? “7” Types of Waste – Toyota 1.Overproduction 2.Inventory 3.Wait Time 4.Transportation 5.Processing 6.Motion 7.Defects 8.Underutilized People
  • 15. 15IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Inappropriate size measures will distort your data: lines of code, story points, page size Deeper Reading: The Statistically Unreliable Nature of Lines of Code; CrossTalk; April, 2005 - NIST Citation
  • 16. 16IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Inappropriate size measures will distort your data: lines of code, story points, page size Deeper Reading Function Points, Use Case Points, Story Points: Observations from a Case Study; CrossTalk; May / June, 2013 Characteristic Function Points Use Case Points Story Points Useful at the project level for estimating or planning With historical FP data With historical UCP data With historical SP data ISO / Standards based ISO 20926 no no Captures customer view Expected Expected Definitely Useful for benchmarking outside the company Could be Could be Less so Easy to calculate Less so More so Yes Easy to validate for repeatability / consistency More so More so Less so Objectivity More so More so Less so (team / team member variability) Technologically independent Yes Yes Maybe Functional measurement to customer Yes Yes Not exclusively (may include refactoring, design, and other work)
  • 17. 17IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Inappropriate size measures will distort your data: lines of code, story points, page size Margins Font Font Size Spacing Bolding Char. Per page % Content Loss Initial settings .3 top & bottom; . 4 sides Times New Roman 10 Single none 7584 0 1” 5450 28 Verdana 5686 25 12 5177 32 Double 4353 43 ON 7185 5 Initial settings 1” Verdana 12 Double ON 1403 83  Read “% Content Loss” (last column) as variation!  Cumulative difference of one page to almost six  Consider still larger font, font size, spacing, charts, diagrams, pictures, etc.  Impact on PMC SP1.1 – Monitor actual values of project planning parameters against the project plan. Deeper Reading Size - The Forgotten Measure; SEPG North America; Albuquerque, N. M.; March 15, 2012
  • 18. 18IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com 5 C’s of Sizing Measures Deeper Reading: : Size - The Forgotten Measure; SEPG North America; Albuquerque, N. M.; March 15, 2012 
  • 19. 19IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Takeaways • You won’t understand your benchmark data until you understand your process fidelity • Inappropriate size measures will distort your data: story points, lines of code, page size • The absence of quality-related data will distort your benchmark results • Allowing teams to retain their own measures and report them as needed to a measurement group will add a layer of inconsistency to your data • Benchmark to improve, not to impress
  • 20. 20IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Questions
  • 21. 21IT Confidence 2013 – October 3, 2013 http://itconfidence2013.wordpress.com Further Readings Joe Schofield is the President of the International Function Point Users Group.  He retired from Sandia National Laboratories as a Distinguished Member of the Technical Staff after a 31-year career.  During twelve of those years he served as the SEPG Chair for an organization of about 400 personnel which was awarded a SW-CMM® Level 3 in 2005.  He continued as the migration lead to CMMI® Level 4 until his departure.   Joe has facilitated over 100 teams in the areas of software specification, team building and organizational planning by using lean six sigma and business process reengineering.  Joe has taught graduate courses since 1990. He was a licensed girl’s mid-school basketball coach for 21 seasons--the last five undefeated, over a span of 50 games.  He has over 80 published books, papers, conference presentations and keynotes—including contributions to the books The IFPUG Guide to IT and Software Measurement (2012), IT Measurement, Certified Function Point Specialist Exam Guide, and The Economics of Software Quality.   He is a CMMI Institute-certified Instructor for the Introduction to the CMMI® and two other CMMI Institute courses, Certified Software Quality Analyst, Certified Function Point Specialist, and a Certified Software Measurement Specialist.   Joe is a frequent presenter in the Software Best Practices Webinar Series sponsored by Computer Aid, Inc.  Joe completed his Master’s degree in MIS at the University of Arizona in 1980.  By "others" he is known as a husband, father, and grandfather. http://joejr.com/presentd.htm (~55) http://joejr.com/publishd.htm (~36)