SlideShare a Scribd company logo
1
1
SOFTWARE QUALITY METRICS
BENCHMARK STUDY
How Software Metrics and Dashboards are Applied in High Technology Company
2
22
EXECUTIVE SUMMARY
The purpose of the benchmark study was to capture best
practices in the application of SW metrics dashboards.
Ten technology companies were benchmarked against these
questions:
• What metrics on software quality are reported to
management?
• Internal quality metrics, external field detected metrics?
• How are they normalized? Customers in field, LOC?
• What are the most important?
• Are they tabular, graphical? How many? Are target values
shown?
• How frequently are they reported? How many do you
report on?
• What are key target values you look at for key metrics?
Alcatel
Boston Scientific
Cisco
Ericsson
General Dynamics
IBM
Saint Jude Medical
Palo Alto Networks
Riverbed
VMware
List of participants
3 Highly regulated companies
7 Networking/Computer/Storage
Key Highlights:
• There is no standard for the number of metrics, type of
metrics, nor frequency of reporting
• However there are best practices around Software Quality
Metrics – We can look at what separates the best from the
rest
• The BEST have
1. Automated metrics tracking and analysis systems that
allow drill down and reporting by product, release,
customer
2. Normalization that ensures that the metrics are
meaningful as the number of customers or the
complexity of code increases
3. Root Cause Analysis system that systematically
analyzes defects that escape the company and are
found in the field
4. Quality metrics that go beyond product defects, and
include release predictability and feature expectations
5. External benchmarks that are used to set goals
(created by third parties to establish databases or
perform surveys)
3
33
HOW WE APPROACHED THE ANALYSIS
• The Process Capability Maturity Model (CMM) defines five level of process maturity
• Levels 1 (Initial, Chaotic)
• Level 2 (Repeatable)
• Level 3 (Defined)
• Level 4 (Managed, Measured)
• Level 5 (Optimizing)
• Metrics are a key parts of the CMM model, and Level 4 indicates mastery of metrics
• SW metrics are well characterized, and are often divided up between Product Quality Metrics, In-Process Metrics, and Metrics for
SW Maintenance*
• From our survey of ten companies, we have derived a sense of metrics maturity, and have created our own rating of SW Metrics
Maturity using five factors
• Automated, Root Cause Analysis, Normalized, External Benchmarks, and Total Quality (not just defects)
• The Best tend to have excellent scores on all five dimensions, the rest lag behind in one or more areas
• The best tend to have measures in the three areas defined above (Product, In-Process, and Maintenance)
4
44
EXAMPLE SW METRICS MATURITY
1. Automated metrics tracking and analysis
systems that allow drill down and
reporting by product, release, customer
2. Normalization that ensures that the
metrics are meaningful as the number of
customers or the complexity of code
increases
3. Root Cause Analysis system that
systematically analyzes defects that
escape the company and are found in the
field
4. Quality metrics that go beyond product
defects, and include release predictability
and feature expectations
5. External benchmarks that are used to set
goals (created by third parties to establish
databases or perform surveys)
Root Cause
Analysis
Automated
Metrics System
Normalization
Total Quality
(Predictability/Fe
atures)
Uses External
Benchmarks
Best
Rest
The nature of the survey did not allow us to complete this chart for each participant, but this treatment would be very useful to
evaluate where you are today and where you should focus in the future to close gaps between the best and the rest.
Hypothetical Radar Chart: A 5 point scale, where mastery is
indicated as a 5 (outermost), and absent is a 0 (innermost)
5
55
DASHBOARD – DRAWN FROM BENCHMARKING
• Title & Description • So What • Consistent Design
• Labeled Axes • Target Curves • Narrative
Guiding Principles:
Each metric should be linked to your overall quality objectives, which were derived from your overall
strategy
From the Benchmark Sample, the goals might be:
• Increasing Net Promoter Score (how highly you are recommended)
• Increasing Release Predictability
• Increasing Customer Satisfaction
• Increasing Reported Quality (Field Quality)
• Reducing time to repair
• Reducing the number of Critical Accounts
Each chart has the following graphical properties:
• The charts are composed so that the ‘so what’ is very clear, and repeated for each so that it is clear to
managers that only see them once a quarter, so they know why the metric is there and if there is any
significance to the data, what the significance is.
• Targets should be on all graphs
• Where benchmark data exists, it will also be shown on the chart
• Each chart should have the following properties
There should be between
4-8 metrics
Two related metrics per
screen
Text describing & analyzing
the data represented
6
66
Percent of Release Slips
This chart plots the percentage of actual versus planned schedule for major
and minor releases.
• The target is derived to get to less than 5% slip by 2014, closing the gap in
a straight line, coming down from 22% where we are today
• The increase shown in November, 2011is driven by the A.2a release,
which had to go through 2 alpha
• We expect a steeper drop in July, 2012 because of our new “Darken the
Sky” program to provide requirements stability
• Benchmarking indicates that the best in class number is a slip rate of less
than 15% (for 9 month release cycles).
Mean Time to Repair
This chart plots the average time, in weeks, that the customers had to wait for
resolution. Measured in weekly intervals, data captured per release.
• The target is derived to get to the fastest resolution (and reduce the
number outstanding)
• The increase shown in January, 2012 is driven by the A.x release.
• The new methods for engineering releases should impact this in 2013
VerticalAxisLabel
Horizontal Axis Label
Benchmark
VerticalAxisLabel
Horizontal Axis Label
Major Release 2
Major Release 3
7
77
BEST PRACTICES
1. Use of third party firms to assess where your software defect performance stacks up against the competition & use of industry standard databases for
software quality
2. Test Escapes Analysis Process to perform root cause analysis on all significant escapes to the field
3. SW Defects reported on dashboard includes broader measures like predictability, expectations
4. Automated, integrated system for real time metrics analysis and presentation to management is simply pulling up current data and reviewing it formally
5. Normalization for complexity and or accounts in the field to ensure that proper comparisons are made
6. Create compound metric that pulls together several important factors for the business
7. Institute metrics that show (unit and integration) statement coverage, branch coverage, all tests passing, and for functional testing, show requirements
coverage and all tests passing
8. Institute metrics that show defect backlog, number of test cases planned, and Upgrade/Update failure rate, Early Return Index, Fault Slip Through
9. Bug tool kit that goes to the field with exhaustive and searchable data to help customers avoid reporting defects, learn about workarounds, and search with
Google like strength
10. If external benchmark targets are not known, track improvement release over release
11. Focus on what is important. One participant only tracks release predictability and customer satisfaction
12. Use parametric estimation metrics – for example 4 days for a test case to ensure high quality, data driven schedule estimates (also helps demonstrate
improvements over time)
In benchmarking studies like this, we often see some exemplary practices that demonstrate creative and effective ways to stay ahead.
Top 5
Metrics to
Consider
Other
Tips
8
88
SUMMARY STATISTICS
• 8 do report customer found defects to management (remaining
2 report customer sat at a high level)
• 6 report on the order of 4 metrics to management, the
remaining 4 report more or less
• 5 include time to market as a metric in their quality dashboard
• 4 report escapes or customer found defects caused by bad fixes
Key Highlights:
• 4 companies have real time visibility of metrics, and they are
automatically updated on a daily basis
• 3 companies reported on compound metrics that combine
reliability, availability, time to fix
• 3 do not use targets for metrics reported to management, but
only report the improvement release to release
• 3 normalize metrics (LOC on inside, or Units in Field on outside)
9
99
IMPLICATIONS
• Root cause analysis should be performed on defects from the field that are either critical or from regressions
• Many companies have special processes for doing this effectively
• It appears that some participants have higher levels of automation and coverage for both unit, integration, and functional test
• And it is measured
• Planning metrics, such as the number of days per test case should be used for prediction and improvement
• If you are growing, some normalization should be used.
• It should be coarse (like judged Lines of Code, converted from Function Points)
• Walker Survey, Quest Database, and Manager-Tools.com are three recommended vendors for metrics and management
• Walker Survey can determine how you stack up against your competitors regarding quality and satisfaction
• Quest is a TL 9000 database
• Manager-Tools are helpful for developing QA managers
• Where absolute targets don’t exist, a target curve based on prior improvement should be used to answer ‘are we getting better?’
10
10
TCGen Inc.
Menlo Park
CA, 94025
info@tcgen.com
+12 3456 789

More Related Content

PDF
Agile Project Management in a Waterfall World: Managing Sprints with Predicti...
PDF
Customer Collaboration & Product Innovation Using Social Networks
PPTX
Agile Metrics: It's Not All That Complicated
PDF
Big Apple Scrum Day 2015 - Advanced Scrum Metrics Presentation
PDF
Product Roadmaps Done Right
PDF
Top Agile Metrics
PDF
Productivity measurement of agile teams (IWSM 2015)
PPT
Contracting for Agile Software Development
Agile Project Management in a Waterfall World: Managing Sprints with Predicti...
Customer Collaboration & Product Innovation Using Social Networks
Agile Metrics: It's Not All That Complicated
Big Apple Scrum Day 2015 - Advanced Scrum Metrics Presentation
Product Roadmaps Done Right
Top Agile Metrics
Productivity measurement of agile teams (IWSM 2015)
Contracting for Agile Software Development

What's hot (19)

PDF
AgileLIVE Webinar: Measuring the Success of Your Agile Transformation - Part 2
PDF
VersionOne Gartner PPM Presentation 2014: Journey to Value - The PPM/Agile In...
PDF
Agile KPIs vs. Traditional KPIs – A mind shift
PPTX
Agile Metrics - how to use metrics to manage agile teams
PPT
The Good, The Bad, and The Metrics
PDF
Agile Methods to Develop Tangible Products Quickly
PDF
Agile dashboard
PDF
Agile Planning Powerpoint Presentation Slides
PDF
Product development kaizen (pdk)
PDF
The art of agile project initiation
PPTX
SAFe v4.6 full
PPTX
Agile Project Management
PDF
Improve Estimation maturity using Functional Size Measurement and Historical ...
PPT
Agile Development
PDF
Governance of agile Software projects by an automated KPI Cockpit in the Cloud
PPTX
The software Implementation Process
PPTX
Agile project management
PDF
Lean discussions about agile and mvp
PDF
Immutable principles of project management (utah pmi)(v1)(no exercise)
AgileLIVE Webinar: Measuring the Success of Your Agile Transformation - Part 2
VersionOne Gartner PPM Presentation 2014: Journey to Value - The PPM/Agile In...
Agile KPIs vs. Traditional KPIs – A mind shift
Agile Metrics - how to use metrics to manage agile teams
The Good, The Bad, and The Metrics
Agile Methods to Develop Tangible Products Quickly
Agile dashboard
Agile Planning Powerpoint Presentation Slides
Product development kaizen (pdk)
The art of agile project initiation
SAFe v4.6 full
Agile Project Management
Improve Estimation maturity using Functional Size Measurement and Historical ...
Agile Development
Governance of agile Software projects by an automated KPI Cockpit in the Cloud
The software Implementation Process
Agile project management
Lean discussions about agile and mvp
Immutable principles of project management (utah pmi)(v1)(no exercise)
Ad

Similar to Software Quality Dashboard Benchmarking Study (20)

PPTX
Benchmarking
PPTX
Software Project Management Chapter --- 4
PDF
OM2_Lecture 11vvvhhbbjjbjdjjeebjrhvhuuhh
PPTX
it_Define_Service_Desk_Metrics_That_Matter_Storyboard.pptx
PPTX
Software Engineering Software Engineering
PPTX
UNIT4(2) OB UNIT II NOTESOB UNIT II NOTES
PDF
Project control and process instrumentation
PPTX
Software quality assurance. subject slides.pptx
PPT
chapter 7.ppt
PDF
Maksym Vyshnivetskyi: PMO KPIs (Ukrainian)
PDF
Using Benchmarking to Quantify the Benefits of Software Process Improvement
PPT
Benchmarking (1)
PPTX
Software Matrix it's a topic in software quality.pptx
PDF
Are Function Points Still Relevant?
PDF
Are Function Points Still Relevant?
PDF
Testing Metrics and why Managers like them
PDF
2018 ValAct - Session 22 - Material Weakness
PPTX
Six sigma
PDF
APM Center of Excellence Drives Improved Business Results at Itau Unibanco
PDF
scribd.vdownloaders.com_day-6-quality.pdf
Benchmarking
Software Project Management Chapter --- 4
OM2_Lecture 11vvvhhbbjjbjdjjeebjrhvhuuhh
it_Define_Service_Desk_Metrics_That_Matter_Storyboard.pptx
Software Engineering Software Engineering
UNIT4(2) OB UNIT II NOTESOB UNIT II NOTES
Project control and process instrumentation
Software quality assurance. subject slides.pptx
chapter 7.ppt
Maksym Vyshnivetskyi: PMO KPIs (Ukrainian)
Using Benchmarking to Quantify the Benefits of Software Process Improvement
Benchmarking (1)
Software Matrix it's a topic in software quality.pptx
Are Function Points Still Relevant?
Are Function Points Still Relevant?
Testing Metrics and why Managers like them
2018 ValAct - Session 22 - Material Weakness
Six sigma
APM Center of Excellence Drives Improved Business Results at Itau Unibanco
scribd.vdownloaders.com_day-6-quality.pdf
Ad

More from John Carter (19)

PDF
Product Development Metrics: More Harm Than Good?
PDF
Program Management 2.0: Work Breakdown Structure
PDF
Program Management 2.0: Schedule Prediction Accuracy
PDF
Program Management 2.0: Risk Management
PDF
Program Management 2.0: Monitoring Performance
PDF
Program Management 2.0: Circle-Dot Charts and Communication
PDF
Program Management 2.0: Burndown Charts
PDF
Program Management Tools and Techniques: Best Practices & Workshop for Progra...
PDF
Innovation Sprint Readiness Scorecard
PDF
Never The Twain Shall Meet: Can Agile Work with a Waterfall Process?
PDF
Learning from the Trenches: Scrum for Hardware
PDF
Leadership and Product Strategy
PDF
Strategy Leadership and Product Portfolio Management
PPTX
Agile: Not Just for Sofware
PDF
Can Agile Work With a Waterfall Process?
PDF
Program and Change Management
PDF
Smart Agile: An Elegant Recipe for Product Developers
PDF
Boundary Conditions - Who Needs Agile?
PDF
Software QA Metrics Dashboard Benchmarking
Product Development Metrics: More Harm Than Good?
Program Management 2.0: Work Breakdown Structure
Program Management 2.0: Schedule Prediction Accuracy
Program Management 2.0: Risk Management
Program Management 2.0: Monitoring Performance
Program Management 2.0: Circle-Dot Charts and Communication
Program Management 2.0: Burndown Charts
Program Management Tools and Techniques: Best Practices & Workshop for Progra...
Innovation Sprint Readiness Scorecard
Never The Twain Shall Meet: Can Agile Work with a Waterfall Process?
Learning from the Trenches: Scrum for Hardware
Leadership and Product Strategy
Strategy Leadership and Product Portfolio Management
Agile: Not Just for Sofware
Can Agile Work With a Waterfall Process?
Program and Change Management
Smart Agile: An Elegant Recipe for Product Developers
Boundary Conditions - Who Needs Agile?
Software QA Metrics Dashboard Benchmarking

Recently uploaded (20)

PDF
CISSP Domain 5: Identity and Access Management (IAM)
PDF
ORGANIZATIONAL communication -concepts and importance._20250806_112132_0000.pdf
PDF
Contemporary management and it's content
PPTX
MY GOLDEN RULES la regla de oro jhonatan requena
PPTX
TCoE_IT_Concrete industry.why is it required
PDF
Equity at the Helm_ Guiding Schools Through Inclusive Leadership by Dr.pdf
PDF
Human resources management is a best management
PDF
Air India AI-171 Crash in Ahmedabad A Tragic Wake-Up Call.
PPT
Claims and Adjustment Business_Communication.pptx.ppt
PDF
Features of Effective decision making in Management
PDF
MANAGEMENT LESSONS FROM ANCIENT KNOWLEDGE SYSTEM-ARTHASHASTRA AND THIRUKKURAL...
PDF
Case study -Uber strategic plan and management
PDF
The Cyber SwarmShield by Stéphane Nappo
PDF
CHAPTER 14 Manageement of Nursing Educational Institutions- planing and orga...
PPTX
Mangeroal Finance for Strategic Management
PDF
The Plan: Save the Palestinian Nation Now
PPTX
BASIC H2S TRAINING for oil and gas industries
PPTX
Leadership for Industry 4.0 And Industry 5.0
PDF
Phillips model training for evaluation pdf
PDF
CHAPTER 14 Manageement of Nursing Educational Institutions- planing and orga...
CISSP Domain 5: Identity and Access Management (IAM)
ORGANIZATIONAL communication -concepts and importance._20250806_112132_0000.pdf
Contemporary management and it's content
MY GOLDEN RULES la regla de oro jhonatan requena
TCoE_IT_Concrete industry.why is it required
Equity at the Helm_ Guiding Schools Through Inclusive Leadership by Dr.pdf
Human resources management is a best management
Air India AI-171 Crash in Ahmedabad A Tragic Wake-Up Call.
Claims and Adjustment Business_Communication.pptx.ppt
Features of Effective decision making in Management
MANAGEMENT LESSONS FROM ANCIENT KNOWLEDGE SYSTEM-ARTHASHASTRA AND THIRUKKURAL...
Case study -Uber strategic plan and management
The Cyber SwarmShield by Stéphane Nappo
CHAPTER 14 Manageement of Nursing Educational Institutions- planing and orga...
Mangeroal Finance for Strategic Management
The Plan: Save the Palestinian Nation Now
BASIC H2S TRAINING for oil and gas industries
Leadership for Industry 4.0 And Industry 5.0
Phillips model training for evaluation pdf
CHAPTER 14 Manageement of Nursing Educational Institutions- planing and orga...

Software Quality Dashboard Benchmarking Study

  • 1. 1 1 SOFTWARE QUALITY METRICS BENCHMARK STUDY How Software Metrics and Dashboards are Applied in High Technology Company
  • 2. 2 22 EXECUTIVE SUMMARY The purpose of the benchmark study was to capture best practices in the application of SW metrics dashboards. Ten technology companies were benchmarked against these questions: • What metrics on software quality are reported to management? • Internal quality metrics, external field detected metrics? • How are they normalized? Customers in field, LOC? • What are the most important? • Are they tabular, graphical? How many? Are target values shown? • How frequently are they reported? How many do you report on? • What are key target values you look at for key metrics? Alcatel Boston Scientific Cisco Ericsson General Dynamics IBM Saint Jude Medical Palo Alto Networks Riverbed VMware List of participants 3 Highly regulated companies 7 Networking/Computer/Storage Key Highlights: • There is no standard for the number of metrics, type of metrics, nor frequency of reporting • However there are best practices around Software Quality Metrics – We can look at what separates the best from the rest • The BEST have 1. Automated metrics tracking and analysis systems that allow drill down and reporting by product, release, customer 2. Normalization that ensures that the metrics are meaningful as the number of customers or the complexity of code increases 3. Root Cause Analysis system that systematically analyzes defects that escape the company and are found in the field 4. Quality metrics that go beyond product defects, and include release predictability and feature expectations 5. External benchmarks that are used to set goals (created by third parties to establish databases or perform surveys)
  • 3. 3 33 HOW WE APPROACHED THE ANALYSIS • The Process Capability Maturity Model (CMM) defines five level of process maturity • Levels 1 (Initial, Chaotic) • Level 2 (Repeatable) • Level 3 (Defined) • Level 4 (Managed, Measured) • Level 5 (Optimizing) • Metrics are a key parts of the CMM model, and Level 4 indicates mastery of metrics • SW metrics are well characterized, and are often divided up between Product Quality Metrics, In-Process Metrics, and Metrics for SW Maintenance* • From our survey of ten companies, we have derived a sense of metrics maturity, and have created our own rating of SW Metrics Maturity using five factors • Automated, Root Cause Analysis, Normalized, External Benchmarks, and Total Quality (not just defects) • The Best tend to have excellent scores on all five dimensions, the rest lag behind in one or more areas • The best tend to have measures in the three areas defined above (Product, In-Process, and Maintenance)
  • 4. 4 44 EXAMPLE SW METRICS MATURITY 1. Automated metrics tracking and analysis systems that allow drill down and reporting by product, release, customer 2. Normalization that ensures that the metrics are meaningful as the number of customers or the complexity of code increases 3. Root Cause Analysis system that systematically analyzes defects that escape the company and are found in the field 4. Quality metrics that go beyond product defects, and include release predictability and feature expectations 5. External benchmarks that are used to set goals (created by third parties to establish databases or perform surveys) Root Cause Analysis Automated Metrics System Normalization Total Quality (Predictability/Fe atures) Uses External Benchmarks Best Rest The nature of the survey did not allow us to complete this chart for each participant, but this treatment would be very useful to evaluate where you are today and where you should focus in the future to close gaps between the best and the rest. Hypothetical Radar Chart: A 5 point scale, where mastery is indicated as a 5 (outermost), and absent is a 0 (innermost)
  • 5. 5 55 DASHBOARD – DRAWN FROM BENCHMARKING • Title & Description • So What • Consistent Design • Labeled Axes • Target Curves • Narrative Guiding Principles: Each metric should be linked to your overall quality objectives, which were derived from your overall strategy From the Benchmark Sample, the goals might be: • Increasing Net Promoter Score (how highly you are recommended) • Increasing Release Predictability • Increasing Customer Satisfaction • Increasing Reported Quality (Field Quality) • Reducing time to repair • Reducing the number of Critical Accounts Each chart has the following graphical properties: • The charts are composed so that the ‘so what’ is very clear, and repeated for each so that it is clear to managers that only see them once a quarter, so they know why the metric is there and if there is any significance to the data, what the significance is. • Targets should be on all graphs • Where benchmark data exists, it will also be shown on the chart • Each chart should have the following properties There should be between 4-8 metrics Two related metrics per screen Text describing & analyzing the data represented
  • 6. 6 66 Percent of Release Slips This chart plots the percentage of actual versus planned schedule for major and minor releases. • The target is derived to get to less than 5% slip by 2014, closing the gap in a straight line, coming down from 22% where we are today • The increase shown in November, 2011is driven by the A.2a release, which had to go through 2 alpha • We expect a steeper drop in July, 2012 because of our new “Darken the Sky” program to provide requirements stability • Benchmarking indicates that the best in class number is a slip rate of less than 15% (for 9 month release cycles). Mean Time to Repair This chart plots the average time, in weeks, that the customers had to wait for resolution. Measured in weekly intervals, data captured per release. • The target is derived to get to the fastest resolution (and reduce the number outstanding) • The increase shown in January, 2012 is driven by the A.x release. • The new methods for engineering releases should impact this in 2013 VerticalAxisLabel Horizontal Axis Label Benchmark VerticalAxisLabel Horizontal Axis Label Major Release 2 Major Release 3
  • 7. 7 77 BEST PRACTICES 1. Use of third party firms to assess where your software defect performance stacks up against the competition & use of industry standard databases for software quality 2. Test Escapes Analysis Process to perform root cause analysis on all significant escapes to the field 3. SW Defects reported on dashboard includes broader measures like predictability, expectations 4. Automated, integrated system for real time metrics analysis and presentation to management is simply pulling up current data and reviewing it formally 5. Normalization for complexity and or accounts in the field to ensure that proper comparisons are made 6. Create compound metric that pulls together several important factors for the business 7. Institute metrics that show (unit and integration) statement coverage, branch coverage, all tests passing, and for functional testing, show requirements coverage and all tests passing 8. Institute metrics that show defect backlog, number of test cases planned, and Upgrade/Update failure rate, Early Return Index, Fault Slip Through 9. Bug tool kit that goes to the field with exhaustive and searchable data to help customers avoid reporting defects, learn about workarounds, and search with Google like strength 10. If external benchmark targets are not known, track improvement release over release 11. Focus on what is important. One participant only tracks release predictability and customer satisfaction 12. Use parametric estimation metrics – for example 4 days for a test case to ensure high quality, data driven schedule estimates (also helps demonstrate improvements over time) In benchmarking studies like this, we often see some exemplary practices that demonstrate creative and effective ways to stay ahead. Top 5 Metrics to Consider Other Tips
  • 8. 8 88 SUMMARY STATISTICS • 8 do report customer found defects to management (remaining 2 report customer sat at a high level) • 6 report on the order of 4 metrics to management, the remaining 4 report more or less • 5 include time to market as a metric in their quality dashboard • 4 report escapes or customer found defects caused by bad fixes Key Highlights: • 4 companies have real time visibility of metrics, and they are automatically updated on a daily basis • 3 companies reported on compound metrics that combine reliability, availability, time to fix • 3 do not use targets for metrics reported to management, but only report the improvement release to release • 3 normalize metrics (LOC on inside, or Units in Field on outside)
  • 9. 9 99 IMPLICATIONS • Root cause analysis should be performed on defects from the field that are either critical or from regressions • Many companies have special processes for doing this effectively • It appears that some participants have higher levels of automation and coverage for both unit, integration, and functional test • And it is measured • Planning metrics, such as the number of days per test case should be used for prediction and improvement • If you are growing, some normalization should be used. • It should be coarse (like judged Lines of Code, converted from Function Points) • Walker Survey, Quest Database, and Manager-Tools.com are three recommended vendors for metrics and management • Walker Survey can determine how you stack up against your competitors regarding quality and satisfaction • Quest is a TL 9000 database • Manager-Tools are helpful for developing QA managers • Where absolute targets don’t exist, a target curve based on prior improvement should be used to answer ‘are we getting better?’
  • 10. 10 10 TCGen Inc. Menlo Park CA, 94025 info@tcgen.com +12 3456 789