SlideShare a Scribd company logo
OHT 21.1
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Software Quality Metrics
Reading
OHT 21.2
Galin, SQA from theory to implementation © Pearson Education Limited 2004
If you can’t measure it,
you can’t manage it
Tom DeMarco, 1982
OHT 21.3
Galin, SQA from theory to implementation © Pearson Education Limited 2004
 Measurement
◦ is the act of obtaining a measure
 Measure
◦ provides a quantitative indication of the size of some product or process
attribute, E.g., Number of errors
 Metric
◦ is a quantitative measure of the degree to which a system, component, or
process possesses a given attribute (IEEE Software Engineering Standards
1993) : Software Quality - E.g., Number of errors found per person hours
expended
3
Measurement, Measures, Metrics
OHT 21.4
Galin, SQA from theory to implementation © Pearson Education Limited 2004
• Process
Measure the efficacy of processes. What works, what
doesn't.
• Project
Assess the status of projects. Track risk. Identify
problem areas. Adjust work flow.
• Product
Measure predefined product attributes
4
What to measure
OHT 21.5
Galin, SQA from theory to implementation © Pearson Education Limited 2004
• Process
Measure the efficacy of processes. What works, what doesn't.
• Code quality
• Programmer productivity
• Software engineer productivity
– Requirements,
– design,
– testing
– and all other tasks done by software engineers
• Software
– Maintainability
– Usability
– And all other quality factors
• Management
– Cost estimation
– Schedule estimation, Duration, time
– Staffing
5
What to measure
OHT 21.6
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Process Metrics
• Process metrics are measures of the
software development process, such as
– Overall development time
– Type of methodology used
• Process metrics are collected across all
projects and over long periods of time.
• Their intent is to provide indicators that
lead to long-term software process
improvement.
OHT 21.7
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Project Metrics
• Project Metrics are the measures of Software Project and
are used to monitor and control the project. Project metrics
usually show how project manager is able to estimate
schedule and cost
• They enable a software project manager to:
 Minimize the development time by making the adjustments
necessary to avoid delays and potential problems and risks.
 Assess product cost on an ongoing basis & modify the
technical approach to improve cost estimation.
OHT 21.8
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Product metrics
• Product metrics are measures of the
software product at any stage of its
development, from requirements to installed
system. Product metrics may measure:
– How easy is the software to use
– How easy is the user to maintain
– The quality of software documentation
– And more ..
OHT 21.9
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Why do we measure?
• Determine quality of piece of software or
documentation
• Determine the quality work of people such
software engineers, programmers, database
admin, and most importantly MANAGERS
• Improve quality of a product/project/ process
OHT 21.10
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Why Do We Measure?
• To assess the benefits derived from new
software engineering methods and tools
• To close the gap of any problems (E.g
training)
• To help justify requests for new tools or
additional training
OHT 21.11
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Examples of Metrics Usage
• Measure estimation skills of project
managers (Schedule/ Budget)
• Measure software engineers
requirements/analysis/design skills
• Measure Programmers work quality
• Measure testing quality
And much more …
OHT 21.12
Galin, SQA from theory to implementation © Pearson Education Limited 2004
(1) A quantitative measure of the degree to
which an item possesses a given quality
attribute.
(2) A function whose inputs are software data
and whose output is a single numerical
value that can be interpreted as the degree
to which the software possesses a given
quality attribute.
OHT 21.13
Galin, SQA from theory to implementation © Pearson Education Limited 2004
1. Facilitate management control, planning and
managerial intervention.
Based on:
· Deviations of actual from planned performance.
· Deviations of actual timetable and budget
performance from planned.
2. Identify situations for development or maintenance
process improvement (preventive or corrective
actions). Based on:
· Accumulation of metrics information regarding the
performance of teams, units, etc.
OHT 21.14
Galin, SQA from theory to implementation © Pearson Education Limited 2004
• KLOC — classic metric that measures the
size of software by thousands of code lines.
• Number of function points (NFP) — a
measure of the development resources
(human resources) required to develop a
program, based on the functionality
specified for the software system.
OHT 21.15
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Process metrics categories
• Software process quality metrics
– Error density metrics
– Error severity metrics
• Software process timetable metrics
• Software process error removal effectiveness
metrics
• Software process productivity metrics
OHT 21.16
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Is KLOC enough ?
• What about number of errors (error density)?
• What about types of errors (error severity) ?
• A mixture of KLOC, density, and severity is
an ideal quality metric to programmers
quality of work and performance
OHT 21.17
Galin, SQA from theory to implementation © Pearson Education Limited 2004
An example
• 2 different project
programmers are
working on two
similar modules
• Programmer A–
produced 342 errors
during software
process before release
• Programmer B-
Produced 184 errors
• Which
Programmer do
you think is
better ?
OHT 21.18
Galin, SQA from theory to implementation © Pearson Education Limited 2004
An example
• It really depends on the types of errors found
(severity) and not only the number of errors
(Density)
• One error of high severity might be more
important than hundreds of other types of
errors
OHT 21.19
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Code Name Calculation formula
CED Code Error Density NCE
CED = -----------
KLOC
DED Development Error Density NDE
DED = -----------
KLOC
WCED Weighted Code Error Density WCE
WCDE = ---------
KLOC
WDED Weighted Development Error Density WDE
WDED = ---------
KLOC
WCEF Weighted Code Errors per Function
Point
WCE
WCEF = ----------
NFP
WDEF Weighted Development Errors per
Function Point
WDE
WDEF = ----------
NFP
NCE = The number of code errors detected by code inspections and testing.
NDE = total number of development (design and code) errors) detected in the development process.
WCE = weighted total code errors detected by code inspections and testing.
WDE = total weighted development (design and code) errors detected in development process.
OHT 21.20
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Code Name Calculation formula
ASCE Average Severity of Code
Errors
WCE
ASCE = -----------
NCE
ASDE Average Severity of
Development Errors
WDE
ASDE = -----------
NDE
NCE = The number of code errors detected by code inspections and testing.
NDE = total number of development (design and code) errors detected in the
development process.
WCE = weighted total code errors detected by code inspections and testing.
WDE = total weighted development (design and code) errors detected in
development process.
OHT 21.21
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Code Name Calculation
formula
TTO Time Table Observance MSOT
TTO = -----------
MS
ADMC Average Delay of Milestone
Completion
TCDAM
ADMC = -----------
MS
MSOT = Milestones completed on time.
MS = Total number of milestones.
TCDAM = Total Completion Delays (days, weeks, etc.) for all milestones.
OHT 21.22
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Time Table Metric Example
• TTO
– Milestones are Requirements, Analysis, Design,
Implementation, and Testing
– Milestones completed in time are Requirements
and analysis only
– TTO = 2/5
OHT 21.23
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Time Table Metric Example
• ADMC
– Requirements (One week delay, Analysis
(Three weeks delay, Design (Two weeks delay,
Implementation (Six weeks delay), and Testing
(Two weeks delay)
– Total Delay is 14 Weeks
– ADMS = 14/5
OHT 21.24
Galin, SQA from theory to implementation © Pearson Education Limited 2004
* Budget constraints in allocating the
necessary resources.
* Human factors, especially opposition of
employees to evaluation of their activities.
* Validity Uncertainty regarding the data's,
partial and biased reporting.
OHT 21.25
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Examples of metrics
Requirements
• Number of requirements that change during the
rest of the software development process
– if a large number changed during specification,
design, …, something is wrong in the
requirements phase and the quality of
requirements engineers work
– The less changes of requirements, the better
requirements document quality
OHT 21.26
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Examples of metrics
Inspection
number of faults found during inspection can be
used as a metric to the quality of inspection
process
OHT 21.27
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Examples of metrics
Testing
– Number of test cases executed
– Number of bugs found per thousand of code
– And more possible metrics
OHT 21.28
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Examples of metrics
Maintainability metrics
– Total number of faults reported
– classifications by severity, fault type
– status of fault reports (reported/fixed)
– Detection and correction times
OHT 21.29
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Examples of metrics
Reliability quality factor
– Count of number of system failure (System
down)
– Total of minutes/hours per week or month
OHT 21.30
Galin, SQA from theory to implementation © Pearson Education Limited 2004
An example
Programmers productivity using
error severity and density
– We will consider error
density and error severity
– Assume that we have
three levels of severity
• Low severity errors
Wight is 1
• Medium Severity Error
Wight is 3
• High Severity Error
Wight is 15
• Two Programmers produced 3
KLOC per day
• After inspection, we found 100
low severity error, 20 medium
severity errors and 10 High
severity errors in the code of
Programmer 1
• And 300 low severity error, 10
medium severity errors and 2
High severity errors in the code
of Programmer 2
Which programmer has
the highest code quality ?
OHT 21.31
Galin, SQA from theory to implementation © Pearson Education Limited 2004
 Understanding
 Learning time: Time for new user to gain basic
understanding of features of the software
 Ease of learning
 Learning time: Time for new user to learn how to perform
basic functions of the software
 Operability
 Operation time: Time required for a user to perform
operation(s) of the software
 Usability
 Human factors: Number of negative comments from new
users regarding ergonomics, human factors, etc.
3
Other Metrics
OHT 21.32
Galin, SQA from theory to implementation © Pearson Education Limited 2004
 Number and type of defects found during requirements, design,
code, and test inspections
 Number of pages of documentation delivered
 Number of new source lines of code created
 Number of source lines of code delivered
 Total number or source lines of code delivered
 Average complexity of all modules delivered
 Average size of modules
 Total number of modules
 Total number of bugs found as a result of unit testing
 Total number of bugs found as a result of integration testing
 Total number of bugs found as a result of validation testing
 Productivity, as measured by KLOC per person-hour
3
Other Metrics
OHT 21.33
Galin, SQA from theory to implementation © Pearson Education Limited 2004
• Metrics for the analysis model
• Metrics for the design model
• Metrics for the source code
• Average find-fix cycle time
• Number of person-hours per inspection
• Number of person-hours per KLOC
• Average number of defects found per
inspection
3
Examples..continued
OHT 21.34
Galin, SQA from theory to implementation © Pearson Education Limited 2004
 Average find-fix cycle time
 Number of person-hours per inspection
 Number of person-hours per KLOC
 Average number of defects found per inspection
 Number of defects found during inspections in
each defect category
 Average amount of rework time
 Percentage of modules that were inspected
3
Examples..continued
OHT 21.35
Galin, SQA from theory to implementation © Pearson Education Limited 2004
Exercise
How can you measure the quality of:
• Project manager skills.
• Requirements Document
• Software development plan
• User manuals
• Programmer coding skills
• Design specification
OHT 21.36
Galin, SQA from theory to implementation © Pearson Education Limited 2004
ADMC Exercise
• An SDLC that consists of six phases
• Requirements ( 3 weeks delay)
• Analysis (2 weeks delay)
• Logical design (0 weeks delay)
• Physical design ( 4 weeks delay
• Implementation (10 weeks delay)
• Testing ( 6 weeks delay) ADMC ?

More Related Content

PPTX
Introduction to Software Quality Metrics
PDF
Software Quality Metrics Software Quality
PPT
CEN6070-cproject life cyhcleChapter7.2.ppt
PDF
458821217-Software-Project-Management-Chapter-4-Software-Metrics.pdf
PDF
Cmmi agile kulpa 2004meas cmmi[1]
PPT
Project-Planning
PPT
Software metrics
PDF
Using Benchmarking to Quantify the Benefits of Software Process Improvement
Introduction to Software Quality Metrics
Software Quality Metrics Software Quality
CEN6070-cproject life cyhcleChapter7.2.ppt
458821217-Software-Project-Management-Chapter-4-Software-Metrics.pdf
Cmmi agile kulpa 2004meas cmmi[1]
Project-Planning
Software metrics
Using Benchmarking to Quantify the Benefits of Software Process Improvement

Similar to Example of Metrics in Project management.pptx (20)

PPTX
Performance Continuous Integration
PPTX
Copy-of-Software-Project-Management-Estimation-LOC-and-FP-Based-Approaches[1]...
PPT
Software Engineering 2 lecture slide
PPTX
Quality in software industry
PDF
GLOC 2018: Automation or How We Eliminated Manual EBS R12.2 Upgrades and Beca...
DOCX
Mahesh_Umanath_06_2015
PPT
Introduction to software quality assurance by QuontraSolutions
PPT
Chapter 5 Software Quality Assurance-Finalised_BW.ppt
PPT
PPT
CMM Capability maturity model for engg.ppt
PPT
Project Planning for undergraduate students
PDF
Practical Application of Agile Techniques in Developing Safety Related Systems
PPTX
Assure TotalView - Analytics for Application Delivery
PDF
A New Model for Building Business Process Quality
PPT
Presentation1
PPT
Lecture08
PPTX
software metrics(process,project,product)
PDF
PPTX
Software Engineering Practices and Issues.pptx
Performance Continuous Integration
Copy-of-Software-Project-Management-Estimation-LOC-and-FP-Based-Approaches[1]...
Software Engineering 2 lecture slide
Quality in software industry
GLOC 2018: Automation or How We Eliminated Manual EBS R12.2 Upgrades and Beca...
Mahesh_Umanath_06_2015
Introduction to software quality assurance by QuontraSolutions
Chapter 5 Software Quality Assurance-Finalised_BW.ppt
CMM Capability maturity model for engg.ppt
Project Planning for undergraduate students
Practical Application of Agile Techniques in Developing Safety Related Systems
Assure TotalView - Analytics for Application Delivery
A New Model for Building Business Process Quality
Presentation1
Lecture08
software metrics(process,project,product)
Software Engineering Practices and Issues.pptx
Ad

Recently uploaded (20)

PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PPTX
Institutional Correction lecture only . . .
PDF
Computing-Curriculum for Schools in Ghana
PDF
Classroom Observation Tools for Teachers
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PPTX
Pharma ospi slides which help in ospi learning
PDF
Complications of Minimal Access Surgery at WLH
PDF
VCE English Exam - Section C Student Revision Booklet
PPTX
Pharmacology of Heart Failure /Pharmacotherapy of CHF
PPTX
Presentation on HIE in infants and its manifestations
PDF
RMMM.pdf make it easy to upload and study
PDF
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Microbial diseases, their pathogenesis and prophylaxis
Institutional Correction lecture only . . .
Computing-Curriculum for Schools in Ghana
Classroom Observation Tools for Teachers
human mycosis Human fungal infections are called human mycosis..pptx
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
Pharma ospi slides which help in ospi learning
Complications of Minimal Access Surgery at WLH
VCE English Exam - Section C Student Revision Booklet
Pharmacology of Heart Failure /Pharmacotherapy of CHF
Presentation on HIE in infants and its manifestations
RMMM.pdf make it easy to upload and study
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
102 student loan defaulters named and shamed – Is someone you know on the list?
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
STATICS OF THE RIGID BODIES Hibbelers.pdf
Ad

Example of Metrics in Project management.pptx

  • 1. OHT 21.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Software Quality Metrics Reading
  • 2. OHT 21.2 Galin, SQA from theory to implementation © Pearson Education Limited 2004 If you can’t measure it, you can’t manage it Tom DeMarco, 1982
  • 3. OHT 21.3 Galin, SQA from theory to implementation © Pearson Education Limited 2004  Measurement ◦ is the act of obtaining a measure  Measure ◦ provides a quantitative indication of the size of some product or process attribute, E.g., Number of errors  Metric ◦ is a quantitative measure of the degree to which a system, component, or process possesses a given attribute (IEEE Software Engineering Standards 1993) : Software Quality - E.g., Number of errors found per person hours expended 3 Measurement, Measures, Metrics
  • 4. OHT 21.4 Galin, SQA from theory to implementation © Pearson Education Limited 2004 • Process Measure the efficacy of processes. What works, what doesn't. • Project Assess the status of projects. Track risk. Identify problem areas. Adjust work flow. • Product Measure predefined product attributes 4 What to measure
  • 5. OHT 21.5 Galin, SQA from theory to implementation © Pearson Education Limited 2004 • Process Measure the efficacy of processes. What works, what doesn't. • Code quality • Programmer productivity • Software engineer productivity – Requirements, – design, – testing – and all other tasks done by software engineers • Software – Maintainability – Usability – And all other quality factors • Management – Cost estimation – Schedule estimation, Duration, time – Staffing 5 What to measure
  • 6. OHT 21.6 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Process Metrics • Process metrics are measures of the software development process, such as – Overall development time – Type of methodology used • Process metrics are collected across all projects and over long periods of time. • Their intent is to provide indicators that lead to long-term software process improvement.
  • 7. OHT 21.7 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Project Metrics • Project Metrics are the measures of Software Project and are used to monitor and control the project. Project metrics usually show how project manager is able to estimate schedule and cost • They enable a software project manager to:  Minimize the development time by making the adjustments necessary to avoid delays and potential problems and risks.  Assess product cost on an ongoing basis & modify the technical approach to improve cost estimation.
  • 8. OHT 21.8 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Product metrics • Product metrics are measures of the software product at any stage of its development, from requirements to installed system. Product metrics may measure: – How easy is the software to use – How easy is the user to maintain – The quality of software documentation – And more ..
  • 9. OHT 21.9 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Why do we measure? • Determine quality of piece of software or documentation • Determine the quality work of people such software engineers, programmers, database admin, and most importantly MANAGERS • Improve quality of a product/project/ process
  • 10. OHT 21.10 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Why Do We Measure? • To assess the benefits derived from new software engineering methods and tools • To close the gap of any problems (E.g training) • To help justify requests for new tools or additional training
  • 11. OHT 21.11 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Examples of Metrics Usage • Measure estimation skills of project managers (Schedule/ Budget) • Measure software engineers requirements/analysis/design skills • Measure Programmers work quality • Measure testing quality And much more …
  • 12. OHT 21.12 Galin, SQA from theory to implementation © Pearson Education Limited 2004 (1) A quantitative measure of the degree to which an item possesses a given quality attribute. (2) A function whose inputs are software data and whose output is a single numerical value that can be interpreted as the degree to which the software possesses a given quality attribute.
  • 13. OHT 21.13 Galin, SQA from theory to implementation © Pearson Education Limited 2004 1. Facilitate management control, planning and managerial intervention. Based on: · Deviations of actual from planned performance. · Deviations of actual timetable and budget performance from planned. 2. Identify situations for development or maintenance process improvement (preventive or corrective actions). Based on: · Accumulation of metrics information regarding the performance of teams, units, etc.
  • 14. OHT 21.14 Galin, SQA from theory to implementation © Pearson Education Limited 2004 • KLOC — classic metric that measures the size of software by thousands of code lines. • Number of function points (NFP) — a measure of the development resources (human resources) required to develop a program, based on the functionality specified for the software system.
  • 15. OHT 21.15 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Process metrics categories • Software process quality metrics – Error density metrics – Error severity metrics • Software process timetable metrics • Software process error removal effectiveness metrics • Software process productivity metrics
  • 16. OHT 21.16 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Is KLOC enough ? • What about number of errors (error density)? • What about types of errors (error severity) ? • A mixture of KLOC, density, and severity is an ideal quality metric to programmers quality of work and performance
  • 17. OHT 21.17 Galin, SQA from theory to implementation © Pearson Education Limited 2004 An example • 2 different project programmers are working on two similar modules • Programmer A– produced 342 errors during software process before release • Programmer B- Produced 184 errors • Which Programmer do you think is better ?
  • 18. OHT 21.18 Galin, SQA from theory to implementation © Pearson Education Limited 2004 An example • It really depends on the types of errors found (severity) and not only the number of errors (Density) • One error of high severity might be more important than hundreds of other types of errors
  • 19. OHT 21.19 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Code Name Calculation formula CED Code Error Density NCE CED = ----------- KLOC DED Development Error Density NDE DED = ----------- KLOC WCED Weighted Code Error Density WCE WCDE = --------- KLOC WDED Weighted Development Error Density WDE WDED = --------- KLOC WCEF Weighted Code Errors per Function Point WCE WCEF = ---------- NFP WDEF Weighted Development Errors per Function Point WDE WDEF = ---------- NFP NCE = The number of code errors detected by code inspections and testing. NDE = total number of development (design and code) errors) detected in the development process. WCE = weighted total code errors detected by code inspections and testing. WDE = total weighted development (design and code) errors detected in development process.
  • 20. OHT 21.20 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Code Name Calculation formula ASCE Average Severity of Code Errors WCE ASCE = ----------- NCE ASDE Average Severity of Development Errors WDE ASDE = ----------- NDE NCE = The number of code errors detected by code inspections and testing. NDE = total number of development (design and code) errors detected in the development process. WCE = weighted total code errors detected by code inspections and testing. WDE = total weighted development (design and code) errors detected in development process.
  • 21. OHT 21.21 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Code Name Calculation formula TTO Time Table Observance MSOT TTO = ----------- MS ADMC Average Delay of Milestone Completion TCDAM ADMC = ----------- MS MSOT = Milestones completed on time. MS = Total number of milestones. TCDAM = Total Completion Delays (days, weeks, etc.) for all milestones.
  • 22. OHT 21.22 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Time Table Metric Example • TTO – Milestones are Requirements, Analysis, Design, Implementation, and Testing – Milestones completed in time are Requirements and analysis only – TTO = 2/5
  • 23. OHT 21.23 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Time Table Metric Example • ADMC – Requirements (One week delay, Analysis (Three weeks delay, Design (Two weeks delay, Implementation (Six weeks delay), and Testing (Two weeks delay) – Total Delay is 14 Weeks – ADMS = 14/5
  • 24. OHT 21.24 Galin, SQA from theory to implementation © Pearson Education Limited 2004 * Budget constraints in allocating the necessary resources. * Human factors, especially opposition of employees to evaluation of their activities. * Validity Uncertainty regarding the data's, partial and biased reporting.
  • 25. OHT 21.25 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Examples of metrics Requirements • Number of requirements that change during the rest of the software development process – if a large number changed during specification, design, …, something is wrong in the requirements phase and the quality of requirements engineers work – The less changes of requirements, the better requirements document quality
  • 26. OHT 21.26 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Examples of metrics Inspection number of faults found during inspection can be used as a metric to the quality of inspection process
  • 27. OHT 21.27 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Examples of metrics Testing – Number of test cases executed – Number of bugs found per thousand of code – And more possible metrics
  • 28. OHT 21.28 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Examples of metrics Maintainability metrics – Total number of faults reported – classifications by severity, fault type – status of fault reports (reported/fixed) – Detection and correction times
  • 29. OHT 21.29 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Examples of metrics Reliability quality factor – Count of number of system failure (System down) – Total of minutes/hours per week or month
  • 30. OHT 21.30 Galin, SQA from theory to implementation © Pearson Education Limited 2004 An example Programmers productivity using error severity and density – We will consider error density and error severity – Assume that we have three levels of severity • Low severity errors Wight is 1 • Medium Severity Error Wight is 3 • High Severity Error Wight is 15 • Two Programmers produced 3 KLOC per day • After inspection, we found 100 low severity error, 20 medium severity errors and 10 High severity errors in the code of Programmer 1 • And 300 low severity error, 10 medium severity errors and 2 High severity errors in the code of Programmer 2 Which programmer has the highest code quality ?
  • 31. OHT 21.31 Galin, SQA from theory to implementation © Pearson Education Limited 2004  Understanding  Learning time: Time for new user to gain basic understanding of features of the software  Ease of learning  Learning time: Time for new user to learn how to perform basic functions of the software  Operability  Operation time: Time required for a user to perform operation(s) of the software  Usability  Human factors: Number of negative comments from new users regarding ergonomics, human factors, etc. 3 Other Metrics
  • 32. OHT 21.32 Galin, SQA from theory to implementation © Pearson Education Limited 2004  Number and type of defects found during requirements, design, code, and test inspections  Number of pages of documentation delivered  Number of new source lines of code created  Number of source lines of code delivered  Total number or source lines of code delivered  Average complexity of all modules delivered  Average size of modules  Total number of modules  Total number of bugs found as a result of unit testing  Total number of bugs found as a result of integration testing  Total number of bugs found as a result of validation testing  Productivity, as measured by KLOC per person-hour 3 Other Metrics
  • 33. OHT 21.33 Galin, SQA from theory to implementation © Pearson Education Limited 2004 • Metrics for the analysis model • Metrics for the design model • Metrics for the source code • Average find-fix cycle time • Number of person-hours per inspection • Number of person-hours per KLOC • Average number of defects found per inspection 3 Examples..continued
  • 34. OHT 21.34 Galin, SQA from theory to implementation © Pearson Education Limited 2004  Average find-fix cycle time  Number of person-hours per inspection  Number of person-hours per KLOC  Average number of defects found per inspection  Number of defects found during inspections in each defect category  Average amount of rework time  Percentage of modules that were inspected 3 Examples..continued
  • 35. OHT 21.35 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Exercise How can you measure the quality of: • Project manager skills. • Requirements Document • Software development plan • User manuals • Programmer coding skills • Design specification
  • 36. OHT 21.36 Galin, SQA from theory to implementation © Pearson Education Limited 2004 ADMC Exercise • An SDLC that consists of six phases • Requirements ( 3 weeks delay) • Analysis (2 weeks delay) • Logical design (0 weeks delay) • Physical design ( 4 weeks delay • Implementation (10 weeks delay) • Testing ( 6 weeks delay) ADMC ?