Kiran Lata Gangwar
Tanmi Kapoor
M.Tech (S.E.)
IEEE Standard for a Software Quality
Metrics Methodology
 Sponsor
Software Engineering Standards Committee of the IEEE
Computer Society
 Approved 8 December 1998
IEEE-SA Standards Board
Software quality metrics methodology _tanmi kiran
Software quality metrics methodology _tanmi kiran
Use,
 organizational experience
 required standards
 regulations
 laws
Consider,
 contractual requirements
 cost or schedule constraints
 warranties
 customer metrics requirements
 organizational self-interest
1.2 Determine the list of quality requirements:
 Survey all involved parties
 Create the list of quality requirements
1.3 Quantify each quality factor:
 For each quality factor, assign one or more direct metrics to
represent the quality factor
 assign direct metric values to serve as quantitative
requirements for that quality factor
Software quality metrics methodology _tanmi kiran
Software quality metrics methodology _tanmi kiran
 Identify tools
 Describe data storage procedures
 Establish a traceability matrix
 Identify the organizational entities
 Participate in data collection
 Responsible for monitoring data collection
 Describe the training and experience required for data
collection
 Training process for personnel involved
Software quality metrics methodology _tanmi kiran
 Test the data collection and metrics computation procedures
on selected software that will act as a prototype
 Select samples that are similar to the project(s) on which the
metrics will be used
 Examine the cost of the measurement process for the
prototype to verify or improve the cost analysis
 Results collected from the prototype to improve the metric
descriptions and descriptions of data items
 Using the formats in Table, collect and store data in the
project metrics database at the appropriate time in the life
cycle
 Check the data for accuracy and proper unit of measure
 Monitor the data collection
 Check for uniformity of data if more than one person is
collecting it
 Compute the metric values from the collected data
Software quality metrics methodology _tanmi kiran
 Interpret and record the results
 Analyze the differences between the collected metric data and
the target values
 Investigate significant differences
 Interpret and record the results
 Unacceptable quality may be manifested as
 excessive complexity,
 inadequate documentation,
 lack of traceability,
 or other undesirable attributes
 Use validated metrics during development to make predictions
of direct metric values
 Make predictions for software components and process steps
 Analyze in detail software components and process steps
whose predicted direct metric values deviate from the target
values
 Use direct metrics to ensure compliance of software products
with quality requirements during system and acceptance
testing
 Use direct metrics for software components and process steps.
Compare these metric values with target values of the direct
metrics
 Classify software components and process steps whose
measurements deviate from the target values as noncompliant
Software quality metrics methodology _tanmi kiran
 The purpose of metrics validation is to identify both product
and process metrics that can predict specified quality factor
values, which are quantitative representations of quality
requirements
 Metrics shall indicate whether quality requirements have
been achieved or are likely to be achieved in the future
 For the purpose of assessing whether a metric is valid
 The following thresholds shall be designated:
V-square of the linear correlation coefficient
B-rank correlation coefficient
A-prediction error
@-confidence level
P-success rate
a) Correlation
b) Tracking
c) Consistency
d) Predictability
e) Discriminative power
f) Reliability
5.3.1 Identify the quality factors sample
 A sample of quality factors shall be drawn from the metrics
database
5.3.2 Identify the metrics sample
 A sample from the same domain (e.g., same software
components), as used in 5.3.1, shall be drawn from the metrics
database
5.3.3 Perform a statistical analysis
 The analysis described in 5.2 shall be performed
 Before a metric is used to evaluate the quality of a product or
process, it shall be validated against the criteria described in
5.2. If a metric does not pass all of the validity tests, it shall
only be used according to the criteria prescribed by those tests
5.3.4 Document the results
 Documented results shall include the direct metric, predictive
metric, validation criteria, and numerical results, as a minimum
 5.3.5 Revalidate the metrics
A validated metric may not necessarily be valid in other
environments or future applications. Therefore, a predictive
metric shall be revalidated before it is used for another
environment or application
 5.3.6 Evaluate the stability of the environment
Metrics validation shall be undertaken in a stable development
environment (i.e., where the design language, implementation
language, or project development tools do not change over the
life of the project in which validation is performed)
Software quality metrics methodology _tanmi kiran

More Related Content

PPTX
golden rules of user interface design
PPTX
Static Testing
PDF
Software Quality Metrics Do's and Don'ts - QAI-Quest 1 Hour Presentation
PDF
software design principles
PPT
Analysis concepts and principles
PPTX
Android UI
PPT
Architecture design in software engineering
PPT
golden rules of user interface design
Static Testing
Software Quality Metrics Do's and Don'ts - QAI-Quest 1 Hour Presentation
software design principles
Analysis concepts and principles
Android UI
Architecture design in software engineering

What's hot (20)

PPTX
Design Concepts in Software Engineering-1.pptx
PPTX
Levels Of Testing.pptx
PDF
Software Architecture: Design Decisions
PPT
Java beans
PPTX
PPTX
SQLite database in android
PPTX
Association rule mining.pptx
PPT
Unit 3 object analysis-classification
PDF
The Evolution of Data Science
PPT
Data preprocessing
PPTX
Architectural styles and patterns
PPTX
Unit 1 defects classes
PPTX
Equivalence class testing
PPTX
Unit 7 performing user interface design
PPTX
R Hadoop integration
PPT
Non Functional Testing
PPTX
Decomposition technique In Software Engineering
PPTX
03 Data Mining Techniques
PPT
Java awt
PDF
What are Abstract Classes in Java | Edureka
Design Concepts in Software Engineering-1.pptx
Levels Of Testing.pptx
Software Architecture: Design Decisions
Java beans
SQLite database in android
Association rule mining.pptx
Unit 3 object analysis-classification
The Evolution of Data Science
Data preprocessing
Architectural styles and patterns
Unit 1 defects classes
Equivalence class testing
Unit 7 performing user interface design
R Hadoop integration
Non Functional Testing
Decomposition technique In Software Engineering
03 Data Mining Techniques
Java awt
What are Abstract Classes in Java | Edureka
Ad

Viewers also liked (20)

PDF
Software quality metric
PPT
Software Metrics
PPTX
Software metrics
PDF
Software metrics
PPT
Software Metrics
PPT
Software metrics
PDF
Understanding software metrics
PPT
Sw Software Metrics
PPT
Software Engineering Fundamentals
DOCX
Desktop applicationtesting
PPTX
States, state graphs and transition testing
PDF
Software Engineering Practice - Software Metrics and Estimation
PPTX
Software Metrics & Measurement-Sharbani Bhattacharya
PPTX
Software Metrics - Software Engineering
PDF
Importance of software quality metrics
PPTX
Product metrics
PPTX
SQA - chapter 13 (Software Quality Infrastructure)
DOC
Chapter 4 software design
PPTX
Agile code quality metrics
Software quality metric
Software Metrics
Software metrics
Software metrics
Software Metrics
Software metrics
Understanding software metrics
Sw Software Metrics
Software Engineering Fundamentals
Desktop applicationtesting
States, state graphs and transition testing
Software Engineering Practice - Software Metrics and Estimation
Software Metrics & Measurement-Sharbani Bhattacharya
Software Metrics - Software Engineering
Importance of software quality metrics
Product metrics
SQA - chapter 13 (Software Quality Infrastructure)
Chapter 4 software design
Agile code quality metrics
Ad

Similar to Software quality metrics methodology _tanmi kiran (20)

PPT
Chapter 15 software product metrics
PPT
Software metrics
PPTX
Unit 8 software quality and matrices
PPTX
Software Productivity Measurement
PDF
Unit 6
PDF
Presentation Quality Management
PDF
Software metrics by Dr. B. J. Mohite
PPTX
UNIT4(2) OB UNIT II NOTESOB UNIT II NOTES
PPTX
software engineering metrics concpets in advanced sotwrae
PPTX
Software Engineering Software Engineering
PPTX
software metrics(process,project,product)
PPTX
SOFTWARE TESTING
PPTX
Quality in software industry
PPT
software-quality-assurance.pptQuality assurance consists of those procedures,...
PPT
lec27.ppt
PPT
Lecture3
PPT
Slides chapter 15
PDF
Quality
PDF
I software quality
Chapter 15 software product metrics
Software metrics
Unit 8 software quality and matrices
Software Productivity Measurement
Unit 6
Presentation Quality Management
Software metrics by Dr. B. J. Mohite
UNIT4(2) OB UNIT II NOTESOB UNIT II NOTES
software engineering metrics concpets in advanced sotwrae
Software Engineering Software Engineering
software metrics(process,project,product)
SOFTWARE TESTING
Quality in software industry
software-quality-assurance.pptQuality assurance consists of those procedures,...
lec27.ppt
Lecture3
Slides chapter 15
Quality
I software quality

Recently uploaded (20)

PPT
Module 1.ppt Iot fundamentals and Architecture
PPTX
Final SEM Unit 1 for mit wpu at pune .pptx
PPTX
observCloud-Native Containerability and monitoring.pptx
PPT
What is a Computer? Input Devices /output devices
PDF
How ambidextrous entrepreneurial leaders react to the artificial intelligence...
PDF
Univ-Connecticut-ChatGPT-Presentaion.pdf
PDF
sustainability-14-14877-v2.pddhzftheheeeee
PDF
CloudStack 4.21: First Look Webinar slides
PDF
Getting Started with Data Integration: FME Form 101
PPT
Geologic Time for studying geology for geologist
PDF
Developing a website for English-speaking practice to English as a foreign la...
PDF
NewMind AI Weekly Chronicles – August ’25 Week III
PDF
Zenith AI: Advanced Artificial Intelligence
PDF
Transform Your ITIL® 4 & ITSM Strategy with AI in 2025.pdf
PDF
A Late Bloomer's Guide to GenAI: Ethics, Bias, and Effective Prompting - Boha...
PPTX
Chapter 5: Probability Theory and Statistics
PDF
A novel scalable deep ensemble learning framework for big data classification...
PDF
Unlock new opportunities with location data.pdf
PPTX
Group 1 Presentation -Planning and Decision Making .pptx
PDF
Five Habits of High-Impact Board Members
Module 1.ppt Iot fundamentals and Architecture
Final SEM Unit 1 for mit wpu at pune .pptx
observCloud-Native Containerability and monitoring.pptx
What is a Computer? Input Devices /output devices
How ambidextrous entrepreneurial leaders react to the artificial intelligence...
Univ-Connecticut-ChatGPT-Presentaion.pdf
sustainability-14-14877-v2.pddhzftheheeeee
CloudStack 4.21: First Look Webinar slides
Getting Started with Data Integration: FME Form 101
Geologic Time for studying geology for geologist
Developing a website for English-speaking practice to English as a foreign la...
NewMind AI Weekly Chronicles – August ’25 Week III
Zenith AI: Advanced Artificial Intelligence
Transform Your ITIL® 4 & ITSM Strategy with AI in 2025.pdf
A Late Bloomer's Guide to GenAI: Ethics, Bias, and Effective Prompting - Boha...
Chapter 5: Probability Theory and Statistics
A novel scalable deep ensemble learning framework for big data classification...
Unlock new opportunities with location data.pdf
Group 1 Presentation -Planning and Decision Making .pptx
Five Habits of High-Impact Board Members

Software quality metrics methodology _tanmi kiran

  • 1. Kiran Lata Gangwar Tanmi Kapoor M.Tech (S.E.)
  • 2. IEEE Standard for a Software Quality Metrics Methodology  Sponsor Software Engineering Standards Committee of the IEEE Computer Society  Approved 8 December 1998 IEEE-SA Standards Board
  • 5. Use,  organizational experience  required standards  regulations  laws Consider,  contractual requirements  cost or schedule constraints  warranties  customer metrics requirements  organizational self-interest
  • 6. 1.2 Determine the list of quality requirements:  Survey all involved parties  Create the list of quality requirements 1.3 Quantify each quality factor:  For each quality factor, assign one or more direct metrics to represent the quality factor  assign direct metric values to serve as quantitative requirements for that quality factor
  • 9.  Identify tools  Describe data storage procedures  Establish a traceability matrix  Identify the organizational entities  Participate in data collection  Responsible for monitoring data collection  Describe the training and experience required for data collection  Training process for personnel involved
  • 11.  Test the data collection and metrics computation procedures on selected software that will act as a prototype  Select samples that are similar to the project(s) on which the metrics will be used  Examine the cost of the measurement process for the prototype to verify or improve the cost analysis  Results collected from the prototype to improve the metric descriptions and descriptions of data items
  • 12.  Using the formats in Table, collect and store data in the project metrics database at the appropriate time in the life cycle  Check the data for accuracy and proper unit of measure  Monitor the data collection  Check for uniformity of data if more than one person is collecting it  Compute the metric values from the collected data
  • 14.  Interpret and record the results  Analyze the differences between the collected metric data and the target values  Investigate significant differences
  • 15.  Interpret and record the results  Unacceptable quality may be manifested as  excessive complexity,  inadequate documentation,  lack of traceability,  or other undesirable attributes
  • 16.  Use validated metrics during development to make predictions of direct metric values  Make predictions for software components and process steps  Analyze in detail software components and process steps whose predicted direct metric values deviate from the target values
  • 17.  Use direct metrics to ensure compliance of software products with quality requirements during system and acceptance testing  Use direct metrics for software components and process steps. Compare these metric values with target values of the direct metrics  Classify software components and process steps whose measurements deviate from the target values as noncompliant
  • 19.  The purpose of metrics validation is to identify both product and process metrics that can predict specified quality factor values, which are quantitative representations of quality requirements  Metrics shall indicate whether quality requirements have been achieved or are likely to be achieved in the future
  • 20.  For the purpose of assessing whether a metric is valid  The following thresholds shall be designated: V-square of the linear correlation coefficient B-rank correlation coefficient A-prediction error @-confidence level P-success rate
  • 21. a) Correlation b) Tracking c) Consistency d) Predictability e) Discriminative power f) Reliability
  • 22. 5.3.1 Identify the quality factors sample  A sample of quality factors shall be drawn from the metrics database 5.3.2 Identify the metrics sample  A sample from the same domain (e.g., same software components), as used in 5.3.1, shall be drawn from the metrics database
  • 23. 5.3.3 Perform a statistical analysis  The analysis described in 5.2 shall be performed  Before a metric is used to evaluate the quality of a product or process, it shall be validated against the criteria described in 5.2. If a metric does not pass all of the validity tests, it shall only be used according to the criteria prescribed by those tests 5.3.4 Document the results  Documented results shall include the direct metric, predictive metric, validation criteria, and numerical results, as a minimum
  • 24.  5.3.5 Revalidate the metrics A validated metric may not necessarily be valid in other environments or future applications. Therefore, a predictive metric shall be revalidated before it is used for another environment or application  5.3.6 Evaluate the stability of the environment Metrics validation shall be undertaken in a stable development environment (i.e., where the design language, implementation language, or project development tools do not change over the life of the project in which validation is performed)