SlideShare a Scribd company logo
CONCORDIA UNIVERSITY
DEPARTMENT OF COMPUTER SCIENCE AND SOFTWARE ENGINEERING
SOFTWARE PROJECT MANAGEMENT (SOEN 6841)
Fall 2015
INSTRUCTOR: DR. OLGA ORMANDJIEVA
IMPLEMENTING GOAL DRIVEN MEASUREMENT
CREATED AND DEVELOPED BY
Bhawna Sharma 27568789
Ehab Amar 26080421
Hammad Ali 27147074
Mudasser Akbar Muneer Ahmed 27654774
Muhammad Usman 27253443
Shahroze Jamil 27240570
Contents
Contents ........................................................................................................................................................2
List of Figures.................................................................................................................................................3
List of Tables..................................................................................................................................................3
1. Abstract .................................................................................................................................................4
2. Introduction...........................................................................................................................................4
3. Business Goal.........................................................................................................................................4
4. Step 1 - Brainstorming Additional Sub goal Statements .......................................................................5
4.1 Grouping of Entities managed by project manager by Theme .....................................................7
4.2 Prioritized Sub-goals......................................................................................................................9
5. Step 2 - Operationalized Sub Goals.......................................................................................................9
6. Step 3 - Success Criteria and Indicators.................................................................................................9
6.1 Defect Propagation indicator ..................................................................................................... 10
6.2 Increase in mean time between failures (MTBF) Indicator........................................................ 10
6.3 Maintainability Indicator............................................................................................................ 11
6.4 Test Case Progress Indicator ...................................................................................................... 13
7. Step 4 - Sub goals to Strategies and Activities ................................................................................... 14
Strategy 1................................................................................................................................................ 14
Strategy 2................................................................................................................................................ 15
Strategy 3................................................................................................................................................ 15
Strategy 4................................................................................................................................................ 16
Summary................................................................................................................................................. 16
8. Step 5 - Worksheet: Mapping of Required Data Elements to Indicators........................................... 16
8.1 Worksheet: Required Data Elements, Availability, and Source ....................................................... 18
9. Step 6 - Planning Tasks....................................................................................................................... 20
10. Conclusion ...................................................................................................................................... 21
11. Appendix A: (Definitions) ............................................................................................................... 21
12. Glossary .......................................................................................................................................... 22
13. References...................................................................................................................................... 22
List of Figures
Figure 1: Defect in different phases of project............................................................................................10
Figure 2: Mean time between failure...........................................................................................................11
Figure 3: Quality Indicator..........................................................................................................................13
Figure 4: Test cases progress chart..............................................................................................................14
List of Tables
Table 1: Quality Indicators..........................................................................................................................12
Table 2: Maintainability Indicator...............................................................................................................12
Table 3 : Test Case Execution Status Sheet ................................................................................................13
Table 4 : Test Cases Progress Chart............................................................................................................14
1. Abstract
Quality is characterized regarding germane properties of the particular project and any related product(s),
conceivably in both quantitative and subjective terms. The purpose behind this report is to recognize steps
to enhance quality of finished project inside of twelve weeks. Commonwealth Software, Inc. (CSI) is
Software Company and it has been in presence for ten years. Its essential business is building and
maintaining ecommerce websites. This report gives brief idea regarding how CSI can enhance quality of
completed project. Whole document is sorted out in seven stages beginning with Brainstorming Additional
Sub goal Statements, taking after by finding object of interest, purpose, Quality Focus & Perspective and
environment & constraints for two recognized sub goals from initial step. Carrying on further with the
Success Criteria and Indicators, this document emphases on the Sub objectives to Strategies and Activities.
2. Introduction
This report points of interest the analysis implementing Goal-Driven Measurement performed on
Commonwealth Software Inc. (CSI) completed to help the project manager of development in
implementing the organizational business objective of demonstrating a quantifiable change in the quality
of completed products inside of twelve weeks.
This analysis was actualized in six stages. The initial step examines the entities managed by the project
manager of development and perceives potential sub-goals that help the organizational business goal. The
second step formalizes the two most noteworthy need sub-goals. The third step distinguishes the success
criteria and indicators for the sub-goal concentrating on the improvement of the reliability of the product
item. The fourth step details strategies and activities to achieve this sub-goal. The fifth step executes the
Goal-Question-Indicator-Metric (GQIM) analysis process to identify quantifiable questions with the
movement of integrating Test Driven Development (TDD) into the software life-cycle and indicators that
guide in the tending to of these questions. Furthermore, the sixth step recognizes the data elements that
should be gathered to construct the success criteria, progress, and assessment indicators.
3. Business Goal
Show a measurable improvement (quantifiable change) in the quality of completed products inside
of twelve weeks.
4. Step 1 - Brainstorming Additional Sub goal Statements
The target of this step is to translate the organizational business objective into sub-objectives
related to the entities and activities that are overseen from a particular administrative point of view.
Inputs and Resources
Entities Managed by
Project Manager
Questions related to Improvement of Quality
Developers  How manageable are the developers as a team?
 Do developers follow programming practices and organization standards?
 Do the teams have specific skills to persuade the tasks?
 Are developers aware of business goals?
 Are sufficient number of developers working on assigned task(s)?
 Are the developers experienced enough to achieve task(s) assigned?
Quality Assurance team  Is the team following CMMI process standards?
 Are test cases enough to test all functions of the software?
 Are test cases reliable “No Defects”?
 Are test cases executed automatically or manually?
 What are testing tools used for testing?
 What type and level of testing applied?
Products and by-products
Entities Managed by
Project Manager
Questions Related to Improvement of Quality
Documentation  Are documents evaluated to guarantee quality?
 Are document audits capable?
 Who performs document audits?
 Are checklists utilized and successful?
 Is there a formal review procedure characterized?
 Are documented structure features traceable?
Pre-releases and source
code
 Is source code audited?
 How powerful are code audits?
 Who performs code audits?
 Are code audit agendas utilized and compelling?
 Are executed features traceable to traceable to necessities, outline, and test
case artifacts?
 Is the compiler utilized specifically?
 Does the compiler identify dialect structure and variable definition
deformities in the code?
 Is the source code blunder free?
 Does source code take after organizational programming models?
 Is there any programmed keep an eye on the source code runs?
Project Plan  A goal driven methods to meet cost variance limits?
 How effective are estimation techniques chosen for the project?
 What estimation techniques being addressed?
 What estimation tools can be used?
 Are milestones achievable meeting the deadlines?
 Is the scope of the project persuaded?
 Is estimated effort achievable?
Test Plan and strategy  Is the test scope all around characterized?
 Is the test scope sufficient?
 Are test cases successful in distinguishing imperfections?
 Are test cases financially savvy?
 Is test case execution computerized?
 Are test tools utilized as a part of testing?
 What is the test course of action and quality thought?
 What are the test sorts and levels?
 Are the test cases looked into?
 What is the premise/outline used to make the test cases?
 Who composes the test cases?
 Are test cases productive and powerful?
 Are the test cases traceable back to outline and necessities?
Activities and Flow paths
Entities managed by
Project Manager
Questions related to Improvement of quality
Effort estimation and
schedule estimation of
software
● What estimations methods are we using?
● What are the tools to be used for effort estimation?
● How realistic is the project schedule?
● Are estimated risks considered for resources?
● How effective are the techniques that have been selected?
Product Development ● What is the development life cycle used?
● Is the selected development life cycle different from the previously used life
cycle?
● Will test driven development be considered as part of the life cycle?
● Is the project requirement and specification reviewed by the Quality
Assurance team?
● How well has the requirement been documented?
● What tools will be used for debugging?
● Is the customer involved during the development process?
Software Process Testing ● What tools are used to manage the testing process?
● What are the testing environments that would be tested in?
● How will the defects be tracked?
● How will the detected defects be handled?
● What testing mechanism will be used for every process steps?
● How will the severity of defects be handled?
● Are the defects categorized?
● Will the fixed defects be tested again to check if they are fixed?
● Up to what extent will the process be tested?
● How much time would it require to test a process before it release?
Risk Management ● Who will be responsible for identification, modification and controlling the
product?
● How will the risk be identified?
● How will you rank the severity of the risk?
● Are there any pre-defined risk that might be associated with the product?
● How will the risks identified be handled?
● What techniques or methodologies are available or inherited to solve a
particular type of risk?
● Is there a time limitation to resolve the risk?
● Will the risk identified be shared with the stake holders?
● Who will manage the risk identified?
Internal Artifacts
Entities managed by
Project Manager
Questions related to Improvement of quality
customer change
requests (work in
process)
 How substantial is our backlog of client change requests?
 Where are the backlogs happening?
4.1 Grouping of Entities managed by project manager by Theme
Questions related to improvement of quality
Group #1
Documents
● Are documents reviewed to ensure quality?
● Are documents evaluated to guarantee quality?
● Are document audits capable?
● Who performs document audits?
● Are checklists utilized and successful?
● Is there a formal review procedure characterized?
● Are documented structure features traceable?
● How well has the requirement been documented?
Group #2
Project Management
 A goal driven methods to meet cost variance limits?
 How effective are estimation techniques chosen for the project?
 What estimation techniques being addressed?
 What estimation tools can be used?
 Are milestones achievable meeting the deadlines?
 Is the scope of the project persuaded?
 Is estimated effort achievable?
 What are the tools to be used for effort estimation?
 How realistic is the project schedule?
 How effective are the techniques that have been selected?
Group #3
Quality Management
 Is the test scope all around characterized?
 Is the test scope sufficient?
 Are test cases successful in distinguishing imperfections?
 Are test cases financially savvy?
 Is test case execution computerized?
 Are test tools utilized as a part of testing?
 What is the test course of action and quality thought?
 What are the test sorts and levels?
 Are the test cases looked into?
 What is the premise/outline used to make the test cases?
 Who composes the test cases?
 Are test cases productive and powerful?
 Are the test cases traceable back to outline and necessities?
 Are test cases enough to test all functions of the software?
 Are test cases reliable “No Defects”?
 Are test cases executed automatically or manually?
 What are testing tools used for testing?
 What type and level of testing applied?
 Is the project requirement and specification reviewed by the Quality
Assurance team?
 What tools are used to manage the testing process?
 What are the testing environments that would be tested in?
 How will the defects be tracked?
 How will the detected defects be handled?
 What testing mechanism will be used for every process steps?
 How will the severity of defects be handled?
 Are the defects categorized?
 Will the fixed defects be tested again to check if they are fixed?
 Up to what extent will the process be tested?
 How much time would it require to test a process before it release?
Group #4
Software Product
 Is source code audited?
 How powerful are code audits?
 Who performs code audits?
 Are code audit agendas utilized and compelling?
 Are executed features traceable to traceable to necessities, outline, and test
case artifacts?
 Is the compiler utilized specifically?
 Does the compiler identify dialect structure and variable definition
deformities in the code?
 Is the source code blunder free?
 Does source code take after organizational programming models?
 Is there any programmed keep an eye on the source code runs?

Group #5
Development
 Are documents evaluated to guarantee quality?
 Are document audits capable?
 Who performs document audits?
 Are checklists utilized and successful?
 Is there a formal review procedure characterized?
 Are documented structure features traceable?
 What is the development life cycle used?
 Is the selected development life cycle different from the previously used life
cycle?
 Will test driven development be considered as part of the life cycle?
Group #6
Risk Management
 Who will be responsible for identification, modification and controlling the
product?
 How will the risk be identified?
 How will you rank the severity of the risk?
 Are there any pre-defined risk that might be associated with the product?
 How will the risks identified be handled?
 What techniques or methodologies are available or inherited to solve a
particular type of risk?
 Is there a time limitation to resolve the risk?
 Will the risk identified be shared with the stake holders?
 Who will manage the risk identified?
 Are estimated risks considered for resources?
Group # 7
Change Management
 How substantial is our backlog of client change requests?
 Where are the backlogs happening?
4.2 Prioritized Sub-goals
1. Improve maintainability of the code.
2. Improve the reliability of software releases.
3. Improve the portability and interoperability of software release.
4. Reduce the defect density of software release.
5. Step 2 - Operationalized Sub Goals
From the above mentioned sub goals we have decided to further explain below mentioned sub goals into
structured statements to highlight its key components (object of interest, purpose, quality focus and
perspective, environment and constraints).
Sub Goal 1 Improve the reliability of software releases
Perspective Software development manager
Object of interest Software release
Purpose Check the probability of failure for a specific period of time in current software
releases in order to improve it.
Quality focus and
Perspective
Examine the reasons of failure from the perspective of development project manager.
Environment and
Constraints
Organizational standards and structure, development, project plan, available
resources and tools, criteria of reliability certification.
Sub Goal 2 Improve maintainability of the code
Perspective Software development manager
Object of interest Code quality
Purpose Observe changeability, analyzability, testability and stability of the code
Quality focus and
Perspective
Examine the reasons of bad code structure which makes it hard to add new feature or
to change a functionality.
Environment and
Constraints
Activities of development team members, organization structure, software
development, resources available, number of developers and testers, cost of making
changes, required skills.
6. Step 3 - Success Criteria and Indicators
The main objective in this step is to develop success criteria and indicators for one of the
operationalized goals outlined in step 2.
Operationalized Measurement Goal selected: Improve the reliability of software releases.
Q1: What is the rate of reduction in defect propagation all through the development of the software product?
Q2: What rate of increment is in mean time between failures being worked on?
Q3: What percentage of the classes are in the zone of excellent, good, fair and in poor?
Q4: What rate of increment in planned coverage, actual coverage, planned and actual ok rate per test cases?
The Success criteria and indicators outlined in the section below picturizes the output results of the goal
driven measurement analysis which are quantitatively evaluated to improve the reliability of the released
software.
6.1 Defect Propagation indicator
Success Criteria: 60% reduction in defect Success criteria. The 60% reduction in defect propagation
throughout the development of the software product
Assumption: 60% reduction in defect success criteria.
Note: These all defects are being extracted from Data Defect Tracker (DDT).
Indicator: The defect propagation indicator provides one of the means to assess the number and type of
high and medium severity detected at every phase of product development. The assumption here is that
there is a reduction of 60% over every phase and steps to be taken to increase the reliability of the product
which is under development.
Figure 1: Defect in different phases of project
The Pie chart shown shows 4 different phases and the severity of defects found during those phases.
6.2 Increase in mean time between failures (MTBF) Indicator
Success Criteria: The 15% improvement in MTBF of product under development.
Defect Indicator HRD MRD HDD MDD HID MID HTD MTD
Requirements phase 60 40 N/a N/a N/a N/a N/a N/a
Design Phase 20 12 160 65 N/a N/a N/a N/a
Implementation
Phase 8 0 65 12 45 25 N/a N/a
Testing Phase 0 0 0 0 20 7 N/a N/a
Delivery 0 0 0 0 0 0 0 0
Indicator: The Increase in Mean Time between Failures Indicator furnishes management with a means to
assess the current mean time between failures against the objective 15% improvement objective all through
the development life-cycle as a component of the recorded MTBF and test time of the software product
being worked on.
This indicator empowers management to figure out if corrective activities need to be implemented with a
specific end goal to expand the MTBF and accomplish the departmental objective of increasing the
reliability of the product under development.
Figure 2: Mean time between failures
6.3 Maintainability Indicator
Success Criteria: Increase in the maintainability of the software under development through improvement
of factors such as changeability, testability, stability and analyzability. 75% of the classes must be in the
zone of excellent, 20% in good, 5% in fair and 0% in poor.
Indicator: The values obtained from the LogiScope tool is an indicator that helps to analyze the factors
and generate respective analysis sub-indicators and indicators. This indicator helps in assessing the code
produced at a class level. Also McCabe is a tool that can be used to achieve reliability and maintainability
goals. The ratings are as defined in the success criteria. This helps in improving the testability, changeability
and analyzability of the produced over next milestones.
Maintainability as characterized as the probability of performing a successful activity inside of a given time.
As such, maintainability measures the simplicity and rate with which a framework can be restored to
operational status after a failure happens.
Maintainability = Testability + Stability + Analyzability + Changeability
“ Definition of Maintainability”
0
500
1000
1500
2000
2500
3000
3500
4000
0 500 1000 1500 2000 2500 3000 3500 4000
X-axis = testing time in hours
Y-axis = Mean time between failure in hours
Current MTBF
15% target
Assuming values gives us the following indicators to be shown to the management.
Class Criteria (M1) Excellent (%) Good (%) Fair (%) Poor (%)
Analyzability 0 65 35 0
Changeability 95 2 0 0
Stability 65 35 0 0
Testability 80 20 0 0
Table 1: Quality Indicators
Class Factor (M1) Excellent Good Fair Poor
Maintainability 90 10 0 0
Table 2: Maintainability Indicator
Analyzability
Excellent Good Fair Poor
Changeability
Excellent Good Fair Poor
Stability
Excellent Good Fair Poor
Testability
Excellent Good Fair Poor
Figure 3: Quality Indicator
6.4 Test Case Progress Indicator
Success Criteria: The success criteria for the test case progress indicators would be the number of defects
which have recorded in a document or a database and whose status after the test is Close. For example: if
100 defects were recorded in a excel sheet with the status of 45 open and 55 closed. So the number of
defects closed would be the success criteria with a percentage of 55%.
Test case progress indicator will give an insight and improvements that would be visible over the testing
progress and defect tracking. This indicator give a progress of the test coverage and key performance
indicator against the planned values to make decisive changes in case of deviations.
Indicator: The indicator will help in determining the confidence level in the delivered quality of the
product. This measurement is based on the test case execution status recorded by the testing team. The
recorded data and the status of each test case. This will make sure the resolution taken for every defect
found. A defect tracking system is required to keep track of the status of the test plans, defect status and
test cases progress.
Module
Complexity
Responsibility
Date
ofExecution
Status(pass/
fail/not
executed)
Defectid
Severity
Status
(open/close)
1.1 High Umar 25 – Nov Pass #1102 Medium Close
Table 3 : Test Case Execution Status Sheet
Maintainablility
Excellent Good Fair Poor
Figure 4: Test cases progress chart
Planned Coverage (%) Actual Coverage (%)
Planned Threshold
(%)
Actual
Threshold (%)
5 5 4 3
10 10 8 6
15 14 12 9
21 21 18 16
Table 4 : Test Cases Progress Chart
7. Step 4 - Sub goals to Strategies and Activities
In this step we are going to mention two strategies for our sub goal identified in step 2 from the perspective
of project development manager which describes the set of activities and its impact.
Strategy 1
Perspective Project development manager.
Sub goal Improve the reliability of software releases.
Strategy Present inspection of the actualized code keeping in mind the end goal to find and fix defects in
early phases of development and to reduce the time to market. Inspection will discover also,
correct defects at their purpose of beginning.
Impact of
strategy
 It will bring about a slight increment in the aggregate expense of the project because of
presentation of inspection however general it will help to find and fix critical number of
defects prior to production.
 It can spare expense to right defects amid analysis than amid later stages. So it is
numerous times less expensive than the expense we spend for fixing the defect at later
stages.
0
5
10
15
20
25
1 2 3 4
Percentage
Days
Test Cases Progress Chart
Activities
 In inspection we will attempt to figure out internal inconsistency in all the documents
including code.
 Verify results with the necessity and updating the agenda.
 Inspection will recognize a wide range of errors, for example, errors identified with
GUI, Business rationale, Database, Security errors, logical errors, server response
errors, similarity, error handling, and execution optimization.
Strategy 2
Perspective Project development manager.
Sub goal Improve the reliability of software releases.
Strategy Implementation of Agile development process within organization by providing lean and test
driven development training to teams.
Impact of
strategy
Improvement in overall code quality, software becomes much faster, less usage of memory and
system processing capabilities and code structure will be improved for maintainability by early
detection and removal of errors.
Activities
 Use of Test Driven Development (TDD) ensues that only necessary piece of code will
be written that is required to pass the test criteria.
 Usage of automated testing ensures that the released software product is tested which
each and every flow path and scenario that a human may miss also it reduces the
probability of new errors that may arise after any changes.
 Make different scrum development teams by integration of quality members this
integration helps them to deliver a reliable and high quality software release at the end
of an iteration.
 Implementation of single batch processing allows developers to pick user stories from
single backlog enabling to deliver working set of features earlier so that testing will
also be started earlier and the final version will be released earlier.
 Use better development environments and tools with faster machines that will improve
developer’s code quality by proper debugging, correction and testing of developed
software version.
Strategy 3
Perspective Project development manager.
Sub goal Improve the reliability of software releases.
Strategy Increase the effectiveness of validation and verification process.
Impact of strategy Identification of defects in early stages and reduce them from propagation.
Activities
 Use of automated testing.
 Coverage of all possible scenarios in test cases.
 Improvement in test coverage.
 Application of formal inspections and technical reviews.
 Improvement in effectiveness of test process.
Strategy 4
Perspective Project development manager.
Sub goal Improve the reliability of software releases.
Strategy Presentation of automation testing tool will stay away from human blunders and to lessen the
quantity of assets on a single task. Automation tool will be an outsider tool which will be utilized
to save worker hours.
Impact of
strategy
 Increase in expense of the project as more money will be spent with a specific end goal
to buy automation tool from outside.
 It will support huge test scope over little span of time.
 Less blunder inclined when contrasted with manual testing.
 Reduce human exertion.
Activities
 Enter input values and validate/accept against the system test case.
 Fix the defects and re run the tool.
 Analyze the progress by plotting diagrams for the acquired results.
 Bring up to date the defect tracker when the defects are closed.
Summary
The above strategies are placed in an order of priority. The highest priority goes to inspection process.
Inspection builds the internal consistency of the work done by a specialist, checks the conflict with best
practices. At the point when the work done by a man is reviewed by another, there will be a proficient
criticism given for the work done which will be a solid approach to enhance the quality of the deliverable
then the priority goes to priority goes to Implementation of Agile development process which can be used
to bring improvement in overall code quality, by which software turns out to be much quicker, less
utilization of memory and framework preparing abilities and code structure will be enhanced for practicality
by ahead of schedule identification and evacuation of blunders. Then the priority is towards increasing the
effectiveness of validation and verification process which is used to identify defects in early stages and
reduce them from propagation. Last priority goes to Automation testing in which we utilized automatic
special software (separate from the software being tested) to control the execution of tests and the
correlation of genuine results with anticipated results.
8. Step 5 - Worksheet: Mapping of Required Data Elements to Indicators
Data Elements Required Indicator
Defect
Propagation
Indicator
Meantime
between
Failures
Maintainability
Indicator
Testcase
progress
indicator
Planned Coverage x
Actual coverage x
Planned Threshold x
Actual Threshold x
Target Automation Factor x x
Lines of codes/packages per spring x
Coverage of unit Tests over the Codes/packages x
defect Severity x
Required MTBF x
Initial MTBF x
Cumulative Test Hours x
Initial Test Time x
Data Elements Required Indicator
Analyzability Changeability Stability Testability
Cl_wmc x
Cl_comf x
In_bases x
Cu_cdused x
Cl_stat x
Cl_func x
Cl_data x
cl_data_publ x
Cu_cdusers x
In_noc x
Cl_func_publ x
Cl_wmc x
Cl_funccu_cdused x
Data Elements Required Indicator
%Timefor
developing
Tests
FeatureTest
coverage
Test
Coverage
(successful
feature)
Failedtest
coverage
(feature)
Defectsat
integration
and
acceptance
Testing(in
%)Average
Numberof
Unittests
Time taken for developing
unit tests
x
Time taken for
development
x
Number of features x x x
No of features covered by
unit tests
x
No of features covered by
successful unit tests
x
Number of Integration
testing defects
x
Number of acceptance
testing defects
x
Number of unit tests
developed for small
modules
x
Number of unit tests
developed for medium
modules.
x
Number of unit tests
developed for large
modules
x
Number of features covered
by failed unit tests
x
8.1 Worksheet: Required Data Elements, Availability, and Source
Data Element Required Avail Source Tools
Planned Coverage 0 0 Excel/project
Actual Coverage +
From the Test case execution reports found in the
tracking system for test plans during the SAT period.
Every test case which has to be updated by the tester.
Microsoft
Project
Planned Threshold 0 0
Performed by quality engineer and product owner.
Planned Threshold can be < 100% for some projects
but the actual threshold part must not contain any
medium and high severity defects open.
Tracking
System
(DDT)
Actual OK-Rate + Defect reporting tool.
Defect
reporting
Target Automation Factor + Six sigma tolerance factor. Six Sigma
Lines of codes/packages per
sprint
0 0 Eclipse, EDK Eclipse
Coverage of Unit Tests Over
the codes/packages
0 0 Unit test tools such as Eclipse JLIN Plug-in JLIN
Total number of unit tests + Eclipse Junit Junit
Total Team Effort in person
days
+
Every team member creates a task in JIRA about for the
unit test with the estimated time and actual time.
Excel
sheet/JIRA
Defect Severity 0
Defect reporting tool we can display all closed defects
and a developer is assigned to resolve.
Excel sheet
Total Person days 0
JIRA is used to find log time. Also from defect tracking
data and JIRA.
JIRA
Growth Rate 0 Standard Requirement document RAMC
Required MTBF +
Initial MTBF of every process is recorded before the
improvement process
RAMC
Initial MTBF 0
Initial MTBF of every product is recorded before the
improvement for testing process begins, and it can be
calculated and logged by using a simple formula (Total
up time) / (number of breakdowns).
RAMC
Cumulative test hours 0 0
Timesheets. By adding up the number of hours the team
worked testing the product.
Time sheet
Initial Test Time +
The initial time is always logged when starting
something and can be obtained or is available on the
spot.
Microsoft
Project
CL_wmc 0 Source code Logiscope
Cl_comf 0 Source code Logiscope
In_bases + Source code Logiscope
cu_cdused + Source code Logiscope
Cl_stat + Source code Logiscope
Cl_func + Source code Logiscope
Cl_data + Source code Logiscope
cl_data_publ + Source code Logiscope
Cu_cdusers + Source code Logiscope
In_noc + Source code Logiscope
cl_func_publ + Source code Logiscope
Cl_func + Source code Logiscope
Analyzability 0 Source code Logiscope
Changeability 0 Source code Logiscope
Stability 0 Source code Logiscope
Testability 0 Source code Logiscope
Time spent developing unit
Tests
+
JIRA. Time spent developing unit tests per developer,
iteration and project.
Eclipse
Development time + Timesheets. Excel
Number of features + Iteration backlog. Word
Number of features covered
by unit tests
+ Unit test cases. Junit
Feature Successful Test
Coverage
+ Unit test cases. Junit
Feature test coverage (failed) + Unit test cases Junit
Number of integration testing
defects
+ Integration testing defects report. Junit
Number of Acceptance
Testing Defects
+ Acceptance testing defects report. Excel
Number of unit tests
Developed for small modules
+
Junit. Total number of unit tests for medium modules.
(LOC 100 - 499)
JUnit
Number of unit tests
developed for medium
modules
+ Junit. (LOC 500-900) JUnit
Number of unit tests
developed for large modules
+ Junit. (LOC > 1000) JUnit
Note: Source could be explained in detail but due to the restriction of number of pages to make report we
have encapsulated it to a shorter form.
Code Meaning
+ Available
0 Can be derived from other data
00 Can be obtained via minor effort
- Not
- - Impossible to obtain or extremely
difficult
9. Step 6 - Planning Tasks
In this task the processes that must be available and activities that must be accomplished in order to collect,
store, process, and report the measures required to construct the indicators outlined previous.
Table below is developed with respect to the checklist of processes and activities to manage the data
required to construct analysis indicators.
New or Modified planning tasks Rationale
Staff Training
Training is required for usage of certain tools. Depending on the
knowledge and skill set of the Staff.
Process of Extracting data
Done by QA specialist. Plan data is extracted at the end of every
milestone.
Process of Evaluating data
Performed by managers. Data is evaluated twice in each milestone. This
helps in maintaining good quality and reliability from the beginning.
Process of execution Done by programmers.
Storing Data
Done by QA Specialist. Data will be stored in form of charts and all
details will be stored in excel. This helps in evaluating the data easily.
Perform the risk assessment for
every feature and function
Performed by the solution manager go get the criticality of the project.
Execute the test cases during SAT Done by the testers. Should be defined in the test tracking system.
Log any errors found in the test
cases
Performed by the testers. Any defects found should be entered in the
defect tracking system.
Check the quality board strategy
Performed by the quality lead to get the targeted rate at the end of the
sprint.
Run unit test coverage rate on the
dev system
Performed by the quality engineer to get the coverage of the automation
for a certain package at the end of the sprint.
Inspect the LOC developed in the
dev system.
Done by the quality engineer to get the LOC written at the end of the
spring this helps the team to evaluate the amount of code written.
Inspect the number of unit tests
developed in the dev system
Done by the quality engineer to get the number of unit tests done at the
end of the sprint, to help the manager evaluate the learning curve of the
TDD.
Enter the task and time spent on the
unit test writing
Done by the developer during TDD.
Enter the task and time spent on
resolving a defect
Done by the developer during daily activities. Help to keep track of the
tasks worked by each team member.
Enter the defect severity
Done by the tester during SAT. To differentiate between the defects
raised and evaluate the ones are more critical to close.
10.Conclusion
The goal driven measurement process is a process designed to support the organizational business goal by
analyzing indicators for analysis along with all specifications for all the data that will be used to construct
these indicators.
Finally, the output of applying goal driven measurement process to Commonwealth Software, Inc.
development department is: A. Mature processes output produce good quality products. B. Quality is
responsibility of the whole development team. C. Quality cost is balanced throughout the whole
development process.
11.Appendix A: (Definitions)
Planned
Threshold
The minimum number of green test cases required.
Actual Threshold The execution status of the test cases.
RAMC Development environment tool.
Maintainability It is characterized as the straightforwardness with which a software product can be modified
to correct defects, modified to meet new prerequisites, modified to make future maintenance
easier, or adjusted to a changed domain.
Reliability It is characterized as the capability of the software product to execute its required functions
under expressed conditions for a predefined time period, on the other hand for a
predetermined number of operations.
Analyzability It is defined as the capacity to distinguish root cause of a failure inside of the product.
Stability It is defined as the affectability to change of a given framework that is the negative effect
that may be created by framework changes.
Changeability It is defined as the amount of effort to change a framework.
Testability It is defined as the effort expected to verify (test) a framework change.
11 Appendix B: Classes Criteria of Maintainability
Cl_comf cl_comf = (cl_comm) / (cl_line MAX 1) Ratio between the number
of lines of comments in the module and the total number of lines:
cl_comf = cl_comm / cl_line Where: cl_comm is the number of
lines of comments in the package, cl_line is the total number of
lines in the package.
Size,
Understandability
Cl_comm Number of comment lines in a class. Size,
Understandability
Cl_data Total number of attributes declared inside the class declaration Cohesion
Cl_data_publ Number of attributes declared in the public section or in the public
interface of a Java class
Encapsulation,
Cohesion
Cl_func Total number of methods declared inside the class declaration Complexity, size
Cl_func_publ Number of methods declared in the public section Encapsulation,
Complexity
Cl_line Total number of lines in a class or an interface Size
Cl_stat Number of executable statements in all methods and initialization
code of a class
Size
Cl_wmc Sun of the static complexities of the class methods.
Static complexity is represented by the cyclomatic number of the
functions.
Complexity
12.Glossary
Abbreviations Meanings
CSI Commonwealth Software Inc.
SAT Software Acceptance Testing
EDK Eclipse Development Kit
JIRA Work Flow Management System Tool
JLIN Java Linkage Disequilibrium Plotter
GQM Goal Question Metric
GQIM Goal Question Indicator Metric
MTBF Mean Time Between Failures
QA Quality Assurance
HRD High Requirement Phase Defect
MRD Medium Requirements Phase Defect
HDD High Design Phase Defect
MDD Medium Design Phase Defect
HID High Implementation Phase Defect
MID Medium Implementation Phase Defect
HTD High Testing Phase Defect
MTD Medium Testing Phase Defect
TDD Test Driven Development
LOC Lines Of Code
13.References
[1] R. E. Park, W. B. Goethert and W. A. Florac. Goal-driven software Measurement—A guidebook. 1996.
[2] W. A. Florac, R. E. Park and A. D. Carleton. Practical Software Measurement: Measuring for Process
Management and Improvement 1997.
[3] S. H. Kan. Software quality metrics overview. Metrics and Models in Software Quality Engineering pp.
85-120. 2002.
[4] IEEE standard adoption of ISO/IEC 15939:2007 systems and software engineering--measurement
process.

More Related Content

PDF
Performance Methodology It Project Metrics Workbook
PDF
Software Project Management: Business Case
PDF
Quality Engineering Material
PPTX
Advanced quality control
PDF
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
PDF
Software Quality Assurance Model for Software Excellence with Its Requirements
PDF
csc 510 Project
PDF
M3-Review-SLOs-13NOV13
Performance Methodology It Project Metrics Workbook
Software Project Management: Business Case
Quality Engineering Material
Advanced quality control
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
Software Quality Assurance Model for Software Excellence with Its Requirements
csc 510 Project
M3-Review-SLOs-13NOV13

What's hot (20)

PPTX
ISTQB Technical Test Analyst 2012 Training - The Technical Test Analyst's Tas...
PDF
Test Plan Template
PDF
M3 Reviewing Trainer Notes
PDF
iSQI Certification Days ISTQB Advanced Kari Kakkonen
DOC
Test Plan Template
PDF
sample-test-plan-template.pdf
PPTX
Chapter 4 - Test Design Techniques
PDF
7.significance of software layered technology on size of projects (2)
PPTX
Chapter 3 - Static Testing
PDF
Scrum an agile process
PDF
Ieee829mtp
PDF
Not Just Numericals Values_ByDrSanjayGupta
PDF
Techical Data on Typologies of Interventions in Knowledge Exchange and Enterp...
PDF
White paper quality at the speed of digital
PDF
Software process &amp; product quality
PPTX
Factors affecting Quality of construction projects in India region
PDF
AN INTEGRATED MANAGEMENT AND MEASUREMENT OF CUSTOMER FEASIBLILTY IN CONSTRUCT...
PPTX
Chapter 3 - Test Techniques
PPTX
ISTQB Technical Test Analyst 2012 Training - Structure-Based Testing
PDF
IRJET- Study on Quality Control of Project Management System
ISTQB Technical Test Analyst 2012 Training - The Technical Test Analyst's Tas...
Test Plan Template
M3 Reviewing Trainer Notes
iSQI Certification Days ISTQB Advanced Kari Kakkonen
Test Plan Template
sample-test-plan-template.pdf
Chapter 4 - Test Design Techniques
7.significance of software layered technology on size of projects (2)
Chapter 3 - Static Testing
Scrum an agile process
Ieee829mtp
Not Just Numericals Values_ByDrSanjayGupta
Techical Data on Typologies of Interventions in Knowledge Exchange and Enterp...
White paper quality at the speed of digital
Software process &amp; product quality
Factors affecting Quality of construction projects in India region
AN INTEGRATED MANAGEMENT AND MEASUREMENT OF CUSTOMER FEASIBLILTY IN CONSTRUCT...
Chapter 3 - Test Techniques
ISTQB Technical Test Analyst 2012 Training - Structure-Based Testing
IRJET- Study on Quality Control of Project Management System
Ad

Similar to FINAL_SPM_document (20)

PDF
Integrated Project Management Measures in CMMI
PDF
Jurnal an example of using key performance indicators for software development
PDF
Project monitoring and control measures in cmmi
PPTX
Software Metrics & Measurement-Sharbani Bhattacharya
PDF
PROJECT PLANNINGMEASURES IN CMMI
PDF
4213ijsea08
PDF
Process and product quality assurance
DOCX
Quality project management
PPT
Capability Maturity Model
PPT
Cmm Level5
PDF
Blending Agile with CMMI®
PDF
CMMI v 1.2 Basics
 
PDF
CMMI Version 1.2
 
DOC
CMMI Implementation Guide
PPT
Spm unit 1
DOCX
Project planning , Productivity metrics,Cost estimation - COCOMO & COCOMO II,...
PPTX
UNIT V - 1 SPM.pptx
PPTX
Lgyt6ttftnjihuhunjnnjnrd6tf tfv ytgyuguy-8.pptx
PDF
LRAFB_Project Profile
PPT
Jason uyderv pmi 2 16 12
Integrated Project Management Measures in CMMI
Jurnal an example of using key performance indicators for software development
Project monitoring and control measures in cmmi
Software Metrics & Measurement-Sharbani Bhattacharya
PROJECT PLANNINGMEASURES IN CMMI
4213ijsea08
Process and product quality assurance
Quality project management
Capability Maturity Model
Cmm Level5
Blending Agile with CMMI®
CMMI v 1.2 Basics
 
CMMI Version 1.2
 
CMMI Implementation Guide
Spm unit 1
Project planning , Productivity metrics,Cost estimation - COCOMO & COCOMO II,...
UNIT V - 1 SPM.pptx
Lgyt6ttftnjihuhunjnnjnrd6tf tfv ytgyuguy-8.pptx
LRAFB_Project Profile
Jason uyderv pmi 2 16 12
Ad

FINAL_SPM_document

  • 1. CONCORDIA UNIVERSITY DEPARTMENT OF COMPUTER SCIENCE AND SOFTWARE ENGINEERING SOFTWARE PROJECT MANAGEMENT (SOEN 6841) Fall 2015 INSTRUCTOR: DR. OLGA ORMANDJIEVA IMPLEMENTING GOAL DRIVEN MEASUREMENT CREATED AND DEVELOPED BY Bhawna Sharma 27568789 Ehab Amar 26080421 Hammad Ali 27147074 Mudasser Akbar Muneer Ahmed 27654774 Muhammad Usman 27253443 Shahroze Jamil 27240570
  • 2. Contents Contents ........................................................................................................................................................2 List of Figures.................................................................................................................................................3 List of Tables..................................................................................................................................................3 1. Abstract .................................................................................................................................................4 2. Introduction...........................................................................................................................................4 3. Business Goal.........................................................................................................................................4 4. Step 1 - Brainstorming Additional Sub goal Statements .......................................................................5 4.1 Grouping of Entities managed by project manager by Theme .....................................................7 4.2 Prioritized Sub-goals......................................................................................................................9 5. Step 2 - Operationalized Sub Goals.......................................................................................................9 6. Step 3 - Success Criteria and Indicators.................................................................................................9 6.1 Defect Propagation indicator ..................................................................................................... 10 6.2 Increase in mean time between failures (MTBF) Indicator........................................................ 10 6.3 Maintainability Indicator............................................................................................................ 11 6.4 Test Case Progress Indicator ...................................................................................................... 13 7. Step 4 - Sub goals to Strategies and Activities ................................................................................... 14 Strategy 1................................................................................................................................................ 14 Strategy 2................................................................................................................................................ 15 Strategy 3................................................................................................................................................ 15 Strategy 4................................................................................................................................................ 16 Summary................................................................................................................................................. 16 8. Step 5 - Worksheet: Mapping of Required Data Elements to Indicators........................................... 16 8.1 Worksheet: Required Data Elements, Availability, and Source ....................................................... 18 9. Step 6 - Planning Tasks....................................................................................................................... 20 10. Conclusion ...................................................................................................................................... 21 11. Appendix A: (Definitions) ............................................................................................................... 21 12. Glossary .......................................................................................................................................... 22 13. References...................................................................................................................................... 22
  • 3. List of Figures Figure 1: Defect in different phases of project............................................................................................10 Figure 2: Mean time between failure...........................................................................................................11 Figure 3: Quality Indicator..........................................................................................................................13 Figure 4: Test cases progress chart..............................................................................................................14 List of Tables Table 1: Quality Indicators..........................................................................................................................12 Table 2: Maintainability Indicator...............................................................................................................12 Table 3 : Test Case Execution Status Sheet ................................................................................................13 Table 4 : Test Cases Progress Chart............................................................................................................14
  • 4. 1. Abstract Quality is characterized regarding germane properties of the particular project and any related product(s), conceivably in both quantitative and subjective terms. The purpose behind this report is to recognize steps to enhance quality of finished project inside of twelve weeks. Commonwealth Software, Inc. (CSI) is Software Company and it has been in presence for ten years. Its essential business is building and maintaining ecommerce websites. This report gives brief idea regarding how CSI can enhance quality of completed project. Whole document is sorted out in seven stages beginning with Brainstorming Additional Sub goal Statements, taking after by finding object of interest, purpose, Quality Focus & Perspective and environment & constraints for two recognized sub goals from initial step. Carrying on further with the Success Criteria and Indicators, this document emphases on the Sub objectives to Strategies and Activities. 2. Introduction This report points of interest the analysis implementing Goal-Driven Measurement performed on Commonwealth Software Inc. (CSI) completed to help the project manager of development in implementing the organizational business objective of demonstrating a quantifiable change in the quality of completed products inside of twelve weeks. This analysis was actualized in six stages. The initial step examines the entities managed by the project manager of development and perceives potential sub-goals that help the organizational business goal. The second step formalizes the two most noteworthy need sub-goals. The third step distinguishes the success criteria and indicators for the sub-goal concentrating on the improvement of the reliability of the product item. The fourth step details strategies and activities to achieve this sub-goal. The fifth step executes the Goal-Question-Indicator-Metric (GQIM) analysis process to identify quantifiable questions with the movement of integrating Test Driven Development (TDD) into the software life-cycle and indicators that guide in the tending to of these questions. Furthermore, the sixth step recognizes the data elements that should be gathered to construct the success criteria, progress, and assessment indicators. 3. Business Goal Show a measurable improvement (quantifiable change) in the quality of completed products inside of twelve weeks.
  • 5. 4. Step 1 - Brainstorming Additional Sub goal Statements The target of this step is to translate the organizational business objective into sub-objectives related to the entities and activities that are overseen from a particular administrative point of view. Inputs and Resources Entities Managed by Project Manager Questions related to Improvement of Quality Developers  How manageable are the developers as a team?  Do developers follow programming practices and organization standards?  Do the teams have specific skills to persuade the tasks?  Are developers aware of business goals?  Are sufficient number of developers working on assigned task(s)?  Are the developers experienced enough to achieve task(s) assigned? Quality Assurance team  Is the team following CMMI process standards?  Are test cases enough to test all functions of the software?  Are test cases reliable “No Defects”?  Are test cases executed automatically or manually?  What are testing tools used for testing?  What type and level of testing applied? Products and by-products Entities Managed by Project Manager Questions Related to Improvement of Quality Documentation  Are documents evaluated to guarantee quality?  Are document audits capable?  Who performs document audits?  Are checklists utilized and successful?  Is there a formal review procedure characterized?  Are documented structure features traceable? Pre-releases and source code  Is source code audited?  How powerful are code audits?  Who performs code audits?  Are code audit agendas utilized and compelling?  Are executed features traceable to traceable to necessities, outline, and test case artifacts?  Is the compiler utilized specifically?  Does the compiler identify dialect structure and variable definition deformities in the code?  Is the source code blunder free?  Does source code take after organizational programming models?  Is there any programmed keep an eye on the source code runs? Project Plan  A goal driven methods to meet cost variance limits?  How effective are estimation techniques chosen for the project?  What estimation techniques being addressed?
  • 6.  What estimation tools can be used?  Are milestones achievable meeting the deadlines?  Is the scope of the project persuaded?  Is estimated effort achievable? Test Plan and strategy  Is the test scope all around characterized?  Is the test scope sufficient?  Are test cases successful in distinguishing imperfections?  Are test cases financially savvy?  Is test case execution computerized?  Are test tools utilized as a part of testing?  What is the test course of action and quality thought?  What are the test sorts and levels?  Are the test cases looked into?  What is the premise/outline used to make the test cases?  Who composes the test cases?  Are test cases productive and powerful?  Are the test cases traceable back to outline and necessities? Activities and Flow paths Entities managed by Project Manager Questions related to Improvement of quality Effort estimation and schedule estimation of software ● What estimations methods are we using? ● What are the tools to be used for effort estimation? ● How realistic is the project schedule? ● Are estimated risks considered for resources? ● How effective are the techniques that have been selected? Product Development ● What is the development life cycle used? ● Is the selected development life cycle different from the previously used life cycle? ● Will test driven development be considered as part of the life cycle? ● Is the project requirement and specification reviewed by the Quality Assurance team? ● How well has the requirement been documented? ● What tools will be used for debugging? ● Is the customer involved during the development process? Software Process Testing ● What tools are used to manage the testing process? ● What are the testing environments that would be tested in? ● How will the defects be tracked? ● How will the detected defects be handled? ● What testing mechanism will be used for every process steps? ● How will the severity of defects be handled? ● Are the defects categorized? ● Will the fixed defects be tested again to check if they are fixed? ● Up to what extent will the process be tested? ● How much time would it require to test a process before it release?
  • 7. Risk Management ● Who will be responsible for identification, modification and controlling the product? ● How will the risk be identified? ● How will you rank the severity of the risk? ● Are there any pre-defined risk that might be associated with the product? ● How will the risks identified be handled? ● What techniques or methodologies are available or inherited to solve a particular type of risk? ● Is there a time limitation to resolve the risk? ● Will the risk identified be shared with the stake holders? ● Who will manage the risk identified? Internal Artifacts Entities managed by Project Manager Questions related to Improvement of quality customer change requests (work in process)  How substantial is our backlog of client change requests?  Where are the backlogs happening? 4.1 Grouping of Entities managed by project manager by Theme Questions related to improvement of quality Group #1 Documents ● Are documents reviewed to ensure quality? ● Are documents evaluated to guarantee quality? ● Are document audits capable? ● Who performs document audits? ● Are checklists utilized and successful? ● Is there a formal review procedure characterized? ● Are documented structure features traceable? ● How well has the requirement been documented? Group #2 Project Management  A goal driven methods to meet cost variance limits?  How effective are estimation techniques chosen for the project?  What estimation techniques being addressed?  What estimation tools can be used?  Are milestones achievable meeting the deadlines?  Is the scope of the project persuaded?  Is estimated effort achievable?  What are the tools to be used for effort estimation?  How realistic is the project schedule?  How effective are the techniques that have been selected? Group #3 Quality Management  Is the test scope all around characterized?  Is the test scope sufficient?  Are test cases successful in distinguishing imperfections?  Are test cases financially savvy?  Is test case execution computerized?  Are test tools utilized as a part of testing?  What is the test course of action and quality thought?  What are the test sorts and levels?  Are the test cases looked into?  What is the premise/outline used to make the test cases?
  • 8.  Who composes the test cases?  Are test cases productive and powerful?  Are the test cases traceable back to outline and necessities?  Are test cases enough to test all functions of the software?  Are test cases reliable “No Defects”?  Are test cases executed automatically or manually?  What are testing tools used for testing?  What type and level of testing applied?  Is the project requirement and specification reviewed by the Quality Assurance team?  What tools are used to manage the testing process?  What are the testing environments that would be tested in?  How will the defects be tracked?  How will the detected defects be handled?  What testing mechanism will be used for every process steps?  How will the severity of defects be handled?  Are the defects categorized?  Will the fixed defects be tested again to check if they are fixed?  Up to what extent will the process be tested?  How much time would it require to test a process before it release? Group #4 Software Product  Is source code audited?  How powerful are code audits?  Who performs code audits?  Are code audit agendas utilized and compelling?  Are executed features traceable to traceable to necessities, outline, and test case artifacts?  Is the compiler utilized specifically?  Does the compiler identify dialect structure and variable definition deformities in the code?  Is the source code blunder free?  Does source code take after organizational programming models?  Is there any programmed keep an eye on the source code runs?  Group #5 Development  Are documents evaluated to guarantee quality?  Are document audits capable?  Who performs document audits?  Are checklists utilized and successful?  Is there a formal review procedure characterized?  Are documented structure features traceable?  What is the development life cycle used?  Is the selected development life cycle different from the previously used life cycle?  Will test driven development be considered as part of the life cycle? Group #6 Risk Management  Who will be responsible for identification, modification and controlling the product?  How will the risk be identified?  How will you rank the severity of the risk?  Are there any pre-defined risk that might be associated with the product?  How will the risks identified be handled?  What techniques or methodologies are available or inherited to solve a particular type of risk?  Is there a time limitation to resolve the risk?  Will the risk identified be shared with the stake holders?  Who will manage the risk identified?
  • 9.  Are estimated risks considered for resources? Group # 7 Change Management  How substantial is our backlog of client change requests?  Where are the backlogs happening? 4.2 Prioritized Sub-goals 1. Improve maintainability of the code. 2. Improve the reliability of software releases. 3. Improve the portability and interoperability of software release. 4. Reduce the defect density of software release. 5. Step 2 - Operationalized Sub Goals From the above mentioned sub goals we have decided to further explain below mentioned sub goals into structured statements to highlight its key components (object of interest, purpose, quality focus and perspective, environment and constraints). Sub Goal 1 Improve the reliability of software releases Perspective Software development manager Object of interest Software release Purpose Check the probability of failure for a specific period of time in current software releases in order to improve it. Quality focus and Perspective Examine the reasons of failure from the perspective of development project manager. Environment and Constraints Organizational standards and structure, development, project plan, available resources and tools, criteria of reliability certification. Sub Goal 2 Improve maintainability of the code Perspective Software development manager Object of interest Code quality Purpose Observe changeability, analyzability, testability and stability of the code Quality focus and Perspective Examine the reasons of bad code structure which makes it hard to add new feature or to change a functionality. Environment and Constraints Activities of development team members, organization structure, software development, resources available, number of developers and testers, cost of making changes, required skills. 6. Step 3 - Success Criteria and Indicators The main objective in this step is to develop success criteria and indicators for one of the operationalized goals outlined in step 2. Operationalized Measurement Goal selected: Improve the reliability of software releases. Q1: What is the rate of reduction in defect propagation all through the development of the software product? Q2: What rate of increment is in mean time between failures being worked on? Q3: What percentage of the classes are in the zone of excellent, good, fair and in poor? Q4: What rate of increment in planned coverage, actual coverage, planned and actual ok rate per test cases?
  • 10. The Success criteria and indicators outlined in the section below picturizes the output results of the goal driven measurement analysis which are quantitatively evaluated to improve the reliability of the released software. 6.1 Defect Propagation indicator Success Criteria: 60% reduction in defect Success criteria. The 60% reduction in defect propagation throughout the development of the software product Assumption: 60% reduction in defect success criteria. Note: These all defects are being extracted from Data Defect Tracker (DDT). Indicator: The defect propagation indicator provides one of the means to assess the number and type of high and medium severity detected at every phase of product development. The assumption here is that there is a reduction of 60% over every phase and steps to be taken to increase the reliability of the product which is under development. Figure 1: Defect in different phases of project The Pie chart shown shows 4 different phases and the severity of defects found during those phases. 6.2 Increase in mean time between failures (MTBF) Indicator Success Criteria: The 15% improvement in MTBF of product under development. Defect Indicator HRD MRD HDD MDD HID MID HTD MTD Requirements phase 60 40 N/a N/a N/a N/a N/a N/a Design Phase 20 12 160 65 N/a N/a N/a N/a Implementation Phase 8 0 65 12 45 25 N/a N/a Testing Phase 0 0 0 0 20 7 N/a N/a Delivery 0 0 0 0 0 0 0 0
  • 11. Indicator: The Increase in Mean Time between Failures Indicator furnishes management with a means to assess the current mean time between failures against the objective 15% improvement objective all through the development life-cycle as a component of the recorded MTBF and test time of the software product being worked on. This indicator empowers management to figure out if corrective activities need to be implemented with a specific end goal to expand the MTBF and accomplish the departmental objective of increasing the reliability of the product under development. Figure 2: Mean time between failures 6.3 Maintainability Indicator Success Criteria: Increase in the maintainability of the software under development through improvement of factors such as changeability, testability, stability and analyzability. 75% of the classes must be in the zone of excellent, 20% in good, 5% in fair and 0% in poor. Indicator: The values obtained from the LogiScope tool is an indicator that helps to analyze the factors and generate respective analysis sub-indicators and indicators. This indicator helps in assessing the code produced at a class level. Also McCabe is a tool that can be used to achieve reliability and maintainability goals. The ratings are as defined in the success criteria. This helps in improving the testability, changeability and analyzability of the produced over next milestones. Maintainability as characterized as the probability of performing a successful activity inside of a given time. As such, maintainability measures the simplicity and rate with which a framework can be restored to operational status after a failure happens. Maintainability = Testability + Stability + Analyzability + Changeability “ Definition of Maintainability” 0 500 1000 1500 2000 2500 3000 3500 4000 0 500 1000 1500 2000 2500 3000 3500 4000 X-axis = testing time in hours Y-axis = Mean time between failure in hours Current MTBF 15% target
  • 12. Assuming values gives us the following indicators to be shown to the management. Class Criteria (M1) Excellent (%) Good (%) Fair (%) Poor (%) Analyzability 0 65 35 0 Changeability 95 2 0 0 Stability 65 35 0 0 Testability 80 20 0 0 Table 1: Quality Indicators Class Factor (M1) Excellent Good Fair Poor Maintainability 90 10 0 0 Table 2: Maintainability Indicator Analyzability Excellent Good Fair Poor Changeability Excellent Good Fair Poor Stability Excellent Good Fair Poor Testability Excellent Good Fair Poor
  • 13. Figure 3: Quality Indicator 6.4 Test Case Progress Indicator Success Criteria: The success criteria for the test case progress indicators would be the number of defects which have recorded in a document or a database and whose status after the test is Close. For example: if 100 defects were recorded in a excel sheet with the status of 45 open and 55 closed. So the number of defects closed would be the success criteria with a percentage of 55%. Test case progress indicator will give an insight and improvements that would be visible over the testing progress and defect tracking. This indicator give a progress of the test coverage and key performance indicator against the planned values to make decisive changes in case of deviations. Indicator: The indicator will help in determining the confidence level in the delivered quality of the product. This measurement is based on the test case execution status recorded by the testing team. The recorded data and the status of each test case. This will make sure the resolution taken for every defect found. A defect tracking system is required to keep track of the status of the test plans, defect status and test cases progress. Module Complexity Responsibility Date ofExecution Status(pass/ fail/not executed) Defectid Severity Status (open/close) 1.1 High Umar 25 – Nov Pass #1102 Medium Close Table 3 : Test Case Execution Status Sheet Maintainablility Excellent Good Fair Poor
  • 14. Figure 4: Test cases progress chart Planned Coverage (%) Actual Coverage (%) Planned Threshold (%) Actual Threshold (%) 5 5 4 3 10 10 8 6 15 14 12 9 21 21 18 16 Table 4 : Test Cases Progress Chart 7. Step 4 - Sub goals to Strategies and Activities In this step we are going to mention two strategies for our sub goal identified in step 2 from the perspective of project development manager which describes the set of activities and its impact. Strategy 1 Perspective Project development manager. Sub goal Improve the reliability of software releases. Strategy Present inspection of the actualized code keeping in mind the end goal to find and fix defects in early phases of development and to reduce the time to market. Inspection will discover also, correct defects at their purpose of beginning. Impact of strategy  It will bring about a slight increment in the aggregate expense of the project because of presentation of inspection however general it will help to find and fix critical number of defects prior to production.  It can spare expense to right defects amid analysis than amid later stages. So it is numerous times less expensive than the expense we spend for fixing the defect at later stages. 0 5 10 15 20 25 1 2 3 4 Percentage Days Test Cases Progress Chart
  • 15. Activities  In inspection we will attempt to figure out internal inconsistency in all the documents including code.  Verify results with the necessity and updating the agenda.  Inspection will recognize a wide range of errors, for example, errors identified with GUI, Business rationale, Database, Security errors, logical errors, server response errors, similarity, error handling, and execution optimization. Strategy 2 Perspective Project development manager. Sub goal Improve the reliability of software releases. Strategy Implementation of Agile development process within organization by providing lean and test driven development training to teams. Impact of strategy Improvement in overall code quality, software becomes much faster, less usage of memory and system processing capabilities and code structure will be improved for maintainability by early detection and removal of errors. Activities  Use of Test Driven Development (TDD) ensues that only necessary piece of code will be written that is required to pass the test criteria.  Usage of automated testing ensures that the released software product is tested which each and every flow path and scenario that a human may miss also it reduces the probability of new errors that may arise after any changes.  Make different scrum development teams by integration of quality members this integration helps them to deliver a reliable and high quality software release at the end of an iteration.  Implementation of single batch processing allows developers to pick user stories from single backlog enabling to deliver working set of features earlier so that testing will also be started earlier and the final version will be released earlier.  Use better development environments and tools with faster machines that will improve developer’s code quality by proper debugging, correction and testing of developed software version. Strategy 3 Perspective Project development manager. Sub goal Improve the reliability of software releases. Strategy Increase the effectiveness of validation and verification process. Impact of strategy Identification of defects in early stages and reduce them from propagation. Activities  Use of automated testing.  Coverage of all possible scenarios in test cases.  Improvement in test coverage.  Application of formal inspections and technical reviews.  Improvement in effectiveness of test process.
  • 16. Strategy 4 Perspective Project development manager. Sub goal Improve the reliability of software releases. Strategy Presentation of automation testing tool will stay away from human blunders and to lessen the quantity of assets on a single task. Automation tool will be an outsider tool which will be utilized to save worker hours. Impact of strategy  Increase in expense of the project as more money will be spent with a specific end goal to buy automation tool from outside.  It will support huge test scope over little span of time.  Less blunder inclined when contrasted with manual testing.  Reduce human exertion. Activities  Enter input values and validate/accept against the system test case.  Fix the defects and re run the tool.  Analyze the progress by plotting diagrams for the acquired results.  Bring up to date the defect tracker when the defects are closed. Summary The above strategies are placed in an order of priority. The highest priority goes to inspection process. Inspection builds the internal consistency of the work done by a specialist, checks the conflict with best practices. At the point when the work done by a man is reviewed by another, there will be a proficient criticism given for the work done which will be a solid approach to enhance the quality of the deliverable then the priority goes to priority goes to Implementation of Agile development process which can be used to bring improvement in overall code quality, by which software turns out to be much quicker, less utilization of memory and framework preparing abilities and code structure will be enhanced for practicality by ahead of schedule identification and evacuation of blunders. Then the priority is towards increasing the effectiveness of validation and verification process which is used to identify defects in early stages and reduce them from propagation. Last priority goes to Automation testing in which we utilized automatic special software (separate from the software being tested) to control the execution of tests and the correlation of genuine results with anticipated results. 8. Step 5 - Worksheet: Mapping of Required Data Elements to Indicators Data Elements Required Indicator Defect Propagation Indicator Meantime between Failures Maintainability Indicator Testcase progress indicator Planned Coverage x Actual coverage x Planned Threshold x Actual Threshold x
  • 17. Target Automation Factor x x Lines of codes/packages per spring x Coverage of unit Tests over the Codes/packages x defect Severity x Required MTBF x Initial MTBF x Cumulative Test Hours x Initial Test Time x Data Elements Required Indicator Analyzability Changeability Stability Testability Cl_wmc x Cl_comf x In_bases x Cu_cdused x Cl_stat x Cl_func x Cl_data x cl_data_publ x Cu_cdusers x In_noc x Cl_func_publ x Cl_wmc x Cl_funccu_cdused x Data Elements Required Indicator %Timefor developing Tests FeatureTest coverage Test Coverage (successful feature) Failedtest coverage (feature) Defectsat integration and acceptance Testing(in %)Average Numberof Unittests Time taken for developing unit tests x
  • 18. Time taken for development x Number of features x x x No of features covered by unit tests x No of features covered by successful unit tests x Number of Integration testing defects x Number of acceptance testing defects x Number of unit tests developed for small modules x Number of unit tests developed for medium modules. x Number of unit tests developed for large modules x Number of features covered by failed unit tests x 8.1 Worksheet: Required Data Elements, Availability, and Source Data Element Required Avail Source Tools Planned Coverage 0 0 Excel/project Actual Coverage + From the Test case execution reports found in the tracking system for test plans during the SAT period. Every test case which has to be updated by the tester. Microsoft Project Planned Threshold 0 0 Performed by quality engineer and product owner. Planned Threshold can be < 100% for some projects but the actual threshold part must not contain any medium and high severity defects open. Tracking System (DDT) Actual OK-Rate + Defect reporting tool. Defect reporting Target Automation Factor + Six sigma tolerance factor. Six Sigma Lines of codes/packages per sprint 0 0 Eclipse, EDK Eclipse Coverage of Unit Tests Over the codes/packages 0 0 Unit test tools such as Eclipse JLIN Plug-in JLIN Total number of unit tests + Eclipse Junit Junit
  • 19. Total Team Effort in person days + Every team member creates a task in JIRA about for the unit test with the estimated time and actual time. Excel sheet/JIRA Defect Severity 0 Defect reporting tool we can display all closed defects and a developer is assigned to resolve. Excel sheet Total Person days 0 JIRA is used to find log time. Also from defect tracking data and JIRA. JIRA Growth Rate 0 Standard Requirement document RAMC Required MTBF + Initial MTBF of every process is recorded before the improvement process RAMC Initial MTBF 0 Initial MTBF of every product is recorded before the improvement for testing process begins, and it can be calculated and logged by using a simple formula (Total up time) / (number of breakdowns). RAMC Cumulative test hours 0 0 Timesheets. By adding up the number of hours the team worked testing the product. Time sheet Initial Test Time + The initial time is always logged when starting something and can be obtained or is available on the spot. Microsoft Project CL_wmc 0 Source code Logiscope Cl_comf 0 Source code Logiscope In_bases + Source code Logiscope cu_cdused + Source code Logiscope Cl_stat + Source code Logiscope Cl_func + Source code Logiscope Cl_data + Source code Logiscope cl_data_publ + Source code Logiscope Cu_cdusers + Source code Logiscope In_noc + Source code Logiscope cl_func_publ + Source code Logiscope Cl_func + Source code Logiscope Analyzability 0 Source code Logiscope Changeability 0 Source code Logiscope Stability 0 Source code Logiscope Testability 0 Source code Logiscope Time spent developing unit Tests + JIRA. Time spent developing unit tests per developer, iteration and project. Eclipse Development time + Timesheets. Excel Number of features + Iteration backlog. Word Number of features covered by unit tests + Unit test cases. Junit Feature Successful Test Coverage + Unit test cases. Junit Feature test coverage (failed) + Unit test cases Junit Number of integration testing defects + Integration testing defects report. Junit
  • 20. Number of Acceptance Testing Defects + Acceptance testing defects report. Excel Number of unit tests Developed for small modules + Junit. Total number of unit tests for medium modules. (LOC 100 - 499) JUnit Number of unit tests developed for medium modules + Junit. (LOC 500-900) JUnit Number of unit tests developed for large modules + Junit. (LOC > 1000) JUnit Note: Source could be explained in detail but due to the restriction of number of pages to make report we have encapsulated it to a shorter form. Code Meaning + Available 0 Can be derived from other data 00 Can be obtained via minor effort - Not - - Impossible to obtain or extremely difficult 9. Step 6 - Planning Tasks In this task the processes that must be available and activities that must be accomplished in order to collect, store, process, and report the measures required to construct the indicators outlined previous. Table below is developed with respect to the checklist of processes and activities to manage the data required to construct analysis indicators. New or Modified planning tasks Rationale Staff Training Training is required for usage of certain tools. Depending on the knowledge and skill set of the Staff. Process of Extracting data Done by QA specialist. Plan data is extracted at the end of every milestone. Process of Evaluating data Performed by managers. Data is evaluated twice in each milestone. This helps in maintaining good quality and reliability from the beginning. Process of execution Done by programmers. Storing Data Done by QA Specialist. Data will be stored in form of charts and all details will be stored in excel. This helps in evaluating the data easily. Perform the risk assessment for every feature and function Performed by the solution manager go get the criticality of the project. Execute the test cases during SAT Done by the testers. Should be defined in the test tracking system. Log any errors found in the test cases Performed by the testers. Any defects found should be entered in the defect tracking system. Check the quality board strategy Performed by the quality lead to get the targeted rate at the end of the sprint.
  • 21. Run unit test coverage rate on the dev system Performed by the quality engineer to get the coverage of the automation for a certain package at the end of the sprint. Inspect the LOC developed in the dev system. Done by the quality engineer to get the LOC written at the end of the spring this helps the team to evaluate the amount of code written. Inspect the number of unit tests developed in the dev system Done by the quality engineer to get the number of unit tests done at the end of the sprint, to help the manager evaluate the learning curve of the TDD. Enter the task and time spent on the unit test writing Done by the developer during TDD. Enter the task and time spent on resolving a defect Done by the developer during daily activities. Help to keep track of the tasks worked by each team member. Enter the defect severity Done by the tester during SAT. To differentiate between the defects raised and evaluate the ones are more critical to close. 10.Conclusion The goal driven measurement process is a process designed to support the organizational business goal by analyzing indicators for analysis along with all specifications for all the data that will be used to construct these indicators. Finally, the output of applying goal driven measurement process to Commonwealth Software, Inc. development department is: A. Mature processes output produce good quality products. B. Quality is responsibility of the whole development team. C. Quality cost is balanced throughout the whole development process. 11.Appendix A: (Definitions) Planned Threshold The minimum number of green test cases required. Actual Threshold The execution status of the test cases. RAMC Development environment tool. Maintainability It is characterized as the straightforwardness with which a software product can be modified to correct defects, modified to meet new prerequisites, modified to make future maintenance easier, or adjusted to a changed domain. Reliability It is characterized as the capability of the software product to execute its required functions under expressed conditions for a predefined time period, on the other hand for a predetermined number of operations. Analyzability It is defined as the capacity to distinguish root cause of a failure inside of the product. Stability It is defined as the affectability to change of a given framework that is the negative effect that may be created by framework changes. Changeability It is defined as the amount of effort to change a framework. Testability It is defined as the effort expected to verify (test) a framework change. 11 Appendix B: Classes Criteria of Maintainability Cl_comf cl_comf = (cl_comm) / (cl_line MAX 1) Ratio between the number of lines of comments in the module and the total number of lines: cl_comf = cl_comm / cl_line Where: cl_comm is the number of lines of comments in the package, cl_line is the total number of lines in the package. Size, Understandability Cl_comm Number of comment lines in a class. Size, Understandability
  • 22. Cl_data Total number of attributes declared inside the class declaration Cohesion Cl_data_publ Number of attributes declared in the public section or in the public interface of a Java class Encapsulation, Cohesion Cl_func Total number of methods declared inside the class declaration Complexity, size Cl_func_publ Number of methods declared in the public section Encapsulation, Complexity Cl_line Total number of lines in a class or an interface Size Cl_stat Number of executable statements in all methods and initialization code of a class Size Cl_wmc Sun of the static complexities of the class methods. Static complexity is represented by the cyclomatic number of the functions. Complexity 12.Glossary Abbreviations Meanings CSI Commonwealth Software Inc. SAT Software Acceptance Testing EDK Eclipse Development Kit JIRA Work Flow Management System Tool JLIN Java Linkage Disequilibrium Plotter GQM Goal Question Metric GQIM Goal Question Indicator Metric MTBF Mean Time Between Failures QA Quality Assurance HRD High Requirement Phase Defect MRD Medium Requirements Phase Defect HDD High Design Phase Defect MDD Medium Design Phase Defect HID High Implementation Phase Defect MID Medium Implementation Phase Defect HTD High Testing Phase Defect MTD Medium Testing Phase Defect TDD Test Driven Development LOC Lines Of Code 13.References [1] R. E. Park, W. B. Goethert and W. A. Florac. Goal-driven software Measurement—A guidebook. 1996. [2] W. A. Florac, R. E. Park and A. D. Carleton. Practical Software Measurement: Measuring for Process Management and Improvement 1997. [3] S. H. Kan. Software quality metrics overview. Metrics and Models in Software Quality Engineering pp. 85-120. 2002. [4] IEEE standard adoption of ISO/IEC 15939:2007 systems and software engineering--measurement process.