SlideShare a Scribd company logo
15-­‐10-­‐12	
  
1	
  
COMBINING QUALITATIVE AND QUANTITATIVE
SOFTWARE PROCESS EVALUATION:
A PROPOSED APPROACH
Sylvie Trudel
Dept. of Computer Science
UQAM
Alex Turcotte
CEFTI
Université de Sherbrooke
2© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
²  So)ware	
  process	
  evalua4on	
  (SPE)	
  
²  Quan4fying	
  the	
  so)ware	
  process	
  efficiency	
  
²  Combining	
  qualita4ve/quan4ta4ve	
  SPE	
  
²  Field	
  trial:	
  financial	
  trading	
  domain	
  
²  Discussion	
  
²  Future	
  work	
  
Content	
  
15-­‐10-­‐12	
  
2	
  
SOFTWARE PROCESS EVALUATION
(SPE)
INTRODUCTION: WHY, WHO, HOW & HOW MUCH
© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
4
•  Why: competition, market demands, costs
–  Symptoms: budget & schedule overruns (especially on
larger projects), poor quality, unsatisfied customers/
users
•  How: evaluate against best practice models (e.g. CMMI)
•  Who:
–  Large organizations vs Small organizations:
same methods?
© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
Understanding	
  what	
  needs	
  to	
  be	
  improved	
  
15-­‐10-­‐12	
  
3	
  
5© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
CMMI	
  Maturity	
  Levels	
  
1. Initial
Unpredictable and Poorly Controlled
4. Quantitatively
managed
Controlled and Measured Process
Predictable Process Process and
Product Quality
5. Optimizing
Emphasis on Process
Improvement
Continuous Improvement
Process
Change
Management
2. Managed
Reactive Project Defined
Process
Project
Management
Structured
Process
3. Defined
Proactive Organizational
Defined Process
Standard and Consistent
Process
Integrated
Engineering Process
6© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
Heroes1. Initial
Level Emphasis
5. Optimizing
Quantitative
Management
Basic Project
Management
Process Areas
Decision Analysis and Resolution
Risk Management
Integrated Project Management for IPPD
Organizational Training
Organizational Process Definition
Organizational Process Focus
Validation
Verification
Product Integration
Technical Solution
Requirements Development
Configuration Management
Process and Product Quality Assurance
Measurement and Analysis
Supplier Agreement Management
Project Monitoring and Control
Project Planning
Requirements Management
Risks and
Rework
Organizational Process Performance
Quantitative Project Management
Organizational Innovation and Deployment
Causal Analysis and Resolution
Continuously Improving
Processes
Processes
Standardization
Productivity
and quality
Results
2. Managed	
  
3. Defined
4. Quantitatively
Managed
CMMI-­‐Dev	
  Overview	
  «Staged»	
  
15-­‐10-­‐12	
  
4	
  
Process	
  Evalua9on	
  Methods
7© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
Method	
   Targeted	
  
organiza9ons	
  
Cer9fied	
  lead	
  
appraiser	
  
Ra9ng	
   Rela9ve	
  cost	
  
SCAMPI	
  ‘A’	
   Large	
   þ	
   þ	
   $$$$$	
  
SCAMPI	
  ‘B’	
   Large	
   þ	
   -­‐	
   $$$	
  
SCAMPI	
  ‘C’	
   Large	
  &	
  Medium	
   -­‐	
   -­‐	
   $$	
  
ISO/IEC	
  29110-­‐3	
   Small	
   -­‐	
   -­‐	
   $	
  
PEM	
   Small	
  &	
  Medium	
   -­‐	
   -­‐	
   $	
  
8© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
Original	
  PEM	
  Method	
  (1	
  of	
  3)
From ISO 14598-5
1- Analysis of Evaluation
Requirements
2- Specification of the
Evaluation
3- Design of the Evaluation
•  Evaluation request
•  Context: projects requirements,
evaluation requirements
•  Statement of work template
•  SOW (draft)
•  CMMI Models
•  Process Area Selection Guide
•  Context
•  Statement of work (draft):
ü  Evaluation objectives
ü  Confidentiality agreement
ü  Assumptions and constraints
•  SOW (final) Scope:
ü  Selected Projects
ü  Selected Model
ü  Selected Process Areas
•  Client contract/agreement
•  SOW
•  Evaluation Plan Template
•  Evaluation Method
•  List of typical questions
•  Evaluation Plan
•  List of Selected Questions
15-­‐10-­‐12	
  
5	
  
9© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
Original	
  PEM	
  Method	
  (2	
  of	
  3)
From ISO 14598-5 (continued)
4- Interviewing Project
Participants and Reviewing
Project Documentation
5- Reviewing and Reporting
Observations
6- Conclusion of the
Evaluation
Execution of the Evaluation
•  Interviews and Document
reviews observations
•  Findings (draft)
•  Evaluator’s Checklist (started)
•  Evaluation Plan
•  List of Selected Questions
•  List of documents to evaluate
•  Evaluator’s Checklist (new)
•  Interview Guideline
•  Interviews and Document
reviews observations
•  Wording of Findings Guideline
•  Findings (draft)
•  Finding Selection Guide
•  Evaluation Report Template
•  Evaluator’s Checklist (updated)
•  Evaluation Report (draft)
•  Findings (final and complete)
•  Evaluation Report (draft)
•  Requesters Comments
•  Evaluation Report (final)
•  Evaluator’s Checklist (final)
•  Interviews and Document
reviews observations (destroyed)
10© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
Original	
  PEM	
  Method	
  (3	
  of	
  3)
Value Added Step
7- Planning of Improvement
Actions (optional)
•  Action plan, including:
ü  Related findings
ü  Activities, deliverables, tools
ü  Estimates and schedule
ü  Stakeholders involvement
ü  Return on investment
•  Evaluation Report (final)
•  Action plan template
15-­‐10-­‐12	
  
6	
  
QUANTIFYING THE SOFTWARE
PROCESS EFFICIENCY
HOW TO MEASURE ASPECTS OF THE SOFTWARE
PROCESS ?
© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
So)ware	
  to	
  measure	
  
Boundary	
  
COSMIC : Overview
Func4onal	
  
process	
  1	
  
Persistent	
  
storage	
  
Write	
  (W)	
  
Read	
  (R)	
  
Func4onal	
  
process	
  2	
  
Func4onal	
  
process	
  n	
  
…	
  
Human	
  users	
  	
  
or	
  
IO	
  
Hardware	
  
Entry	
  (E)	
  
Exit	
  (X)	
  
or	
  
Func4onal	
  
Users	
  
or	
  
Other	
  
systems	
  
Data	
  
‘Interfaces’	
   ‘Infrastructures’	
  
12© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
15-­‐10-­‐12	
  
7	
  
13© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
Applying	
  COSMIC:	
  3	
  phases	
  
FUR
Chapter 2
Measurement
Strategy
Generic Software Model
Chapter 3
Mapping
Phase
FUR in the form of the
Generic Software Model
Chapter 4
Measurement
Phase
Functional size of
the software in
units of CFP
Definition of each piece of
software to be measured and
of the required measurement
Input from measurement sponsor
Software Context Model
FUR
Poten4al	
  measurement	
  
purpose:	
  Quan4fy	
  the	
  SW	
  
process	
  produc4vity	
  rate	
  
Possible	
  side	
  effect:	
  Iden4fy	
  
defects	
  in	
  Func4onal	
  
Requirements!	
  
COMBINING QUALITATIVE AND
QUANTITATIVE SPE
WHY AND HOW: A PROPOSED APPROACH…
© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
15-­‐10-­‐12	
  
8	
  
15© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
²  Mo4va4ons:	
  	
  
Ø  Bring	
  a	
  broader	
  insight	
  on	
  the	
  SW	
  Process	
  
Ø  Overlooked	
  issues	
  related	
  to	
  requirements	
  engineering	
  
Ø  Provide	
  a	
  SW	
  process	
  produc4vity	
  rate	
  
²  Hypothesis:	
  Mutual	
  influences	
  between	
  measurement	
  
results	
  and	
  qualita4ve	
  findings	
  
Ø  Requires	
  to	
  be	
  combined	
  during	
  execu4on	
  of	
  the	
  evalua4on!	
  
Combining	
  Qualita9ve	
  and	
  Quan9ta9ve	
  SPE	
  
16© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
Proposed	
  approach
4a- Interviewing Project
Participants and Reviewing
Project Documentation
5- Reviewing and Reporting
Observations
Execution of the Evaluation
•  Interviews and Document
reviews observations
•  Findings (draft)
•  Evaluator’s Checklist (started)
•  Evaluation Plan
•  List of Selected Questions
•  List of documents to evaluate
•  Evaluator’s Checklist (new)
•  Interview Guideline
•  Interviews and Document
reviews observations
•  Wording of Findings Guideline
•  Findings (draft)
•  Finding Selection Guide
•  Evaluation Report Template
•  Evaluator’s Checklist (updated)
•  Evaluation Report (draft)
•  Findings (final and complete)
4b- Measuring software
functional size and process
efficiency
•  Selected FSM Method
•  FUR from selected projects
•  Effort from selected projects
•  Quality rating guidelines
•  Functional Size
•  Efficiency data (analysed)
•  Requirements defects
(identified)
•  Findings (updated)
15-­‐10-­‐12	
  
9	
  
FIELD TRIAL:
FINANCIAL TRADING DOMAIN
WHAT WERE THE RESULTS?
© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
18© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
1.  Analysis:	
  100	
  staff	
  members	
  (most	
  projects	
  <	
  10)	
  à	
  trading	
  SW	
  
for	
  deriva4ve	
  markets,	
  documented	
  process,	
  percep4on	
  of	
  high	
  
costs	
  from	
  clients	
  à	
  Budget	
  overruns	
  on	
  larger	
  projects	
  
2.  Specifica4on:	
  1	
  Business	
  domain	
  (2	
  major	
  clients),	
  CMMI	
  2	
  &	
  3	
  
except	
  SAM-­‐OT-­‐DAR	
  à	
  6	
  projects	
  selected	
  (3	
  large	
  [500+	
  staff-­‐
days],	
  3	
  regular)	
  
3.  Design:	
  Plan	
  to	
  interview	
  21	
  par4cipants	
  (at	
  least	
  2	
  per	
  role)	
  à	
  
ques4onnaire	
  developed,	
  2-­‐page	
  email	
  instead	
  of	
  kick-­‐off	
  
Organiza9on	
  &	
  Evalua9on	
  Characteris9cs	
  (1/2)	
  
15-­‐10-­‐12	
  
10	
  
19© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
4.  a.	
  Interviews:	
  18	
  out	
  21	
  par4cipants	
  interviewed,	
  confiden4ality	
  
assured	
  à	
  effort	
  =	
  50	
  staff-­‐hours	
  
4.	
  	
  	
  b.	
  Measurement:	
  Done	
  while	
  reviewing	
  requirements	
  &	
  projects	
  
documenta4on	
  à	
  ambigui4es	
  led	
  to	
  examining	
  SW	
  code;	
  results	
  
verified	
  by	
  cer4fied	
  measurer,	
  then	
  analysis	
  done	
  
5.  Review	
  &	
  Report:	
  Valida4on	
  of	
  findings	
  by	
  par4cipants	
  
6.  Conclusion:	
  Final	
  report	
  combines	
  results	
  à	
  RE	
  inconsistencies	
  
raised,	
  recommenda4ons	
  proposed	
  for	
  improvement	
  
Organiza9on	
  &	
  Evalua9on	
  Characteris9cs	
  (2/2)	
  
20© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
Ini4al	
  produc4vity	
  model	
  with	
  func4onal	
  size	
  and	
  effort	
  from	
  all	
  six	
  projects	
  
15-­‐10-­‐12	
  
11	
  
21© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
Comparison	
  of	
  func4onal	
  size	
  and	
  effort	
  for	
  the	
  remaining	
  four	
  projects	
  
DISCUSSION AND FUTURE WORK
WHAT DID WE LEARN AND WHAT’S NEXT?
© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
15-­‐10-­‐12	
  
12	
  
23© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
²  Adapta4on	
  of	
  PEM	
  to	
  combine	
  qualita4ve	
  (CMMI)	
  and	
  
quan4ta4ve	
  (COSMIC)	
  evalua4on	
  of	
  the	
  SW	
  Process	
  and	
  the	
  SW	
  
Requirements	
  
²  Several	
  benefits	
  from	
  the	
  field	
  trial:	
  
1.  Measurement	
  ßà	
  Review	
  of	
  requirements	
  quality	
  (implicit)	
  
2.  Rela4on	
  between	
  measurement	
  results	
  and	
  qualita4ve	
  findings	
  
3.  Preliminary	
  es4ma4on	
  model	
  obtained	
  
4.  Acceptable	
  level	
  of	
  effort	
  for	
  a	
  small	
  organiza4on	
  
Discussion	
  
24© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
²  Adapt	
  PEM	
  from	
  ISO/IEC	
  14598-­‐5	
  to	
  its	
  updated	
  SQuaRE	
  
version	
  (i.e.	
  ISO/IEC	
  25000)	
  	
  
²  Verify	
  compliance	
  of	
  PEM	
  with	
  ISO/IEC	
  15504	
  
²  Include	
  a	
  customer	
  sa4sfac4on	
  survey	
  to	
  provide	
  a	
  360˚	
  
view	
  of	
  the	
  SW	
  process	
  being	
  evaluated	
  
Future	
  Work	
  
15-­‐10-­‐12	
  
13	
  
25
26
15-­‐10-­‐12	
  
14	
  
Thank you !
27

More Related Content

PDF
DMAIC Recap - ESTIEM Lean Six Sigma Green Belt Course
PPT
Profolio
PDF
8D analysis presentation
PDF
Front-End Loading in the Oil and Gas Industry
PPTX
PAPER_CODE__IE12
PPT
Project Quality Management
PDF
How A Single Black Belt Project Jump Starts a Successful Lean Six Sigma Effort
PDF
The New Face of Quality in Office and Service Environments
DMAIC Recap - ESTIEM Lean Six Sigma Green Belt Course
Profolio
8D analysis presentation
Front-End Loading in the Oil and Gas Industry
PAPER_CODE__IE12
Project Quality Management
How A Single Black Belt Project Jump Starts a Successful Lean Six Sigma Effort
The New Face of Quality in Office and Service Environments

What's hot (18)

PPT
Episode 23 : PROJECT TIME MANAGEMENT
PDF
PPT
360logica_Testing_center_of_excellence
PPT
PPT
Phase gate review development model august 8 2017 - dave litwiller
PPT
QM-007-Design for 6 sigma
PDF
LeanSigma for IW 2005
PPTX
Project management
PPTX
Quality Planning
PDF
Basic SPC Training
PPTX
PMP Preparation - 08 Quality Management
DOCX
CustomerCopy (1)
DOC
Luis Navarro Resume
PPT
Process Audit --VDA
PPT
15 Deliv template
PDF
Gray areas of vda 6.3 process auditors
PPT
Process auditing as per VDA 6.3
PPT
Implementing lean Six sigma
Episode 23 : PROJECT TIME MANAGEMENT
360logica_Testing_center_of_excellence
Phase gate review development model august 8 2017 - dave litwiller
QM-007-Design for 6 sigma
LeanSigma for IW 2005
Project management
Quality Planning
Basic SPC Training
PMP Preparation - 08 Quality Management
CustomerCopy (1)
Luis Navarro Resume
Process Audit --VDA
15 Deliv template
Gray areas of vda 6.3 process auditors
Process auditing as per VDA 6.3
Implementing lean Six sigma
Ad

Similar to Combining qualitative and quantitative software process evaluation sylvie trudel (20)

PPTX
Imws2014 requirements engineering quality revealed (sylvie trudel - monette)
PPTX
SPM_presentation.pptx
PPTX
Software assessment and audit
PPT
Cba Ipi Cmm Intro Session 1.1
PPT
Ch+14
PPT
CMM Capability maturity model for engg.ppt
PPT
PPTX
Beit 381 se lec 14 - 35 - 12 mar21 - sqa - iso and cmm
PPT
Capability Maturity Model
PPT
9.process improvement chapter 9
PPT
SEI Capability Maturity Model.ppt Software Engineering
PPT
Can CMMI Deliver On Its Promises In a Multi-Model Environment?
PPT
Combined Template.ppt
PPT
Capability Maturity Model (CMM) in Software Engineering
PPT
eUnit 2 software process model
PPT
Six sigma green belt project template
PPT
Cmm
PPTX
personal-and-team-process-models.pptx in
PPTX
personal-and-team-process-models.pptx po
Imws2014 requirements engineering quality revealed (sylvie trudel - monette)
SPM_presentation.pptx
Software assessment and audit
Cba Ipi Cmm Intro Session 1.1
Ch+14
CMM Capability maturity model for engg.ppt
Beit 381 se lec 14 - 35 - 12 mar21 - sqa - iso and cmm
Capability Maturity Model
9.process improvement chapter 9
SEI Capability Maturity Model.ppt Software Engineering
Can CMMI Deliver On Its Promises In a Multi-Model Environment?
Combined Template.ppt
Capability Maturity Model (CMM) in Software Engineering
eUnit 2 software process model
Six sigma green belt project template
Cmm
personal-and-team-process-models.pptx in
personal-and-team-process-models.pptx po
Ad

More from IWSM Mensura (20)

PDF
When do software issues get reported in large open source software - Rakesh Rana
PDF
Accounting for non functional and project requirements - cosmic and ifpug dev...
PPTX
Workshop early or rapid cosmic fsm - Frank Vogelezang
PDF
Tips and hints for an effective cosmic learning process gained from industria...
PDF
The significance of ifpug base functionality types in effort estimation cig...
PDF
The effects of duration based moving windows with estimation by analogy - sou...
PDF
Software or service that's the question luigi buglione
PDF
Requirements effort estimation state of the practice - mohamad kassab
PDF
Quantitative functional change impact analysis in activity diagrams a cosmi...
PDF
Practical usage of fpa and automatic code review piotr popovski
PDF
Performance measurement of agile teams harold van heeringen
PDF
Measurement as-a-service a new way of organizing metrics programs - wilhelm m...
PDF
Improving the cosmic approximate sizing using the fuzzy logic epcu model al...
PDF
Functional size measurement for processor load estimation hassan soubra
PDF
From software to service sustainability a still broader perspective - luigi...
PDF
Estimation and measuring of software size within the atos gobal delivery plat...
PDF
Energy wasting rate jérôme rocheteau
PDF
Do we measure functional size or do we count thomas fehlmann
PDF
Designing an unobtrusive analytics framework for monitoring java applications...
PDF
Automatic measurements of use cases with cosmic thomas fehlmann
When do software issues get reported in large open source software - Rakesh Rana
Accounting for non functional and project requirements - cosmic and ifpug dev...
Workshop early or rapid cosmic fsm - Frank Vogelezang
Tips and hints for an effective cosmic learning process gained from industria...
The significance of ifpug base functionality types in effort estimation cig...
The effects of duration based moving windows with estimation by analogy - sou...
Software or service that's the question luigi buglione
Requirements effort estimation state of the practice - mohamad kassab
Quantitative functional change impact analysis in activity diagrams a cosmi...
Practical usage of fpa and automatic code review piotr popovski
Performance measurement of agile teams harold van heeringen
Measurement as-a-service a new way of organizing metrics programs - wilhelm m...
Improving the cosmic approximate sizing using the fuzzy logic epcu model al...
Functional size measurement for processor load estimation hassan soubra
From software to service sustainability a still broader perspective - luigi...
Estimation and measuring of software size within the atos gobal delivery plat...
Energy wasting rate jérôme rocheteau
Do we measure functional size or do we count thomas fehlmann
Designing an unobtrusive analytics framework for monitoring java applications...
Automatic measurements of use cases with cosmic thomas fehlmann

Recently uploaded (20)

PPTX
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
PPTX
Agentic AI : A Practical Guide. Undersating, Implementing and Scaling Autono...
PDF
Audit Checklist Design Aligning with ISO, IATF, and Industry Standards — Omne...
PDF
top salesforce developer skills in 2025.pdf
PDF
Understanding Forklifts - TECH EHS Solution
PDF
Nekopoi APK 2025 free lastest update
PPTX
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
PDF
How to Migrate SBCGlobal Email to Yahoo Easily
PPTX
CHAPTER 2 - PM Management and IT Context
PDF
How to Choose the Right IT Partner for Your Business in Malaysia
PDF
EN-Survey-Report-SAP-LeanIX-EA-Insights-2025.pdf
PPTX
ai tools demonstartion for schools and inter college
PDF
Upgrade and Innovation Strategies for SAP ERP Customers
PPTX
L1 - Introduction to python Backend.pptx
PDF
Claude Code: Everyone is a 10x Developer - A Comprehensive AI-Powered CLI Tool
PDF
wealthsignaloriginal-com-DS-text-... (1).pdf
PPTX
Transform Your Business with a Software ERP System
PPTX
Reimagine Home Health with the Power of Agentic AI​
PDF
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
PDF
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
Agentic AI : A Practical Guide. Undersating, Implementing and Scaling Autono...
Audit Checklist Design Aligning with ISO, IATF, and Industry Standards — Omne...
top salesforce developer skills in 2025.pdf
Understanding Forklifts - TECH EHS Solution
Nekopoi APK 2025 free lastest update
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
How to Migrate SBCGlobal Email to Yahoo Easily
CHAPTER 2 - PM Management and IT Context
How to Choose the Right IT Partner for Your Business in Malaysia
EN-Survey-Report-SAP-LeanIX-EA-Insights-2025.pdf
ai tools demonstartion for schools and inter college
Upgrade and Innovation Strategies for SAP ERP Customers
L1 - Introduction to python Backend.pptx
Claude Code: Everyone is a 10x Developer - A Comprehensive AI-Powered CLI Tool
wealthsignaloriginal-com-DS-text-... (1).pdf
Transform Your Business with a Software ERP System
Reimagine Home Health with the Power of Agentic AI​
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...

Combining qualitative and quantitative software process evaluation sylvie trudel

  • 1. 15-­‐10-­‐12   1   COMBINING QUALITATIVE AND QUANTITATIVE SOFTWARE PROCESS EVALUATION: A PROPOSED APPROACH Sylvie Trudel Dept. of Computer Science UQAM Alex Turcotte CEFTI Université de Sherbrooke 2© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW ²  So)ware  process  evalua4on  (SPE)   ²  Quan4fying  the  so)ware  process  efficiency   ²  Combining  qualita4ve/quan4ta4ve  SPE   ²  Field  trial:  financial  trading  domain   ²  Discussion   ²  Future  work   Content  
  • 2. 15-­‐10-­‐12   2   SOFTWARE PROCESS EVALUATION (SPE) INTRODUCTION: WHY, WHO, HOW & HOW MUCH © TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW 4 •  Why: competition, market demands, costs –  Symptoms: budget & schedule overruns (especially on larger projects), poor quality, unsatisfied customers/ users •  How: evaluate against best practice models (e.g. CMMI) •  Who: –  Large organizations vs Small organizations: same methods? © TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW Understanding  what  needs  to  be  improved  
  • 3. 15-­‐10-­‐12   3   5© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW CMMI  Maturity  Levels   1. Initial Unpredictable and Poorly Controlled 4. Quantitatively managed Controlled and Measured Process Predictable Process Process and Product Quality 5. Optimizing Emphasis on Process Improvement Continuous Improvement Process Change Management 2. Managed Reactive Project Defined Process Project Management Structured Process 3. Defined Proactive Organizational Defined Process Standard and Consistent Process Integrated Engineering Process 6© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW Heroes1. Initial Level Emphasis 5. Optimizing Quantitative Management Basic Project Management Process Areas Decision Analysis and Resolution Risk Management Integrated Project Management for IPPD Organizational Training Organizational Process Definition Organizational Process Focus Validation Verification Product Integration Technical Solution Requirements Development Configuration Management Process and Product Quality Assurance Measurement and Analysis Supplier Agreement Management Project Monitoring and Control Project Planning Requirements Management Risks and Rework Organizational Process Performance Quantitative Project Management Organizational Innovation and Deployment Causal Analysis and Resolution Continuously Improving Processes Processes Standardization Productivity and quality Results 2. Managed   3. Defined 4. Quantitatively Managed CMMI-­‐Dev  Overview  «Staged»  
  • 4. 15-­‐10-­‐12   4   Process  Evalua9on  Methods 7© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW Method   Targeted   organiza9ons   Cer9fied  lead   appraiser   Ra9ng   Rela9ve  cost   SCAMPI  ‘A’   Large   þ   þ   $$$$$   SCAMPI  ‘B’   Large   þ   -­‐   $$$   SCAMPI  ‘C’   Large  &  Medium   -­‐   -­‐   $$   ISO/IEC  29110-­‐3   Small   -­‐   -­‐   $   PEM   Small  &  Medium   -­‐   -­‐   $   8© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW Original  PEM  Method  (1  of  3) From ISO 14598-5 1- Analysis of Evaluation Requirements 2- Specification of the Evaluation 3- Design of the Evaluation •  Evaluation request •  Context: projects requirements, evaluation requirements •  Statement of work template •  SOW (draft) •  CMMI Models •  Process Area Selection Guide •  Context •  Statement of work (draft): ü  Evaluation objectives ü  Confidentiality agreement ü  Assumptions and constraints •  SOW (final) Scope: ü  Selected Projects ü  Selected Model ü  Selected Process Areas •  Client contract/agreement •  SOW •  Evaluation Plan Template •  Evaluation Method •  List of typical questions •  Evaluation Plan •  List of Selected Questions
  • 5. 15-­‐10-­‐12   5   9© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW Original  PEM  Method  (2  of  3) From ISO 14598-5 (continued) 4- Interviewing Project Participants and Reviewing Project Documentation 5- Reviewing and Reporting Observations 6- Conclusion of the Evaluation Execution of the Evaluation •  Interviews and Document reviews observations •  Findings (draft) •  Evaluator’s Checklist (started) •  Evaluation Plan •  List of Selected Questions •  List of documents to evaluate •  Evaluator’s Checklist (new) •  Interview Guideline •  Interviews and Document reviews observations •  Wording of Findings Guideline •  Findings (draft) •  Finding Selection Guide •  Evaluation Report Template •  Evaluator’s Checklist (updated) •  Evaluation Report (draft) •  Findings (final and complete) •  Evaluation Report (draft) •  Requesters Comments •  Evaluation Report (final) •  Evaluator’s Checklist (final) •  Interviews and Document reviews observations (destroyed) 10© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW Original  PEM  Method  (3  of  3) Value Added Step 7- Planning of Improvement Actions (optional) •  Action plan, including: ü  Related findings ü  Activities, deliverables, tools ü  Estimates and schedule ü  Stakeholders involvement ü  Return on investment •  Evaluation Report (final) •  Action plan template
  • 6. 15-­‐10-­‐12   6   QUANTIFYING THE SOFTWARE PROCESS EFFICIENCY HOW TO MEASURE ASPECTS OF THE SOFTWARE PROCESS ? © TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW So)ware  to  measure   Boundary   COSMIC : Overview Func4onal   process  1   Persistent   storage   Write  (W)   Read  (R)   Func4onal   process  2   Func4onal   process  n   …   Human  users     or   IO   Hardware   Entry  (E)   Exit  (X)   or   Func4onal   Users   or   Other   systems   Data   ‘Interfaces’   ‘Infrastructures’   12© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
  • 7. 15-­‐10-­‐12   7   13© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW Applying  COSMIC:  3  phases   FUR Chapter 2 Measurement Strategy Generic Software Model Chapter 3 Mapping Phase FUR in the form of the Generic Software Model Chapter 4 Measurement Phase Functional size of the software in units of CFP Definition of each piece of software to be measured and of the required measurement Input from measurement sponsor Software Context Model FUR Poten4al  measurement   purpose:  Quan4fy  the  SW   process  produc4vity  rate   Possible  side  effect:  Iden4fy   defects  in  Func4onal   Requirements!   COMBINING QUALITATIVE AND QUANTITATIVE SPE WHY AND HOW: A PROPOSED APPROACH… © TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
  • 8. 15-­‐10-­‐12   8   15© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW ²  Mo4va4ons:     Ø  Bring  a  broader  insight  on  the  SW  Process   Ø  Overlooked  issues  related  to  requirements  engineering   Ø  Provide  a  SW  process  produc4vity  rate   ²  Hypothesis:  Mutual  influences  between  measurement   results  and  qualita4ve  findings   Ø  Requires  to  be  combined  during  execu4on  of  the  evalua4on!   Combining  Qualita9ve  and  Quan9ta9ve  SPE   16© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW Proposed  approach 4a- Interviewing Project Participants and Reviewing Project Documentation 5- Reviewing and Reporting Observations Execution of the Evaluation •  Interviews and Document reviews observations •  Findings (draft) •  Evaluator’s Checklist (started) •  Evaluation Plan •  List of Selected Questions •  List of documents to evaluate •  Evaluator’s Checklist (new) •  Interview Guideline •  Interviews and Document reviews observations •  Wording of Findings Guideline •  Findings (draft) •  Finding Selection Guide •  Evaluation Report Template •  Evaluator’s Checklist (updated) •  Evaluation Report (draft) •  Findings (final and complete) 4b- Measuring software functional size and process efficiency •  Selected FSM Method •  FUR from selected projects •  Effort from selected projects •  Quality rating guidelines •  Functional Size •  Efficiency data (analysed) •  Requirements defects (identified) •  Findings (updated)
  • 9. 15-­‐10-­‐12   9   FIELD TRIAL: FINANCIAL TRADING DOMAIN WHAT WERE THE RESULTS? © TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW 18© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW 1.  Analysis:  100  staff  members  (most  projects  <  10)  à  trading  SW   for  deriva4ve  markets,  documented  process,  percep4on  of  high   costs  from  clients  à  Budget  overruns  on  larger  projects   2.  Specifica4on:  1  Business  domain  (2  major  clients),  CMMI  2  &  3   except  SAM-­‐OT-­‐DAR  à  6  projects  selected  (3  large  [500+  staff-­‐ days],  3  regular)   3.  Design:  Plan  to  interview  21  par4cipants  (at  least  2  per  role)  à   ques4onnaire  developed,  2-­‐page  email  instead  of  kick-­‐off   Organiza9on  &  Evalua9on  Characteris9cs  (1/2)  
  • 10. 15-­‐10-­‐12   10   19© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW 4.  a.  Interviews:  18  out  21  par4cipants  interviewed,  confiden4ality   assured  à  effort  =  50  staff-­‐hours   4.      b.  Measurement:  Done  while  reviewing  requirements  &  projects   documenta4on  à  ambigui4es  led  to  examining  SW  code;  results   verified  by  cer4fied  measurer,  then  analysis  done   5.  Review  &  Report:  Valida4on  of  findings  by  par4cipants   6.  Conclusion:  Final  report  combines  results  à  RE  inconsistencies   raised,  recommenda4ons  proposed  for  improvement   Organiza9on  &  Evalua9on  Characteris9cs  (2/2)   20© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW Ini4al  produc4vity  model  with  func4onal  size  and  effort  from  all  six  projects  
  • 11. 15-­‐10-­‐12   11   21© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW Comparison  of  func4onal  size  and  effort  for  the  remaining  four  projects   DISCUSSION AND FUTURE WORK WHAT DID WE LEARN AND WHAT’S NEXT? © TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW
  • 12. 15-­‐10-­‐12   12   23© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW ²  Adapta4on  of  PEM  to  combine  qualita4ve  (CMMI)  and   quan4ta4ve  (COSMIC)  evalua4on  of  the  SW  Process  and  the  SW   Requirements   ²  Several  benefits  from  the  field  trial:   1.  Measurement  ßà  Review  of  requirements  quality  (implicit)   2.  Rela4on  between  measurement  results  and  qualita4ve  findings   3.  Preliminary  es4ma4on  model  obtained   4.  Acceptable  level  of  effort  for  a  small  organiza4on   Discussion   24© TRUDEL & TURCOTTE 2015 IWSM MENSURA 2015 - CRACOW ²  Adapt  PEM  from  ISO/IEC  14598-­‐5  to  its  updated  SQuaRE   version  (i.e.  ISO/IEC  25000)     ²  Verify  compliance  of  PEM  with  ISO/IEC  15504   ²  Include  a  customer  sa4sfac4on  survey  to  provide  a  360˚   view  of  the  SW  process  being  evaluated   Future  Work