SlideShare a Scribd company logo
There is No Mystery   (From “Methodologies as Swimsuits” by Alistair Cockburn) Projects  come in different shapes and sizes.  Methodologies  come in different shapes and sizes. People  come in different shapes and sizes.  The  people  may not fit the  methodology . The  people  are the more important.
(Software) Development  Process Improvement By:  Joshua Klein Agile More about the lecturer
You dream of being more efficient
You  must  be more efficient!
But you can’t afford the improvement methodologies CMMI ISO Competitive
Process Improvement should be: Evolutionary Scalable Pragmatic Flexible Motivating Creative Assertive Agile Step by step, check & go on Grows or shrinks per case Always refer to constraints Change methods / timescale Persistently feed the sponsorship Invent original solutions Obstinately coach
The motivation cycle  Openness Change Improved process Motivation Success Failure Poor process Sponsorship
Success criteria Fewer complaints on stress More focus on engineering, less on collecting and checking information Software more effective Lower frequency of emergencies Less misses (e.g. last minute missing or failing feature) Better compliance to plans Better management’s control (e.g. outcomes more expectable) Less redoing More efficient testing (it may paradoxically result in more bugs detected) More consistency on the development chain Reduced and easier turnover Easier changes Smoother communication with other Intel groups (e.g. US)   The following criteria are used to determine how successful the SPI initiative has been. They address how effective the process used for SPI has been.
Conditions for Change If D * V * F > R then “change will occur” where D = Dissatisfaction with status quo V = Vision of a future state F = First steps towards the vision R = Resistance to change Actually: Permission to try
The “Improvement Leader” The Improvement Leader’s role expands the Lead Assessor’s role from inspection to execution The internal "assessment" is integrated with the improvement process The Improvement Leader leads the improvement activities that derives from the assessment The Improvement Leader reports to the management The Improvement Leader show improvement results at short intervals
The Improvement Leader’s role Improvement Implementation Improvement Plan Assessment
Open Assessment Internal assessment, not disclosed No checklist but flexible guidelines Opportunity to hear the practitioners Enlightens from numerous sides Brings everyone involved Reveals unexpected practice effects Spreads motivation
CMM (with some CMMI) Individual Heroes 1 Initial / Incomplete Requirements Management Software Project Planning SW Project Tracking & Oversight Measurement and Analysis Software Quality Assurance Software Configuration Management Project Management 2 Repeatable / Managed Organization Process Focus Organization Process Definition Training Program Integrated Software Management Software Product Engineering Risk Management Decision Analysis and Resolution Peer Reviews Technical Engineering Processes 3 Defined Quantitative Process Management Software Quality Management Measurement & Process Control 4 Quantitatively Managed Defect Prevention Technology Change Management Process Change Management Continuous Improvement 5 Optimizing Key Process Areas Focus Level
Open benchmark Where are we standing now ? Where could we be ? How to get there ?
Objectivity and normalization  Outputs of  Open  Assessment include: Subjective feelings Personal frustrations Actual process clues from 1 point of view In order to reach objective findings: Compare interviews data, find similarities Ask clarification questions Analyze until finding the roots
Plan Prioritization Broad acceptance Available solutions Short term Criticality 08 / 09 15 / 09 22 / 09 29 / 09 06 / 10 13 / 10 20 / 10 27 / 10 03 / 11 10 / 11 17 / 11 24 / 11 01 / 12 08 / September October November December Assessment findings
SW manufacturing processes Supporting process Production process Concept Design Implementation Validation Maintenance Disposal Requirements Documentation Configuration management Quality assurance Organizational process Management Infrastructure Improvement Human resources Planning Product Inspired from ISO 15288
Improvement mini-cycle  KPA assessment Methods & Procedures Pilot Training Deployment Assimilation  I n f r a s t r u c t u r e
Focused assessments
The Roadmap Require- ments Design Developer  test Coding Practices Require- ments Design Require -ments Design Require- ments Design Require- ments Design SPI Planning Require- ments Design SPI Bodies Training Reasess- ment Tools 1 Assimilation Deployment Training Methods & Procedures Pilot Assessment Infrastructure J u l y 0 2 A u g u s t 0 2 S e p t e m b e r 0 2 N o v e m b e r 0 2 D e c e m b e r 0 2 J a n u a r y 0 3 F e b r u a r y 0 3 O c t o b e r 0 2 Measu- rement Measu- rement Debrie- fing March 03 Developer  test Developer  test Developer  test Developer  test Developer  test Coding Practices Coding Practices Coding Practices Coding Practices Coding Practices Ramp up MSC Assessment MSC Assessment MSC Assessment
Software Product Engineering The purpose of Software Product Engineering is to consistently perform a well-defined engineering process that integrates all the software engineering activities to produce correct, consistent software products effectively and efficiently. Software Product Engineering describes the technical activities of the project, e.g., requirements analysis, design, code, and test. Knowledge areas: Software Requirements Engineering Software Design Software Coding Software Testing Software Operation and Maintenance http://computing.db.erau.edu/SEERS/2-x.html#1
SEPG charter The LAD SW Software Engineering Process Group (SEPG) is the focal point for software process improvement activities. The SEPG coordinate the implementation of improvement plans, and track the effectiveness of these efforts. Example: The SEPG coordinates, tracks and promote,  the SEPG does not develop technical guidelines but asks for experts .
SPI Organization Policy + resources Reports each ~ 5 weeks + discussion LAD SW  GL LAD SW Staff LAD SW  SPI SEPG Pilot(s) Procedures & methods SQA Manages Controls & tracks Consults, coaches, writes reports, etc… Intel Quality Bodies
The Coding Rules Intranet page
SPI problem solving
Conclusion You can do it You must do it, engineer or manager! The Process Improvement methodologists did a great work; we just have to adapt the models Unfortunately, “Process Improvement” is still unknown around. It will soon be popular like compilers. Start tomorrow  
Questions & Answers Q & A Joshua Klein Consultant in Process Improvement of Software / Systems Development 6 Michlin street, Jerusalem 96430, Israel Cell: 972-55-665247 Phone:972-2-6434290 [email_address] http://www.yedidia.net/jklein
The End
The Lecturer Joshua Klein Consultant in Process Improvement of Software / Systems Development 6 Michlin street, Jerusalem 96430, Israel Cell: 972-55-665247 Phone:972-2-6434290 [email_address] http://www.yedidia.net/jklein
The Lecturer (ctd.) 2002–2003 Process Improvement Consultant (Intel) 1993–2001 NDS Process Group Leader 1990–1992 Sapiens Methodology Team Leader 1988–1990 Sapiens France Directeur Technique 1980-1988 Ministry of Housing DBA 2002 Honourcode Eng. of complex systems 1998 Standards Institute Lead quality auditor 1987-1988 Inst. of Productivity Project Manager 1972-1975 Hebrew Univ. of J-lem Mathematics, Physics, Computers 1969-1971 Yeshivat Hanegev Talmud studies Born in Neuilly, France, 1951 W o r k Studies
 
Backup
End of backup

More Related Content

PPT
Zachman Framework
PDF
Capacitor 1.0 launch
PDF
Real Life Clean Architecture
PPTX
Micro-services architecture
PDF
導入 Flutter 前你應該知道的事
PDF
Architecture for Flow w/ Wardley Mapping, Domain-Driven Design, and Team Topo...
PPT
Security consultant kpi
Zachman Framework
Capacitor 1.0 launch
Real Life Clean Architecture
Micro-services architecture
導入 Flutter 前你應該知道的事
Architecture for Flow w/ Wardley Mapping, Domain-Driven Design, and Team Topo...
Security consultant kpi

What's hot (20)

PDF
디지털 전환이 가져올 교육의 변화와 인공지능의 역할 (2021년 마지막 업데이트)
PDF
Patterns of resilience
PDF
Is Platform Engineering the new Ops?
PDF
REST in AEM by Roy Fielding
PDF
Bizweb Microservices Architecture
PDF
Tweaking the interactive grid
PPTX
1. 아키텍쳐 설계 프로세스
PPT
Agile: A guide to creating a project burndown chart
PPT
Unpacking TOGAF's 'Phase B': Business Transformation, Business Architecture a...
PDF
Enable Authentication and Authorization with Azure Active Directory and Sprin...
PDF
A Structured Approach to Requirements Analysis (lecture slides)
PDF
25 упражнений для тех кому больше всех надо.
PPTX
Introduction to Distributed Tracing
PDF
Arquitetura Hexagonal: uma introdução
PDF
BRM and Enterprise Architects in PWC
PDF
Microservices Workshop - Craft Conference
PPTX
Getting the Most Out of Hyperion Planning: Predictive Planning: What? When? How?
PPTX
Define an IT Strategy and Roadmap
PDF
An Introduction to IT Management with COBIT 2019
디지털 전환이 가져올 교육의 변화와 인공지능의 역할 (2021년 마지막 업데이트)
Patterns of resilience
Is Platform Engineering the new Ops?
REST in AEM by Roy Fielding
Bizweb Microservices Architecture
Tweaking the interactive grid
1. 아키텍쳐 설계 프로세스
Agile: A guide to creating a project burndown chart
Unpacking TOGAF's 'Phase B': Business Transformation, Business Architecture a...
Enable Authentication and Authorization with Azure Active Directory and Sprin...
A Structured Approach to Requirements Analysis (lecture slides)
25 упражнений для тех кому больше всех надо.
Introduction to Distributed Tracing
Arquitetura Hexagonal: uma introdução
BRM and Enterprise Architects in PWC
Microservices Workshop - Craft Conference
Getting the Most Out of Hyperion Planning: Predictive Planning: What? When? How?
Define an IT Strategy and Roadmap
An Introduction to IT Management with COBIT 2019
Ad

Similar to Agile Software Process Improvement (20)

PDF
Introduction to Software Process
PPTX
Hihn.jarius
PPTX
РАМЕЛЛА БАСЕНКО «Огляд підходів та моделей покращення процесів Improvement ...
PPT
9.process improvement chapter 9
PDF
Lessons learned from an ISO/IEC 15504 SPI Programme in a Company
PPTX
SOFTWARE.pptx
PDF
Focus your investments in innovations
PPTX
Software Process Improvement - RKREDDY
PPS
Software Quality Assurance
PDF
Bush.stewart
PPT
02 processraison
PPT
Software Process Improvement.ppt
PPTX
Lecture 01 Software Process Maturity.pptx
PPT
Process Improvement in Software Engineering SE25
PPTX
Sally godfreyheatherrarick
PPT
CMM Capability maturity model for engg.ppt
PPT
PPT
eUnit 2 software process model
PDF
PA2557_SQM_Lecture3 - Process Improvement.pdf
Introduction to Software Process
Hihn.jarius
РАМЕЛЛА БАСЕНКО «Огляд підходів та моделей покращення процесів Improvement ...
9.process improvement chapter 9
Lessons learned from an ISO/IEC 15504 SPI Programme in a Company
SOFTWARE.pptx
Focus your investments in innovations
Software Process Improvement - RKREDDY
Software Quality Assurance
Bush.stewart
02 processraison
Software Process Improvement.ppt
Lecture 01 Software Process Maturity.pptx
Process Improvement in Software Engineering SE25
Sally godfreyheatherrarick
CMM Capability maturity model for engg.ppt
eUnit 2 software process model
PA2557_SQM_Lecture3 - Process Improvement.pdf
Ad

Agile Software Process Improvement

  • 1. There is No Mystery (From “Methodologies as Swimsuits” by Alistair Cockburn) Projects come in different shapes and sizes. Methodologies come in different shapes and sizes. People come in different shapes and sizes. The people may not fit the methodology . The people are the more important.
  • 2. (Software) Development Process Improvement By: Joshua Klein Agile More about the lecturer
  • 3. You dream of being more efficient
  • 4. You must be more efficient!
  • 5. But you can’t afford the improvement methodologies CMMI ISO Competitive
  • 6. Process Improvement should be: Evolutionary Scalable Pragmatic Flexible Motivating Creative Assertive Agile Step by step, check & go on Grows or shrinks per case Always refer to constraints Change methods / timescale Persistently feed the sponsorship Invent original solutions Obstinately coach
  • 7. The motivation cycle Openness Change Improved process Motivation Success Failure Poor process Sponsorship
  • 8. Success criteria Fewer complaints on stress More focus on engineering, less on collecting and checking information Software more effective Lower frequency of emergencies Less misses (e.g. last minute missing or failing feature) Better compliance to plans Better management’s control (e.g. outcomes more expectable) Less redoing More efficient testing (it may paradoxically result in more bugs detected) More consistency on the development chain Reduced and easier turnover Easier changes Smoother communication with other Intel groups (e.g. US) The following criteria are used to determine how successful the SPI initiative has been. They address how effective the process used for SPI has been.
  • 9. Conditions for Change If D * V * F > R then “change will occur” where D = Dissatisfaction with status quo V = Vision of a future state F = First steps towards the vision R = Resistance to change Actually: Permission to try
  • 10. The “Improvement Leader” The Improvement Leader’s role expands the Lead Assessor’s role from inspection to execution The internal "assessment" is integrated with the improvement process The Improvement Leader leads the improvement activities that derives from the assessment The Improvement Leader reports to the management The Improvement Leader show improvement results at short intervals
  • 11. The Improvement Leader’s role Improvement Implementation Improvement Plan Assessment
  • 12. Open Assessment Internal assessment, not disclosed No checklist but flexible guidelines Opportunity to hear the practitioners Enlightens from numerous sides Brings everyone involved Reveals unexpected practice effects Spreads motivation
  • 13. CMM (with some CMMI) Individual Heroes 1 Initial / Incomplete Requirements Management Software Project Planning SW Project Tracking & Oversight Measurement and Analysis Software Quality Assurance Software Configuration Management Project Management 2 Repeatable / Managed Organization Process Focus Organization Process Definition Training Program Integrated Software Management Software Product Engineering Risk Management Decision Analysis and Resolution Peer Reviews Technical Engineering Processes 3 Defined Quantitative Process Management Software Quality Management Measurement & Process Control 4 Quantitatively Managed Defect Prevention Technology Change Management Process Change Management Continuous Improvement 5 Optimizing Key Process Areas Focus Level
  • 14. Open benchmark Where are we standing now ? Where could we be ? How to get there ?
  • 15. Objectivity and normalization Outputs of Open Assessment include: Subjective feelings Personal frustrations Actual process clues from 1 point of view In order to reach objective findings: Compare interviews data, find similarities Ask clarification questions Analyze until finding the roots
  • 16. Plan Prioritization Broad acceptance Available solutions Short term Criticality 08 / 09 15 / 09 22 / 09 29 / 09 06 / 10 13 / 10 20 / 10 27 / 10 03 / 11 10 / 11 17 / 11 24 / 11 01 / 12 08 / September October November December Assessment findings
  • 17. SW manufacturing processes Supporting process Production process Concept Design Implementation Validation Maintenance Disposal Requirements Documentation Configuration management Quality assurance Organizational process Management Infrastructure Improvement Human resources Planning Product Inspired from ISO 15288
  • 18. Improvement mini-cycle KPA assessment Methods & Procedures Pilot Training Deployment Assimilation I n f r a s t r u c t u r e
  • 20. The Roadmap Require- ments Design Developer test Coding Practices Require- ments Design Require -ments Design Require- ments Design Require- ments Design SPI Planning Require- ments Design SPI Bodies Training Reasess- ment Tools 1 Assimilation Deployment Training Methods & Procedures Pilot Assessment Infrastructure J u l y 0 2 A u g u s t 0 2 S e p t e m b e r 0 2 N o v e m b e r 0 2 D e c e m b e r 0 2 J a n u a r y 0 3 F e b r u a r y 0 3 O c t o b e r 0 2 Measu- rement Measu- rement Debrie- fing March 03 Developer test Developer test Developer test Developer test Developer test Coding Practices Coding Practices Coding Practices Coding Practices Coding Practices Ramp up MSC Assessment MSC Assessment MSC Assessment
  • 21. Software Product Engineering The purpose of Software Product Engineering is to consistently perform a well-defined engineering process that integrates all the software engineering activities to produce correct, consistent software products effectively and efficiently. Software Product Engineering describes the technical activities of the project, e.g., requirements analysis, design, code, and test. Knowledge areas: Software Requirements Engineering Software Design Software Coding Software Testing Software Operation and Maintenance http://computing.db.erau.edu/SEERS/2-x.html#1
  • 22. SEPG charter The LAD SW Software Engineering Process Group (SEPG) is the focal point for software process improvement activities. The SEPG coordinate the implementation of improvement plans, and track the effectiveness of these efforts. Example: The SEPG coordinates, tracks and promote, the SEPG does not develop technical guidelines but asks for experts .
  • 23. SPI Organization Policy + resources Reports each ~ 5 weeks + discussion LAD SW GL LAD SW Staff LAD SW SPI SEPG Pilot(s) Procedures & methods SQA Manages Controls & tracks Consults, coaches, writes reports, etc… Intel Quality Bodies
  • 24. The Coding Rules Intranet page
  • 26. Conclusion You can do it You must do it, engineer or manager! The Process Improvement methodologists did a great work; we just have to adapt the models Unfortunately, “Process Improvement” is still unknown around. It will soon be popular like compilers. Start tomorrow 
  • 27. Questions & Answers Q & A Joshua Klein Consultant in Process Improvement of Software / Systems Development 6 Michlin street, Jerusalem 96430, Israel Cell: 972-55-665247 Phone:972-2-6434290 [email_address] http://www.yedidia.net/jklein
  • 29. The Lecturer Joshua Klein Consultant in Process Improvement of Software / Systems Development 6 Michlin street, Jerusalem 96430, Israel Cell: 972-55-665247 Phone:972-2-6434290 [email_address] http://www.yedidia.net/jklein
  • 30. The Lecturer (ctd.) 2002–2003 Process Improvement Consultant (Intel) 1993–2001 NDS Process Group Leader 1990–1992 Sapiens Methodology Team Leader 1988–1990 Sapiens France Directeur Technique 1980-1988 Ministry of Housing DBA 2002 Honourcode Eng. of complex systems 1998 Standards Institute Lead quality auditor 1987-1988 Inst. of Productivity Project Manager 1972-1975 Hebrew Univ. of J-lem Mathematics, Physics, Computers 1969-1971 Yeshivat Hanegev Talmud studies Born in Neuilly, France, 1951 W o r k Studies
  • 31.  

Editor's Notes

  • #6: Today, the terms Software Process Improvement (SPI) and Software Engineering (SE) have an academic, rigid and complex ring to them. What's more, software developers have the idea that these methodologies are simply too time-consuming and expensive to implement for small, "pragmatic" enterprises. For SPI and SE to gain support in these settings, process improvement efforts must rapidly and continuously lead to tangible results in terms of software production.
  • #7: Contrary to common belief, small groups and small projects can and should benefit from Software Process Improvement (SPI). Agile and evolutionary approaches make SPI scalable to all projects, regardless of scope. This paper presents a pragmatic method of process improvement that has emerged from the author’s experience over the last five years in three software-development groups—each of a different size and in a different company. The methodology described is applicable to all aspects of SPI—technological, organizational and behavioral—thanks to its flexible and creative approach.
  • #8: The Agile [1] Software Process Improvement (ASPI) methodology brings about improvement in small cycles. The cycle starts with a minimal investment that yields measurable results, giving those involved confidence in the methodology, which, in turn, fuels the motivation for another improvement cycle. Every improvement achieved demonstrates the methodology's immediate usefulness to the project and inspires ongoing software process enhancement. This is an adaptive, pragmatic, fast-track approach to achieving process improvement in small steps. Gilb has shown the approach to be successful in project management.[2] Generally, after a few bad experiences, the threat of failure becomes stronger than the threat posed by change. Innumerable startups that initially underestimated software testing experienced painful failures, which consequently lead to a change in their testing policy. A disk crash causing costly data loss is a sure trigger for the adoption of configuration management. Unfortunately, we are more prone to learn, and improve, after failure than after success.
  • #10: Process Improvement implies change and change means resistance. To willingly accept change, people need to be convinced that it offers more benefits than risks. Alternatively, they may realize that the fear of change is preferable to undesirable results obtained when using existing work methods. The broad promise of "higher quality" is, unfortunately, not appealing enough; though many people say they are driven by quality, most are driven by schedule.[3] In addition to failure, the primary motivator for process improvement, discontent with the status quo can motivate the desire to change.[4] Let’s not mislead ourselves, however: this is “permission to try” rather than a whole-hearted commitment to process improvement. ASPI builds from this humble starting point. From cycle to cycle, the team receives additional doses of positive feedback, leading to a progressively greater commitment to process improvement, until improvement becomes part of the company culture.
  • #11: In this model, "assessment" is integrated with the improvement process. The improvement leader (formerly "lead assessor") directs and coaches all improvement-related tasks.[5] This person performs the internal assessment, presents findings to the staff, develops the improvement plan and manages it. The improvement leader can be an external SQA consultant, the QA manager or a company SQA engineer, but he is not "merely" an assessor; he has the responsibility to show results, like any other project manager.
  • #12: In this model, "assessment" is integrated with the improvement process. The improvement leader (formerly "lead assessor") directs and coaches all improvement-related tasks.[5] This person performs the internal assessment, presents findings to the staff, develops the improvement plan and manages it. The improvement leader can be an external SQA consultant, the QA manager or a company SQA engineer, but he is not "merely" an assessor; he has the responsibility to show results, like any other project manager.
  • #13: Open assessment Open assessment is the cornerstone of a motivation-driven improvement process. This is an internal assessment. The assessor does not build constraining checklists, which have a limiting effect on eliciting participation from interviewees. Kerr stated that an imposed development process is likely to miss the mark and result in failure.[6] In fact, if developers play a role in designing the process, it has an excellent chance for promoting improvement: not only does it target the correct issues, but it also raises the developers' motivation for implementing it. Dymond emphasized the decisive contribution engineers can bring to assessment findings.[7] Veteran engineers can offer valuable information about endemic problems in company processes as well as areas of strength. Newcomers are able to question established practices in light of their experience at a previous workplace. Administrative assistants can also recount process failures and achievements.
  • #14: The CMM / CMMI model serves as a framework from which the assessor looks for the roots and the team management select the areas to prioritize.
  • #15: The interview starts with a brief description of ASPI as an opportunity for better performance based on the information gathered from the interviews. The objective of the interview is to anonymously gather interviewees' opinions on both weaknesses and strengths. The assessor asks the interviewee about his work, about successes and failures, about smooth processes versus problematic ones. For every practice, the assessor asks three basic questions: What goes wrong now? Optimally, how should the process be performed? How should the process be changed? The assessor encourages the interviewee to speak freely, the only restriction being not to refer to persons but to processes. The rule is to avoid names of people; roles or titles may be referred to if absolutely necessary.
  • #16: “Open assessment” findings are a mix of subjective feelings, personal frustrations and actual process clues. Different people can see a given practice from quite different points of view. Determining objective findings is simple: the more a finding recurs in interviews, the greater the chance it is fact. The assessor must find a link between reports. To this end, additional clarification questions are presented. The assessor then collects the clues that point to one problematic practice. This method is typical of scientific research, where a new theory explains a group of apparently unrelated findings.
  • #17: Prioritization When prioritizing processes for improvement, the greatest weight should be given to those areas in which there is broad acceptance of assessment findings, even if these are not objectively the most critical. The assessor, therefore, submits his report to a mid-management forum, the level that prioritizes the KPAs. This ensures that at least initial ASPI activities benefit from broad managerial support. It is wise to postpone the most challenging tasks to subsequent phases in the improvement process—when confidence in the improvement process has increased and patience is greater.
  • #18: Plan 4.1 Quick and tangible results Changes in software development practice must make a real contribution to production before they are accepted in "pragmatic" companies. Management is more likely to focus on practices that directly affect software production—for example, coding (implementation). Initial improvement efforts in this area, therefore, have good chances of management's support. In the free interpretation of ISO 12207 shown in the illustration below, the software manufacturing process is divided into three distinct layers. Each layer sustains the layer above, up to the product itself. Managers typically choose to start with activity areas at the "production process" level, such as implementation , requirements or design .
  • #19: 3 Improvement cycles After the first assessment, KPA identification and prioritization, each KPA becomes its own independent improvement cycle.[8] This is a very short improvement process that aims to demonstrate added value in a limited area, within short time, and requiring only a small investment. The importance of this was shown by Boehm.[9]
  • #21: Case study LADj is part of ICG, which develops communications products at Intel. Two years ago, when the group had an SE team developing templates for software-design documents, it decided that process improvement was needed in additional areas. The LADj staff was aware that I was running a small process-improvement project for another group in the same building. Because of the group's informal exposure to process improvement, its members were not resistant to the concept. The first issue selected following first assessment and the ensuing brainstorming was “coding rules.” This is part of implementation , which is closest to production itself (see figure 3). It therefore sounds "productive" even in a culture unfamiliar with process-improvement methodologies. Not surprisingly, the second choice was "unit (developer) tests"—part of validation . The third issue selected was design . The illustration above shows how a new mini-improvement-cycle was started each month. We strove to have three improvement tracks running at the same time. The “unit test” track started when the “coding practices” track completed assessment. In practice, our progress was very flexible, in accordance with the resources available. Peak activity in other projects caused delay. We never claimed priority over software-development activities, so ongoing projects were only minimally affected.
  • #23: The Software Engineering Process Group (SEPG) created a sub-committee to define the coding rules. The members of the sub-committee were all prominent software engineers, professionally respected within LADj, so that the developers had confidence that the coding rules established had relevance to their work.
  • #25: The process of assimilation of the coding rules established continues to this day, long after the SEPG formally completed this phase, demonstrating a true commitment to SPI. One of the engineers, for example, recently wrote patterns for the rules and posted the page on the SPI intranet site. Today, process improvement is institutionalized for the entire LAD group, worldwide: there is a dedicated team that directs the process for all of the group's branches, around the world.