SlideShare a Scribd company logo
Benefits and limitations of current techniques for measuring the
            readiness of a system to proceed to production
                                            James W. Bilbro
                                      JB Consulting International
                                    www.jbconsultinginternational.com
                                      Executive Summary
       This paper summarizes a number of methodologies currently being used or developed for
determining the readiness of a system to be taken into production, addressing both strengths and
weaknesses. There are a number of possible permutations and combinations of the tools and
processes addressed that would be successful, but there are a few key points that should be
considered.
    • An integrated total system approach should be used that addresses technology,
       manufacturing, integration, etc. from the system of systems level to the individual
       component level.

    •   Current status should be augmented by a predictive assessment that addresses the
        combined risk, schedule and cost to completion. While risk and cost assessments are
        currently undertaken, they appear to be only loosely connected to each other and not at
        all to TRA’s.

    •   A system readiness process should be standardized through the use of tools – this
        facilitates independent assessments as well as comparisons between programs.

    •   The tools and processes/methodologies should be mandatory: if they are not mandatory,
        they will never be used and therefore never be improved. There are too many demands
        on the program manager to utilize “helpful” processes, tools, etc.

    •   In the end, whatever approach is chosen, there needs to be a balance between
        “comprehensiveness” and ease of use. It is easy to ask 10,000 questions to cover
        everything; however, it is impractical from the point of view of the resources available to
        do so.

    •   In this respect, one should rely on the system engineering process (perhaps relying on a
        consolidated set of checklists) and develop a concise augmentation process such as
        expanding upon the AD2 process in conjunction with a system level TRA, or incorporating
        the RI3 methodology into a cost assessment together with an integrated TRA & MRA..

Introduction:
         Program failures and cost and schedule overruns over the last two decades have led to
pressure to focus on specific issues in order to improve performance. Initial focus was on
Technology Readiness, soon to be followed by Manufacturing Readiness and most recently on a
renewed emphasis on system engineering, including specifically integration. There is no question
that attempts to use immature technology (both hardware and software) and lack of
manufacturing capability have significantly contributed to these problems as have integration
issues, particularly in today’s complex interrelated systems. However, arguably all of these
issues have arisen from failures in the system engineering process and, if a solution is to be
found it must be approached from a total system engineering perspective. A summary of tools
and processes currently available is provided in the following paragraphs.
                                    Available Tools & Processes


                                         JB Consulting International
                                 4017 Panorama Drive SE Huntsville, AL 35801
            Phone: 256-534-6245 Fax : 866-235-8953 Mobile : 256-655-6273 E-mail: jbci@bellsouth.net
                                Website : www.JBConsultingInternational.com
Advancement Degree of Difficulty (AD2):
         AD2 is designed to be done in conjunction with a Technology Readiness Assessment
(TRA). The AD2 process is focused on determining the “tall tent pole” issues in cost, schedule and
risk that are associated with the various elements (systems, subsystems, components) deemed
to be below the desired level of maturity by the TRA. It is structured around the Work Breakdown
Structure (WBS) and asks a series of questions regarding the element under investigation as to
whether or not it can be designed, manufactured, tested, or operated. E.g. do you have the
necessary models with the requisite accuracy and if not, what is the cost and schedule required
to obtain them and what is the risk that they cannot be obtained? The results are portrayed in the
form of identifying the weakest link, ie, greatest risk, highest cost, longest time. AD2 is not
intended as a replacement for a formal cost analysis nor risk analysis, rather it looks to provide
quantified indicators for more detailed examination. The AD2 tool currently consists of 57
questions in 5 categories: Design & Analysis, Manufacturing, Software Development, Test, and
Operations. Categories may be changed and questions may be changed or additional questions
added. The strength of the AD2 process is that it is predictive and can (and should) be used from
the system of system level down to the component level. The weakness is that it is only as good
as the questions asked and they must continually be refined.
https://acc.dau.mil/CommunityBrowser.aspx?id=189733&lang=en-US
DOD Systems Engineering Checklists:
         There are 18 System Engineering Checklists covering all program phases intended to
supplement the Services individual processes/methodologies. The TRA checklist for example
consists of 69 questions in 8 areas: Timing/Entry Level, Planning, Program Schedule, Program
Risk Assessment, Critical Technologies Identification, TRA Panel, TRA Preparation and Event,
and Completion/Exit Criteria. Each question is to be assessed with respect to risk categories of
Red, Yellow, Green, Unassigned or Not Applicable. The strength of the checklist approach as a
whole is its comprehensiveness which, as was the case, with the UK approach (below) is also
one of its weaknesses because of the time required to apply it. The questions are also of a
programmatic nature as to whether or not a process has been completed without regard to how
well it was done. The checklist approach also is only a status. While it does make risk
characterization, it does not provide any quantification of what remains to be done. It appears that
while these checklists are recommended for DOD as a whole, only NavAir (the developer) makes
much use of them. https://acc.dau.mil/CommunityBrowser.aspx?id=144143&lang=en-US
Risk Identification, Integration & ‘illities (RI3):
         RI3 is a methodology designed to provide a concise set of questions that highlight key
areas that have historically been overlooked, particularly in areas related to the integration of new
technologies, and the “ilities.” It is intended to be used in conjunction with formal risk assessment
processes, not as a replacement. It consists of 105 questions in 9 categories: Design Maturity
and Stability, Scalability & Complexity, Integrability, Testability, Software, Reliability,
Maintainability, Human Factors, and People Organization & Skills. The questions are formulated
in terms of “best practices” and if the answers are negative, the responder is asked to indicate a
likelihood of the “best practice” not occurring and the subsequent consequence to the program.
The results are captured in a conventional 5X5 risk matrix and in linearized likelihood and
consequence charts. It is recommended that the linearized charts be used due to the stigma
associated with having “red” areas in a conventional 5X5 risk matrix which often results in risks
being under reported. The strength of the RI3 methodology is that it is applicable to any level -
from system of systems to components and that it focuses on issues that have most frequently
been missed. The primary weakness of the RI3 method is that when the questions are examined
they are frequently dismissed as being common sense issues which “of course” are already being
addressed. This can lead to the methodology being deemed unnecessary and therefore not
utilized. Questions, of course, must continue to be refined and if maximum benefit is to be


                                          JB Consulting International
                                  4017 Panorama Drive SE Huntsville, AL 35801
             Phone: 256-534-6245 Fax : 866-235-8953 Mobile : 256-655-6273 E-mail: jbci@bellsouth.net
                                 Website : www.JBConsultingInternational.com
achieved from the methodology it should be incorporated into formalized cost assessments.
http://www.afit.edu/cse/page.cfm?page=164&sub=95
System Readiness Level (SRL) (Stevens Institute):
         The SRL in this case is defined through the combination of the TRL of a given technology
with the Integration Readiness Level (IRL) of each of the elements with which it will be integrated.
The computation of SRL is considered as a normalized matrix of pairwise comparisons of
normalized TRL and IRL. The strength of this approach is that it recognizes that integration plays
a major role in successful program completion and offers the opportunity to define a system
readiness by a single number. One weakness of this approach is that representing a system
readiness by a single number has the unwanted potential for masking major problems. A more
limiting weakness is that it requires the use of Integration Readiness Levels which at this point
are too ill-defined to result in practical application. This approach provides for status only and
does not provide any means of quantifying what is left to be done. Substantial work remains to
be done on quantifying IRLs if this methodology is to be successful.
http://www.systemreadinesslevel.com/

System Readiness Level (SRL) ( UK Ministry of Defense):
         The UK MOD SRL methodology is an attempt at a comprehensive look at all aspects of a
program. It is used in conjunction with TRL assessments. The SRL self assessment tool has 9
top level categories: System Engineering Drivers, Training, Safety & Environmental, Reliability &
Maintainability, Human Factors, Software, Information Systems, Airworthiness, and Maritime.
Each of these 9 areas has a set of questions for each of the 9 levels of the SRL for a total of 399
questions. Affirmative answers to a complete set of questions for a given level determines the
SRL for that area. The composite SRL is displayed as a matrix of areas against individual SRLs
resulting in a particular signature at a given point in the program. The strength of this
methodology is its comprehensiveness, which is also its major weakness, it is incredibly time
consuming to perform this assessment and there has been some indication (not confirmed) that
the UK MOD has discontinued use of this methodology. It is worth noting that the UK MOD
attempted to utilize Integration Readiness Levels (IRLs) and Design Readiness Levels (DRLs)
before settling on this approach. http://www.aof.mod.uk/aofcontent/tactical/techman/index.htm
System Technology Readiness Assessments (STRA):
         Technology Readiness Assessment can (and arguably should) be performed on a
system or even System of Systems basis as well as subsystem and component. This is
particularly true in consideration of the fact that using legacy hardware or software in a new
system/environment frequently results in technology development. When a TRA is conducted at
the system level all issues including integration are addressed. I.e., if a system level prototype is
tested in a relevant environment, it must have been integrated first. Two tools exist for
standardizing the TRA assessment – the AFRL TRL Calculator and the NASA-AFRL TRL
Calculator. The latter calculator is specifically structured to perform assessments as a function of
the program Work Breakdown Schedule (WBS). It has 259 questions for the 9 levels in 3
categories (hardware, software & manufacturing). A web-based version of the AFRL TRL
Calculator is under development which will also include a WBS capability. The strength of this
approach (when used with a calculator tool) is that it provides a standardized means of making a
comprehensive system assessment from the top down as well as from the bottom up. The
weakness is that it only provides a status at the time of the assessment and does not address
what is required for successful completion. https://acc.dau.mil/Search.aspx?
id=157372&m=6&tfp=1&tfk=1&tfd=1&q=trl
https://acc.dau.mil/CommunityBrowser.aspx?id=189733&lang=en-US




                                          JB Consulting International
                                  4017 Panorama Drive SE Huntsville, AL 35801
             Phone: 256-534-6245 Fax : 866-235-8953 Mobile : 256-655-6273 E-mail: jbci@bellsouth.net
                                 Website : www.JBConsultingInternational.com

More Related Content

PPTX
A Comprehensive Overview Of Techniquess For Measuring System Readiness Final ...
PDF
Industrial perspective on static analysis
PDF
Using Fuzzy Clustering and Software Metrics to Predict Faults in large Indust...
PDF
Investigation of quality and functional risk
PDF
Successive Software Reliability Growth Model: A Modular Approach
DOC
ISTQB Advanced Study Guide - 6
PDF
Enabling and Supporting the Debugging of Field Failures (Job Talk)
PDF
Software testing and introduction to quality
A Comprehensive Overview Of Techniquess For Measuring System Readiness Final ...
Industrial perspective on static analysis
Using Fuzzy Clustering and Software Metrics to Predict Faults in large Indust...
Investigation of quality and functional risk
Successive Software Reliability Growth Model: A Modular Approach
ISTQB Advanced Study Guide - 6
Enabling and Supporting the Debugging of Field Failures (Job Talk)
Software testing and introduction to quality

What's hot (19)

PDF
2005 sauserramirezvermagovecser
PDF
Software Quality Assurance
PDF
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
PPT
Ian Sommerville, Software Engineering, 9th EditionCh 8
DOCX
Lekia P Ross resume
PDF
Software Reliability Engineering
PDF
An empirical evaluation of
PDF
Software Defect Prediction Techniques in the Automotive Domain: Evaluation, S...
PPT
Software reliability
PDF
Software Testing and Quality Assurance Assignment 2
PPT
Sa 007 availability
PDF
Software testing
DOCX
Lekia Ross Resume_2016
PDF
Volume 2-issue-6-1983-1986
PDF
CRASH Report 2014
PDF
A Survey of functional verification techniques
PDF
Defect effort prediction models in software maintenance projects
DOCX
Software engg unit 4
PDF
@#$@#$@#$"""@#$@#$"""
2005 sauserramirezvermagovecser
Software Quality Assurance
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
Ian Sommerville, Software Engineering, 9th EditionCh 8
Lekia P Ross resume
Software Reliability Engineering
An empirical evaluation of
Software Defect Prediction Techniques in the Automotive Domain: Evaluation, S...
Software reliability
Software Testing and Quality Assurance Assignment 2
Sa 007 availability
Software testing
Lekia Ross Resume_2016
Volume 2-issue-6-1983-1986
CRASH Report 2014
A Survey of functional verification techniques
Defect effort prediction models in software maintenance projects
Software engg unit 4
@#$@#$@#$"""@#$@#$"""
Ad

Viewers also liked (7)

DOC
Intro To TRLs
PPTX
Ontopia/Liferay integration @TMRA 2010
DOC
Jwcv Jbci
PPTX
Go Green
PPTX
Eddy
DOC
Jbci Publications
DOC
Jwcv Jbci 2 P System Tech
Intro To TRLs
Ontopia/Liferay integration @TMRA 2010
Jwcv Jbci
Go Green
Eddy
Jbci Publications
Jwcv Jbci 2 P System Tech
Ad

Similar to Assessing System Readiness (20)

PDF
2009 ASME final
PDF
A Review on Software Fault Detection and Prevention Mechanism in Software Dev...
PDF
F017652530
PDF
Adaptive fuzzy hierarchical cumulative voting a novel approach toward requir...
PDF
A Study On The Software Requirements Elicitation Issues Its Causes And Effects
DOCX
How Should We Estimate Agile Software Development Projects and What Data Do W...
PDF
Agile software development and challenges
PDF
reliability plan.pdf
PDF
FROM PLM TO ERP : A SOFTWARE SYSTEMS ENGINEERING INTEGRATION
DOC
Lightweight Processes: A Definition
PDF
Agile software development and challenges
PDF
Configuration Navigation Analysis Model for Regression Test Case Prioritization
PDF
AN IMPROVED REPOSITORY STRUCTURE TO IDENTIFY, SELECT AND INTEGRATE COMPONENTS...
PDF
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENT
PDF
KJ Ross Whitepaper How CXO's can reduce IT Project risk by improving software...
 
PDF
Suitability of Agile Methods for Safety-Critical Systems Development: A Surve...
DOCX
DMAIC addressed Bearnson S-N tracking for all product.
PDF
A Complexity Based Regression Test Selection Strategy
PPTX
L-5hoknnnnjhingyug hbnbnnbhhjbhuguhbk.pptx
PDF
STATE-OF-THE-ART IN EMPIRICAL VALIDATION OF SOFTWARE METRICS FOR FAULT PRONEN...
2009 ASME final
A Review on Software Fault Detection and Prevention Mechanism in Software Dev...
F017652530
Adaptive fuzzy hierarchical cumulative voting a novel approach toward requir...
A Study On The Software Requirements Elicitation Issues Its Causes And Effects
How Should We Estimate Agile Software Development Projects and What Data Do W...
Agile software development and challenges
reliability plan.pdf
FROM PLM TO ERP : A SOFTWARE SYSTEMS ENGINEERING INTEGRATION
Lightweight Processes: A Definition
Agile software development and challenges
Configuration Navigation Analysis Model for Regression Test Case Prioritization
AN IMPROVED REPOSITORY STRUCTURE TO IDENTIFY, SELECT AND INTEGRATE COMPONENTS...
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENT
KJ Ross Whitepaper How CXO's can reduce IT Project risk by improving software...
 
Suitability of Agile Methods for Safety-Critical Systems Development: A Surve...
DMAIC addressed Bearnson S-N tracking for all product.
A Complexity Based Regression Test Selection Strategy
L-5hoknnnnjhingyug hbnbnnbhhjbhuguhbk.pptx
STATE-OF-THE-ART IN EMPIRICAL VALIDATION OF SOFTWARE METRICS FOR FAULT PRONEN...

More from jbci (7)

DOC
System Assessment
DOC
Summary Of Short Courses
DOC
Intro To Technology Assessment
DOC
Intro To Ad2
DOC
Systematic Technology Assessment
PPT
Using The Advancement Degree Of Difficulty (Ad2) As An Input To Risk Management
PPT
A Suite Of Tools For Technology Assessment
System Assessment
Summary Of Short Courses
Intro To Technology Assessment
Intro To Ad2
Systematic Technology Assessment
Using The Advancement Degree Of Difficulty (Ad2) As An Input To Risk Management
A Suite Of Tools For Technology Assessment

Recently uploaded (20)

PDF
Univ-Connecticut-ChatGPT-Presentaion.pdf
PPTX
TechTalks-8-2019-Service-Management-ITIL-Refresh-ITIL-4-Framework-Supports-Ou...
PDF
Hybrid model detection and classification of lung cancer
PDF
A novel scalable deep ensemble learning framework for big data classification...
PDF
DASA ADMISSION 2024_FirstRound_FirstRank_LastRank.pdf
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PDF
A comparative study of natural language inference in Swahili using monolingua...
PDF
Web App vs Mobile App What Should You Build First.pdf
PDF
Mushroom cultivation and it's methods.pdf
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PDF
A comparative analysis of optical character recognition models for extracting...
PPTX
OMC Textile Division Presentation 2021.pptx
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Accuracy of neural networks in brain wave diagnosis of schizophrenia
PPTX
SOPHOS-XG Firewall Administrator PPT.pptx
PDF
DP Operators-handbook-extract for the Mautical Institute
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PDF
Video forgery: An extensive analysis of inter-and intra-frame manipulation al...
PPTX
Chapter 5: Probability Theory and Statistics
Univ-Connecticut-ChatGPT-Presentaion.pdf
TechTalks-8-2019-Service-Management-ITIL-Refresh-ITIL-4-Framework-Supports-Ou...
Hybrid model detection and classification of lung cancer
A novel scalable deep ensemble learning framework for big data classification...
DASA ADMISSION 2024_FirstRound_FirstRank_LastRank.pdf
Assigned Numbers - 2025 - Bluetooth® Document
A comparative study of natural language inference in Swahili using monolingua...
Web App vs Mobile App What Should You Build First.pdf
Mushroom cultivation and it's methods.pdf
Encapsulation_ Review paper, used for researhc scholars
Agricultural_Statistics_at_a_Glance_2022_0.pdf
A comparative analysis of optical character recognition models for extracting...
OMC Textile Division Presentation 2021.pptx
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Accuracy of neural networks in brain wave diagnosis of schizophrenia
SOPHOS-XG Firewall Administrator PPT.pptx
DP Operators-handbook-extract for the Mautical Institute
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
Video forgery: An extensive analysis of inter-and intra-frame manipulation al...
Chapter 5: Probability Theory and Statistics

Assessing System Readiness

  • 1. Benefits and limitations of current techniques for measuring the readiness of a system to proceed to production James W. Bilbro JB Consulting International www.jbconsultinginternational.com Executive Summary This paper summarizes a number of methodologies currently being used or developed for determining the readiness of a system to be taken into production, addressing both strengths and weaknesses. There are a number of possible permutations and combinations of the tools and processes addressed that would be successful, but there are a few key points that should be considered. • An integrated total system approach should be used that addresses technology, manufacturing, integration, etc. from the system of systems level to the individual component level. • Current status should be augmented by a predictive assessment that addresses the combined risk, schedule and cost to completion. While risk and cost assessments are currently undertaken, they appear to be only loosely connected to each other and not at all to TRA’s. • A system readiness process should be standardized through the use of tools – this facilitates independent assessments as well as comparisons between programs. • The tools and processes/methodologies should be mandatory: if they are not mandatory, they will never be used and therefore never be improved. There are too many demands on the program manager to utilize “helpful” processes, tools, etc. • In the end, whatever approach is chosen, there needs to be a balance between “comprehensiveness” and ease of use. It is easy to ask 10,000 questions to cover everything; however, it is impractical from the point of view of the resources available to do so. • In this respect, one should rely on the system engineering process (perhaps relying on a consolidated set of checklists) and develop a concise augmentation process such as expanding upon the AD2 process in conjunction with a system level TRA, or incorporating the RI3 methodology into a cost assessment together with an integrated TRA & MRA.. Introduction: Program failures and cost and schedule overruns over the last two decades have led to pressure to focus on specific issues in order to improve performance. Initial focus was on Technology Readiness, soon to be followed by Manufacturing Readiness and most recently on a renewed emphasis on system engineering, including specifically integration. There is no question that attempts to use immature technology (both hardware and software) and lack of manufacturing capability have significantly contributed to these problems as have integration issues, particularly in today’s complex interrelated systems. However, arguably all of these issues have arisen from failures in the system engineering process and, if a solution is to be found it must be approached from a total system engineering perspective. A summary of tools and processes currently available is provided in the following paragraphs. Available Tools & Processes JB Consulting International 4017 Panorama Drive SE Huntsville, AL 35801 Phone: 256-534-6245 Fax : 866-235-8953 Mobile : 256-655-6273 E-mail: jbci@bellsouth.net Website : www.JBConsultingInternational.com
  • 2. Advancement Degree of Difficulty (AD2): AD2 is designed to be done in conjunction with a Technology Readiness Assessment (TRA). The AD2 process is focused on determining the “tall tent pole” issues in cost, schedule and risk that are associated with the various elements (systems, subsystems, components) deemed to be below the desired level of maturity by the TRA. It is structured around the Work Breakdown Structure (WBS) and asks a series of questions regarding the element under investigation as to whether or not it can be designed, manufactured, tested, or operated. E.g. do you have the necessary models with the requisite accuracy and if not, what is the cost and schedule required to obtain them and what is the risk that they cannot be obtained? The results are portrayed in the form of identifying the weakest link, ie, greatest risk, highest cost, longest time. AD2 is not intended as a replacement for a formal cost analysis nor risk analysis, rather it looks to provide quantified indicators for more detailed examination. The AD2 tool currently consists of 57 questions in 5 categories: Design & Analysis, Manufacturing, Software Development, Test, and Operations. Categories may be changed and questions may be changed or additional questions added. The strength of the AD2 process is that it is predictive and can (and should) be used from the system of system level down to the component level. The weakness is that it is only as good as the questions asked and they must continually be refined. https://acc.dau.mil/CommunityBrowser.aspx?id=189733&lang=en-US DOD Systems Engineering Checklists: There are 18 System Engineering Checklists covering all program phases intended to supplement the Services individual processes/methodologies. The TRA checklist for example consists of 69 questions in 8 areas: Timing/Entry Level, Planning, Program Schedule, Program Risk Assessment, Critical Technologies Identification, TRA Panel, TRA Preparation and Event, and Completion/Exit Criteria. Each question is to be assessed with respect to risk categories of Red, Yellow, Green, Unassigned or Not Applicable. The strength of the checklist approach as a whole is its comprehensiveness which, as was the case, with the UK approach (below) is also one of its weaknesses because of the time required to apply it. The questions are also of a programmatic nature as to whether or not a process has been completed without regard to how well it was done. The checklist approach also is only a status. While it does make risk characterization, it does not provide any quantification of what remains to be done. It appears that while these checklists are recommended for DOD as a whole, only NavAir (the developer) makes much use of them. https://acc.dau.mil/CommunityBrowser.aspx?id=144143&lang=en-US Risk Identification, Integration & ‘illities (RI3): RI3 is a methodology designed to provide a concise set of questions that highlight key areas that have historically been overlooked, particularly in areas related to the integration of new technologies, and the “ilities.” It is intended to be used in conjunction with formal risk assessment processes, not as a replacement. It consists of 105 questions in 9 categories: Design Maturity and Stability, Scalability & Complexity, Integrability, Testability, Software, Reliability, Maintainability, Human Factors, and People Organization & Skills. The questions are formulated in terms of “best practices” and if the answers are negative, the responder is asked to indicate a likelihood of the “best practice” not occurring and the subsequent consequence to the program. The results are captured in a conventional 5X5 risk matrix and in linearized likelihood and consequence charts. It is recommended that the linearized charts be used due to the stigma associated with having “red” areas in a conventional 5X5 risk matrix which often results in risks being under reported. The strength of the RI3 methodology is that it is applicable to any level - from system of systems to components and that it focuses on issues that have most frequently been missed. The primary weakness of the RI3 method is that when the questions are examined they are frequently dismissed as being common sense issues which “of course” are already being addressed. This can lead to the methodology being deemed unnecessary and therefore not utilized. Questions, of course, must continue to be refined and if maximum benefit is to be JB Consulting International 4017 Panorama Drive SE Huntsville, AL 35801 Phone: 256-534-6245 Fax : 866-235-8953 Mobile : 256-655-6273 E-mail: jbci@bellsouth.net Website : www.JBConsultingInternational.com
  • 3. achieved from the methodology it should be incorporated into formalized cost assessments. http://www.afit.edu/cse/page.cfm?page=164&sub=95 System Readiness Level (SRL) (Stevens Institute): The SRL in this case is defined through the combination of the TRL of a given technology with the Integration Readiness Level (IRL) of each of the elements with which it will be integrated. The computation of SRL is considered as a normalized matrix of pairwise comparisons of normalized TRL and IRL. The strength of this approach is that it recognizes that integration plays a major role in successful program completion and offers the opportunity to define a system readiness by a single number. One weakness of this approach is that representing a system readiness by a single number has the unwanted potential for masking major problems. A more limiting weakness is that it requires the use of Integration Readiness Levels which at this point are too ill-defined to result in practical application. This approach provides for status only and does not provide any means of quantifying what is left to be done. Substantial work remains to be done on quantifying IRLs if this methodology is to be successful. http://www.systemreadinesslevel.com/ System Readiness Level (SRL) ( UK Ministry of Defense): The UK MOD SRL methodology is an attempt at a comprehensive look at all aspects of a program. It is used in conjunction with TRL assessments. The SRL self assessment tool has 9 top level categories: System Engineering Drivers, Training, Safety & Environmental, Reliability & Maintainability, Human Factors, Software, Information Systems, Airworthiness, and Maritime. Each of these 9 areas has a set of questions for each of the 9 levels of the SRL for a total of 399 questions. Affirmative answers to a complete set of questions for a given level determines the SRL for that area. The composite SRL is displayed as a matrix of areas against individual SRLs resulting in a particular signature at a given point in the program. The strength of this methodology is its comprehensiveness, which is also its major weakness, it is incredibly time consuming to perform this assessment and there has been some indication (not confirmed) that the UK MOD has discontinued use of this methodology. It is worth noting that the UK MOD attempted to utilize Integration Readiness Levels (IRLs) and Design Readiness Levels (DRLs) before settling on this approach. http://www.aof.mod.uk/aofcontent/tactical/techman/index.htm System Technology Readiness Assessments (STRA): Technology Readiness Assessment can (and arguably should) be performed on a system or even System of Systems basis as well as subsystem and component. This is particularly true in consideration of the fact that using legacy hardware or software in a new system/environment frequently results in technology development. When a TRA is conducted at the system level all issues including integration are addressed. I.e., if a system level prototype is tested in a relevant environment, it must have been integrated first. Two tools exist for standardizing the TRA assessment – the AFRL TRL Calculator and the NASA-AFRL TRL Calculator. The latter calculator is specifically structured to perform assessments as a function of the program Work Breakdown Schedule (WBS). It has 259 questions for the 9 levels in 3 categories (hardware, software & manufacturing). A web-based version of the AFRL TRL Calculator is under development which will also include a WBS capability. The strength of this approach (when used with a calculator tool) is that it provides a standardized means of making a comprehensive system assessment from the top down as well as from the bottom up. The weakness is that it only provides a status at the time of the assessment and does not address what is required for successful completion. https://acc.dau.mil/Search.aspx? id=157372&m=6&tfp=1&tfk=1&tfd=1&q=trl https://acc.dau.mil/CommunityBrowser.aspx?id=189733&lang=en-US JB Consulting International 4017 Panorama Drive SE Huntsville, AL 35801 Phone: 256-534-6245 Fax : 866-235-8953 Mobile : 256-655-6273 E-mail: jbci@bellsouth.net Website : www.JBConsultingInternational.com