SlideShare a Scribd company logo
0Copyright 2018 – QA Systems GmbH www.qa-systems.com
Automated
Low Level Requirements Testing
for DO-178C
1Copyright 2018 – QA Systems GmbH www.qa-systems.com
DO-178C SW Verification Process
Software Verification Cases and Procedures
DO-178C 11.13
Software Verification Results
DO-178C 11.14
Associated Trace Data
DO-178C 11.21
INPUTS OUTPUTS
Verification
Software Requirements
Source Code
System Requirements
Software Architecture
Trace Data
Executable Object Code
Parameter Data
Software Verification Plan
2Copyright 2018 – QA Systems GmbH www.qa-systems.com
Software Testing Activities 1
DO-178 Section 6.4
Requirements Based Test Objectives
a. The Executable Object Code complies with the
high level requirements
b. The Executable Object Code is robust with the
high level requirements
c. The Executable Object Code complies with the
low level requirements
d. The Executable Object Code robust with the low
level requirements
e. The Executable Object Code is compatible with
the target computer
3Copyright 2018 – QA Systems GmbH www.qa-systems.com
Software Testing Stages
System
Requirements
HSI
Tests
To verify the implementation of
low-level requirements +
derived low-level requirements
To verify the interrelationships between software
requirements and components and to verify the
implementation of the software requirements
and software components within the software
architecture
To verify the correct operation of the
software in the target environment
DO-178 Section 6.4
Table A.6Parameter Data
Low Level
(unit) Tests
Low Level
Requirement
Derived
Low Level
Requirement
SW
Integration
Test
Code
High Level
Requirement
4Copyright 2018 – QA Systems GmbH www.qa-systems.com
Software Testing Activities 2
DO-178 Section 6.4.4.1
Test Coverage Analysis Objectives
a. Analysis using the associated Trace Data, to
confirm that the test cases exits for each
software requirement
b. Analysis to confirm the test cases satisfy the
criteria for normal and robustness testing as
defined in section 6.4.2
c. Resolution if any deficiencies identified in the
analysis. Possible solutions are adding or
enhancing test cases.
d. Analysis to confirm that all the test cases, and
thus all the test procedures, used to achieve
structural coverage, are traceable to
requirements.
5Copyright 2018 – QA Systems GmbH www.qa-systems.com
Test Coverage Analysis
Low Level
Requirement
Code
Low Level
(unit) Test
Requirements Coverage
% Requirements verified by tests
<-> Traceability to tests
Test Coverage
% tests executed & passing
<-> Traceability to requirements
Parameter Data
6Copyright 2018 – QA Systems GmbH www.qa-systems.com
Software Testing Activities 3
DO-178 Section 6.4.4.2
Structural Coverage Analysis Objectives
a. Analysis of the structural coverage information
collected during requirements-based testing to
confirm that the degree of structural coverage is
appropriate to the software level.
b. Structural coverage analysis may be performed
on the Source Code, object code or Executable
Object Code. {NB - Additional verification for
additional code not directly traceable to Source
Code}
c. Analysis to confirm that the requirements-based
testing has exercised the data and control
coupling between code components.
d. Structural coverage analysis resolution
7Copyright 2018 – QA Systems GmbH www.qa-systems.com
Software Testing Activities 4
DO-178 Section 6.4.4.3
Structural Coverage Resolution Objectives
a. Shortcomings in requirements-based test cases
or procedures.
b. Inadequacies in software requirements.
c. Extraneous code, including dead code.
d. Deactivated code.
1. Category One
2. Category Two
8Copyright 2018 – QA Systems GmbH www.qa-systems.com
Structural Coverage Analysis
Low Level
Requirement
Code
Low Level
(unit) Test
Code Coverage
% code executed by tests
<->Traceability to requirements
<->Traceability to tests
Parameter Data
9Copyright 2018 – QA Systems GmbH www.qa-systems.com
Requirements Based Test Selection
DO-178 Section 6.4.2
Requirements-Based Test Selection
1. Specific test cases should be developed to
include normal range test cases and robustness
(abnormal range) test cases.
2. The specific test cases should be developed from
the software requirements and the error sources
inherent in the software development processes.
Note: Robustness test cases are requirements-based.
3. Test procedures are generated from the test
cases.
10Copyright 2018 – QA Systems GmbH www.qa-systems.com
Manual Test Generation
Test Cases crafted Manually from Requirements
Can be hard work! – Even with powerful test tools
Insufficiently validated requirements
 Decomposed, Correct, Complete, Unambiguous, Logically consistent
 High-reliance on structural code coverage & reverse engineering
Complexity of test vectors
 Pre-conditions, Inputs, Expected behaviours & outputs, post-conditions
Boundary of Low Level Requirements ≠ usable test case vectors
 Test Framework: Drivers, Dependencies & Datasets
Gaps & Overlaps
 Defensive programming, private/protected code etc
 Equivalence classes
11Copyright 2018 – QA Systems GmbH www.qa-systems.com
Requirements Tests
Parameter Data
Code
So…How to Automate?
12Copyright 2018 – QA Systems GmbH www.qa-systems.com
Tests
Generation from Requirements
Test Case Generation
Test Cases Generated from Requirements
Very limited capability from:
NL, SNL, PDL,
Use Case Scenarios
Mathematical specs
More capability from
Models (e.g. MBT with UML)
Requirements
13Copyright 2018 – QA Systems GmbH www.qa-systems.com
Generation from Code
Test Cases Generated from Code
Test Vectors from path solving
 Intelligent optimisation
Full test framework
 Pre-conditions, Inputs,
Expected behaviours,
Expected outputs & post-conditions
Tests generated for
maintainability & traceability Code
Tests
AutoTest
Requirements
14Copyright 2018 – QA Systems GmbH www.qa-systems.com
AutoTest & Trace for DO-178C
AutoTest
15Copyright 2017 – QA Systems GmbH www.qa-systems.com
AutoTest Generation
Flexible application
 GUI or CLI invocation
 Complete suite of passing unit tests
 Additional test cases to fill gaps
 Black-box cluster integration test through public functions
 White-box unit isolation test of static functions
 Uses Cantata workspace preferences
Test cases exercise all paths through the code
 Entry-Point
 Statement
 Decision
 MC/DC (unique cause)
Test Cases are complete & maintainable for full control
 All required inputs: parameters + accessible data
 All expected outputs: parameters + accessed data + call-order
 Each test case path solving purpose explained
16Copyright 2017 – QA Systems GmbH www.qa-systems.com
Build RunTest
Exe
Instruments
AutoTest
Makefiles
Tests
Code
AutoTest Process
Code Copy
Generation
Report
Test Results
Automatic Test Generation
Automatic Test Execution
17Copyright 2017 – QA Systems GmbH www.qa-systems.com
AutoTest DO-178C Use Cases
Source Code Testability Assessment
 The AutoTest Generation Report may be used to identify difficulties in creating low-level Cantata
test cases for the software or potential run-time errors:
Dynamically unreachable code Crash scenarios Compiler type truncation
Data un-initialized or function static Implicit function declarations
Test Cases for Assignment as Requirements-based tests
 Generated test cases may be reviewed and used (unaltered or modified) to meet requirements
based verification objectives
 Test cases can be assigned to requirements in Trace once assessed as meeting the objectives
 NOTE: DO-178C 6.4.4.1.d. Requires all test cases used to achieve structural coverage are traceable
to requirements.
Targeted Test Case Generation
 Test cases can be generated for all functions in a source file
 Test cases can be added to test scripts for selected functions to help achieve structural code
coverage requirements (e.g. MC/DC)
18Copyright 2017 – QA Systems GmbH www.qa-systems.com
Traced requirements, test
status and code coverage
Test Information
.csv ReqIF Excel
Requirements
Requirements
Management Tool
Full bi-directional
requirements traceability
evidence
Drag and drop tracing of requirements
(text, diagrams, links) with test cases.
Generate tests
link to requirements
Test Tool
Requirements Trace Closes Loop
19Copyright 2017 – QA Systems GmbH www.qa-systems.com
Easy Linking in Cantata Trace
Bi-directional drag and drop interface, immediately creates links on a server
Whole Test Scripts linked to Requirements
Individual Test Cases linked to Requirements
20Copyright 2017 – QA Systems GmbH www.qa-systems.com
3 Part Automation
1 Automatic Test Vector Generation
Test case vectors from code exercising all paths (up to MC/DC coverage)
Sets input parameters & data throughout test execution
Checks expected vs actual data, input & output parameters and call order
3 Automated Traceability & Coverage Data Production
Complete Requirements imported/exported for testing
AutoTest cases generated with traceable descriptions
Test status, Requirements traceability & Structural coverage evidence
2 Automated Test Execution
Continuous integration build, run and reporting
21Copyright 2018 – QA Systems GmbH www.qa-systems.com
Complete 3 Way Analysis
Low Level
Requirement
Code
Low Level
(unit) Test
Parameter Data
Requirements Coverage
See requirements coverage in
your requirements
management & test tools
Use the same tool for all
trace data
Test Coverage
Run tests when not executed
(continuous integration and
testing helps a lot)
Fix tests when they fail
Code Coverage
When you have gaps, identify if the code is:
dead / redundant, unreachable, deactivated (not used in this context)
If not, then add a test and that needs to be traced to [new] requirements
22Copyright 2018 – QA Systems GmbH www.qa-systems.com
23Copyright 2018 – QA Systems GmbH www.qa-systems.com
If code already handles these – then AutoTest generation is very helpful
If code does not – then Traceability should catch them as AutoTest will not
But AutoTest could generate test cases for these scenarios too…
Further Enhancements? - Robustness
DO-178 Sections 6.4.3 identifies Normal & Robustness Test Cases
Robustness test cases demonstrate the ability of the software to respond to abnormal inputs and conditions.
• Failure of an algorithm to satisfy a software requirement.
• Incorrect loop operations.
• Incorrect logic decisions.
• Failure to process correctly legitimate combinations of input
conditions.
• Incorrect responses to missing or corrupt input data.
• Incorrect handling of exceptions, such as arithmetic faults or
violations of array limits.
• Incorrect computation sequence.
• Inadequate algorithm precision, accuracy, or performance.
• Incorrect initialization of variables and constants.
• Parameter passing errors.
• Data corruption, especially global data.
• Inadequate end-to-end numerical resolution.
• Incorrect sequencing of events and operations.
24Copyright 2018 – QA Systems GmbH www.qa-systems.com
Thank you

More Related Content

PPT
Basic software-testing-concepts
PPTX
Software Testing - Part 1 (Techniques, Types, Levels, Methods, STLC, Bug Life...
PPTX
Software testing.ppt
PDF
Testing methodology
PDF
STLC (Software Testing Life Cycle)
PPT
Software Testing
PPS
Test Process
PPT
Test Automation Strategies For Agile
Basic software-testing-concepts
Software Testing - Part 1 (Techniques, Types, Levels, Methods, STLC, Bug Life...
Software testing.ppt
Testing methodology
STLC (Software Testing Life Cycle)
Software Testing
Test Process
Test Automation Strategies For Agile

What's hot (20)

PPTX
stlc
PDF
Software Testing Life Cycle (STLC) | Software Testing Tutorial | Edureka
PPSX
Manual testing
PPT
Software Testing 101
PPTX
PPTX
How to Design a Successful Test Automation Strategy
PPTX
Software Testing Life Cycle – A Beginner’s Guide
PPT
PPTX
Software Testing or Quality Assurance
PPTX
SDLC vs STLC
PDF
DevOps in an Embedded World
PPT
Basic Guide to Manual Testing
PDF
Test Automation Framework Design | www.idexcel.com
PPT
QACampus PPT (STLC)
PPTX
Components of the sqa system
PDF
Test Automation - Keytorc Approach
PPT
Test Automation Best Practices (with SOA test approach)
PPT
Softwaretesting
PDF
Building a Test Automation Strategy for Success
PPTX
Software metrics
stlc
Software Testing Life Cycle (STLC) | Software Testing Tutorial | Edureka
Manual testing
Software Testing 101
How to Design a Successful Test Automation Strategy
Software Testing Life Cycle – A Beginner’s Guide
Software Testing or Quality Assurance
SDLC vs STLC
DevOps in an Embedded World
Basic Guide to Manual Testing
Test Automation Framework Design | www.idexcel.com
QACampus PPT (STLC)
Components of the sqa system
Test Automation - Keytorc Approach
Test Automation Best Practices (with SOA test approach)
Softwaretesting
Building a Test Automation Strategy for Success
Software metrics
Ad

Similar to Automated Low Level Requirements Testing for DO-178C (20)

PPTX
Automated Requirements-Based Testing for Medical Device Software
PPSX
Automated Requirements-Based Testing for Medical Device Software.ppsx
PPTX
Automated requirements based testing for ISO 26262
PDF
Automated DevOps Dynamic Testing for CI and CD
PPTX
Software testing lecture software engineering
PPTX
Automating The Process For Building Reliable Software
PPT
PPTX
Automating Test Maintenance as Code and Requirements Change
PPT
9 test_levels-
PDF
Software Testing.pdf
DOCX
Softwaretestingstrategies
PDF
SE2018_Lec 19_ Software Testing
PPTX
Software Testing & Debugging
PDF
software-testing-yogesh-singh (1).pdf
PPT
ISTQB / ISEB Foundation Exam Practice -1
PPT
SOFTWARE ENGINEERING unit4-1 CLASS notes in pptx 2nd year
PPTX
Software engineering quality assurance and testing
PPT
ISTQB, ISEB Lecture Notes
PPT
Chapter 9 Testing Strategies.ppt
PDF
Glossary of Testing Terms and Concepts
Automated Requirements-Based Testing for Medical Device Software
Automated Requirements-Based Testing for Medical Device Software.ppsx
Automated requirements based testing for ISO 26262
Automated DevOps Dynamic Testing for CI and CD
Software testing lecture software engineering
Automating The Process For Building Reliable Software
Automating Test Maintenance as Code and Requirements Change
9 test_levels-
Software Testing.pdf
Softwaretestingstrategies
SE2018_Lec 19_ Software Testing
Software Testing & Debugging
software-testing-yogesh-singh (1).pdf
ISTQB / ISEB Foundation Exam Practice -1
SOFTWARE ENGINEERING unit4-1 CLASS notes in pptx 2nd year
Software engineering quality assurance and testing
ISTQB, ISEB Lecture Notes
Chapter 9 Testing Strategies.ppt
Glossary of Testing Terms and Concepts
Ad

Recently uploaded (20)

PDF
Which alternative to Crystal Reports is best for small or large businesses.pdf
PDF
Softaken Excel to vCard Converter Software.pdf
PPTX
VVF-Customer-Presentation2025-Ver1.9.pptx
PDF
Nekopoi APK 2025 free lastest update
PDF
Claude Code: Everyone is a 10x Developer - A Comprehensive AI-Powered CLI Tool
PDF
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
PPT
Introduction Database Management System for Course Database
PPTX
Agentic AI Use Case- Contract Lifecycle Management (CLM).pptx
PPTX
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
PDF
Upgrade and Innovation Strategies for SAP ERP Customers
PDF
Design an Analysis of Algorithms I-SECS-1021-03
PDF
PTS Company Brochure 2025 (1).pdf.......
PDF
Digital Strategies for Manufacturing Companies
PDF
Internet Downloader Manager (IDM) Crack 6.42 Build 41
PDF
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
PDF
System and Network Administration Chapter 2
PDF
Wondershare Filmora 15 Crack With Activation Key [2025
PDF
EN-Survey-Report-SAP-LeanIX-EA-Insights-2025.pdf
PDF
Design an Analysis of Algorithms II-SECS-1021-03
PDF
Internet Downloader Manager (IDM) Crack 6.42 Build 42 Updates Latest 2025
Which alternative to Crystal Reports is best for small or large businesses.pdf
Softaken Excel to vCard Converter Software.pdf
VVF-Customer-Presentation2025-Ver1.9.pptx
Nekopoi APK 2025 free lastest update
Claude Code: Everyone is a 10x Developer - A Comprehensive AI-Powered CLI Tool
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
Introduction Database Management System for Course Database
Agentic AI Use Case- Contract Lifecycle Management (CLM).pptx
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
Upgrade and Innovation Strategies for SAP ERP Customers
Design an Analysis of Algorithms I-SECS-1021-03
PTS Company Brochure 2025 (1).pdf.......
Digital Strategies for Manufacturing Companies
Internet Downloader Manager (IDM) Crack 6.42 Build 41
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
System and Network Administration Chapter 2
Wondershare Filmora 15 Crack With Activation Key [2025
EN-Survey-Report-SAP-LeanIX-EA-Insights-2025.pdf
Design an Analysis of Algorithms II-SECS-1021-03
Internet Downloader Manager (IDM) Crack 6.42 Build 42 Updates Latest 2025

Automated Low Level Requirements Testing for DO-178C

  • 1. 0Copyright 2018 – QA Systems GmbH www.qa-systems.com Automated Low Level Requirements Testing for DO-178C
  • 2. 1Copyright 2018 – QA Systems GmbH www.qa-systems.com DO-178C SW Verification Process Software Verification Cases and Procedures DO-178C 11.13 Software Verification Results DO-178C 11.14 Associated Trace Data DO-178C 11.21 INPUTS OUTPUTS Verification Software Requirements Source Code System Requirements Software Architecture Trace Data Executable Object Code Parameter Data Software Verification Plan
  • 3. 2Copyright 2018 – QA Systems GmbH www.qa-systems.com Software Testing Activities 1 DO-178 Section 6.4 Requirements Based Test Objectives a. The Executable Object Code complies with the high level requirements b. The Executable Object Code is robust with the high level requirements c. The Executable Object Code complies with the low level requirements d. The Executable Object Code robust with the low level requirements e. The Executable Object Code is compatible with the target computer
  • 4. 3Copyright 2018 – QA Systems GmbH www.qa-systems.com Software Testing Stages System Requirements HSI Tests To verify the implementation of low-level requirements + derived low-level requirements To verify the interrelationships between software requirements and components and to verify the implementation of the software requirements and software components within the software architecture To verify the correct operation of the software in the target environment DO-178 Section 6.4 Table A.6Parameter Data Low Level (unit) Tests Low Level Requirement Derived Low Level Requirement SW Integration Test Code High Level Requirement
  • 5. 4Copyright 2018 – QA Systems GmbH www.qa-systems.com Software Testing Activities 2 DO-178 Section 6.4.4.1 Test Coverage Analysis Objectives a. Analysis using the associated Trace Data, to confirm that the test cases exits for each software requirement b. Analysis to confirm the test cases satisfy the criteria for normal and robustness testing as defined in section 6.4.2 c. Resolution if any deficiencies identified in the analysis. Possible solutions are adding or enhancing test cases. d. Analysis to confirm that all the test cases, and thus all the test procedures, used to achieve structural coverage, are traceable to requirements.
  • 6. 5Copyright 2018 – QA Systems GmbH www.qa-systems.com Test Coverage Analysis Low Level Requirement Code Low Level (unit) Test Requirements Coverage % Requirements verified by tests <-> Traceability to tests Test Coverage % tests executed & passing <-> Traceability to requirements Parameter Data
  • 7. 6Copyright 2018 – QA Systems GmbH www.qa-systems.com Software Testing Activities 3 DO-178 Section 6.4.4.2 Structural Coverage Analysis Objectives a. Analysis of the structural coverage information collected during requirements-based testing to confirm that the degree of structural coverage is appropriate to the software level. b. Structural coverage analysis may be performed on the Source Code, object code or Executable Object Code. {NB - Additional verification for additional code not directly traceable to Source Code} c. Analysis to confirm that the requirements-based testing has exercised the data and control coupling between code components. d. Structural coverage analysis resolution
  • 8. 7Copyright 2018 – QA Systems GmbH www.qa-systems.com Software Testing Activities 4 DO-178 Section 6.4.4.3 Structural Coverage Resolution Objectives a. Shortcomings in requirements-based test cases or procedures. b. Inadequacies in software requirements. c. Extraneous code, including dead code. d. Deactivated code. 1. Category One 2. Category Two
  • 9. 8Copyright 2018 – QA Systems GmbH www.qa-systems.com Structural Coverage Analysis Low Level Requirement Code Low Level (unit) Test Code Coverage % code executed by tests <->Traceability to requirements <->Traceability to tests Parameter Data
  • 10. 9Copyright 2018 – QA Systems GmbH www.qa-systems.com Requirements Based Test Selection DO-178 Section 6.4.2 Requirements-Based Test Selection 1. Specific test cases should be developed to include normal range test cases and robustness (abnormal range) test cases. 2. The specific test cases should be developed from the software requirements and the error sources inherent in the software development processes. Note: Robustness test cases are requirements-based. 3. Test procedures are generated from the test cases.
  • 11. 10Copyright 2018 – QA Systems GmbH www.qa-systems.com Manual Test Generation Test Cases crafted Manually from Requirements Can be hard work! – Even with powerful test tools Insufficiently validated requirements  Decomposed, Correct, Complete, Unambiguous, Logically consistent  High-reliance on structural code coverage & reverse engineering Complexity of test vectors  Pre-conditions, Inputs, Expected behaviours & outputs, post-conditions Boundary of Low Level Requirements ≠ usable test case vectors  Test Framework: Drivers, Dependencies & Datasets Gaps & Overlaps  Defensive programming, private/protected code etc  Equivalence classes
  • 12. 11Copyright 2018 – QA Systems GmbH www.qa-systems.com Requirements Tests Parameter Data Code So…How to Automate?
  • 13. 12Copyright 2018 – QA Systems GmbH www.qa-systems.com Tests Generation from Requirements Test Case Generation Test Cases Generated from Requirements Very limited capability from: NL, SNL, PDL, Use Case Scenarios Mathematical specs More capability from Models (e.g. MBT with UML) Requirements
  • 14. 13Copyright 2018 – QA Systems GmbH www.qa-systems.com Generation from Code Test Cases Generated from Code Test Vectors from path solving  Intelligent optimisation Full test framework  Pre-conditions, Inputs, Expected behaviours, Expected outputs & post-conditions Tests generated for maintainability & traceability Code Tests AutoTest Requirements
  • 15. 14Copyright 2018 – QA Systems GmbH www.qa-systems.com AutoTest & Trace for DO-178C AutoTest
  • 16. 15Copyright 2017 – QA Systems GmbH www.qa-systems.com AutoTest Generation Flexible application  GUI or CLI invocation  Complete suite of passing unit tests  Additional test cases to fill gaps  Black-box cluster integration test through public functions  White-box unit isolation test of static functions  Uses Cantata workspace preferences Test cases exercise all paths through the code  Entry-Point  Statement  Decision  MC/DC (unique cause) Test Cases are complete & maintainable for full control  All required inputs: parameters + accessible data  All expected outputs: parameters + accessed data + call-order  Each test case path solving purpose explained
  • 17. 16Copyright 2017 – QA Systems GmbH www.qa-systems.com Build RunTest Exe Instruments AutoTest Makefiles Tests Code AutoTest Process Code Copy Generation Report Test Results Automatic Test Generation Automatic Test Execution
  • 18. 17Copyright 2017 – QA Systems GmbH www.qa-systems.com AutoTest DO-178C Use Cases Source Code Testability Assessment  The AutoTest Generation Report may be used to identify difficulties in creating low-level Cantata test cases for the software or potential run-time errors: Dynamically unreachable code Crash scenarios Compiler type truncation Data un-initialized or function static Implicit function declarations Test Cases for Assignment as Requirements-based tests  Generated test cases may be reviewed and used (unaltered or modified) to meet requirements based verification objectives  Test cases can be assigned to requirements in Trace once assessed as meeting the objectives  NOTE: DO-178C 6.4.4.1.d. Requires all test cases used to achieve structural coverage are traceable to requirements. Targeted Test Case Generation  Test cases can be generated for all functions in a source file  Test cases can be added to test scripts for selected functions to help achieve structural code coverage requirements (e.g. MC/DC)
  • 19. 18Copyright 2017 – QA Systems GmbH www.qa-systems.com Traced requirements, test status and code coverage Test Information .csv ReqIF Excel Requirements Requirements Management Tool Full bi-directional requirements traceability evidence Drag and drop tracing of requirements (text, diagrams, links) with test cases. Generate tests link to requirements Test Tool Requirements Trace Closes Loop
  • 20. 19Copyright 2017 – QA Systems GmbH www.qa-systems.com Easy Linking in Cantata Trace Bi-directional drag and drop interface, immediately creates links on a server Whole Test Scripts linked to Requirements Individual Test Cases linked to Requirements
  • 21. 20Copyright 2017 – QA Systems GmbH www.qa-systems.com 3 Part Automation 1 Automatic Test Vector Generation Test case vectors from code exercising all paths (up to MC/DC coverage) Sets input parameters & data throughout test execution Checks expected vs actual data, input & output parameters and call order 3 Automated Traceability & Coverage Data Production Complete Requirements imported/exported for testing AutoTest cases generated with traceable descriptions Test status, Requirements traceability & Structural coverage evidence 2 Automated Test Execution Continuous integration build, run and reporting
  • 22. 21Copyright 2018 – QA Systems GmbH www.qa-systems.com Complete 3 Way Analysis Low Level Requirement Code Low Level (unit) Test Parameter Data Requirements Coverage See requirements coverage in your requirements management & test tools Use the same tool for all trace data Test Coverage Run tests when not executed (continuous integration and testing helps a lot) Fix tests when they fail Code Coverage When you have gaps, identify if the code is: dead / redundant, unreachable, deactivated (not used in this context) If not, then add a test and that needs to be traced to [new] requirements
  • 23. 22Copyright 2018 – QA Systems GmbH www.qa-systems.com
  • 24. 23Copyright 2018 – QA Systems GmbH www.qa-systems.com If code already handles these – then AutoTest generation is very helpful If code does not – then Traceability should catch them as AutoTest will not But AutoTest could generate test cases for these scenarios too… Further Enhancements? - Robustness DO-178 Sections 6.4.3 identifies Normal & Robustness Test Cases Robustness test cases demonstrate the ability of the software to respond to abnormal inputs and conditions. • Failure of an algorithm to satisfy a software requirement. • Incorrect loop operations. • Incorrect logic decisions. • Failure to process correctly legitimate combinations of input conditions. • Incorrect responses to missing or corrupt input data. • Incorrect handling of exceptions, such as arithmetic faults or violations of array limits. • Incorrect computation sequence. • Inadequate algorithm precision, accuracy, or performance. • Incorrect initialization of variables and constants. • Parameter passing errors. • Data corruption, especially global data. • Inadequate end-to-end numerical resolution. • Incorrect sequencing of events and operations.
  • 25. 24Copyright 2018 – QA Systems GmbH www.qa-systems.com Thank you

Editor's Notes

  • #5: It sometimes happens that high-level requirements are used to generate source code directly, in which cases those high-level requirements are also considered to be low-level requirements.
  • #24: October 27 2017 –The Federal Aviation Administration (FAA) and the Civil Aviation Administration of China (CAAC) today announced the signing of an implementing agreement under the U.S. – China Bilateral Aviation Safety Agreement (BASA) recognizing each other’s regulatory systems with respect to the airworthiness of aviation products and articles. The Implementation Procedures for Airworthiness (IPA) document allows each authority to leverage approvals completed by the other with respect to design, production, and airworthiness as well as continued airworthiness.
  • #26: .