SlideShare a Scribd company logo
Modeling and Testing
Dovetail in MagicDraw
Gregory Solovey
MD facilitates the creation of system models. This
presentation proposes to extend this functionality by
adding the ability to monitor the model implementation:
• A test management plugin allows to build tests that
verify the models and monitor the test implementation
progress. Our ultimate goal is to deliver products with
zero implementation defects (“shift left”)
• A project management plugin allows to monitor new
feature implementation. Our ultimate goal is to monitor
the new development-related KPIs
MagicDraw - value added
Proposal
In the presence of a modeling tool, to
ensure test completeness, the “Model
Driven Testing” approach has to be
used
are run against
SYSTEM
Model Driven Testing
Code
management:
Executable
Test Scripts
Test
management:
Abstract Tests
Model
management:
Structural &
behavioral
models
describe
are derived from
are mapped to
Project
management:
Feature
Backlog are associated with
dashboards
MagicDraw
“Enriched” model management
The author of the PERL EXPECT plugin once said:
“I took 5% of EXPECT language that is used 95%
of time.”
Similarly, my proposal is to augment MD with
~5% traditional test management and project
management functionality to assure the test
completeness against the models and monitor
the new feature implementation.
For knowing…
Nikola Tesla visited Henry Ford at his factory, which was
having some kind of difficulty. Ford asked Tesla if he
could help identify the problem area. Tesla walked up
to a wall of boilerplate and made a small X in chalk on
one of the plates. Ford was thrilled, and told him to
send an invoice. The bill arrived, for $10,000. Ford
asked for a breakdown. Tesla sent another invoice,
indicating a $1 charge for marking the wall with an X,
and $9,999 for knowing where to put it.
Project management
E2E traceability
Model management
Test management
releases  features
REQs/ AC  models
tests  test scripts
Model management
Model management
• The system can be presented by two integrated
parts: existing system features and features that are
under development.
• A feature is a subset of models and/or their
elements.
• There are thousands of features that make up a
system. Therefore, it is essential to identify the
features that are under current development; only
these features need to be monitored.
Model management - structural view
Application/
Business
layer
Application
1
Application
2
Application
6
Middleware/
Platform
layer
Service
1
Service
2
Service
3
HW
abstraction
layer
Component 1 Component 2 Component 3
Interface
layer API 1 API 2 API 7
Model management - behavioral view
Use case
Diagram 1
Activity
Diagram 9
Sequence
diagram 5
Activity
diagram 3
Sequence
diagram 2
Sequence
diagram 7
Activity
diagram 3
Use case
diagram 2
Activity
Diagram 5
Activity
Diagram 3
Activity
diagram 9
State
machine 8
Activity
diagram 3
Model management - system view
Application/
Business
layer
Application
1
Application
2
Application
6
Middleware/
Platform
layer
Service
1
Service
2
Service
3
HW
abstraction
layer
Component 1 Component 2 Component 3
Interface
layer API 1 API 2 API 7
Model management - feature view
Use case
Diagram 1
Activity
Diagram 9
Sequence
diagram 5
Activity
diagram 3
Sequence
diagram 2
Sequence
diagram 7
Activity
diagram 3
Use case
diagram 2
Activity
Diagram 5
Activity
Diagram 3
Activity
diagram 9
State
machine 8
Activity
diagram 3
Use case
Diagram 1
Activity
Diagram 9
Sequence
diagram 5
Activity
diagram 3
Sequence
diagram 2
Sequence
diagram 7
Activity
diagram 3
Use case
diagram 2
Activity
Diagram 5
Activity
Diagram 3
Activity
diagram 9
State
machine 8
Activity
diagram 3
Feature WWW-111
Feature XYZ-321
A feature is represented by new and/or updated diagrams/diagram elements
Test management
Preamble
The coverage of requirements and
acceptance criteria (REQs/AC) is
necessary, but not sufficient, to
achieve complete test.
Example: REQs/AC coverage
REQs TEXT AC
RPREQ_1500
As Application_3 SW, I want Platform SW to control state LEDs during early HW start-up, warm and
cold reset, so that sate HW physical LEDs indicate when module is in resetting phase and initialized.
RPREQ_1747
As Application_3 SW, I want Platform SW to set color and pattern on state and interface LEDs
according to received requests, so that I can see a state of a particular unit.
RPAC_498
RPAC_523
AC TEXT REQ
RPAC_498
Service_5 notifies Component_5 about new state LED color and pattern, Component_5 requests
Component_9 to set state LED according to Sefice_5 notification. Component_9 sets the state LED
accordingly. RPREQ_1747
RPAC_523
Service_8 notifies Component_2 about new state LED color and pattern. Component_2 requests
Component_9 to set LED according to Service_8 notification. Component_9 sets particular state LED
color and pattern. RPREQ_1747
The coverage of REQs/AC by test is typically required by most
organizations. Often, the AC are just a rephrasing of the
respective requirements.
In this example, testers can get away with just ~15 test cases to
cover these REQs/AC. However …
Example: Specification coverage
Use case diagram (1 diagram):
start-up, cold/ warm reset/ OFF
for various boards Activity diagram (7 diagrams):
Algorithms/ conditions of
start-up, cold/ warm reset/ OFF
Sequence diagram (4 diagrams):
Message exchange for LED settings
The previous requirements are described by 12 UML models.
These models require ~200 test cases (as opposed to 15).
MD TMS vs traditional TMS
A traditional TMS:
• Deals with requirements coverage. In contrast, the
MD TMS maps tests to specification models. This
allows to control the test completeness.
• Is a release-oriented tool - all feature test plans exist
only temporarily and independently from the
regression tests.
• Does not allow to see how well the regression test
covers the existing system, because a traditional TMS
is not linked to the overall system architecture/
behavior.
Principles of model-based tests
• The main purpose of a TMS within MD is to associate
tests to models, to ensure the test completeness.
• Tests are built based on the model types.
• Tests are not generated automatically.
• Test completeness is verified during the review.
• Tests include a requirement ID, a model ID, and a
unique tag for traceability purposes.
• Executable test scripts are not intended to be stored
in TMS, but they have to cover the tests in TMS, using
the test tag.
Test model
Test Case layer
Test Scenario layer
Test Suite layer
Test Plan layer TP 1
TS 1
UC 1
TC 1 TC 2 TC 3
UC 2
TC 5 TC 6
TS 2
UC 7
TC 7
TS 3
UC 5
TC 8 TC 9
• A Test Plan represents one of the traditional test
types, such as application, feature, sanity, regression,
performance, etc.
• A Test Suite reflects the structural view of the system.
• A Test Scenario mirrors the behavioral view, such as
end-to-end scenarios or business functions.
• A Test Case is a set of actions, such as a message
exchange, with one compare statement.
Test hierarchy
Model - Test
Use case
Diagram 1
Activity
Diagram 9
Sequence
diagram 5
Activity
diagram 3
Sequence
diagram 2
Sequence
diagram 7
Activity
diagram 3
Use case
diagram 2
Activity
Diagram 5
Activity
Diagram 3
Activity
diagram 9
State
machine 8
Activity
diagram 3
Application/
Business
layer Application 1 Application 2 Application 6
Middleware/
Platform
layer Service 1 Service 2 Service 3
HW
abstraction
layer Component 1 Component 2 Component 3
Interface
layer
API 1 API 2 API 7
Test management: GUI
MagicDraw
Model
management
Test
management
Project
management
Test management: Interfaces
Dashboards
Jason -
Test tags
CI result
repository
Jason -
Test Plan
Jason -
Test Plan -
review
Jason -
Test Plan
Export
Import
Project management
Project management
• The main purpose is to monitor the quality of new feature
development.
• The decomposition/refinement process produces the backlogs
for various system levels/ components, that represent the initial
data for MD project management.
• MD project management uses only data that are necessary to
monitor the development and verification of the models, such as
the relationship between features, models and tests.
• Most of the common project management artifacts, such as
implementation tasks, schedule, builds, definition of done, etc.
are not included in the MD project management.
KPI sourcesFilter sources
Project management
feature
releases
products
requirements AC
references to
models
references to
Test Plans
belongs to
is used in
is defined by
is verified by
is implemented through
is verified by
variants
is applied to
Export
MagicDraw
Model
management
Test
management
Project
management
Interfaces
Dashboards
Jason -
release/ feature
Test Plans
Import
Jason -
requirements
/ AC data
Jason -
features data
Requirement
repository
Backlogs
Reports, Dashboards, search pages:
Select artifacts: release, product, component, feature,
architectural layer, test plan
Show KPIs:
test plans coverage by automated tests
test plan requirements/ AC coverage by automated tests
feature requirements/ AC coverage by automated tests
model coverage by abstract test
Show relationship/ traceability:
release <-> requirements/ AC <-> models <->
test plans <-> test scripts <-> test cases
Solution: Report management
Test Quality Dashboard
Quality Dashboards:
• system components coverage by test
• new features coverage by test
Daily: Extract
features data
Doc repository:
requirements
and acceptance
criteria
E2E Process
JIRA
Backlogs:
releases and
features
Modeling Tool:
Specifications
and Design
Test
Management
System: abstract
testware
DevOps environment
Source Control
System: Test
Scripts
Daily: Extract testware
tags/ results
DevOps
environment:
Logs and Reports
feature/reqs/models/tests repository
Conclusion
The test and project management, as additions
to MagicDraw, provide the possibility to verify
the system development and to monitor the
implementation progress.
These “extensions” have the potential of making
MagicDraw attractive to a broader customer
base, looking for model implementation
aspects.
Further reading
• Requirements coverage - a false sense of security,
Professional Tester magazine, issue 42, 12-17; December
2017. Is the forerunner of the this presentation
• Tower of Babel insights Professional Tester magazine, issue
35, 15-18; December 2015. Proposes standards that make
requirements testable
• From test techniques to test methods Professional Tester
magazine, issue 29, November 2014; 4-14; Presents test
design methods for all UML software models
• QA of testing, Professional Tester magazine, issue 28, August
2014; 9-12; Describes the process that guaranties the test
automation in parallel with code development
Thank you
for attending this session
gregory.solovey@nokia.com

More Related Content

PPTX
Advanced technologies as "DOC, DPF, SCR" to reduce Diesel engines harmful em...
PPTX
Airbags in automobile.pptx
PDF
Agile Modelling Architecture
PDF
Model-Driven Software Verification
PPT
Testing Presentation
PDF
Mdd test qa_test2014_bryan_bakker
PPTX
Model-Based Testing: Theory and Practice. Keynote @ MoTiP (ISSRE) 2012.
PPTX
Model Driven Testing: requirements, models & test
Advanced technologies as "DOC, DPF, SCR" to reduce Diesel engines harmful em...
Airbags in automobile.pptx
Agile Modelling Architecture
Model-Driven Software Verification
Testing Presentation
Mdd test qa_test2014_bryan_bakker
Model-Based Testing: Theory and Practice. Keynote @ MoTiP (ISSRE) 2012.
Model Driven Testing: requirements, models & test

Similar to Modeling and Testing Dovetail in MagicDraw (20)

PDF
OMG Introduction Dr. Richard Mark Soley
PPTX
Java Unit Test and Coverage Introduction
PDF
Software Metrics
PDF
No magic@md day2011
PDF
Model-based Testing Principles
PDF
Close Encounters in MDD: when models meet code
PDF
Close encounters in MDD: when Models meet Code
PPT
Software testing & its technology
PDF
Function Points
PPTX
Agile testing for embedded software development
PDF
st-notes-13-26-software-testing-is-the-act-of-examining-the-artifacts-and-the...
PPTX
SE - Lecture 8 - Software Testing State Diagram.pptx
PDF
Metrics driven development 10.09.2014
PPTX
SCRIMPS-STD: Test Automation Design Principles - and asking the right questions!
PPT
ERP_Up_Down.ppt
PPTX
Agile MDD
PDF
Полезные метрики покрытия. Практический опыт и немного теории
PDF
Model-Based Testing in The Test Automation
PDF
Introduction To Software Testing 2nd Edition Paul Ammann Jeff Offutt
OMG Introduction Dr. Richard Mark Soley
Java Unit Test and Coverage Introduction
Software Metrics
No magic@md day2011
Model-based Testing Principles
Close Encounters in MDD: when models meet code
Close encounters in MDD: when Models meet Code
Software testing & its technology
Function Points
Agile testing for embedded software development
st-notes-13-26-software-testing-is-the-act-of-examining-the-artifacts-and-the...
SE - Lecture 8 - Software Testing State Diagram.pptx
Metrics driven development 10.09.2014
SCRIMPS-STD: Test Automation Design Principles - and asking the right questions!
ERP_Up_Down.ppt
Agile MDD
Полезные метрики покрытия. Практический опыт и немного теории
Model-Based Testing in The Test Automation
Introduction To Software Testing 2nd Edition Paul Ammann Jeff Offutt
Ad

Recently uploaded (20)

PDF
System and Network Administraation Chapter 3
PPTX
Reimagine Home Health with the Power of Agentic AI​
PDF
Design an Analysis of Algorithms II-SECS-1021-03
PDF
Digital Strategies for Manufacturing Companies
PPTX
Transform Your Business with a Software ERP System
PPTX
Odoo POS Development Services by CandidRoot Solutions
PDF
Design an Analysis of Algorithms I-SECS-1021-03
PDF
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
PDF
medical staffing services at VALiNTRY
PDF
AI in Product Development-omnex systems
PDF
How to Migrate SBCGlobal Email to Yahoo Easily
PDF
wealthsignaloriginal-com-DS-text-... (1).pdf
PDF
Odoo Companies in India – Driving Business Transformation.pdf
PDF
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
PDF
Nekopoi APK 2025 free lastest update
PDF
How to Choose the Right IT Partner for Your Business in Malaysia
PPTX
L1 - Introduction to python Backend.pptx
PDF
Adobe Premiere Pro 2025 (v24.5.0.057) Crack free
PPTX
ai tools demonstartion for schools and inter college
PDF
Internet Downloader Manager (IDM) Crack 6.42 Build 42 Updates Latest 2025
System and Network Administraation Chapter 3
Reimagine Home Health with the Power of Agentic AI​
Design an Analysis of Algorithms II-SECS-1021-03
Digital Strategies for Manufacturing Companies
Transform Your Business with a Software ERP System
Odoo POS Development Services by CandidRoot Solutions
Design an Analysis of Algorithms I-SECS-1021-03
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
medical staffing services at VALiNTRY
AI in Product Development-omnex systems
How to Migrate SBCGlobal Email to Yahoo Easily
wealthsignaloriginal-com-DS-text-... (1).pdf
Odoo Companies in India – Driving Business Transformation.pdf
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
Nekopoi APK 2025 free lastest update
How to Choose the Right IT Partner for Your Business in Malaysia
L1 - Introduction to python Backend.pptx
Adobe Premiere Pro 2025 (v24.5.0.057) Crack free
ai tools demonstartion for schools and inter college
Internet Downloader Manager (IDM) Crack 6.42 Build 42 Updates Latest 2025
Ad

Modeling and Testing Dovetail in MagicDraw

  • 1. Modeling and Testing Dovetail in MagicDraw Gregory Solovey
  • 2. MD facilitates the creation of system models. This presentation proposes to extend this functionality by adding the ability to monitor the model implementation: • A test management plugin allows to build tests that verify the models and monitor the test implementation progress. Our ultimate goal is to deliver products with zero implementation defects (“shift left”) • A project management plugin allows to monitor new feature implementation. Our ultimate goal is to monitor the new development-related KPIs MagicDraw - value added
  • 3. Proposal In the presence of a modeling tool, to ensure test completeness, the “Model Driven Testing” approach has to be used
  • 4. are run against SYSTEM Model Driven Testing Code management: Executable Test Scripts Test management: Abstract Tests Model management: Structural & behavioral models describe are derived from are mapped to Project management: Feature Backlog are associated with dashboards MagicDraw
  • 5. “Enriched” model management The author of the PERL EXPECT plugin once said: “I took 5% of EXPECT language that is used 95% of time.” Similarly, my proposal is to augment MD with ~5% traditional test management and project management functionality to assure the test completeness against the models and monitor the new feature implementation.
  • 6. For knowing… Nikola Tesla visited Henry Ford at his factory, which was having some kind of difficulty. Ford asked Tesla if he could help identify the problem area. Tesla walked up to a wall of boilerplate and made a small X in chalk on one of the plates. Ford was thrilled, and told him to send an invoice. The bill arrived, for $10,000. Ford asked for a breakdown. Tesla sent another invoice, indicating a $1 charge for marking the wall with an X, and $9,999 for knowing where to put it.
  • 7. Project management E2E traceability Model management Test management releases  features REQs/ AC  models tests  test scripts
  • 9. Model management • The system can be presented by two integrated parts: existing system features and features that are under development. • A feature is a subset of models and/or their elements. • There are thousands of features that make up a system. Therefore, it is essential to identify the features that are under current development; only these features need to be monitored.
  • 10. Model management - structural view Application/ Business layer Application 1 Application 2 Application 6 Middleware/ Platform layer Service 1 Service 2 Service 3 HW abstraction layer Component 1 Component 2 Component 3 Interface layer API 1 API 2 API 7
  • 11. Model management - behavioral view Use case Diagram 1 Activity Diagram 9 Sequence diagram 5 Activity diagram 3 Sequence diagram 2 Sequence diagram 7 Activity diagram 3 Use case diagram 2 Activity Diagram 5 Activity Diagram 3 Activity diagram 9 State machine 8 Activity diagram 3
  • 12. Model management - system view Application/ Business layer Application 1 Application 2 Application 6 Middleware/ Platform layer Service 1 Service 2 Service 3 HW abstraction layer Component 1 Component 2 Component 3 Interface layer API 1 API 2 API 7
  • 13. Model management - feature view Use case Diagram 1 Activity Diagram 9 Sequence diagram 5 Activity diagram 3 Sequence diagram 2 Sequence diagram 7 Activity diagram 3 Use case diagram 2 Activity Diagram 5 Activity Diagram 3 Activity diagram 9 State machine 8 Activity diagram 3 Use case Diagram 1 Activity Diagram 9 Sequence diagram 5 Activity diagram 3 Sequence diagram 2 Sequence diagram 7 Activity diagram 3 Use case diagram 2 Activity Diagram 5 Activity Diagram 3 Activity diagram 9 State machine 8 Activity diagram 3 Feature WWW-111 Feature XYZ-321 A feature is represented by new and/or updated diagrams/diagram elements
  • 15. Preamble The coverage of requirements and acceptance criteria (REQs/AC) is necessary, but not sufficient, to achieve complete test.
  • 16. Example: REQs/AC coverage REQs TEXT AC RPREQ_1500 As Application_3 SW, I want Platform SW to control state LEDs during early HW start-up, warm and cold reset, so that sate HW physical LEDs indicate when module is in resetting phase and initialized. RPREQ_1747 As Application_3 SW, I want Platform SW to set color and pattern on state and interface LEDs according to received requests, so that I can see a state of a particular unit. RPAC_498 RPAC_523 AC TEXT REQ RPAC_498 Service_5 notifies Component_5 about new state LED color and pattern, Component_5 requests Component_9 to set state LED according to Sefice_5 notification. Component_9 sets the state LED accordingly. RPREQ_1747 RPAC_523 Service_8 notifies Component_2 about new state LED color and pattern. Component_2 requests Component_9 to set LED according to Service_8 notification. Component_9 sets particular state LED color and pattern. RPREQ_1747 The coverage of REQs/AC by test is typically required by most organizations. Often, the AC are just a rephrasing of the respective requirements. In this example, testers can get away with just ~15 test cases to cover these REQs/AC. However …
  • 17. Example: Specification coverage Use case diagram (1 diagram): start-up, cold/ warm reset/ OFF for various boards Activity diagram (7 diagrams): Algorithms/ conditions of start-up, cold/ warm reset/ OFF Sequence diagram (4 diagrams): Message exchange for LED settings The previous requirements are described by 12 UML models. These models require ~200 test cases (as opposed to 15).
  • 18. MD TMS vs traditional TMS A traditional TMS: • Deals with requirements coverage. In contrast, the MD TMS maps tests to specification models. This allows to control the test completeness. • Is a release-oriented tool - all feature test plans exist only temporarily and independently from the regression tests. • Does not allow to see how well the regression test covers the existing system, because a traditional TMS is not linked to the overall system architecture/ behavior.
  • 19. Principles of model-based tests • The main purpose of a TMS within MD is to associate tests to models, to ensure the test completeness. • Tests are built based on the model types. • Tests are not generated automatically. • Test completeness is verified during the review. • Tests include a requirement ID, a model ID, and a unique tag for traceability purposes. • Executable test scripts are not intended to be stored in TMS, but they have to cover the tests in TMS, using the test tag.
  • 20. Test model Test Case layer Test Scenario layer Test Suite layer Test Plan layer TP 1 TS 1 UC 1 TC 1 TC 2 TC 3 UC 2 TC 5 TC 6 TS 2 UC 7 TC 7 TS 3 UC 5 TC 8 TC 9
  • 21. • A Test Plan represents one of the traditional test types, such as application, feature, sanity, regression, performance, etc. • A Test Suite reflects the structural view of the system. • A Test Scenario mirrors the behavioral view, such as end-to-end scenarios or business functions. • A Test Case is a set of actions, such as a message exchange, with one compare statement. Test hierarchy
  • 22. Model - Test Use case Diagram 1 Activity Diagram 9 Sequence diagram 5 Activity diagram 3 Sequence diagram 2 Sequence diagram 7 Activity diagram 3 Use case diagram 2 Activity Diagram 5 Activity Diagram 3 Activity diagram 9 State machine 8 Activity diagram 3 Application/ Business layer Application 1 Application 2 Application 6 Middleware/ Platform layer Service 1 Service 2 Service 3 HW abstraction layer Component 1 Component 2 Component 3 Interface layer API 1 API 2 API 7
  • 24. MagicDraw Model management Test management Project management Test management: Interfaces Dashboards Jason - Test tags CI result repository Jason - Test Plan Jason - Test Plan - review Jason - Test Plan Export Import
  • 26. Project management • The main purpose is to monitor the quality of new feature development. • The decomposition/refinement process produces the backlogs for various system levels/ components, that represent the initial data for MD project management. • MD project management uses only data that are necessary to monitor the development and verification of the models, such as the relationship between features, models and tests. • Most of the common project management artifacts, such as implementation tasks, schedule, builds, definition of done, etc. are not included in the MD project management.
  • 27. KPI sourcesFilter sources Project management feature releases products requirements AC references to models references to Test Plans belongs to is used in is defined by is verified by is implemented through is verified by variants is applied to
  • 28. Export MagicDraw Model management Test management Project management Interfaces Dashboards Jason - release/ feature Test Plans Import Jason - requirements / AC data Jason - features data Requirement repository Backlogs
  • 29. Reports, Dashboards, search pages: Select artifacts: release, product, component, feature, architectural layer, test plan Show KPIs: test plans coverage by automated tests test plan requirements/ AC coverage by automated tests feature requirements/ AC coverage by automated tests model coverage by abstract test Show relationship/ traceability: release <-> requirements/ AC <-> models <-> test plans <-> test scripts <-> test cases Solution: Report management
  • 31. Quality Dashboards: • system components coverage by test • new features coverage by test Daily: Extract features data Doc repository: requirements and acceptance criteria E2E Process JIRA Backlogs: releases and features Modeling Tool: Specifications and Design Test Management System: abstract testware DevOps environment Source Control System: Test Scripts Daily: Extract testware tags/ results DevOps environment: Logs and Reports feature/reqs/models/tests repository
  • 32. Conclusion The test and project management, as additions to MagicDraw, provide the possibility to verify the system development and to monitor the implementation progress. These “extensions” have the potential of making MagicDraw attractive to a broader customer base, looking for model implementation aspects.
  • 33. Further reading • Requirements coverage - a false sense of security, Professional Tester magazine, issue 42, 12-17; December 2017. Is the forerunner of the this presentation • Tower of Babel insights Professional Tester magazine, issue 35, 15-18; December 2015. Proposes standards that make requirements testable • From test techniques to test methods Professional Tester magazine, issue 29, November 2014; 4-14; Presents test design methods for all UML software models • QA of testing, Professional Tester magazine, issue 28, August 2014; 9-12; Describes the process that guaranties the test automation in parallel with code development
  • 34. Thank you for attending this session gregory.solovey@nokia.com

Editor's Notes

  • #3: Defect Detection Efficiency (DDE) is the number of defects injected and detected during a phase divided by the total number of defects injected during that phase. ALU PLTF data ~75%, but can be 95%