SlideShare a Scribd company logo
Software test automation success
A case study
Mike Snyman
Case study
Who: Large financial institution
Where: Johannesburg
When: 2006 to 2008
What: Adoption of test automation with the aim
of reducing cost and increasing
productivity
Contents
1. The challenges facing the organisation
2. The approach to solving the problem
3. How we look at, and classify automation tools (Why)
4. Our automation process
5. Our automation framework
6. Contemporary automation issues
7. Automation benefit calculations
8. What we have learned
March 2009
Introduction
“ On the island of testers they were forever doomed to search for
what should not and could not exist, knowing to succeed would
bring misery to the gods.”
Lessons Learned in Software Testing
Cem Kaner
James Bach
Bret Pettichord
“ A fool with a tool is still a fool.”
March 2009
Challenges
1. Lack of an end-to-end defined automation process.
2. Insufficient re-use of tools, scenarios and data.
3. Automation involved late in the process.
4. No living documentation.
5. Slow uptake of test automation and shelfware.
6. No means of proving automation benefits and cost justification.
7. Dependency on key resources.
March 2009
Solving the problem
1. Formal organisational project launched.
2. Project had to be based on a sound financial base.
3. Funding for the project approved at the highest level.
4. Failure not an option.
5. Resourced with capable and skilled resources.
6. Driven by a passionate person.
March 2009
Project testing
1. Formal project launched in May 2006.
2. Project had a significant budget.
3. Detailed business case was documented to facilitate return on investment.
4. Steering committee tracked progress on a monthly basis.
5. Macro project broken down into workgroups.
6. Dedicated workgroup responsible for communication and change
management.
March 2009
Project testing workstreams
• Test process workstream.
• Test governance workstream.
• Test mechanisation and tools workstream.
• Test repository workstream.
• Test environment workstream.
• Test change management, communication and training workstream.
March 2009
Test automation work stream
• Automation process.
• Automation process integrated with SDLC.
• Tool selection, procurement and training.
• Automation framework.
• Tool classification.
• Quantification of automation benefits.
March 2009
Automation process
March 2009
Analysis and design phase
• Understand the client’s test
requirements.
• Automated solution is proposed.
• Prototyping.
March 2009
Scripting/Configuration phase
• User requirements implemented.
• Activities in this phase could
include recording, scripting and
building of special utilities.
• Internal testing of automated test
solution.
March 2009
Parameter identification
• Scripts assessed against user-
defined scenarios.
• Components identified for
parameterisation.
• Components categorised based on
nature and function.
March 2009
Parameter management
• Large amounts of data created in
previous step are managed.
• Customised spreadsheets are
created.
• Requirement exists that data in
these sheets can be maintained by
non-technical personnel.
• All scenarios described in both
technical and non-technical terms.
March 2009
Scenario collection
• Spreadsheet populated with
scenarios provided by
stakeholders of the system.
• Manual test cases could provide
input.
• Actual production outages used as
input.
March 2009
Validation
• Automated validation of results
important due to volume.
• Spreadsheet updated with
expected results.
• Validation done by script using
information from spreadsheet.
• Clear “Pass” or “Fail” indicators
provided.
• Summary of results provided.
March 2009
Testing of scripts
• Like all new software, the scripts
must be tested.
• Important that when defects are
reported during the actual test
cycle that the tool is above
reproach.
• Scripts must be able to react to
anomalies in a predicted fashion.
• Must still be able to report
effectively on operational
scenarios.
March 2009
Script execution
• Scripts are run against target
application.
• Stability of system determines
support required.
• Strive to provide as much
meaningful data in the shortest
possible time.
March 2009
Review of results
• Results are reviewed internally.
• Focus on abnormal failures due to
environmental or other conditions.
• Checks are done to ensure data
integrity and scope coverage.
March 2009
Result and benefit communication
• Because of size, results are saved
in repositories.
• Screen dumps of all defects are
kept.
• Communicate benefit in terms of
automation lifecycle stage.
March 2009
0
500000
1000000
1500000
2000000
2500000
3000000
3500000
4000000
4500000
5000000
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Investment
Benefits
Stage 2
Stage 3
Stage 4
2
3
4
5
6
Regression cycles
Stage 11
March 2009
Categorising our toolset
Why?
The strength in categorising tools is the ability to provide
uniquely customised, automated testing solutions which, in
conjunction with manual testing, aim to mitigate both the
product and project risks associated with solution deployment.
March 2009
Categorising our toolset
(continued)
March 2009
Categorised toolset
• Front-end user interface (Functional)
• Back-end or network interaction (Functional)
• Front-end user interface (Non-functional)
• Back-end or network interaction (Non-functional)
• General automated testing utilities
March 2009
Front-end user interface (Functional)
• Record and playback.
• Simulate user interaction with
applications.
• Provide support for unit,
integration, system and
acceptance testing.
• User regression testing particularly
suits the nature of these tools.
March 2009
Back-end or network interaction
(Functional)
• Ability to simulate user interaction
in the absence of a front-end.
• Supports bottom-up integration.
• Provides the ability to find defects
much sooner in the SDLC.
• We found it valuable in testing
SOA implementations.
March 2009
Front-end user interface
(Non-functional)
• Ability to generate the required
load for performance, load and
stress testing.
• These types of tests can typically
not be performed manually.
• Tool is dependent on the
functional quality of the application
under test.
• Test environment comparison with
production .
March 2009
Back-end or network interaction
(Non-functional)
• Due to the nature of
implementation, it is not practical
to use the front-end to generate
the required load (ATM, point-of-
sale devices).
• Typically these kinds of tests
cannot be performed manually.
• Test environment comparison with
production .
• We found it valuable in testing the
performance of services in SOA
implementations.
March 2009
General automated testing utilities
• Provides general support for manual and automated testing
activities.
• Compares large amounts of data.
• Generates unique data sets from production data.
• Generates vast amount of test data.
• Provides support to all previously-mentioned tool categories.
March 2009
Contemporary automation issues
Automation lag
• Refers to the time lost between the implementation of the
application, and the point at which automated functional
scripts can be used during testing.
March 2009
Automation lag
Scripting
Parameters
Spreadsheet
Testing
Validation
ScriptExecution
Data
Prep
Analysis
Feedback
Application available for testingApplication available for testing
Design &
Build
Design &
Build
Automation LagAutomation Lag
Manual test executionManual test execution
Automated
testing
Automated
testing
ImplementationImplementation
March 2009
Contemporary automation issues
(continued)
User acceptance testing and automation
• To perform successful user acceptance testing, the
involvement of the user is critical – both in defining the test,
execution and validation of results.
• Users responsible for acceptance are asked to complete
the scenario spreadsheet.
• We ask the user to monitor a few initial script executions.
March 2009
Automation benefit calculations
1. Historically problematic to get accurate measures.
2. Focus initially on quality.
3. Difficult to quantify quality in monetary terms.
4. Decision taken to focus on productivity improvement and
associated cost reduction.
March 2009
Automation benefit calculations process
1. Identify the smallest common element.
2. In most cases it would be the parameters identified in the
scripting process.
3. Compare effort (duration) associated with manual testing with
that of automated testing per parameter.
4. Describe the improvement in monetary terms by using an
average resource cost.
March 2009
Automation benefit
1. Total cost saving for the period 2005 to mid-2008:
R86 183 171.00 ($ 8600 000 US)
2. Benefit confirmed by system owners and users.
3. Calculation method reviewed by Finance.
4. Return of this nature justified the significant investment made
in both project testing and automation tools.
March 2009
What have we learned?
1. Having a test tool is not a strategy.
2. Automation does not test in the same way that a manual tester does.
3. Record and playback is only the start.
4. Automated test scripts are software programs and must be treated as
such.
5. The value lies in maintenance.
6. Categorise your toolset, and design integrated solutions.
.
March 2009
Thank you
Mikes@Nedbank.co.za

More Related Content

PPTX
Ben Walters - Creating Customer Value With Agile Testing - EuroSTAR 2011
PDF
Christian Bk Hansen - Agile on Huge Banking Mainframe Legacy Systems - EuroST...
PPT
Rob Baarda - Are Real Test Metrics Predictive for the Future?
PDF
Anders Claesson - Test Strategies in Agile Projects - EuroSTAR 2010
PPTX
T19 performance testing effort - estimation or guesstimation revised
PDF
Julie Gardiner - Branch out using Classification Trees for Test Case Design -...
PPT
'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer
PPT
John Brennen - Red Hot Testing in a Green World
Ben Walters - Creating Customer Value With Agile Testing - EuroSTAR 2011
Christian Bk Hansen - Agile on Huge Banking Mainframe Legacy Systems - EuroST...
Rob Baarda - Are Real Test Metrics Predictive for the Future?
Anders Claesson - Test Strategies in Agile Projects - EuroSTAR 2010
T19 performance testing effort - estimation or guesstimation revised
Julie Gardiner - Branch out using Classification Trees for Test Case Design -...
'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer
John Brennen - Red Hot Testing in a Green World

What's hot (20)

PPT
James Brodie - Outsourcing Partnership - Shared Perspectives
PPT
'Customer Testing & Quality In Outsourced Development - A Story From An Insur...
PDF
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
PDF
Software Testing - Defect Metrics & Analysis
PDF
Bjarne Mansson - Risk-based Testing,A Must For Medical Devices - EuroSTAR 2010
PPT
'How To Apply Lean Test Management' by Bob van de Burgt
PPT
'Houston We Have A Problem' by Rien van Vugt & Maurice Siteur
PPTX
ISTQB foundation level - day 2
PPT
Gitte Ottosen - Agility and Process Maturity, Of Course They Mix!
PPT
Derk jan de Grood - ET, Best of Both Worlds
PDF
Edwin Van Loon - How Much Testing is Enough - EuroSTAR 2010
PPT
'Acceptance Testing' by Erik Boelen
PPT
Mattias Diagl - Low Budget Tooling - Excel-ent
PPT
Wim Demey - Regression Testing in a Migration Project
PPTX
Mickiel Vroon - Test Environment, The Future Achilles’ Heel
PPT
Introduction to ISTQB & ISEB Certifications
PDF
Henrik Andersson - Exploratory Testing Champions - EuroSTAR 2010
PPT
John Kent - An Entity Model for Software Testing
PPT
'Continuous Quality Improvements – A Journey Through The Largest Scrum Projec...
PPTX
'Growing to a Next Level Test Organisation' by Tim Koomen
James Brodie - Outsourcing Partnership - Shared Perspectives
'Customer Testing & Quality In Outsourced Development - A Story From An Insur...
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
Software Testing - Defect Metrics & Analysis
Bjarne Mansson - Risk-based Testing,A Must For Medical Devices - EuroSTAR 2010
'How To Apply Lean Test Management' by Bob van de Burgt
'Houston We Have A Problem' by Rien van Vugt & Maurice Siteur
ISTQB foundation level - day 2
Gitte Ottosen - Agility and Process Maturity, Of Course They Mix!
Derk jan de Grood - ET, Best of Both Worlds
Edwin Van Loon - How Much Testing is Enough - EuroSTAR 2010
'Acceptance Testing' by Erik Boelen
Mattias Diagl - Low Budget Tooling - Excel-ent
Wim Demey - Regression Testing in a Migration Project
Mickiel Vroon - Test Environment, The Future Achilles’ Heel
Introduction to ISTQB & ISEB Certifications
Henrik Andersson - Exploratory Testing Champions - EuroSTAR 2010
John Kent - An Entity Model for Software Testing
'Continuous Quality Improvements – A Journey Through The Largest Scrum Projec...
'Growing to a Next Level Test Organisation' by Tim Koomen
Ad

Viewers also liked (20)

PPTX
Software test automation
PPTX
On-Demand Webinar: Software Virtualization Lifecycle
PDF
Tom Appleby - Sustainable Severn 2015
PDF
Voip automation framework
PPTX
Maximizing UI Automation – A Case Study
PPT
Test automation lessons from WebSphere Application Server
PPTX
Software-automation-testing-course-navi-mumbai-software-automation-testing-co...
PPTX
Software Test Automation
PDF
Improving software testing efficiency using automation methods by thuravupala...
PPTX
Improving Agility While Widening Profit Margins Using Data Virtualization
PPTX
Automation in software engineering
PPTX
Coded ui - lesson 3 - case study - calculator
PPT
LIST OF TOP BUG TRACKING TOOLS AND WHY ARE THEY CALLED SO
PDF
Case Study : Manual & Automation Testing
PPTX
Testers and Coders - Blurring the Lines
PDF
Why Test Automation Fails
PDF
Test automation - Building effective solutions
PPT
Improving ROI and Efficiencies of Software Test Case Automation
PDF
Denodo DataFest 2016: ROI Justification in Data Virtualization
PDF
automation testing benefits
Software test automation
On-Demand Webinar: Software Virtualization Lifecycle
Tom Appleby - Sustainable Severn 2015
Voip automation framework
Maximizing UI Automation – A Case Study
Test automation lessons from WebSphere Application Server
Software-automation-testing-course-navi-mumbai-software-automation-testing-co...
Software Test Automation
Improving software testing efficiency using automation methods by thuravupala...
Improving Agility While Widening Profit Margins Using Data Virtualization
Automation in software engineering
Coded ui - lesson 3 - case study - calculator
LIST OF TOP BUG TRACKING TOOLS AND WHY ARE THEY CALLED SO
Case Study : Manual & Automation Testing
Testers and Coders - Blurring the Lines
Why Test Automation Fails
Test automation - Building effective solutions
Improving ROI and Efficiencies of Software Test Case Automation
Denodo DataFest 2016: ROI Justification in Data Virtualization
automation testing benefits
Ad

Similar to Michael Snyman - Software Test Automation Success (20)

PDF
TEST_AUTOMATION_CASE_STUDY_(2)2[1]
PPTX
unit-5 SPM.pptx
PPTX
Module 4.pptxbsbsnsnsnsbsbbsjsjzbsbbsbsbsbs
PDF
How to manage your testing automation project ttm methodology
PPTX
Chap 5 Testing tools and measurement.pptx
PPT
Qtp - Introduction values
PPSX
Introduction to Automation Testing
PPTX
TEST AUTOMATION for S/W Q/A Process.pptx
PPSX
Software automation
PPTX
Unit v
PPTX
Test Automation failure analysis
PPTX
Lecture-11-AutomatedTesting-software.pptx
PDF
Software Test Automation - Best Practices
PPT
M. Holovaty, Концепции автоматизированного тестирования
PPTX
unit-2_20-july-2018 (1).pptx
PPTX
Curiosity and Infuse Consulting Present: Sustainable Test Automation Strategi...
PDF
Why Automation Fails—in Theory and Practice
PPT
Software Testing - Tool support for testing (CAST) - Mazenet Solution
PPTX
Questions for successful test automation projects
PPTX
Unit 5 st ppt
TEST_AUTOMATION_CASE_STUDY_(2)2[1]
unit-5 SPM.pptx
Module 4.pptxbsbsnsnsnsbsbbsjsjzbsbbsbsbsbs
How to manage your testing automation project ttm methodology
Chap 5 Testing tools and measurement.pptx
Qtp - Introduction values
Introduction to Automation Testing
TEST AUTOMATION for S/W Q/A Process.pptx
Software automation
Unit v
Test Automation failure analysis
Lecture-11-AutomatedTesting-software.pptx
Software Test Automation - Best Practices
M. Holovaty, Концепции автоматизированного тестирования
unit-2_20-july-2018 (1).pptx
Curiosity and Infuse Consulting Present: Sustainable Test Automation Strategi...
Why Automation Fails—in Theory and Practice
Software Testing - Tool support for testing (CAST) - Mazenet Solution
Questions for successful test automation projects
Unit 5 st ppt

More from TEST Huddle (20)

PPTX
Why We Need Diversity in Testing- Accenture
PPTX
Keys to continuous testing for faster delivery euro star webinar
PPTX
Why you Shouldnt Automated But You Will Anyway
PDF
Being a Tester in Scrum
PDF
Leveraging Visual Testing with Your Functional Tests
PPTX
Using Test Trees to get an Overview of Test Work
PPTX
Big Data: The Magic to Attain New Heights
PPTX
Will Robots Replace Testers?
PPTX
TDD For The Rest Of Us
PDF
Scaling Agile with LeSS (Large Scale Scrum)
PPTX
Creating Agile Test Strategies for Larger Enterprises
PPTX
Is There A Risk?
PDF
Are Your Tests Well-Travelled? Thoughts About Test Coverage
PDF
Growing a Company Test Community: Roles and Paths for Testers
PDF
Do we need testers on agile teams?
PDF
How to use selenium successfully
PDF
Testers & Teams on the Agile Fluency™ Journey
PDF
Practical Test Strategy Using Heuristics
PDF
Thinking Through Your Role
PDF
Using Selenium 3 0
Why We Need Diversity in Testing- Accenture
Keys to continuous testing for faster delivery euro star webinar
Why you Shouldnt Automated But You Will Anyway
Being a Tester in Scrum
Leveraging Visual Testing with Your Functional Tests
Using Test Trees to get an Overview of Test Work
Big Data: The Magic to Attain New Heights
Will Robots Replace Testers?
TDD For The Rest Of Us
Scaling Agile with LeSS (Large Scale Scrum)
Creating Agile Test Strategies for Larger Enterprises
Is There A Risk?
Are Your Tests Well-Travelled? Thoughts About Test Coverage
Growing a Company Test Community: Roles and Paths for Testers
Do we need testers on agile teams?
How to use selenium successfully
Testers & Teams on the Agile Fluency™ Journey
Practical Test Strategy Using Heuristics
Thinking Through Your Role
Using Selenium 3 0

Recently uploaded (20)

PDF
Navsoft: AI-Powered Business Solutions & Custom Software Development
PPTX
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
PDF
How Creative Agencies Leverage Project Management Software.pdf
PDF
2025 Textile ERP Trends: SAP, Odoo & Oracle
PDF
Nekopoi APK 2025 free lastest update
PPTX
ai tools demonstartion for schools and inter college
PDF
SAP S4 Hana Brochure 3 (PTS SYSTEMS AND SOLUTIONS)
PPTX
Essential Infomation Tech presentation.pptx
PDF
Design an Analysis of Algorithms I-SECS-1021-03
PPTX
Agentic AI : A Practical Guide. Undersating, Implementing and Scaling Autono...
PPTX
CHAPTER 2 - PM Management and IT Context
PDF
System and Network Administraation Chapter 3
PDF
PTS Company Brochure 2025 (1).pdf.......
PPTX
Transform Your Business with a Software ERP System
PDF
Digital Strategies for Manufacturing Companies
PDF
Wondershare Filmora 15 Crack With Activation Key [2025
PPTX
Introduction to Artificial Intelligence
PDF
Odoo Companies in India – Driving Business Transformation.pdf
PDF
wealthsignaloriginal-com-DS-text-... (1).pdf
PDF
Adobe Premiere Pro 2025 (v24.5.0.057) Crack free
Navsoft: AI-Powered Business Solutions & Custom Software Development
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
How Creative Agencies Leverage Project Management Software.pdf
2025 Textile ERP Trends: SAP, Odoo & Oracle
Nekopoi APK 2025 free lastest update
ai tools demonstartion for schools and inter college
SAP S4 Hana Brochure 3 (PTS SYSTEMS AND SOLUTIONS)
Essential Infomation Tech presentation.pptx
Design an Analysis of Algorithms I-SECS-1021-03
Agentic AI : A Practical Guide. Undersating, Implementing and Scaling Autono...
CHAPTER 2 - PM Management and IT Context
System and Network Administraation Chapter 3
PTS Company Brochure 2025 (1).pdf.......
Transform Your Business with a Software ERP System
Digital Strategies for Manufacturing Companies
Wondershare Filmora 15 Crack With Activation Key [2025
Introduction to Artificial Intelligence
Odoo Companies in India – Driving Business Transformation.pdf
wealthsignaloriginal-com-DS-text-... (1).pdf
Adobe Premiere Pro 2025 (v24.5.0.057) Crack free

Michael Snyman - Software Test Automation Success

  • 1. Software test automation success A case study Mike Snyman
  • 2. Case study Who: Large financial institution Where: Johannesburg When: 2006 to 2008 What: Adoption of test automation with the aim of reducing cost and increasing productivity
  • 3. Contents 1. The challenges facing the organisation 2. The approach to solving the problem 3. How we look at, and classify automation tools (Why) 4. Our automation process 5. Our automation framework 6. Contemporary automation issues 7. Automation benefit calculations 8. What we have learned
  • 4. March 2009 Introduction “ On the island of testers they were forever doomed to search for what should not and could not exist, knowing to succeed would bring misery to the gods.” Lessons Learned in Software Testing Cem Kaner James Bach Bret Pettichord “ A fool with a tool is still a fool.”
  • 5. March 2009 Challenges 1. Lack of an end-to-end defined automation process. 2. Insufficient re-use of tools, scenarios and data. 3. Automation involved late in the process. 4. No living documentation. 5. Slow uptake of test automation and shelfware. 6. No means of proving automation benefits and cost justification. 7. Dependency on key resources.
  • 6. March 2009 Solving the problem 1. Formal organisational project launched. 2. Project had to be based on a sound financial base. 3. Funding for the project approved at the highest level. 4. Failure not an option. 5. Resourced with capable and skilled resources. 6. Driven by a passionate person.
  • 7. March 2009 Project testing 1. Formal project launched in May 2006. 2. Project had a significant budget. 3. Detailed business case was documented to facilitate return on investment. 4. Steering committee tracked progress on a monthly basis. 5. Macro project broken down into workgroups. 6. Dedicated workgroup responsible for communication and change management.
  • 8. March 2009 Project testing workstreams • Test process workstream. • Test governance workstream. • Test mechanisation and tools workstream. • Test repository workstream. • Test environment workstream. • Test change management, communication and training workstream.
  • 9. March 2009 Test automation work stream • Automation process. • Automation process integrated with SDLC. • Tool selection, procurement and training. • Automation framework. • Tool classification. • Quantification of automation benefits.
  • 11. March 2009 Analysis and design phase • Understand the client’s test requirements. • Automated solution is proposed. • Prototyping.
  • 12. March 2009 Scripting/Configuration phase • User requirements implemented. • Activities in this phase could include recording, scripting and building of special utilities. • Internal testing of automated test solution.
  • 13. March 2009 Parameter identification • Scripts assessed against user- defined scenarios. • Components identified for parameterisation. • Components categorised based on nature and function.
  • 14. March 2009 Parameter management • Large amounts of data created in previous step are managed. • Customised spreadsheets are created. • Requirement exists that data in these sheets can be maintained by non-technical personnel. • All scenarios described in both technical and non-technical terms.
  • 15. March 2009 Scenario collection • Spreadsheet populated with scenarios provided by stakeholders of the system. • Manual test cases could provide input. • Actual production outages used as input.
  • 16. March 2009 Validation • Automated validation of results important due to volume. • Spreadsheet updated with expected results. • Validation done by script using information from spreadsheet. • Clear “Pass” or “Fail” indicators provided. • Summary of results provided.
  • 17. March 2009 Testing of scripts • Like all new software, the scripts must be tested. • Important that when defects are reported during the actual test cycle that the tool is above reproach. • Scripts must be able to react to anomalies in a predicted fashion. • Must still be able to report effectively on operational scenarios.
  • 18. March 2009 Script execution • Scripts are run against target application. • Stability of system determines support required. • Strive to provide as much meaningful data in the shortest possible time.
  • 19. March 2009 Review of results • Results are reviewed internally. • Focus on abnormal failures due to environmental or other conditions. • Checks are done to ensure data integrity and scope coverage.
  • 20. March 2009 Result and benefit communication • Because of size, results are saved in repositories. • Screen dumps of all defects are kept. • Communicate benefit in terms of automation lifecycle stage.
  • 21. March 2009 0 500000 1000000 1500000 2000000 2500000 3000000 3500000 4000000 4500000 5000000 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Investment Benefits Stage 2 Stage 3 Stage 4 2 3 4 5 6 Regression cycles Stage 11
  • 22. March 2009 Categorising our toolset Why? The strength in categorising tools is the ability to provide uniquely customised, automated testing solutions which, in conjunction with manual testing, aim to mitigate both the product and project risks associated with solution deployment.
  • 23. March 2009 Categorising our toolset (continued)
  • 24. March 2009 Categorised toolset • Front-end user interface (Functional) • Back-end or network interaction (Functional) • Front-end user interface (Non-functional) • Back-end or network interaction (Non-functional) • General automated testing utilities
  • 25. March 2009 Front-end user interface (Functional) • Record and playback. • Simulate user interaction with applications. • Provide support for unit, integration, system and acceptance testing. • User regression testing particularly suits the nature of these tools.
  • 26. March 2009 Back-end or network interaction (Functional) • Ability to simulate user interaction in the absence of a front-end. • Supports bottom-up integration. • Provides the ability to find defects much sooner in the SDLC. • We found it valuable in testing SOA implementations.
  • 27. March 2009 Front-end user interface (Non-functional) • Ability to generate the required load for performance, load and stress testing. • These types of tests can typically not be performed manually. • Tool is dependent on the functional quality of the application under test. • Test environment comparison with production .
  • 28. March 2009 Back-end or network interaction (Non-functional) • Due to the nature of implementation, it is not practical to use the front-end to generate the required load (ATM, point-of- sale devices). • Typically these kinds of tests cannot be performed manually. • Test environment comparison with production . • We found it valuable in testing the performance of services in SOA implementations.
  • 29. March 2009 General automated testing utilities • Provides general support for manual and automated testing activities. • Compares large amounts of data. • Generates unique data sets from production data. • Generates vast amount of test data. • Provides support to all previously-mentioned tool categories.
  • 30. March 2009 Contemporary automation issues Automation lag • Refers to the time lost between the implementation of the application, and the point at which automated functional scripts can be used during testing.
  • 31. March 2009 Automation lag Scripting Parameters Spreadsheet Testing Validation ScriptExecution Data Prep Analysis Feedback Application available for testingApplication available for testing Design & Build Design & Build Automation LagAutomation Lag Manual test executionManual test execution Automated testing Automated testing ImplementationImplementation
  • 32. March 2009 Contemporary automation issues (continued) User acceptance testing and automation • To perform successful user acceptance testing, the involvement of the user is critical – both in defining the test, execution and validation of results. • Users responsible for acceptance are asked to complete the scenario spreadsheet. • We ask the user to monitor a few initial script executions.
  • 33. March 2009 Automation benefit calculations 1. Historically problematic to get accurate measures. 2. Focus initially on quality. 3. Difficult to quantify quality in monetary terms. 4. Decision taken to focus on productivity improvement and associated cost reduction.
  • 34. March 2009 Automation benefit calculations process 1. Identify the smallest common element. 2. In most cases it would be the parameters identified in the scripting process. 3. Compare effort (duration) associated with manual testing with that of automated testing per parameter. 4. Describe the improvement in monetary terms by using an average resource cost.
  • 35. March 2009 Automation benefit 1. Total cost saving for the period 2005 to mid-2008: R86 183 171.00 ($ 8600 000 US) 2. Benefit confirmed by system owners and users. 3. Calculation method reviewed by Finance. 4. Return of this nature justified the significant investment made in both project testing and automation tools.
  • 36. March 2009 What have we learned? 1. Having a test tool is not a strategy. 2. Automation does not test in the same way that a manual tester does. 3. Record and playback is only the start. 4. Automated test scripts are software programs and must be treated as such. 5. The value lies in maintenance. 6. Categorise your toolset, and design integrated solutions. .

Editor's Notes

  • #22: Automation return on investment, life stages: The graph depicts the typical relationship between investment made in, and benefits achieved through software test automation. Important milestones: The first time scripts are run with the aim of regression testing the application. The period during which the investment (cost) far outweigh the benefits being achieved. The point at which benefits achieved match investment made. The period during which the benefit achieved far outweigh the investment (cost). The point at which client or stake holder implicitly trust automation to such a degree that they no longer estimate for manual regression testing. The period during which the benefit steam seizes to exist and different way are used to depict value add. We use these milestones alone the benefit curve to indicate live cycle stages with unique characteristics. Stage 1- Stage 2- Stage 3- Stage 4-