SlideShare a Scribd company logo
7 DEADLY SINS
    OF
    AUTOMATED
    TESTING
    Dr Adrian Smith
    September 2012
                            Engineering Innovation.
Thursday, 20 September 12
Adrian Smith
     • Background in Engineering
     • Software development using Agile and Lean
                                        Diverse Ex
                                        Diverse Ex
     • Technical and Organisational Coach
                                              Aerospace Engineering
     • Founded a startup product development and
                                              Commercial and military engineering
                                              Aerospace Engineering
                                              analysis and manufacturing experienc
       consulting business   Diverse Experienc
                                              Commercial and military engineering
                                              programs including A380 and F35.
                                              analysis and manufacturing experienc
                                              programs including A380 and F35.
                                             Agile Software Developm
                              Aerospace Engineering
                                             Software development, architecture a
                                             Agile Software Developm
                              Commercial and military engineering design,
                                               management for engineering CAE, au
                                               Software development, architecture a
                              analysis and manufacturing experience on major
                                               scientific and digital media.
                              programs including A380 and F35. engineering CAE, au
                                               management for
                                               scientific and digital media.
                                         Systems Integration
Thursday, 20 September 12
                              Agile Software Development
                                         Integration of logistics, financial, engi
Geeks hate repetition




Thursday, 20 September 12
Airbus A380 Wing




Thursday, 20 September 12
Thursday, 20 September 12
Envy
                            Flawed comparison
                             of manual testing
                              and automation




Thursday, 20 September 12
How management
                                 see testing




Thursday, 20 September 12
How management
 would like to see testing
Thursday, 20 September 12
Manual vs Automation
        • A flawed comparison
              • Assumes that automation can replace manual
                testing effort
              • Automation generally doesn’t find new defects


        • Testing is not merely a sequence of
          repeatable actions
        • Testing requires thought and learning


Thursday, 20 September 12
Ideal automation targets
        • Regression testing - assessing current state
        • Automation of test support activities
              • Data generation or sub-setting
              • Load generation
              • Non-functional testing

        • Deterministic problems
        • Big data problems


Thursday, 20 September 12
Common symptoms
        • Relying on automation as the basis
          for all testing activities
        • All tests are built by developers
        • Absence of code reviews
        • Absence of exploratory testing
        • Absence of user testing



Thursday, 20 September 12
Suggested approach
        • Avoid comparison between manual and
          automated testing - both are needed
        • Distinguish between the automation and the
          process that is being automated
        • Use automation to provide a baseline
        • Use automation in conjunction with manual
          techniques



Thursday, 20 September 12
GLUTTONY




                            Over indulging on commercial test tools
Thursday, 20 September 12
Promise of automation
                                 • Software vendors
                                   have sold automation
                                   as the capture-replay
                                   of manual testing
                                   processes




    • Miracle tools that solve
      all testing problems

Thursday, 20 September 12
License barrier
        • Commercial licenses restrict usage
        • Not everyone can run the tests
        • Typically, organisations create special
          groups or privileged individuals




Thursday, 20 September 12
Incompatible technology
        • Underlying technology of commercial tools is
          often not compatible with the development
          toolchain
              • Special file formats or databases
              • Lack of version control for tests
              • Tests cannot be versioned within the software
              • Continuous integration problems
              • Can’t be adapted or extended by the developers



Thursday, 20 September 12
Justifying the expense
        • Financial commitments
          distort judgement
        • Difficult to make objective
          decisions
        • Tendency to use the tool
          for every testing problem
        • People define their role by
          the tools they use


Thursday, 20 September 12
Common symptoms
        • A commercial tools form the basis
          of a testing strategy
        • Only certain teams or individuals can
          access a tool or run tests
        • Developers have not be consulted in the
          selection of a testing tools
        • “We always use <insert tool-name> for testing!”



Thursday, 20 September 12
Suggested approach
        • Use Open Source software tools where ever
          possible
        • Use tools that can easily be supported by the
          development team and play nicely with existing
          development tool chain


        • Ensure any commercial tools can be executed
          in a command-line mode


Thursday, 20 September 12
Lust
                            User interface forms the basis for all testing




Thursday, 20 September 12
Testing through the GUI
        • Non-technical testers often approach testing
          through the user interface
        • Ignores the underlying system and application
          architecture
        • Resulting tests are slow and brittle
        • Difficult to setup test context - resulting in
          sequence dependent scripts



Thursday, 20 September 12
Investment profile

                              Manual
                            Exploratory

                                      Interface




                                                                            Confidence



                                                                                        Speed / Feedback
   Collaboratively
     built around
  system behaviour                Acceptance

      Exercises
   components and                  Integration
      systems

                             Unit/Component

                                                  Investment / Importance
                       Developer built
                      optimised for fast
                          feedback


Thursday, 20 September 12
Architecture
      • Understanding
        application and
        system architecture
        improves test design
      • Creates opportunities
        to verify functionality
        at the right level




Thursday, 20 September 12
Test design
                            Test Intent
                      (Clearly identifies what the       Test Data
                         test is trying to verify)




                             Test
                        Implementation                System Under
                     (Implementation of the test           Test
                    including usage of test data)




Thursday, 20 September 12
F.I.R.S.T. class tests

                        F       Fast

                            I   Independent

                        R       Reliable

                        S       Small

                        T       Transparent


Thursday, 20 September 12
Common symptoms
        • Testers cannot draw the
          application or system architecture
        • Large proportion of tests are
          being run through the user interface
        • Testers have limited technical skills
        • No collaboration with developers
        • Intent of tests is unclear


Thursday, 20 September 12
Suggested approach
        • Limit the investment in automated tests that are
          executed through the user interface
        • Collaborate with developers
        • Focus investment in automation at lowest
          possible level with clear test intent
        • Ensure automation give fast feedback




Thursday, 20 September 12
Pride
                              Too proud to
                            collaborate when
                             creating tests




Thursday, 20 September 12
Poor collaboration
        • Organisations often create
          specialisations of roles
          and skills
        • Layers of management and
          control then develop
        • Collaboration becomes difficult
        • Poor collaboration = poor tests



Thursday, 20 September 12
Automating too much
        • Delegating test automation to a special group
          inhibits collaboration
        • Poor collaboration can results in duplicate test
          cases / coverage
        • Duplication wastes effort and creates
          maintenance issues
        • Setting performance goals based around test-
          cases automated leads to problems


Thursday, 20 September 12
No definition of quality
        • Automated testing effort should match the
          desired system quality
        • Risk that too-much, too-little or not the right
          things will be tested
        • Defining quality creates a shared
          understanding and can only be achieved
          through collaboration




Thursday, 20 September 12
Good collaboration
    • Cross-functional
                                                                    Acceptance
      teams built                                                     Criteria

      better software                            Analyst

    • Collaboration          Specification and
      improves                 Elaboration
                                                   Collaboration   Tester
      definition and
      verification                              Developer
                                                                     Automation




Thursday, 20 September 12
Specification by Example
    • Recognises the value of
      collaboration in testing
    • More general than
      ATDD and/or BDD
    • Based around building a
      suite of Living Documentation
      that can be executed




Thursday, 20 September 12
Common symptoms
        • Automated tests are being built
          in isolation from team
        • Intent of tests is unclear
          or not matched to quality
        • Poor automation design (abstraction,
          encapsulation, ...)
        • Maintainability or compatibility issues



Thursday, 20 September 12
Suggested approach
        • Collaborate to create good tests and avoid
          duplication
        • Limit the investment in UI based automated
          tests
        • Collaborate with developers to ensure good
          technical practices (encapsulation, abstraction,
          reuse, ... )
        • Test code = Production code


Thursday, 20 September 12
SLOTH
    Too lazy to properly
    maintain automated tests




                               Engineering Innovation.

Thursday, 20 September 12
Automated Test Failures
         • Many potential causes of failure
         • Unless maintained - value is slowly eroded

                              System        Reference
                            Interface            Data
                             Change          Changes


                                                        Time
             New                          OS
           Feature                      Patch



Thursday, 20 September 12
Importance of maintenance

                     Manual test execution                 Value of
                                                         Unmaintained
                     Maintained automation              Automated Test
                                                            Suite
                     Unmaintained automation
    Cost / Effort




                                                         Potential
                                                    Value of Maintained
                                                      Automated Test
                                                           Suite




                                             Time
Thursday, 20 September 12
Continuous integration




Thursday, 20 September 12
Common symptoms
        • Test suite has not been recently
          run - state is unknown
        • Continuous Integration history
          shows consistent failures following
          development changes / release
        • Test suite requires manual intervention
        • Duplication within automation code
        • Small changes triggers a cascade of failures

Thursday, 20 September 12
Suggested Approach
        • Ensure automated tests are executed
          using a Continuous Integration
          environment
        • Ensure test are always running - even if
          system in not being actively developed
        • Make test results visible - create
          transparency of system health
        • Ensure collaboration between developers
          and testers
Thursday, 20 September 12
Rage
                            Frustration with slow, brittle
                            or unreliable automated tests

Thursday, 20 September 12
Slow automation
        • Large datasets
        • Unnecessary integrations
        • Inadequate hardware/environments
        • Too many tests
        • Reliance on GUI based tests
        • Manual intervention
        • ... many others

Thursday, 20 September 12
Fast Feedback




Thursday, 20 September 12
Brittle Tests
                               • Contain time-bound data
                               • Have external dependencies
                               • Rely on UI layout/style
                               • Rely on sequence of
                                 execution
                               • Based on production data or
                                 environments


Thursday, 20 September 12
Frustration




Thursday, 20 September 12
Unreliable Tests
                                 • False positives
                                 • Wastes time investigating
                                 • Failures start being ignored
                                 • Creates uncertainty of
                                   system health
                                 • Workarounds and alternate
                                   tests are created


Thursday, 20 September 12
Suggested approach
        • Treat automated tests with the same
          importance as production code
        • Review, refactor, improve ...
        • Apply a “Stop the line” approach to test failure
        • Eliminate (quarantine) unreliable tests
        • Ensure collaboration with developers
        • Up-skill / pair testers

Thursday, 20 September 12
Avarice                Trying to cut
                            costs through
     (Greed)                automation




Thursday, 20 September 12
Lure of cheap testing
                              • Testing tool vendors
                                often try to calculate
                                ROI based on saving
                                labour




       • Analysis is unreliable and under values the
         importance of testing


Thursday, 20 September 12
Automation is not cheap
        • Adopting test automation tools and techniques
          requires significant investment
        • Investment in new ways of working
        • Investment in skills
        • Investment in collaboration
        • Ongoing investment in maintenance



Thursday, 20 September 12
Common symptoms
        • Investment in commercial tools
          using a business-case based on
          reducing headcount
        • Using a predicted ROI as a way of
          reducing budget for Testing
        • Consolidating automated testing
          within a special group




Thursday, 20 September 12
Suggested approach
        • Ensure the reasons for automation are
          clear and are NOT based purely on saving
          money/headcount
        • Ensure business case for automation
          includes costs for ongoing maintenance




Thursday, 20 September 12
7 Deadly Sins
     Envy                   Flawed comparison of manual testing and automation

     Gluttony Over indulging on commercial test tools

     Lust                   User interface forms the basis for all testing

     Pride                  Too proud to collaborate when creating tests

     Sloth                  Too lazy to maintain automated tests

     Rage                   Frustration with slow, brittle or unreliable tests

     Greed                  Trying to cut costs through automation

Thursday, 20 September 12
Why automate testing ?




Thursday, 20 September 12
How geeks really work




Thursday, 20 September 12
Thank you
    Dr Adrian Smith
    September 2012
                            Engineering Innovation.
Thursday, 20 September 12

More Related Content

PDF
Agile Testing Framework - The Art of Automated Testing
PDF
12 principles for Agile Development
PDF
Cucumber ppt
PDF
Integración Continua
PPTX
Test automation
PDF
e2e testing with cypress
PPT
Test Automation Strategies For Agile
PDF
Cypress - Best Practices
Agile Testing Framework - The Art of Automated Testing
12 principles for Agile Development
Cucumber ppt
Integración Continua
Test automation
e2e testing with cypress
Test Automation Strategies For Agile
Cypress - Best Practices

What's hot (20)

PPT
Selenium Automation Framework
PPTX
test_automation_POC
PDF
Web automation using selenium.ppt
PPT
Test Automation
PDF
Test Automation
PPTX
Introduction to Agile Testing
PPT
Agile QA and Testing process
PPTX
Selenium test automation
PDF
Behavior Driven Development and Automation Testing Using Cucumber
PDF
Automation Testing using Selenium
PPTX
Agile Testing - presentation for Agile User Group
PPT
TESTING STRATEGY.ppt
PPTX
Automated Test Framework with Cucumber
PDF
Agile Testing
PPTX
Best practices for test automation
PPTX
Test Automation and Selenium
PDF
Shift left - find defects earlier through automated test and deployment
PPT
CI and CD with Jenkins
PDF
Behavior-Driven Development and Automation Testing Using Cucumber Framework W...
PDF
소프트웨어 테스팅
Selenium Automation Framework
test_automation_POC
Web automation using selenium.ppt
Test Automation
Test Automation
Introduction to Agile Testing
Agile QA and Testing process
Selenium test automation
Behavior Driven Development and Automation Testing Using Cucumber
Automation Testing using Selenium
Agile Testing - presentation for Agile User Group
TESTING STRATEGY.ppt
Automated Test Framework with Cucumber
Agile Testing
Best practices for test automation
Test Automation and Selenium
Shift left - find defects earlier through automated test and deployment
CI and CD with Jenkins
Behavior-Driven Development and Automation Testing Using Cucumber Framework W...
소프트웨어 테스팅
Ad

Viewers also liked (14)

PDF
O impacto das novas tecnologias na educação superior: um novo modelo de ensin...
PPT
MANET
PDF
Agile Introduction
PPT
Telecom italia oss transformation roadmap marco daccò venice 2010
PDF
DevoxxUK 2015 "The Seven Deadly Sins of Microservices (Full Version)"
PPTX
7 deadly sins
PDF
How agile coaches help us win the agile coach role @ Spotify
PDF
5 Games for Effective Agile Coaching
PDF
Agile and Lean Games
PDF
How To Grade Your Selenium Tests by Dave Haeffner - Sauce Labs Webinar
PDF
Agile Coaching Workshop
PDF
Business Driven Architecture for Strategic Transformation
PPTX
Agile tour ncr test360_degree - agile testing on steroids
PPTX
Agile Testing Strategy
O impacto das novas tecnologias na educação superior: um novo modelo de ensin...
MANET
Agile Introduction
Telecom italia oss transformation roadmap marco daccò venice 2010
DevoxxUK 2015 "The Seven Deadly Sins of Microservices (Full Version)"
7 deadly sins
How agile coaches help us win the agile coach role @ Spotify
5 Games for Effective Agile Coaching
Agile and Lean Games
How To Grade Your Selenium Tests by Dave Haeffner - Sauce Labs Webinar
Agile Coaching Workshop
Business Driven Architecture for Strategic Transformation
Agile tour ncr test360_degree - agile testing on steroids
Agile Testing Strategy
Ad

Similar to 7 Deadly Sins of Agile Software Test Automation (20)

PDF
Testing In Agile
PDF
Quality assurance in distributed continuous delivery
PDF
Kickoff Test Automation Day 2012
PDF
QA is dead long live the new QA - Agile Dev and QA Conference Israel
PDF
Testing Theories &amp; Methodologies
ZIP
Sqp 090508084934 Phpapp02
PDF
Swiss Testing Day - Testautomation, 10 (sometimes painful) lessons learned
PDF
Introduction to Test Automation - Technology and Tools
ZIP
Software Quality Plan
PDF
The Speed to Cool - Valuing Testing & Quality in Agile Teams
PDF
01 software test engineering (manual testing)
PDF
Lean startup
PPTX
Quality Coding with Visual Studio 2012
PDF
Combining Performance Testing And Modelling For Easy Jet.Com V 1.0
PPTX
Quality Coding: What's New with Visual Studio 2012
PPTX
Quality Coding: What’s New with Visual Studio 2012
PPTX
My talk at PMI Sweden Congress 2013 on Agile and Large Software Products
PDF
The Future Tester at Suncorp - A Journey of Building Quality In Through Agile
PDF
Automate your way to agility
PPTX
Software Lifecycle
Testing In Agile
Quality assurance in distributed continuous delivery
Kickoff Test Automation Day 2012
QA is dead long live the new QA - Agile Dev and QA Conference Israel
Testing Theories &amp; Methodologies
Sqp 090508084934 Phpapp02
Swiss Testing Day - Testautomation, 10 (sometimes painful) lessons learned
Introduction to Test Automation - Technology and Tools
Software Quality Plan
The Speed to Cool - Valuing Testing & Quality in Agile Teams
01 software test engineering (manual testing)
Lean startup
Quality Coding with Visual Studio 2012
Combining Performance Testing And Modelling For Easy Jet.Com V 1.0
Quality Coding: What's New with Visual Studio 2012
Quality Coding: What’s New with Visual Studio 2012
My talk at PMI Sweden Congress 2013 on Agile and Large Software Products
The Future Tester at Suncorp - A Journey of Building Quality In Through Agile
Automate your way to agility
Software Lifecycle

Recently uploaded (20)

PDF
MIND Revenue Release Quarter 2 2025 Press Release
PPTX
Spectroscopy.pptx food analysis technology
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PPTX
Cloud computing and distributed systems.
PDF
Machine learning based COVID-19 study performance prediction
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PPT
Teaching material agriculture food technology
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PPTX
Understanding_Digital_Forensics_Presentation.pptx
PDF
KodekX | Application Modernization Development
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
MIND Revenue Release Quarter 2 2025 Press Release
Spectroscopy.pptx food analysis technology
20250228 LYD VKU AI Blended-Learning.pptx
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
Cloud computing and distributed systems.
Machine learning based COVID-19 study performance prediction
“AI and Expert System Decision Support & Business Intelligence Systems”
Teaching material agriculture food technology
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Reach Out and Touch Someone: Haptics and Empathic Computing
Understanding_Digital_Forensics_Presentation.pptx
KodekX | Application Modernization Development
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Chapter 3 Spatial Domain Image Processing.pdf
Dropbox Q2 2025 Financial Results & Investor Presentation
Diabetes mellitus diagnosis method based random forest with bat algorithm
Encapsulation_ Review paper, used for researhc scholars
Advanced methodologies resolving dimensionality complications for autism neur...
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf

7 Deadly Sins of Agile Software Test Automation

  • 1. 7 DEADLY SINS OF AUTOMATED TESTING Dr Adrian Smith September 2012 Engineering Innovation. Thursday, 20 September 12
  • 2. Adrian Smith • Background in Engineering • Software development using Agile and Lean Diverse Ex Diverse Ex • Technical and Organisational Coach Aerospace Engineering • Founded a startup product development and Commercial and military engineering Aerospace Engineering analysis and manufacturing experienc consulting business Diverse Experienc Commercial and military engineering programs including A380 and F35. analysis and manufacturing experienc programs including A380 and F35. Agile Software Developm Aerospace Engineering Software development, architecture a Agile Software Developm Commercial and military engineering design, management for engineering CAE, au Software development, architecture a analysis and manufacturing experience on major scientific and digital media. programs including A380 and F35. engineering CAE, au management for scientific and digital media. Systems Integration Thursday, 20 September 12 Agile Software Development Integration of logistics, financial, engi
  • 4. Airbus A380 Wing Thursday, 20 September 12
  • 6. Envy Flawed comparison of manual testing and automation Thursday, 20 September 12
  • 7. How management see testing Thursday, 20 September 12
  • 8. How management would like to see testing Thursday, 20 September 12
  • 9. Manual vs Automation • A flawed comparison • Assumes that automation can replace manual testing effort • Automation generally doesn’t find new defects • Testing is not merely a sequence of repeatable actions • Testing requires thought and learning Thursday, 20 September 12
  • 10. Ideal automation targets • Regression testing - assessing current state • Automation of test support activities • Data generation or sub-setting • Load generation • Non-functional testing • Deterministic problems • Big data problems Thursday, 20 September 12
  • 11. Common symptoms • Relying on automation as the basis for all testing activities • All tests are built by developers • Absence of code reviews • Absence of exploratory testing • Absence of user testing Thursday, 20 September 12
  • 12. Suggested approach • Avoid comparison between manual and automated testing - both are needed • Distinguish between the automation and the process that is being automated • Use automation to provide a baseline • Use automation in conjunction with manual techniques Thursday, 20 September 12
  • 13. GLUTTONY Over indulging on commercial test tools Thursday, 20 September 12
  • 14. Promise of automation • Software vendors have sold automation as the capture-replay of manual testing processes • Miracle tools that solve all testing problems Thursday, 20 September 12
  • 15. License barrier • Commercial licenses restrict usage • Not everyone can run the tests • Typically, organisations create special groups or privileged individuals Thursday, 20 September 12
  • 16. Incompatible technology • Underlying technology of commercial tools is often not compatible with the development toolchain • Special file formats or databases • Lack of version control for tests • Tests cannot be versioned within the software • Continuous integration problems • Can’t be adapted or extended by the developers Thursday, 20 September 12
  • 17. Justifying the expense • Financial commitments distort judgement • Difficult to make objective decisions • Tendency to use the tool for every testing problem • People define their role by the tools they use Thursday, 20 September 12
  • 18. Common symptoms • A commercial tools form the basis of a testing strategy • Only certain teams or individuals can access a tool or run tests • Developers have not be consulted in the selection of a testing tools • “We always use <insert tool-name> for testing!” Thursday, 20 September 12
  • 19. Suggested approach • Use Open Source software tools where ever possible • Use tools that can easily be supported by the development team and play nicely with existing development tool chain • Ensure any commercial tools can be executed in a command-line mode Thursday, 20 September 12
  • 20. Lust User interface forms the basis for all testing Thursday, 20 September 12
  • 21. Testing through the GUI • Non-technical testers often approach testing through the user interface • Ignores the underlying system and application architecture • Resulting tests are slow and brittle • Difficult to setup test context - resulting in sequence dependent scripts Thursday, 20 September 12
  • 22. Investment profile Manual Exploratory Interface Confidence Speed / Feedback Collaboratively built around system behaviour Acceptance Exercises components and Integration systems Unit/Component Investment / Importance Developer built optimised for fast feedback Thursday, 20 September 12
  • 23. Architecture • Understanding application and system architecture improves test design • Creates opportunities to verify functionality at the right level Thursday, 20 September 12
  • 24. Test design Test Intent (Clearly identifies what the Test Data test is trying to verify) Test Implementation System Under (Implementation of the test Test including usage of test data) Thursday, 20 September 12
  • 25. F.I.R.S.T. class tests F Fast I Independent R Reliable S Small T Transparent Thursday, 20 September 12
  • 26. Common symptoms • Testers cannot draw the application or system architecture • Large proportion of tests are being run through the user interface • Testers have limited technical skills • No collaboration with developers • Intent of tests is unclear Thursday, 20 September 12
  • 27. Suggested approach • Limit the investment in automated tests that are executed through the user interface • Collaborate with developers • Focus investment in automation at lowest possible level with clear test intent • Ensure automation give fast feedback Thursday, 20 September 12
  • 28. Pride Too proud to collaborate when creating tests Thursday, 20 September 12
  • 29. Poor collaboration • Organisations often create specialisations of roles and skills • Layers of management and control then develop • Collaboration becomes difficult • Poor collaboration = poor tests Thursday, 20 September 12
  • 30. Automating too much • Delegating test automation to a special group inhibits collaboration • Poor collaboration can results in duplicate test cases / coverage • Duplication wastes effort and creates maintenance issues • Setting performance goals based around test- cases automated leads to problems Thursday, 20 September 12
  • 31. No definition of quality • Automated testing effort should match the desired system quality • Risk that too-much, too-little or not the right things will be tested • Defining quality creates a shared understanding and can only be achieved through collaboration Thursday, 20 September 12
  • 32. Good collaboration • Cross-functional Acceptance teams built Criteria better software Analyst • Collaboration Specification and improves Elaboration Collaboration Tester definition and verification Developer Automation Thursday, 20 September 12
  • 33. Specification by Example • Recognises the value of collaboration in testing • More general than ATDD and/or BDD • Based around building a suite of Living Documentation that can be executed Thursday, 20 September 12
  • 34. Common symptoms • Automated tests are being built in isolation from team • Intent of tests is unclear or not matched to quality • Poor automation design (abstraction, encapsulation, ...) • Maintainability or compatibility issues Thursday, 20 September 12
  • 35. Suggested approach • Collaborate to create good tests and avoid duplication • Limit the investment in UI based automated tests • Collaborate with developers to ensure good technical practices (encapsulation, abstraction, reuse, ... ) • Test code = Production code Thursday, 20 September 12
  • 36. SLOTH Too lazy to properly maintain automated tests Engineering Innovation. Thursday, 20 September 12
  • 37. Automated Test Failures • Many potential causes of failure • Unless maintained - value is slowly eroded System Reference Interface Data Change Changes Time New OS Feature Patch Thursday, 20 September 12
  • 38. Importance of maintenance Manual test execution Value of Unmaintained Maintained automation Automated Test Suite Unmaintained automation Cost / Effort Potential Value of Maintained Automated Test Suite Time Thursday, 20 September 12
  • 40. Common symptoms • Test suite has not been recently run - state is unknown • Continuous Integration history shows consistent failures following development changes / release • Test suite requires manual intervention • Duplication within automation code • Small changes triggers a cascade of failures Thursday, 20 September 12
  • 41. Suggested Approach • Ensure automated tests are executed using a Continuous Integration environment • Ensure test are always running - even if system in not being actively developed • Make test results visible - create transparency of system health • Ensure collaboration between developers and testers Thursday, 20 September 12
  • 42. Rage Frustration with slow, brittle or unreliable automated tests Thursday, 20 September 12
  • 43. Slow automation • Large datasets • Unnecessary integrations • Inadequate hardware/environments • Too many tests • Reliance on GUI based tests • Manual intervention • ... many others Thursday, 20 September 12
  • 45. Brittle Tests • Contain time-bound data • Have external dependencies • Rely on UI layout/style • Rely on sequence of execution • Based on production data or environments Thursday, 20 September 12
  • 47. Unreliable Tests • False positives • Wastes time investigating • Failures start being ignored • Creates uncertainty of system health • Workarounds and alternate tests are created Thursday, 20 September 12
  • 48. Suggested approach • Treat automated tests with the same importance as production code • Review, refactor, improve ... • Apply a “Stop the line” approach to test failure • Eliminate (quarantine) unreliable tests • Ensure collaboration with developers • Up-skill / pair testers Thursday, 20 September 12
  • 49. Avarice Trying to cut costs through (Greed) automation Thursday, 20 September 12
  • 50. Lure of cheap testing • Testing tool vendors often try to calculate ROI based on saving labour • Analysis is unreliable and under values the importance of testing Thursday, 20 September 12
  • 51. Automation is not cheap • Adopting test automation tools and techniques requires significant investment • Investment in new ways of working • Investment in skills • Investment in collaboration • Ongoing investment in maintenance Thursday, 20 September 12
  • 52. Common symptoms • Investment in commercial tools using a business-case based on reducing headcount • Using a predicted ROI as a way of reducing budget for Testing • Consolidating automated testing within a special group Thursday, 20 September 12
  • 53. Suggested approach • Ensure the reasons for automation are clear and are NOT based purely on saving money/headcount • Ensure business case for automation includes costs for ongoing maintenance Thursday, 20 September 12
  • 54. 7 Deadly Sins Envy Flawed comparison of manual testing and automation Gluttony Over indulging on commercial test tools Lust User interface forms the basis for all testing Pride Too proud to collaborate when creating tests Sloth Too lazy to maintain automated tests Rage Frustration with slow, brittle or unreliable tests Greed Trying to cut costs through automation Thursday, 20 September 12
  • 55. Why automate testing ? Thursday, 20 September 12
  • 56. How geeks really work Thursday, 20 September 12
  • 57. Thank you Dr Adrian Smith September 2012 Engineering Innovation. Thursday, 20 September 12