SlideShare a Scribd company logo
Creating Testing Tools to Support Development
Chema del Barco
SDET Manager, Akamai Technologies
Warming up
How many of you…
… Work as dedicated testers?
… Create automated tests?
… Think you have enough tests?
… Rely on automated tests from testers?
… Trust these automated tests?
Warming up
How do we use test automation at Akamai?
• Different products & systems
• Java, JS, Ruby, Python…
• A lot of backend, less frontend
• Part of workflow
• We also do manual testing
• Lots of challenges
Challenges of test automation
Challenges of Test Automation
Challenges of test automation
1. Not testing the right thing
Not testing the right thing
Real Life Example:
Volvo’s Automatic Collision Avoidance
(2010)
Not testing the right thing
https://www.youtube.com/watch?v=aNi17YLnZpg
Not testing the right thing - lessons learned
1. Everybody makes mistakes
2. There is no such thing as 100% test coverage
3. There are always risks involved in any delivery
4. Testing can tell us where these risks are
5. We can mitigate risks if we know them
Not testing the right thing
Some failed demos later (2014)…
Not testing the right thing
https://www.youtube.com/watch?v=kWiwS-43xpk
Not testing the right thing
How can we mitigate this?
Not testing the right thing
Don’t lose the user’s point of view
PO/Testers can use
BDD / Spec by Example
Not testing the right thing
Developers
can use TDD
Not testing the right thing
Or you can all work
together to use ATDD!
Not testing the right thing
Alternative approach:
Use Domain Abstractions
(ex: Actors & Actions)
Challenges of test automation
2. The Test Environment Hell
The Test Environment Hell
• Complex applications tend to rely on big Integrated
Environments for validating changes
• Environments are so complex that require constant
troubleshooting and maintenance
• Testers (and builds) are often blocked and teams
spend a lot of time investigating if problem is app or
environment related
Build UT
IT ST Delivery
Build Validate
Dev
Stream
SDET
Stream
UT – Unit Testing Promote – Deploy to Next
CT – Component Testing Test Environment
IT – Integration Testing Push – Manual code push
ST – System Testing
SIT – System Integration Testing
Promote
Push
DEV Environment
INT Environment
Push
Ex: Akamai’s Luna Control Center test environments
Build UT
IT ST Delivery
Build Validate
Dev
Stream
SDET
Stream
UT – Unit Testing Promote – Deploy to Next
CT – Component Testing Test Environment
IT – Integration Testing Push – Manual code push
ST – System Testing
SIT – System Integration Testing
Promote
Push
DEV Environment
INT Environment
Push
Ex: Akamai’s Luna Control Center test environments
The Test Environment Hell
How can we mitigate this?
1. Test the right thing in the right environment
“Separation of Concerns”:
1. Test that your change works, in isolation (UT)
2. Test that your app works with your change, in
isolation (CT / IT)
3. Test that your app works with your change, in
integration (ST)
4. Test that your system works in integration (SIT)
1. Test the right thing in the right environment
• Isolated Component (Service) Testing
• Exclusive for Team
• MockServer to mock external
dependencies (provided by the
service owners)
• SUT thinks it’s in production
• Can test also Integration (Contract)
• Very stable, maintained by
devs/testers
• Testers can automate and validate
faster
• Everyone is less frustrated
Service
Under
Test (SUT)
Service 2
(MockServer)
Service 4
(MockServer)
Service 3
(MockServer)
Local Test Env.
Build UT CT
IT ST SIT Delivery
Build Validate
Dev
Stream
SDET
Stream
Promote
Push
Promote
DEV Environments
INT Environment SYS Environment
Push
QA Environment
1. Test the right thing in the right environment
If a test fails here it we know for
a fact that it will be because of:
1. The environment
2. A service not following its API
contract, or
3. An actual integration bug
(+ IT + ST)
2. Implement automatic retry-on-error techniques
Some test frameworks like TestNG allow for several ways
to automatically retry failed tests, by:
1. Running “<test-outputs>testng-failed.xml” after a run with failed
tests [Good]
2. Run TestNG programmatically and implement a “retry test”
transformer to the @Test annotation [Great!]
3. If you don’t want to retry all, you can create a @Retry annotation
to implement the “retry test” listener [Great!]
2. Implement automatic retry-on-error techniques (option 2)
2. Implement automatic retry-on-error techniques (option 2)
2. Implement automatic retry-on-error techniques (option 2)
3. Use plugins in your CI to detect flaky tests
https://wiki.jenkins-ci.org/display/JENKINS/Test+Results+Analyzer+Plugin
Challenges of test automation
3. Doing test automation for the
wrong reasons
1. “I don’t have to test anything manually”
Manual Testing
is CRUCIAL in
Agile because
machines cannot
think outside the
box …Yet ;)
“Agile Testing: A Practical Guide for Testers and Agile Teams”, Lisa Crispin & Janet Gregory
2. Testing as a separated work flow
Build UT
IT/ST OKBuild
DEV Environments
QA Environment
Dev
Stream
Test
Stream
FAIL
“Ready
to Test”
Development “DONE”
Tester waits for
changes to be
deployed and runs
tests
Development “STARTS”
Tester provides FEEDBACK
Not understanding WHY we need test automation
Build UT CT
IT ST SIT Delivery
Build Validate
Promote
Push
Promote
DEV Environments
INT Environment SYS Environment
Push
QA Environment
Build
Pipeline
Test
Stream
The goal of test automation is to provide FEEDBACK
Test Automation should be another form of delivery
Not understanding WHY we need test automation
• Feedback MUST be:
• Reliable (no false positives/negatives)
• Fast (as early as possible)
• Scalable (keeps being fast when growing)
• Runnable by anyone
• It’s a WHOLE TEAM thing
Challenges of test automation
4. Building Vs. Using
Building vs. Using
• Now that testers are toolsmiths, they can also be
infected by the ”building-everything-from-scratch”
disease
• There is a tool for pretty much anything. If there is not,
search again (at least you should find a starting point)
• A good tester should always try to find the right tool
for the right kind of testing
● In 2014 we had to send 10k+ HTML emails to our users
● HTML emails are inconsistently rendered by different email clients:
• Some do not support HTML at all,
• Some do not render it consistently with W3C specifications
● You never know which email client your users are using
• Desktop clients: Outlook 2002, 2013, Thunderbird
• Web clients: GMail (in Chrome, FF, IE), Yahoo! Mail, etc
● You cannot easily automate if email looks good, but you can automate
rendering in ~10 mail clients
Example
Example (cont)
Building Test Frameworks
How can a Test Framework be USEFUL?
1. Easy to Write Tests
when().
get("/store").
then().
body("store.book.findAll { it.price < 10 }.title",
hasItems("Sayings of the Century",
”Moby Dick"));
when().
get("/store").
then().
body("store.book.author.collect { it.length()}.sum()",
greaterThan(50));
library: RestAssured
https://github.com/rest-assured/rest-assured
2. Reporting
library: cucumber-reporting
https://github.com/damianszczepanik/cucumber-reporting
2. Reporting
library: logging-selenium
http://loggingselenium.sourceforge.net/usage.html
3. Debugging
Library: curl-logger
(Maciej Gawinecki)
http://nomoretesting.logdown.com/
https://github.com/dzieciou/curl-logger
curl 'http://google.com/' -H 'Accept: */*' -H
'Content-Length: 0' -H 'Host: google.com'
-H 'Connection: Keep-Alive' -H 'User-
Agent: Apache-HttpClient/4.5.1
(Java/1.8.0_45)' --compressed --insecure
--verbose
4. Transparent X-Platform & X-Browser Testing
Libraries:
WebDriver
http://www.seleniumhq.org
appium
https://github.com/appium/appium
5. Project Management Tool Integration
Zephyr API (ZAPI)
http://docs.getzephyr.apiary.io
Challenges of test automation
5. Thinking that end-to-end test
automation solves everything
Typical scenario of relying in end-to-end test automation
Example: the pain of a real delivery pipeline
Days Left Pass
%
Notes
1 5% Everything is broken! Signing in to the service is broken... Almost all
tests sign in a user so almost all tests failed.
0 4% A partner team we rely on deployed a bad build to their testing
environment yesterday.
-1 54% A dev broke the save scenario yesterday (or the day before?). Half the
tests save a document at some point in time. Devs spent most of the
day determining if it's a frontend bug or a backend bug
-2 54% It's a frontend bug, devs spent half of today figuring out where
-3 54% A bad fix was checked in yesterday. The mistake was pretty easy to
spot, though, and a correct fix was checked in today.
http://googletesting.blogspot.com/2015/04/just-say-no-to-more-end-to-end-tests.html
Example: the pain of a real delivery pipeline
Days
Left
Pass % Notes
-4 1% Hardware failures occurred in the lab for our testing environment.
-5 84% Many small bugs hiding behind the big bugs (e.g., sign-in broken, save
broken). Still working on the small bugs.
-6 87% We should be above 90%, but are not for some reason.
-7 89.54% (Rounds up to 90%, close enough.) No fixes were checked in
yesterday, so the tests must have been flaky yesterday
http://googletesting.blogspot.com/2015/04/just-say-no-to-more-end-to-end-tests.html
©2016 AKAMAI | FASTER FORWARDTM
Problems with end-to-end tests
● Long
• Developers need to wait long for feedback about their changes
● Flaky
• Sensitive to environment and subsystems failures, timeouts, etc.
• Reduce the developer's trust in test, as a result flaky tests are often ignored
● Hard to isolate root cause
• Developers need to find the specific lines of code causing the bug
• For >1M LOC it's like trying to find a needle in a haystack.
http://googletesting.blogspot.com/2015/04/just-say-no-to-more-end-to-end-tests.html
©2016 AKAMAI | FASTER FORWARDTM
How to address it? Do end-to-end only if necessary!
Maintenance
Slower tests
Flakiness
Move Fast & Don't Break Things, GTAC 2014
©2016 AKAMAI | FASTER FORWARDTM
Cooling down
Summary
©2016 AKAMAI | FASTER FORWARDTM
Summary
• Test Automation is a form of development and
should be treated as such
• It also suffers the same problems
• Think of WHY you need it before doing it
• There are plenty of tools and libraries to make it
more useful
Thank You!
Feel free to send questions to jdelbarc@akamai.com

More Related Content

PDF
Jeremias Rößler
PDF
A Software Tester's Travels from the Land of the Waterfall to the Land of Agi...
PDF
The limits of unit testing by Craig Stuntz
PPTX
DotNet unit testing training
PDF
Myth vs Reality: Understanding AI/ML for QA Automation - w/ Jonathan Lipps
PDF
iOS Test-Driven Development
PDF
Engaging IV&V Testing Services for Agile Projects
PDF
Justin Ison
Jeremias Rößler
A Software Tester's Travels from the Land of the Waterfall to the Land of Agi...
The limits of unit testing by Craig Stuntz
DotNet unit testing training
Myth vs Reality: Understanding AI/ML for QA Automation - w/ Jonathan Lipps
iOS Test-Driven Development
Engaging IV&V Testing Services for Agile Projects
Justin Ison

What's hot (20)

PDF
Codeception Testing Framework -- English #phpkansai
PDF
Automation testing: how tools are important?
PDF
30 of the best free software test tools in 60 minutes by Jess Lancaster
PPTX
ISTQB Foundation and Selenium Java Automation Testing
ODP
Integration Testing in Python
PPTX
Sustainable Automation Frameworks by Kelsey Shannahan
PPTX
assertYourself - Breaking the Theories and Assumptions of Unit Testing in Flex
PDF
Adam carmi
PPTX
Testing for Android: When, Where, and How to Successfully Use Test Automation
PPTX
Testing As A Bottleneck - How Testing Slows Down Modern Development Processes...
PPTX
Test management struggles and challenges in SDLC
PDF
Understanding Layers of Testing
PDF
Php tests tips
PPTX
Automated tests
PDF
FAQ - why does my code throw a null pointer exception - common reason #1 Rede...
PDF
Testing Experience - Evolution of Test Automation Frameworks
PPT
Testing Options in Java
PPTX
Test Driven Development (TDD) Preso 360|Flex 2010
PDF
Test & behavior driven development
PPTX
Automated testing web application
Codeception Testing Framework -- English #phpkansai
Automation testing: how tools are important?
30 of the best free software test tools in 60 minutes by Jess Lancaster
ISTQB Foundation and Selenium Java Automation Testing
Integration Testing in Python
Sustainable Automation Frameworks by Kelsey Shannahan
assertYourself - Breaking the Theories and Assumptions of Unit Testing in Flex
Adam carmi
Testing for Android: When, Where, and How to Successfully Use Test Automation
Testing As A Bottleneck - How Testing Slows Down Modern Development Processes...
Test management struggles and challenges in SDLC
Understanding Layers of Testing
Php tests tips
Automated tests
FAQ - why does my code throw a null pointer exception - common reason #1 Rede...
Testing Experience - Evolution of Test Automation Frameworks
Testing Options in Java
Test Driven Development (TDD) Preso 360|Flex 2010
Test & behavior driven development
Automated testing web application
Ad

Similar to Creating testing tools to support development (20)

PPT
Why test with flex unit
PPTX
TDD Best Practices
PPTX
Unit tests & TDD
PPTX
Browser Automated Testing Frameworks - Nightwatch.js
PDF
Software Testing
PPTX
STARWEST 2011 - 7 Steps To Improving Software Quality using Microsoft Test Ma...
PPT
Google test training
PPS
Unit Testing
PDF
Automated Exploratory Testing
PPS
Why Unit Testingl
PPS
Why Unit Testingl
PPS
Why unit testingl
PDF
Testing Hourglass at Jira Frontend - by Alexey Shpakov, Sr. Developer @ Atlas...
DOC
Comparison manual & automation
PDF
TDD Workshop UTN 2012
PPTX
STAREAST 2011 - 7 Steps To Improving Software Quality using Microsoft Test Ma...
PPTX
Reliable mobile test automation
PPT
Automation testing
PPTX
#DOAW16 - DevOps@work Roma 2016 - Testing your databases
PPTX
Joomla! Testing - J!DD Germany 2016
Why test with flex unit
TDD Best Practices
Unit tests & TDD
Browser Automated Testing Frameworks - Nightwatch.js
Software Testing
STARWEST 2011 - 7 Steps To Improving Software Quality using Microsoft Test Ma...
Google test training
Unit Testing
Automated Exploratory Testing
Why Unit Testingl
Why Unit Testingl
Why unit testingl
Testing Hourglass at Jira Frontend - by Alexey Shpakov, Sr. Developer @ Atlas...
Comparison manual & automation
TDD Workshop UTN 2012
STAREAST 2011 - 7 Steps To Improving Software Quality using Microsoft Test Ma...
Reliable mobile test automation
Automation testing
#DOAW16 - DevOps@work Roma 2016 - Testing your databases
Joomla! Testing - J!DD Germany 2016
Ad

Recently uploaded (20)

PDF
System and Network Administration Chapter 2
PPTX
L1 - Introduction to python Backend.pptx
PPTX
Transform Your Business with a Software ERP System
PDF
Digital Strategies for Manufacturing Companies
PDF
Design an Analysis of Algorithms II-SECS-1021-03
PDF
Internet Downloader Manager (IDM) Crack 6.42 Build 42 Updates Latest 2025
PPTX
Odoo POS Development Services by CandidRoot Solutions
PPTX
CHAPTER 2 - PM Management and IT Context
PPTX
Operating system designcfffgfgggggggvggggggggg
PPTX
VVF-Customer-Presentation2025-Ver1.9.pptx
PDF
Understanding Forklifts - TECH EHS Solution
PDF
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
PDF
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
PPTX
Agentic AI : A Practical Guide. Undersating, Implementing and Scaling Autono...
PDF
2025 Textile ERP Trends: SAP, Odoo & Oracle
PDF
Audit Checklist Design Aligning with ISO, IATF, and Industry Standards — Omne...
PPT
Introduction Database Management System for Course Database
PDF
Design an Analysis of Algorithms I-SECS-1021-03
PDF
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
PPTX
CHAPTER 12 - CYBER SECURITY AND FUTURE SKILLS (1) (1).pptx
System and Network Administration Chapter 2
L1 - Introduction to python Backend.pptx
Transform Your Business with a Software ERP System
Digital Strategies for Manufacturing Companies
Design an Analysis of Algorithms II-SECS-1021-03
Internet Downloader Manager (IDM) Crack 6.42 Build 42 Updates Latest 2025
Odoo POS Development Services by CandidRoot Solutions
CHAPTER 2 - PM Management and IT Context
Operating system designcfffgfgggggggvggggggggg
VVF-Customer-Presentation2025-Ver1.9.pptx
Understanding Forklifts - TECH EHS Solution
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
Agentic AI : A Practical Guide. Undersating, Implementing and Scaling Autono...
2025 Textile ERP Trends: SAP, Odoo & Oracle
Audit Checklist Design Aligning with ISO, IATF, and Industry Standards — Omne...
Introduction Database Management System for Course Database
Design an Analysis of Algorithms I-SECS-1021-03
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
CHAPTER 12 - CYBER SECURITY AND FUTURE SKILLS (1) (1).pptx

Creating testing tools to support development

  • 1. Creating Testing Tools to Support Development Chema del Barco SDET Manager, Akamai Technologies
  • 2. Warming up How many of you… … Work as dedicated testers? … Create automated tests? … Think you have enough tests? … Rely on automated tests from testers? … Trust these automated tests?
  • 3. Warming up How do we use test automation at Akamai? • Different products & systems • Java, JS, Ruby, Python… • A lot of backend, less frontend • Part of workflow • We also do manual testing • Lots of challenges
  • 4. Challenges of test automation Challenges of Test Automation
  • 5. Challenges of test automation 1. Not testing the right thing
  • 6. Not testing the right thing Real Life Example: Volvo’s Automatic Collision Avoidance (2010)
  • 7. Not testing the right thing https://www.youtube.com/watch?v=aNi17YLnZpg
  • 8. Not testing the right thing - lessons learned 1. Everybody makes mistakes 2. There is no such thing as 100% test coverage 3. There are always risks involved in any delivery 4. Testing can tell us where these risks are 5. We can mitigate risks if we know them
  • 9. Not testing the right thing Some failed demos later (2014)…
  • 10. Not testing the right thing https://www.youtube.com/watch?v=kWiwS-43xpk
  • 11. Not testing the right thing How can we mitigate this?
  • 12. Not testing the right thing Don’t lose the user’s point of view PO/Testers can use BDD / Spec by Example
  • 13. Not testing the right thing Developers can use TDD
  • 14. Not testing the right thing Or you can all work together to use ATDD!
  • 15. Not testing the right thing Alternative approach: Use Domain Abstractions (ex: Actors & Actions)
  • 16. Challenges of test automation 2. The Test Environment Hell
  • 17. The Test Environment Hell • Complex applications tend to rely on big Integrated Environments for validating changes • Environments are so complex that require constant troubleshooting and maintenance • Testers (and builds) are often blocked and teams spend a lot of time investigating if problem is app or environment related
  • 18. Build UT IT ST Delivery Build Validate Dev Stream SDET Stream UT – Unit Testing Promote – Deploy to Next CT – Component Testing Test Environment IT – Integration Testing Push – Manual code push ST – System Testing SIT – System Integration Testing Promote Push DEV Environment INT Environment Push Ex: Akamai’s Luna Control Center test environments
  • 19. Build UT IT ST Delivery Build Validate Dev Stream SDET Stream UT – Unit Testing Promote – Deploy to Next CT – Component Testing Test Environment IT – Integration Testing Push – Manual code push ST – System Testing SIT – System Integration Testing Promote Push DEV Environment INT Environment Push Ex: Akamai’s Luna Control Center test environments
  • 20. The Test Environment Hell How can we mitigate this?
  • 21. 1. Test the right thing in the right environment “Separation of Concerns”: 1. Test that your change works, in isolation (UT) 2. Test that your app works with your change, in isolation (CT / IT) 3. Test that your app works with your change, in integration (ST) 4. Test that your system works in integration (SIT)
  • 22. 1. Test the right thing in the right environment • Isolated Component (Service) Testing • Exclusive for Team • MockServer to mock external dependencies (provided by the service owners) • SUT thinks it’s in production • Can test also Integration (Contract) • Very stable, maintained by devs/testers • Testers can automate and validate faster • Everyone is less frustrated Service Under Test (SUT) Service 2 (MockServer) Service 4 (MockServer) Service 3 (MockServer) Local Test Env.
  • 23. Build UT CT IT ST SIT Delivery Build Validate Dev Stream SDET Stream Promote Push Promote DEV Environments INT Environment SYS Environment Push QA Environment 1. Test the right thing in the right environment If a test fails here it we know for a fact that it will be because of: 1. The environment 2. A service not following its API contract, or 3. An actual integration bug (+ IT + ST)
  • 24. 2. Implement automatic retry-on-error techniques Some test frameworks like TestNG allow for several ways to automatically retry failed tests, by: 1. Running “<test-outputs>testng-failed.xml” after a run with failed tests [Good] 2. Run TestNG programmatically and implement a “retry test” transformer to the @Test annotation [Great!] 3. If you don’t want to retry all, you can create a @Retry annotation to implement the “retry test” listener [Great!]
  • 25. 2. Implement automatic retry-on-error techniques (option 2)
  • 26. 2. Implement automatic retry-on-error techniques (option 2)
  • 27. 2. Implement automatic retry-on-error techniques (option 2)
  • 28. 3. Use plugins in your CI to detect flaky tests https://wiki.jenkins-ci.org/display/JENKINS/Test+Results+Analyzer+Plugin
  • 29. Challenges of test automation 3. Doing test automation for the wrong reasons
  • 30. 1. “I don’t have to test anything manually” Manual Testing is CRUCIAL in Agile because machines cannot think outside the box …Yet ;) “Agile Testing: A Practical Guide for Testers and Agile Teams”, Lisa Crispin & Janet Gregory
  • 31. 2. Testing as a separated work flow Build UT IT/ST OKBuild DEV Environments QA Environment Dev Stream Test Stream FAIL “Ready to Test” Development “DONE” Tester waits for changes to be deployed and runs tests Development “STARTS” Tester provides FEEDBACK
  • 32. Not understanding WHY we need test automation Build UT CT IT ST SIT Delivery Build Validate Promote Push Promote DEV Environments INT Environment SYS Environment Push QA Environment Build Pipeline Test Stream The goal of test automation is to provide FEEDBACK Test Automation should be another form of delivery
  • 33. Not understanding WHY we need test automation • Feedback MUST be: • Reliable (no false positives/negatives) • Fast (as early as possible) • Scalable (keeps being fast when growing) • Runnable by anyone • It’s a WHOLE TEAM thing
  • 34. Challenges of test automation 4. Building Vs. Using
  • 35. Building vs. Using • Now that testers are toolsmiths, they can also be infected by the ”building-everything-from-scratch” disease • There is a tool for pretty much anything. If there is not, search again (at least you should find a starting point) • A good tester should always try to find the right tool for the right kind of testing
  • 36. ● In 2014 we had to send 10k+ HTML emails to our users ● HTML emails are inconsistently rendered by different email clients: • Some do not support HTML at all, • Some do not render it consistently with W3C specifications ● You never know which email client your users are using • Desktop clients: Outlook 2002, 2013, Thunderbird • Web clients: GMail (in Chrome, FF, IE), Yahoo! Mail, etc ● You cannot easily automate if email looks good, but you can automate rendering in ~10 mail clients Example
  • 38. Building Test Frameworks How can a Test Framework be USEFUL?
  • 39. 1. Easy to Write Tests when(). get("/store"). then(). body("store.book.findAll { it.price < 10 }.title", hasItems("Sayings of the Century", ”Moby Dick")); when(). get("/store"). then(). body("store.book.author.collect { it.length()}.sum()", greaterThan(50)); library: RestAssured https://github.com/rest-assured/rest-assured
  • 42. 3. Debugging Library: curl-logger (Maciej Gawinecki) http://nomoretesting.logdown.com/ https://github.com/dzieciou/curl-logger curl 'http://google.com/' -H 'Accept: */*' -H 'Content-Length: 0' -H 'Host: google.com' -H 'Connection: Keep-Alive' -H 'User- Agent: Apache-HttpClient/4.5.1 (Java/1.8.0_45)' --compressed --insecure --verbose
  • 43. 4. Transparent X-Platform & X-Browser Testing Libraries: WebDriver http://www.seleniumhq.org appium https://github.com/appium/appium
  • 44. 5. Project Management Tool Integration Zephyr API (ZAPI) http://docs.getzephyr.apiary.io
  • 45. Challenges of test automation 5. Thinking that end-to-end test automation solves everything
  • 46. Typical scenario of relying in end-to-end test automation
  • 47. Example: the pain of a real delivery pipeline Days Left Pass % Notes 1 5% Everything is broken! Signing in to the service is broken... Almost all tests sign in a user so almost all tests failed. 0 4% A partner team we rely on deployed a bad build to their testing environment yesterday. -1 54% A dev broke the save scenario yesterday (or the day before?). Half the tests save a document at some point in time. Devs spent most of the day determining if it's a frontend bug or a backend bug -2 54% It's a frontend bug, devs spent half of today figuring out where -3 54% A bad fix was checked in yesterday. The mistake was pretty easy to spot, though, and a correct fix was checked in today. http://googletesting.blogspot.com/2015/04/just-say-no-to-more-end-to-end-tests.html
  • 48. Example: the pain of a real delivery pipeline Days Left Pass % Notes -4 1% Hardware failures occurred in the lab for our testing environment. -5 84% Many small bugs hiding behind the big bugs (e.g., sign-in broken, save broken). Still working on the small bugs. -6 87% We should be above 90%, but are not for some reason. -7 89.54% (Rounds up to 90%, close enough.) No fixes were checked in yesterday, so the tests must have been flaky yesterday http://googletesting.blogspot.com/2015/04/just-say-no-to-more-end-to-end-tests.html
  • 49. ©2016 AKAMAI | FASTER FORWARDTM Problems with end-to-end tests ● Long • Developers need to wait long for feedback about their changes ● Flaky • Sensitive to environment and subsystems failures, timeouts, etc. • Reduce the developer's trust in test, as a result flaky tests are often ignored ● Hard to isolate root cause • Developers need to find the specific lines of code causing the bug • For >1M LOC it's like trying to find a needle in a haystack. http://googletesting.blogspot.com/2015/04/just-say-no-to-more-end-to-end-tests.html
  • 50. ©2016 AKAMAI | FASTER FORWARDTM How to address it? Do end-to-end only if necessary! Maintenance Slower tests Flakiness Move Fast & Don't Break Things, GTAC 2014
  • 51. ©2016 AKAMAI | FASTER FORWARDTM Cooling down Summary
  • 52. ©2016 AKAMAI | FASTER FORWARDTM Summary • Test Automation is a form of development and should be treated as such • It also suffers the same problems • Think of WHY you need it before doing it • There are plenty of tools and libraries to make it more useful
  • 53. Thank You! Feel free to send questions to jdelbarc@akamai.com