SlideShare a Scribd company logo
Test automation engineer
Dec. 1st, 2021
Emura Sadaaki
Rakuten, Inc.
2
Agenda
1. What is test automation
2. Test automation pyramid
3. Test automation reporting
4. Test automation design
5. Test automation deployment / rollout
3
1.What is test automation
4
Why do you do test automation?
5
Test automation objective
1. improve efficiency
2. expand test function coverage
3. reduce total cost
4. execute tests manual tester cannot
5. improve test speed , increase test cycles (frequency)
6. execute exactly same behavior every time
※ improve quality is not included
6
Efficiency & Total cost
Manual test
Test automation
Test reputation
cost
Manual test
Test automation
Test reputation
time
Time Cost
7
Expand test function coverage & execute test manual tester cannot
• Many data variation pattern test
• Many combination function test
• Repeat same test in many times
10,000 test pattern
8
What is test automation merit and demerit?
9
Test automation merit & demerit
Merit Demerit
1. increase test cycle
2. possibility to execute test manual tester cannot,
difficult, complex
3. execute test faster
4. reduce human mistake to execute test
5. more efficient use of test resources
6. quick feedback of test result
7. improve consistency of tests
1. consider initial cost and maintenance cost
2. require technical skill, tool
3. tend to forget true testing objective.
4. testing to be complex
5. additional investigation for automation testing error
6. Difficult to find new bug
10
Execute test faster & Quick feedback of test result
Test A
Test B
Test C
Test D
Test E
Faster execute
Degrade found
in function D
11
Reduce human mistake to execute test & Improve consistency of test
V.S.
• Different operation every time
• Mistake operation sometimes
• Easy reproduce
• Follow test spec perfectly
12
Consider initial cost and maintenance cost
AND
13
Require technical skill, tool
• Programing skill
• CI tool skill
• Maintenance strategy
• Middle ware to connect test device . Appium , selenium …
• How to connect reporting tool (JIRA, Practitest)
14
Tend to forget true testing objective
OR
15
Additional investigation for automation testing error
Test Automation
ManualTest
1. Execute test
2. Bug is found and know detail
1. Test automation failed
2. Check test result
3. Run test automation by manual
4. Know detail
16
Difficult to find new bug
V.S.
• Regression
• Bug is found during scripting
• Repeat same behavior
• Exploratory testing
• Ad-hoc testing
17
What is test automation limitation?
18
Test automation limitation
1. not all manual test can be automated
2. can only check machine-understandable result
3. can only check actual result with expectation prepared
4. cannot do exploratory test
19
Not all manual test can be automated
• Exploratory test
• Design broken check
• Test automation tool not support
Difficult to automate
20
can only check understandable result & check actual result with expectation prepared
• Need to get actual data to validate
• Prepare expectation in advance
21
2.Test automation pyramid
22
Test automation pyramid
23
Test automation pyramid
characteristic
End to end test • Front end & Back end, total system test
• Black-Box test
• Test by End user
• test speed is slow/ take time to test
• If need huge test case, cannot finish test
• Tend to increase test cost
• test environment is fragile
• A few Test cycle
Integration test • Example API test
• Cover difficult point to test by unit test
• Test with Database (Unit test usually use stub)
• Black-Box test , based on Inter Face Specification
Unit test • Small size, component function test
• Easy to find bug
• Independent test of other unit
• Check success or failure
• Tend to increase test set
• Require test speed
• It’s usual to run test every time code change, make
24
Anti Test automation pyramid
Cupcake Ice cream cone hourglass Dual pyramid
What do you imagine?
25
Cupcake
• Each person in charge do similar test
• No collaboration
• Do manual test after E2E test
• Finally do exploratory test due to worry
26
Ice cream cone
• Increase test automation volume as test progress
• Do my best by black box test , not unit test
• Increase test cost
• Take time to find bug route cause
27
hourglass
• Forget Integration test (API)
• Not big problem like ice cream cone, but cost ,speed issue
28
Dual pyramid
• Each specialist create each original test platform
• Many test automation duplication
29
3.Test automation reporting
30
Test automation metrics
External metrics : how impact to another activity
• ROI(how much reduce cost, install cost, maintenance cost)
• A number of bug found(bug found ratio、compare with manual)
• Performance (execution speed)
• Accuracy (script fail ratio、false negative, false positive)
Internal metrics : major test automation effect, efficiency
• Scripting cost
• Script bug ratio
• performance
to plan test automation strategy, monitor effect , efficiency
31
Note : False positive & False negative
● False positive
● False negative
Report say bug, but bug is not found
Report say no bug, but bug is found
→ check cost increase, but keep quality
→ missing bug
32
Test automation logging and reporting
• Content of the reports to know which tests have failed and the reasons for failure
• Publishing the reports to know if test execution was success or not
● Log type
● structure to give result properly
● report format depend on receiver
• Test automation current status, execution result
• Execution step in detail, screen shot, test data
• System log (crush dump, stack trace)
• Dashboard, to understand all test automation summary
• Historical test results are stored
33
4.Test automation design
34
Test automation design
1. Data driven
2. Keyword driven
3. Robust & Sensitive
4. Scenario independency
5. Scenario size
6. Setup &Teardown
7. Analyzable report
8. Repeatability
9. Break flaky
10. Flexible trap
11. Performance
12. Simple scripting
35
1. Data driven
Separate data from script
36
2. Data driven
Merit
1. Improve maintenanceability
2. Reuse script with many data
3. Easy to execute it repeatedly
37
3. keyword driven
Readable script. Abstract detail steps.
Low level test steps
Keyword driven
38
4. Robust and Sensitive
Robust Sensitive
• Validate some points
• Scripting cost is small
• Test is not failed when UI change a little
• Take time to investigate reason when test is failed
• Sometimes false negative happen
• Test execution time is short
• Validate many points
• Validate data format in detail
• Scripting cost is huge
• A little UI change impact test automation
• Test execution time is long
Which direction we should choose before script depend on test automation objective
39
Robust
40
Sensitive
41
5. Scenario independency
One test scenario has independency to execute stand-alone
Users run test automation without pre-condition
But to keep independency, too big test scenario is not good.
Independency and scenario size balance are required
42
5. Scenario independency
Independency Dependency
Create ID
Reservation hotel
1. Create new ID
2. Login mypage to validate it
1. Reserve Hotel
2. Validate reservation in mypage
3. Cancel this reservation
43
6. Scenario size
Test execution time depend on test scenario size
This issue cause investigation for failed script
1. Investigate error reason with logs
2. Fix issue
3. Run test scenario again
Step 3 take much time
success
Failure
investigation
1
st
execution
time
2
nd
execution
time
44
7. Setup and Teardown
Task Before test execution and After test execution
What task do you think?
To keep consistency of tests
45
7. Setup and Teardown
Setup (task before test execution)
• Open browser
• Open Native App
• Close unnecessary popup in browser
• Clear cookie , clear cach
• Set default data
46
7. Setup and Teardown
Teardown (task after test execution)
This task should run even if test is failed in the middle
• Initialize data
• Close browser
• Close Native App
47
8. Analyzable report
Test report have SUCCESS or FAILURE status exactly
Also it have information to investigate it when test is failed
What information is needed?
48
8. Analyzable report
• Test step
• Test data used
• Where and why test failed
• Screen shot when failed
• Movie when failed
• Can trace changing value
• Total and each step test execution time
Test report should have ..
49
9. Repeatability
Repeatability is “Can execute same test scenario many times”
• Want to run test every time when deployment
• Want to run test after fix failure test
What scenario affects repeatability?
50
9. Repeatability
1. One email address register membership only once, but can withdraw membership
2. One email address register membership only once and cannot withdraw it
3. Set favorite
4. Purchase items by pool money
5. Reservation system to choose date like golf , travel
There are example scenario to affect repeatability
51
10. Break flaky
Flaky is a test that both pass and fail periodically without any application change
Test result history
• Identify Object by X,Y location
• Unstable environment(performance, network, etc)
• Test precondition is not clear or changeable
• Test data is not maintained
• Out of sync (data , UI , etc)
Route cause
52
Sample script
53
Sample script
54
11. Flexible trap
“Flexible scripting” is test automation to do suitable behavior depend on situation
It’s not clear what test is running every time
It could cause “false negative”
This test automation is not failed in any situation
55
Flexible trap example
Base test scenario : book for 2 person
Condition Result Flexible scenario
Full Fail Test skip
1 available Fail Book for 1 person
2 available Success Book for 2 person
No search result Fail Test skip
Flexible test scenario
Is it OK to do this behavior without observe ?
56
12. Performance
One of test automation objective is feedback speed.
Performance is important
• Optimize loop operation , reduce same step
• Fast search object
• Reduce unnecessary “wait”
Example
57
12. Performance
login
Add favorite X
Validate X
login
Open chrome
Close chrome
X = 1,2,3 …
before after
login
Add favorite X
Validate X
login
Open chrome
Close chrome
X = 1,2,3 …
58
13. Simple scripting
Test automation is application
Test automation should be tested
Test automation should be simple
59
5.Test automation deployment / rollout
60
Pilot Deployment Maintenance
61
Pilot project
●Choose project
• Not big , not small
• Important and schedule must project should be avoid
• How to use ? ,Why use ?
• Decide test scope
• Test designer, script engineer, operator etc.
• Test automation tool, environment
●Clear test automation objective and scope
●Resource
62
Deployment
●Report
• Build system / flow to get metrix like coverage, performance (automatically)
• Build system to correct test result and analyze it
●Process & Documentation
• How to support project by test automation
• Guideline (coding rule, test etc)
• Training for new members
63
How to approach to project
Requirement Design Coding Test Release
Class, id guideline?
When environment is ready?
When get stable?
Able to start coding
from here?
64
Maintenance
●Update script
• Create new or update current script to support latest specification
• Update script to improve performance , adopt changeable environment
●support environment
• Update test environment (OS, browser, device)
• Update test tool (tool, middle ware like java)
• Scale up, scale out test resource
65
Reference
• ISTQB “Test Automation Engineer” Syllabus
https://www.istqb.org/downloads/category/48-advanced-level-test-automation-engineer-documents.html
• Experiences ofTest Automation
https://books.rakuten.co.jp/rk/86e641c4dda434f9961203d68efdbd16/
Test automation engineer

More Related Content

PPTX
Test automation lesson
PPTX
ISTQB Advanced Test Manager Training
PPTX
Test Automation Improvement by Machine Learning Jasst'21 Tokyo
PPTX
20200630 Rakuten QA meetup #2 "Improve test automation operation"
PPTX
ISTQB Foundation and Selenium Java Automation Testing
PPTX
Test Automation Architecture That Works by Bhupesh Dahal
PDF
Test Automation
PPTX
ISTQB Advanced Level Test Automation Engineering Training
Test automation lesson
ISTQB Advanced Test Manager Training
Test Automation Improvement by Machine Learning Jasst'21 Tokyo
20200630 Rakuten QA meetup #2 "Improve test automation operation"
ISTQB Foundation and Selenium Java Automation Testing
Test Automation Architecture That Works by Bhupesh Dahal
Test Automation
ISTQB Advanced Level Test Automation Engineering Training

What's hot (20)

PPTX
5 Considerations When Adopting Automated Testing
PPT
Automation Framework/QTP Framework
PPTX
Test management struggles and challenges in SDLC
PDF
Test Automation
PDF
Testing automation in agile environment
PPTX
Qa process 2012
PPTX
Test Automation failure analysis
PDF
The limits of unit testing by Craig Stuntz
PDF
Developing a test automation strategy by Brian Bayer
PDF
KeytorcTestTalks #11 - Onur Başkirt, Agile Test Management with Testrail
PDF
Wso2con test-automation
PDF
Test pyramid
PPT
Automation testing
PPTX
Which test cases to automate
PDF
QA Process Overview for Firefox OS 2014
PDF
Tutorial ranorex
PPTX
Test automation
PPT
Guideto Successful Application Test Automation
PDF
Automation testing introduction for FujiNet
PPTX
5 top pain points of test automation
5 Considerations When Adopting Automated Testing
Automation Framework/QTP Framework
Test management struggles and challenges in SDLC
Test Automation
Testing automation in agile environment
Qa process 2012
Test Automation failure analysis
The limits of unit testing by Craig Stuntz
Developing a test automation strategy by Brian Bayer
KeytorcTestTalks #11 - Onur Başkirt, Agile Test Management with Testrail
Wso2con test-automation
Test pyramid
Automation testing
Which test cases to automate
QA Process Overview for Firefox OS 2014
Tutorial ranorex
Test automation
Guideto Successful Application Test Automation
Automation testing introduction for FujiNet
5 top pain points of test automation
Ad

Similar to Test automation engineer (20)

PPTX
Introduction to Automation Testing
PPTX
Introduction to Automation Testing
PPTX
How to make Automation an asset for Organization
PPTX
Curiosity and Infuse Consulting Present: Sustainable Test Automation Strategi...
PDF
Test automation methodologies
PPT
Automation Concepts
PPTX
Questions for successful test automation projects
PDF
Automated testing handbook from Linda Hayes
PPTX
Module 4.pptxbsbsnsnsnsbsbbsjsjzbsbbsbsbsbs
PPTX
unit-5 SPM.pptx
PDF
Automated testing-whitepaper
PPTX
Unit 5 st ppt
PPTX
Influence of emphasized automation in ci
PDF
Best Practises In Test Automation
PDF
Need for automation testing
PDF
Saksham Sarode - Innovation Through Introspection - EuroSTAR 2012
PPTX
Automated testing
PDF
programs testing programs
PDF
qLabs Test Automation
PDF
2019 Testim Webinar: Automation Test Strategy and Design for Agile Teams
Introduction to Automation Testing
Introduction to Automation Testing
How to make Automation an asset for Organization
Curiosity and Infuse Consulting Present: Sustainable Test Automation Strategi...
Test automation methodologies
Automation Concepts
Questions for successful test automation projects
Automated testing handbook from Linda Hayes
Module 4.pptxbsbsnsnsnsbsbbsjsjzbsbbsbsbsbs
unit-5 SPM.pptx
Automated testing-whitepaper
Unit 5 st ppt
Influence of emphasized automation in ci
Best Practises In Test Automation
Need for automation testing
Saksham Sarode - Innovation Through Introspection - EuroSTAR 2012
Automated testing
programs testing programs
qLabs Test Automation
2019 Testim Webinar: Automation Test Strategy and Design for Agile Teams
Ad

More from Sadaaki Emura (15)

PPTX
Test Automation Journey 2023 useful knowledge to start test automation
PPTX
What is Unit Testing
PPTX
20220527_JaSST'22 Tohoku
PPTX
How to introduce test automation in VeriServe Test Automation Talk #2
PPTX
20211221 jasst nano_test automation operation
PPTX
20191029 automation struggle
PPTX
basic of Test automation installation
PPTX
20190531 jasst19 tohoku
PPTX
Struggles and Challenges in STLC in Ques No.13
PPTX
20190424 q ameetup-m -publish
PPTX
20181211 QA meetup in office
PPTX
JaSST'18 Hokkaido Improve Automation Testing
PPTX
Istqb : Test automation Engineer
PPTX
20180828 QA meetup
PDF
QA improvement
Test Automation Journey 2023 useful knowledge to start test automation
What is Unit Testing
20220527_JaSST'22 Tohoku
How to introduce test automation in VeriServe Test Automation Talk #2
20211221 jasst nano_test automation operation
20191029 automation struggle
basic of Test automation installation
20190531 jasst19 tohoku
Struggles and Challenges in STLC in Ques No.13
20190424 q ameetup-m -publish
20181211 QA meetup in office
JaSST'18 Hokkaido Improve Automation Testing
Istqb : Test automation Engineer
20180828 QA meetup
QA improvement

Recently uploaded (20)

PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PDF
Empathic Computing: Creating Shared Understanding
PDF
Encapsulation theory and applications.pdf
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
KodekX | Application Modernization Development
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
DOCX
The AUB Centre for AI in Media Proposal.docx
PPTX
MYSQL Presentation for SQL database connectivity
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
NewMind AI Monthly Chronicles - July 2025
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PPTX
A Presentation on Artificial Intelligence
PDF
Review of recent advances in non-invasive hemoglobin estimation
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Diabetes mellitus diagnosis method based random forest with bat algorithm
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
Empathic Computing: Creating Shared Understanding
Encapsulation theory and applications.pdf
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
KodekX | Application Modernization Development
“AI and Expert System Decision Support & Business Intelligence Systems”
The AUB Centre for AI in Media Proposal.docx
MYSQL Presentation for SQL database connectivity
Encapsulation_ Review paper, used for researhc scholars
NewMind AI Monthly Chronicles - July 2025
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
A Presentation on Artificial Intelligence
Review of recent advances in non-invasive hemoglobin estimation
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Agricultural_Statistics_at_a_Glance_2022_0.pdf
NewMind AI Weekly Chronicles - August'25 Week I
Mobile App Security Testing_ A Comprehensive Guide.pdf
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx

Test automation engineer

  • 1. Test automation engineer Dec. 1st, 2021 Emura Sadaaki Rakuten, Inc.
  • 2. 2 Agenda 1. What is test automation 2. Test automation pyramid 3. Test automation reporting 4. Test automation design 5. Test automation deployment / rollout
  • 3. 3 1.What is test automation
  • 4. 4 Why do you do test automation?
  • 5. 5 Test automation objective 1. improve efficiency 2. expand test function coverage 3. reduce total cost 4. execute tests manual tester cannot 5. improve test speed , increase test cycles (frequency) 6. execute exactly same behavior every time ※ improve quality is not included
  • 6. 6 Efficiency & Total cost Manual test Test automation Test reputation cost Manual test Test automation Test reputation time Time Cost
  • 7. 7 Expand test function coverage & execute test manual tester cannot • Many data variation pattern test • Many combination function test • Repeat same test in many times 10,000 test pattern
  • 8. 8 What is test automation merit and demerit?
  • 9. 9 Test automation merit & demerit Merit Demerit 1. increase test cycle 2. possibility to execute test manual tester cannot, difficult, complex 3. execute test faster 4. reduce human mistake to execute test 5. more efficient use of test resources 6. quick feedback of test result 7. improve consistency of tests 1. consider initial cost and maintenance cost 2. require technical skill, tool 3. tend to forget true testing objective. 4. testing to be complex 5. additional investigation for automation testing error 6. Difficult to find new bug
  • 10. 10 Execute test faster & Quick feedback of test result Test A Test B Test C Test D Test E Faster execute Degrade found in function D
  • 11. 11 Reduce human mistake to execute test & Improve consistency of test V.S. • Different operation every time • Mistake operation sometimes • Easy reproduce • Follow test spec perfectly
  • 12. 12 Consider initial cost and maintenance cost AND
  • 13. 13 Require technical skill, tool • Programing skill • CI tool skill • Maintenance strategy • Middle ware to connect test device . Appium , selenium … • How to connect reporting tool (JIRA, Practitest)
  • 14. 14 Tend to forget true testing objective OR
  • 15. 15 Additional investigation for automation testing error Test Automation ManualTest 1. Execute test 2. Bug is found and know detail 1. Test automation failed 2. Check test result 3. Run test automation by manual 4. Know detail
  • 16. 16 Difficult to find new bug V.S. • Regression • Bug is found during scripting • Repeat same behavior • Exploratory testing • Ad-hoc testing
  • 17. 17 What is test automation limitation?
  • 18. 18 Test automation limitation 1. not all manual test can be automated 2. can only check machine-understandable result 3. can only check actual result with expectation prepared 4. cannot do exploratory test
  • 19. 19 Not all manual test can be automated • Exploratory test • Design broken check • Test automation tool not support Difficult to automate
  • 20. 20 can only check understandable result & check actual result with expectation prepared • Need to get actual data to validate • Prepare expectation in advance
  • 23. 23 Test automation pyramid characteristic End to end test • Front end & Back end, total system test • Black-Box test • Test by End user • test speed is slow/ take time to test • If need huge test case, cannot finish test • Tend to increase test cost • test environment is fragile • A few Test cycle Integration test • Example API test • Cover difficult point to test by unit test • Test with Database (Unit test usually use stub) • Black-Box test , based on Inter Face Specification Unit test • Small size, component function test • Easy to find bug • Independent test of other unit • Check success or failure • Tend to increase test set • Require test speed • It’s usual to run test every time code change, make
  • 24. 24 Anti Test automation pyramid Cupcake Ice cream cone hourglass Dual pyramid What do you imagine?
  • 25. 25 Cupcake • Each person in charge do similar test • No collaboration • Do manual test after E2E test • Finally do exploratory test due to worry
  • 26. 26 Ice cream cone • Increase test automation volume as test progress • Do my best by black box test , not unit test • Increase test cost • Take time to find bug route cause
  • 27. 27 hourglass • Forget Integration test (API) • Not big problem like ice cream cone, but cost ,speed issue
  • 28. 28 Dual pyramid • Each specialist create each original test platform • Many test automation duplication
  • 30. 30 Test automation metrics External metrics : how impact to another activity • ROI(how much reduce cost, install cost, maintenance cost) • A number of bug found(bug found ratio、compare with manual) • Performance (execution speed) • Accuracy (script fail ratio、false negative, false positive) Internal metrics : major test automation effect, efficiency • Scripting cost • Script bug ratio • performance to plan test automation strategy, monitor effect , efficiency
  • 31. 31 Note : False positive & False negative ● False positive ● False negative Report say bug, but bug is not found Report say no bug, but bug is found → check cost increase, but keep quality → missing bug
  • 32. 32 Test automation logging and reporting • Content of the reports to know which tests have failed and the reasons for failure • Publishing the reports to know if test execution was success or not ● Log type ● structure to give result properly ● report format depend on receiver • Test automation current status, execution result • Execution step in detail, screen shot, test data • System log (crush dump, stack trace) • Dashboard, to understand all test automation summary • Historical test results are stored
  • 34. 34 Test automation design 1. Data driven 2. Keyword driven 3. Robust & Sensitive 4. Scenario independency 5. Scenario size 6. Setup &Teardown 7. Analyzable report 8. Repeatability 9. Break flaky 10. Flexible trap 11. Performance 12. Simple scripting
  • 35. 35 1. Data driven Separate data from script
  • 36. 36 2. Data driven Merit 1. Improve maintenanceability 2. Reuse script with many data 3. Easy to execute it repeatedly
  • 37. 37 3. keyword driven Readable script. Abstract detail steps. Low level test steps Keyword driven
  • 38. 38 4. Robust and Sensitive Robust Sensitive • Validate some points • Scripting cost is small • Test is not failed when UI change a little • Take time to investigate reason when test is failed • Sometimes false negative happen • Test execution time is short • Validate many points • Validate data format in detail • Scripting cost is huge • A little UI change impact test automation • Test execution time is long Which direction we should choose before script depend on test automation objective
  • 41. 41 5. Scenario independency One test scenario has independency to execute stand-alone Users run test automation without pre-condition But to keep independency, too big test scenario is not good. Independency and scenario size balance are required
  • 42. 42 5. Scenario independency Independency Dependency Create ID Reservation hotel 1. Create new ID 2. Login mypage to validate it 1. Reserve Hotel 2. Validate reservation in mypage 3. Cancel this reservation
  • 43. 43 6. Scenario size Test execution time depend on test scenario size This issue cause investigation for failed script 1. Investigate error reason with logs 2. Fix issue 3. Run test scenario again Step 3 take much time success Failure investigation 1 st execution time 2 nd execution time
  • 44. 44 7. Setup and Teardown Task Before test execution and After test execution What task do you think? To keep consistency of tests
  • 45. 45 7. Setup and Teardown Setup (task before test execution) • Open browser • Open Native App • Close unnecessary popup in browser • Clear cookie , clear cach • Set default data
  • 46. 46 7. Setup and Teardown Teardown (task after test execution) This task should run even if test is failed in the middle • Initialize data • Close browser • Close Native App
  • 47. 47 8. Analyzable report Test report have SUCCESS or FAILURE status exactly Also it have information to investigate it when test is failed What information is needed?
  • 48. 48 8. Analyzable report • Test step • Test data used • Where and why test failed • Screen shot when failed • Movie when failed • Can trace changing value • Total and each step test execution time Test report should have ..
  • 49. 49 9. Repeatability Repeatability is “Can execute same test scenario many times” • Want to run test every time when deployment • Want to run test after fix failure test What scenario affects repeatability?
  • 50. 50 9. Repeatability 1. One email address register membership only once, but can withdraw membership 2. One email address register membership only once and cannot withdraw it 3. Set favorite 4. Purchase items by pool money 5. Reservation system to choose date like golf , travel There are example scenario to affect repeatability
  • 51. 51 10. Break flaky Flaky is a test that both pass and fail periodically without any application change Test result history • Identify Object by X,Y location • Unstable environment(performance, network, etc) • Test precondition is not clear or changeable • Test data is not maintained • Out of sync (data , UI , etc) Route cause
  • 54. 54 11. Flexible trap “Flexible scripting” is test automation to do suitable behavior depend on situation It’s not clear what test is running every time It could cause “false negative” This test automation is not failed in any situation
  • 55. 55 Flexible trap example Base test scenario : book for 2 person Condition Result Flexible scenario Full Fail Test skip 1 available Fail Book for 1 person 2 available Success Book for 2 person No search result Fail Test skip Flexible test scenario Is it OK to do this behavior without observe ?
  • 56. 56 12. Performance One of test automation objective is feedback speed. Performance is important • Optimize loop operation , reduce same step • Fast search object • Reduce unnecessary “wait” Example
  • 57. 57 12. Performance login Add favorite X Validate X login Open chrome Close chrome X = 1,2,3 … before after login Add favorite X Validate X login Open chrome Close chrome X = 1,2,3 …
  • 58. 58 13. Simple scripting Test automation is application Test automation should be tested Test automation should be simple
  • 61. 61 Pilot project ●Choose project • Not big , not small • Important and schedule must project should be avoid • How to use ? ,Why use ? • Decide test scope • Test designer, script engineer, operator etc. • Test automation tool, environment ●Clear test automation objective and scope ●Resource
  • 62. 62 Deployment ●Report • Build system / flow to get metrix like coverage, performance (automatically) • Build system to correct test result and analyze it ●Process & Documentation • How to support project by test automation • Guideline (coding rule, test etc) • Training for new members
  • 63. 63 How to approach to project Requirement Design Coding Test Release Class, id guideline? When environment is ready? When get stable? Able to start coding from here?
  • 64. 64 Maintenance ●Update script • Create new or update current script to support latest specification • Update script to improve performance , adopt changeable environment ●support environment • Update test environment (OS, browser, device) • Update test tool (tool, middle ware like java) • Scale up, scale out test resource
  • 65. 65 Reference • ISTQB “Test Automation Engineer” Syllabus https://www.istqb.org/downloads/category/48-advanced-level-test-automation-engineer-documents.html • Experiences ofTest Automation https://books.rakuten.co.jp/rk/86e641c4dda434f9961203d68efdbd16/

Editor's Notes

  • #10: Consistency = 一貫性
  • #26: 各担当がcollaborateしていない状況 各layerで同じようなテストを行う END2ENDのテストの後にマニュアルテストで頑張る 最後にexploratory testを心配なので実施する
  • #27: End User 目線に近づく(End to End Test)につれて、自動化テストのvolumeが多くなる Black Box テストでとにかく品質を頑張る Test feedbackが遅くなる Test Costが高くなる Bugのroute causeの発見に時間がかかる
  • #28: API等のテストを忘れる Ice cream coneほどではないが、cost/speed/debug等に問題を生じる
  • #29: それぞれの専門チームが個別に自動化テストプラットフォームを作成する それぞれの自動化テストが重複したテストを行っている
  • #31: テスト自動化戦略、効果や効率をモニタリングするためのメトリックス External metrics : テスト自動化がほかの活動へ与える影響度を測る 自動化のROI(自動化による工数削減、自動化構築工数、維持工数等) 自動化のバグ検出(欠陥検出数、手動テストとの比較等) 自動化の処理時間(パフォーマンス) 自動化の精度(失敗率、偽陰性、偽陽性等) Internal metrics : テスト自動化の効果、効率を測る 自動化Scripting工数 自動化Scripting欠陥密度 自動化の処理効率(パフォーマンス)
  • #33: 自動化の現在のステータス、実行結果 実行ステップの詳細ログ、スクリーンショット、テストデータ(特にエラー時) システム側のログ(クラッシュダンプ、スタックトレース等) Dashboard など、自動化の状態全般を把握できるもの 多くのテスト結果が収集される Content of the reports : テスト失敗時の分析に必要な情報を要する Publishing the reports : テストの実行可否を知るための情報を要する
  • #35: テスト自動化を設計するうえで、以下のようなことを考慮する必要がある
  • #36: Data driven(データ駆動)とは、"ScriptとDataを切り離して設計"すること。
  • #37: Data driven(データ駆動)とは、"ScriptとDataを切り離して設計"すること。
  • #38: キーワード駆動とは、readableなscriptingである。処理ステップを抽象化する
  • #39: テスト自動化の目的に応じて、robust or sensitive どちらに倒すかを最初に決めておく必要がある。 Robust 限られたポイントをチェックする テスト自動化のscripting工数は抑えられる 少しのUI変更が生じても、テスト自動化は失敗しない 失敗時の原因分析に時間がかかることがある False Negative になりうる テストの実行時間は短い Sensitive 数多くのポイントをチェックする データのフォーマット等(例 日付は0埋めMM/DD形式)、細かいレベルのチェックする テスト自動化のscripting工数が大きくなる 変化に対してテスト自動化の影響が受けやすく、修正が発生しやすい テストの実行時間が長くなる
  • #42: テストシナリオ(1つのテスト自動化の実行単位)は、単体で実行可能(独立性)であることが望ましい。 事前シナリオの実行前提を意識せずにテストを実行できるためである。 テスト自動化のシナリオが多くなれば、この依存性が大きいと実行順序をドキュメント化する等の運用に大きな障害が出てくる。 ただ、独立性を意識しすぎて、一つ一つのシナリオが肥大化するのも、テスト実行時間が長くなり、また同じテストを複数のシナリオで重複して実施することになるため、バランスが求められる。
  • #44: シナリオのサイズは、大きいと実行完了するまでの時間が長くなる。 テストが失敗したときの作業に影響する 1. エラーのログをもとに、失敗原因を調査 2. 失敗原因を復旧 3. 再実行し、成功を確認 そのため、一つのシナリオが大きいと、調査時間や再実行完了し成功を確認するまでの時間が大きくなる。
  • #45: Setup (テスト実行前に行う処理) テスト実行時には環境が整わなければならない。 例えば、以下のような処理である ブラウザ起動(WEBテスト) アプリ起動(NativeAppテスト) ブラウザの不要なポップアップをクローズ クッキーをクリア、もしくは関係するクッキーを削除 データのデフォルト設定
  • #46: Setup (テスト実行前に行う処理) テスト実行時には環境が整わなければならない。 例えば、以下のような処理である ブラウザ起動(WEBテスト) アプリ起動(NativeAppテスト) ブラウザの不要なポップアップをクローズ クッキーをクリア、もしくは関係するクッキーを削除 データのデフォルト設定
  • #47: Teardown (テスト実行後に行う処理) テスト終了後に環境をクリアにする処理である。 尚、テストが途中でエラーになり中断したとしても、このteardownは必ず実行させるようにすべきである。 例えば以下のような処理である データの初期化 ブラウザを終了する(複数のタブがOPENしている場合、すべてをクローズする) アプリを終了する 次のテスト自動化を実施するときに、影響を与えないようにテスト実行前の状態に戻すべきである。
  • #49: テスト実行ステップ 使用したテストデータ 失敗した時、どこでどんな理由で失敗したかが明確 失敗した時のスクリーンショット 失敗した時の動画 テストデータが変数で変化した場合、それをトレースできる 実行時間(トータル、処理step毎)
  • #50: テストスクリプトが単体で何度でも繰り返し実施が可能であることを求めているものである。 Repeatabilityは特にテストデータに依存する Repeatability に影響するテストシナリオの例 会員は一つのメールアドレスに対して一回しか申し込むことができない(退会可能) 会員は一つのメールアドレスに対して一回しか申し込むことができない(退会不可) お気に入り登録 プールしている金額の範囲で物を購入できる ゴルフのような、日付指定した予約処理
  • #51: テストスクリプトが単体で何度でも繰り返し実施が可能であることを求めているものである。 Repeatabilityは特にテストデータに依存する Repeatability に影響するテストシナリオの例 会員は一つのメールアドレスに対して一回しか申し込むことができない(退会可能) 会員は一つのメールアドレスに対して一回しか申し込むことができない(退会不可) お気に入り登録 プールしている金額の範囲で物を購入できる ゴルフのような、日付指定した予約処理
  • #52: Flaky とは、テスト自動化がバグが原因ではなく、成功、失敗を交互に繰り返す ObjectをX,Y locationで取得している 実行環境が不安定(performance, network, etc) テスト開始前提が定まっていない テストデータが管理されていない 十分なsynchronizationを取っていない
  • #55: "柔軟性のある実装による罠" “柔軟性のある実装”とは、テスト実行時の状態に応じてtest automationの処理を切り分けることである。 この実装により、多様な状態でもテストが失敗しない。 過度な柔軟性を持たせると、毎回のテスト実行でなにをテストしているのかが不明になる。 また、false negativeの可能性を起こす
  • #57: ループ処理の最適化 オブジェクト検索の最速化 安定化のためのwaitの最小化
  • #62: プロジェクトの選定 自動化するprojectの選定(大きすぎず、小さすぎず) スケジュールに依存するような重要なプロジェクトは避ける テスト自動化の目的、スコープの明確化 どのようになんのために使うかを明確にする(リグレッションテストとして毎日実行等) テストスコープを決める(リグレッション、重要度、変更頻度等) リソースの整備 テスト設計者、スクリプトエンジニア、運用者など関連者 テスト自動化ツールや実行環境
  • #63: Pilot projectが成功したら、導入・展開する レポート 現在状況(カバレッジ)、パフォーマンス、効果等を測定できる仕組み(メトリックス) テスト結果の効果的な収集、分析の仕組み プロセス、ドキュメンテーション プロジェクトに対してテスト自動化のアプローチ テスト自動化構築のガイドライン(コーディングルール等) 新規人材育成
  • #65: スクリプト保守 テスト対象の変更に追従するため、新規作成・修正 パフォーマンス向上等の完成度を向上 環境保守 テスト環境(OS、ブラウザ、デバイス等)の新規、更新への追従 テストツールの更新(ツール自体、javaなどの必要アプリケーション)への対応 テストリソースのスケールアップ、スケールアウト