SlideShare a Scribd company logo
Context-Driven Performance
Testing
Alexander Podelko
About the speaker
• Alexander Podelko
• Specializing in performance since 1997
• Currently Consulting Member of Technical Staff at Oracle (Stamford, CT office)
• Performance testing and optimization of Enterprise Performance Management
(EPM) a.k.a. Hyperion products
• Board director at Computer Measurement Group (CMG) – non-profit organization
of performance and capacity professionals
Disclaimer: The views expressed here are my personal views only and do not necessarily represent those of my current or previous employers. All brands and trademarks mentioned are the property of their owners.
Context-Driven Testing
•
•
•
•
Context-DrivenTesting
• Context-driven approach was initially introduced by James Bach, Brian
Marick, Bret Pettichord, and Cem Kaner
• http://context-driven-testing.com
• Declared a “school” in 2001 (Lessons Learned in Software Testing)
• Became political
BasicPrinciples
• The value of any practice depends on its context.
• There are good practices in context, but there are no best practices.
• People, working together, are the most important part of any project’s context.
• Projects unfold over time in ways that are often not predictable.
BasicPrinciples
• The product is a solution. If the problem isn’t solved, the product doesn’t work.
• Good software testing is a challenging intellectual process.
• Only through judgment and skill, exercised cooperatively throughout the entire project,
are we able to do the right things at the right times to effectively test our products.
« Traditional» Approach
• Load / Performance Testing is:
• Last moment before deployment
• Last step in the waterfall process
• Checking against given requirements / SLAs
• Throwing it back over the wall if reqs are not met
• System-level
• Realistic workload
• With variations when needed: stress, uptime, etc.
• Lab environment
• Often scale-down
• Protocol level record-and-playback
• Expensive tools requiring special skills
Early Performance Testing
- Exploratory, Continuous
•
•
•
•
Agile Development
• Agile development should be rather a trivial case for performance testing
• You have a working system each iteration to test early by definition.
• You need performance engineer for the whole project
• Savings come from detecting problems early
• You need to adjust requirements for implemented functionality
• Additional functionality will impact performance
The MainIssue on theAgile Side
• It doesn’t [always] work this way in practice
• That is why you have “Hardening Iterations”, “Technical Debt” and similar
notions
• Same old problem: functionality gets priority over performance
The MainIssue on theTestingSide
• Performance Engineering teams don’t scale well
• Even assuming that they are competent and effective
• Increased volume exposes the problem
• Early testing
• Each iteration
• Remedies: automation, making performance everyone’s job
Early Testing– MentalityChange
• Making performance everyone’s job
• Late record/playback performance testing -> Early Performance
Engineering
• System-level requirements -> Component-level requirements
• Record/playback approach -> Programming to generate load/create stubs
• "Black Box" -> "Grey Box”
Continuum of Options
NewWell-known
System
Testing Approach
Exploratory / AgileAutomated / Regression
Traditional
Traditional
ExploratoryPerformanceTesting
• Rather alien for performance testing, but probably more relevant than for
functional testing
• We learn about system’s performance as we start to run test
• Only guesses for new systems
• Rather a performance engineering process bringing the system to the
proper state than just testing
Continuous PerformanceTesting
• You see Performance CI presentations at every conference nowadays….
and
• Still opinions vary
• From “full automation” to:
DifferentPerspectives
• Consultant: need to test the system
• In its current state
• Why bother about automation?
• External or internal
• Performance Engineer
• On an agile team
• Need to test it each build/iteration/sprint/etc.
• Automation Engineer / SDET / etc.
Automation:Considerations
• You need to know the system well enough to make meaningful
automation
• If system is new, overheads are too high
• So almost no automation in traditional environments
• If the same system is tested again and again
• It makes sense to invest in setting up automation
• Automated interfaces should be stable enough
• APIs are usually more stable on early stages
Time/ResourceConsiderations
• Performance tests take time and resources
• The larger tests, the more
• May be not an option on each check-in
• Need of a tiered solution
• Some performance measurements each build
• Daily mid-size performance tests
• Periodic large-scale / uptime tests outside CI
Automation:Limitations
• Works great to find regressions and check against requirements
• Doesn’t cover:
• Exploratory tests
• Large scale / scope / duration / volume
• “Full Automation” doesn’t look like a real option, should be a combination
Environment / Scope / Granularity
•
•
•
•
Environment
• Cloud
• No more excuse of not having hardware
• Lab vs. Service (SaaS) vs. Cloud (IaaS)
• For both the system and load generators
• Test vs. Production
Scenarios
• System validation for high load
• Outside load (service or cloud), production system
• Wider scope, lower repeatability
• Performance optimization / troubleshooting
• Isolated lab environment
• Limited scope, high repeatability
• Testing in Cloud
• Lowering costs (in case of periodic tests)
• Limited scope, low repeatability
Find Your Way
• If performance risk is high it may be a combination of environments, e.g.
• Outside tests against the production environment to test for max load
• Lab for performance optimization / troubleshooting
• Limited performance environments to be used as part of continuous integration
Scope / Granularity
• System level
• Component level
• Service Virtualization, etc.
• Server time
• Server + Network (WAN simulation, etc.)
• End-to-end (User Experience)
• Each may require different approach / tools
Load Generation
•
•
•
•
Record and Playback: Protocol Level
Load Testing Tool
Virtual Users
ServerLoad Generator
Application
Network
Considerations
• Usually doesn't work for testing components
• Each tool support a limited number of technologies (protocols)
• Some technologies are very time-consuming
• Workload validity in case of sophisticated logic on the client side is not
guaranteed
Record and Playback: UI Level
Load Testing Tool
Virtual
Users
ServerLoad Generator
Application
NetworkBrowsers
Considerations
• Scalability
• Still require more resources
• Supported technologies
• Timing accuracy
• Playback accuracy
• For example, for HtmlUnit
Programming
Load Testing Tool App.
ServerLoad Generator
Application
Network
API
Virtual
Users
Considerations
• Requires programming / access to APIs
• Tool support
• Extensibility
• Language support
• May require more resources
• Environment may need to be set
Production Workload
• A/B testing, canary testing
• Should work well if
• homogenous workloads and a way to control them precisely
• potential issues have minimal impact on user satisfaction and company image and you
can easily rollback the changes
• fully parallel and scalable architecture
Performance testing/Engineering Strategy
•
•
•
•
PerformanceRisk Mitigation
• Single-user performance engineering
• Profiling, WPO, single-user performance
• Software Performance Engineering
• Modeling, Performance Patterns
• Instrumentation / APM / Monitoring
• Production system insights
• Capacity Planning/Management
• Resources Allocation
• Continuous Integration / Deployment
• Ability to deploy and remove changes quickly
DefiningPerformanceTestingStrategy
• What are performance risks we want to mitigate?
• What part of this risks should be mitigated by performance testing?
• Which performance tests will mitigate the risk?
• When we should run them?
• What process/environment/approach/tools we need in our context to implement them?
Examples
• Handling full/extra load
• System level, production[-like env], realistic load
• Catching regressions
• Continuous testing, limited scale/env
• Early detection of performance problems
• Exploratory tests, targeted workload
• Performance optimization/investigation
• Dedicated env, targeted workload
Summary
• Testing strategy became very non-trivial
• A lot of options along many dimensions
• Defined by context
• “Automation” is only one part of it
• Important for iterative development
• Part of performance engineering strategy
• Should be considered amongst other activities
Questions?
Alexander Podelko
alex.podelko@oracle.com
apodelko@yahoo.com
alexanderpodelko.com/blog
@apodelko
Thank you

More Related Content

PPTX
Performance Assurance for Packaged Applications
PPTX
Reinventing Performance Testing, CMG imPACt 2016 slides
PPT
Software testing
PPT
Getting start with Performance Testing
PPTX
Software testing performance testing
PPT
Performance Testing Overview
PPTX
Small is Beautiful- Fully Automate your Test Case Design
PDF
Don’t Be Another Statistic! Develop a Long-Term Test Automation Strategy
Performance Assurance for Packaged Applications
Reinventing Performance Testing, CMG imPACt 2016 slides
Software testing
Getting start with Performance Testing
Software testing performance testing
Performance Testing Overview
Small is Beautiful- Fully Automate your Test Case Design
Don’t Be Another Statistic! Develop a Long-Term Test Automation Strategy

What's hot (19)

PPTX
Performance testing
PPTX
An Introduction to Performance Testing
PPTX
QSpiders - Introduction to JMeter
PDF
Agile Test Automation: Truth, Oxymoron or Lie?
PPTX
Performance Testing
PDF
Building a Test Automation Strategy for Success
PDF
Mt s1 basic_fundamentals
PPTX
Automated testing
PDF
Integration strategies best practices- Mulesoft meetup April 2018
PPTX
Performance testing
PPTX
Performance testing
PDF
NYC MeetUp 10.9
PDF
Neotys PAC 2018 - Gayatree Nalwadad
PPTX
Solano Labs presented at MassTLC's automated testing
PDF
Neotys PAC 2018 - Ramya Ramalinga Moorthy
PDF
What is Performance Testing?
PPT
Introduction to Performance Testing Part 1
PDF
6 Traits of a Successful Test Automation Architecture
PPT
Software Testing with Agile Requirements Practices
Performance testing
An Introduction to Performance Testing
QSpiders - Introduction to JMeter
Agile Test Automation: Truth, Oxymoron or Lie?
Performance Testing
Building a Test Automation Strategy for Success
Mt s1 basic_fundamentals
Automated testing
Integration strategies best practices- Mulesoft meetup April 2018
Performance testing
Performance testing
NYC MeetUp 10.9
Neotys PAC 2018 - Gayatree Nalwadad
Solano Labs presented at MassTLC's automated testing
Neotys PAC 2018 - Ramya Ramalinga Moorthy
What is Performance Testing?
Introduction to Performance Testing Part 1
6 Traits of a Successful Test Automation Architecture
Software Testing with Agile Requirements Practices
Ad

Similar to Alexander Podelko - Context-Driven Performance Testing (20)

PPTX
Context-Driven Performance Testing
PPTX
Introduction to performance testing
PDF
PAC 2019 virtual Alexander Podelko
PPTX
Load testing with Visual Studio and Azure - Andrew Siemer
PDF
ISTQB - CTFL Summary v1.0
PPT
SoftwareTesting-Lets learn about document handling
PPT
Test planning and software's engineering
PDF
Agile Acceptance testing with Fitnesse
PPTX
Questions for successful test automation projects
PPTX
Introduction to performance testing
PPTX
Extreme Makeover OnBase Edition
PPTX
Neotys PAC - Stephen Townshend
PPTX
Multiple Dimensions of Load Testing
PPTX
5 Considerations When Adopting Automated Testing
PPTX
Role of Test Automation in Modern Software Delivery Pipelines
PPTX
Istqb foundation level day 1
PPTX
Automation in the world of project
PPT
Test automation lessons from WebSphere Application Server
PDF
Testing in the New World of Off-the-Shelf Software
PDF
20121213 qa introduction smileryang
Context-Driven Performance Testing
Introduction to performance testing
PAC 2019 virtual Alexander Podelko
Load testing with Visual Studio and Azure - Andrew Siemer
ISTQB - CTFL Summary v1.0
SoftwareTesting-Lets learn about document handling
Test planning and software's engineering
Agile Acceptance testing with Fitnesse
Questions for successful test automation projects
Introduction to performance testing
Extreme Makeover OnBase Edition
Neotys PAC - Stephen Townshend
Multiple Dimensions of Load Testing
5 Considerations When Adopting Automated Testing
Role of Test Automation in Modern Software Delivery Pipelines
Istqb foundation level day 1
Automation in the world of project
Test automation lessons from WebSphere Application Server
Testing in the New World of Off-the-Shelf Software
20121213 qa introduction smileryang
Ad

More from Neotys_Partner (20)

PPTX
Srivalli Aparna - The Blueprints to Success
PPTX
Leandro Melendez - Switching Performance Left & Right
PPTX
Jonathon Wright - Intelligent Performance Cognitive Learning (AIOps)
PPTX
Hari Krishnan Ramachandran - Assuring Performance for the Connected World
PPTX
Bruno Audoux - Connected Cars to the Net, IoTs on the Roads
PPTX
Andreas Grabner - Performance as Code, Let's Make It a Standard
PPTX
Alan Gordon - Building a Holistic Performance Management Platform
PPTX
Twan Koot - Beyond the % usage, an in-depth look into monitoring
PPTX
Stijn Schepers - Performance Test Automation Beyond Frontier
PPTX
Stephen Townshend - Constellations
PPTX
Stefano Doni - Achieve Superhuman Performance with Machine Learning
PDF
PAC 2018 - Stijn Schepers
PDF
Neotys PAC 2018 - Helen Bally
PDF
Neotys PAC 2018 - Mark Tomlinson
PDF
Neotys PAC 2018 - Wilson Mar
PDF
Neotys PAC - Zak Cole
PDF
Neotys PAC 2018 - Thomas Steinmaurer
PDF
Neotys PAC 2018 - Todd De Capua
PDF
Neotys PAC 2018 - Tingting Zong
PDF
Neotys PAC 2018 - Thomas Rotté
Srivalli Aparna - The Blueprints to Success
Leandro Melendez - Switching Performance Left & Right
Jonathon Wright - Intelligent Performance Cognitive Learning (AIOps)
Hari Krishnan Ramachandran - Assuring Performance for the Connected World
Bruno Audoux - Connected Cars to the Net, IoTs on the Roads
Andreas Grabner - Performance as Code, Let's Make It a Standard
Alan Gordon - Building a Holistic Performance Management Platform
Twan Koot - Beyond the % usage, an in-depth look into monitoring
Stijn Schepers - Performance Test Automation Beyond Frontier
Stephen Townshend - Constellations
Stefano Doni - Achieve Superhuman Performance with Machine Learning
PAC 2018 - Stijn Schepers
Neotys PAC 2018 - Helen Bally
Neotys PAC 2018 - Mark Tomlinson
Neotys PAC 2018 - Wilson Mar
Neotys PAC - Zak Cole
Neotys PAC 2018 - Thomas Steinmaurer
Neotys PAC 2018 - Todd De Capua
Neotys PAC 2018 - Tingting Zong
Neotys PAC 2018 - Thomas Rotté

Recently uploaded (20)

PPTX
Essential Infomation Tech presentation.pptx
PDF
Digital Strategies for Manufacturing Companies
PDF
Understanding Forklifts - TECH EHS Solution
PPTX
Agentic AI : A Practical Guide. Undersating, Implementing and Scaling Autono...
PDF
Design an Analysis of Algorithms II-SECS-1021-03
PPTX
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
PDF
How Creative Agencies Leverage Project Management Software.pdf
PDF
SAP S4 Hana Brochure 3 (PTS SYSTEMS AND SOLUTIONS)
PPTX
CHAPTER 12 - CYBER SECURITY AND FUTURE SKILLS (1) (1).pptx
PDF
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
PDF
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
PDF
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
PPTX
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
PPTX
history of c programming in notes for students .pptx
PDF
Complete React Javascript Course Syllabus.pdf
PDF
Audit Checklist Design Aligning with ISO, IATF, and Industry Standards — Omne...
PPTX
Introduction to Artificial Intelligence
DOCX
The Five Best AI Cover Tools in 2025.docx
PPTX
Materi-Enum-and-Record-Data-Type (1).pptx
PDF
Upgrade and Innovation Strategies for SAP ERP Customers
Essential Infomation Tech presentation.pptx
Digital Strategies for Manufacturing Companies
Understanding Forklifts - TECH EHS Solution
Agentic AI : A Practical Guide. Undersating, Implementing and Scaling Autono...
Design an Analysis of Algorithms II-SECS-1021-03
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
How Creative Agencies Leverage Project Management Software.pdf
SAP S4 Hana Brochure 3 (PTS SYSTEMS AND SOLUTIONS)
CHAPTER 12 - CYBER SECURITY AND FUTURE SKILLS (1) (1).pptx
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
history of c programming in notes for students .pptx
Complete React Javascript Course Syllabus.pdf
Audit Checklist Design Aligning with ISO, IATF, and Industry Standards — Omne...
Introduction to Artificial Intelligence
The Five Best AI Cover Tools in 2025.docx
Materi-Enum-and-Record-Data-Type (1).pptx
Upgrade and Innovation Strategies for SAP ERP Customers

Alexander Podelko - Context-Driven Performance Testing

  • 2. About the speaker • Alexander Podelko • Specializing in performance since 1997 • Currently Consulting Member of Technical Staff at Oracle (Stamford, CT office) • Performance testing and optimization of Enterprise Performance Management (EPM) a.k.a. Hyperion products • Board director at Computer Measurement Group (CMG) – non-profit organization of performance and capacity professionals Disclaimer: The views expressed here are my personal views only and do not necessarily represent those of my current or previous employers. All brands and trademarks mentioned are the property of their owners.
  • 4. Context-DrivenTesting • Context-driven approach was initially introduced by James Bach, Brian Marick, Bret Pettichord, and Cem Kaner • http://context-driven-testing.com • Declared a “school” in 2001 (Lessons Learned in Software Testing) • Became political
  • 5. BasicPrinciples • The value of any practice depends on its context. • There are good practices in context, but there are no best practices. • People, working together, are the most important part of any project’s context. • Projects unfold over time in ways that are often not predictable.
  • 6. BasicPrinciples • The product is a solution. If the problem isn’t solved, the product doesn’t work. • Good software testing is a challenging intellectual process. • Only through judgment and skill, exercised cooperatively throughout the entire project, are we able to do the right things at the right times to effectively test our products.
  • 7. « Traditional» Approach • Load / Performance Testing is: • Last moment before deployment • Last step in the waterfall process • Checking against given requirements / SLAs • Throwing it back over the wall if reqs are not met • System-level • Realistic workload • With variations when needed: stress, uptime, etc. • Lab environment • Often scale-down • Protocol level record-and-playback • Expensive tools requiring special skills
  • 8. Early Performance Testing - Exploratory, Continuous • • • •
  • 9. Agile Development • Agile development should be rather a trivial case for performance testing • You have a working system each iteration to test early by definition. • You need performance engineer for the whole project • Savings come from detecting problems early • You need to adjust requirements for implemented functionality • Additional functionality will impact performance
  • 10. The MainIssue on theAgile Side • It doesn’t [always] work this way in practice • That is why you have “Hardening Iterations”, “Technical Debt” and similar notions • Same old problem: functionality gets priority over performance
  • 11. The MainIssue on theTestingSide • Performance Engineering teams don’t scale well • Even assuming that they are competent and effective • Increased volume exposes the problem • Early testing • Each iteration • Remedies: automation, making performance everyone’s job
  • 12. Early Testing– MentalityChange • Making performance everyone’s job • Late record/playback performance testing -> Early Performance Engineering • System-level requirements -> Component-level requirements • Record/playback approach -> Programming to generate load/create stubs • "Black Box" -> "Grey Box”
  • 13. Continuum of Options NewWell-known System Testing Approach Exploratory / AgileAutomated / Regression Traditional Traditional
  • 14. ExploratoryPerformanceTesting • Rather alien for performance testing, but probably more relevant than for functional testing • We learn about system’s performance as we start to run test • Only guesses for new systems • Rather a performance engineering process bringing the system to the proper state than just testing
  • 15. Continuous PerformanceTesting • You see Performance CI presentations at every conference nowadays…. and • Still opinions vary • From “full automation” to:
  • 16. DifferentPerspectives • Consultant: need to test the system • In its current state • Why bother about automation? • External or internal • Performance Engineer • On an agile team • Need to test it each build/iteration/sprint/etc. • Automation Engineer / SDET / etc.
  • 17. Automation:Considerations • You need to know the system well enough to make meaningful automation • If system is new, overheads are too high • So almost no automation in traditional environments • If the same system is tested again and again • It makes sense to invest in setting up automation • Automated interfaces should be stable enough • APIs are usually more stable on early stages
  • 18. Time/ResourceConsiderations • Performance tests take time and resources • The larger tests, the more • May be not an option on each check-in • Need of a tiered solution • Some performance measurements each build • Daily mid-size performance tests • Periodic large-scale / uptime tests outside CI
  • 19. Automation:Limitations • Works great to find regressions and check against requirements • Doesn’t cover: • Exploratory tests • Large scale / scope / duration / volume • “Full Automation” doesn’t look like a real option, should be a combination
  • 20. Environment / Scope / Granularity • • • •
  • 21. Environment • Cloud • No more excuse of not having hardware • Lab vs. Service (SaaS) vs. Cloud (IaaS) • For both the system and load generators • Test vs. Production
  • 22. Scenarios • System validation for high load • Outside load (service or cloud), production system • Wider scope, lower repeatability • Performance optimization / troubleshooting • Isolated lab environment • Limited scope, high repeatability • Testing in Cloud • Lowering costs (in case of periodic tests) • Limited scope, low repeatability
  • 23. Find Your Way • If performance risk is high it may be a combination of environments, e.g. • Outside tests against the production environment to test for max load • Lab for performance optimization / troubleshooting • Limited performance environments to be used as part of continuous integration
  • 24. Scope / Granularity • System level • Component level • Service Virtualization, etc. • Server time • Server + Network (WAN simulation, etc.) • End-to-end (User Experience) • Each may require different approach / tools
  • 26. Record and Playback: Protocol Level Load Testing Tool Virtual Users ServerLoad Generator Application Network
  • 27. Considerations • Usually doesn't work for testing components • Each tool support a limited number of technologies (protocols) • Some technologies are very time-consuming • Workload validity in case of sophisticated logic on the client side is not guaranteed
  • 28. Record and Playback: UI Level Load Testing Tool Virtual Users ServerLoad Generator Application NetworkBrowsers
  • 29. Considerations • Scalability • Still require more resources • Supported technologies • Timing accuracy • Playback accuracy • For example, for HtmlUnit
  • 30. Programming Load Testing Tool App. ServerLoad Generator Application Network API Virtual Users
  • 31. Considerations • Requires programming / access to APIs • Tool support • Extensibility • Language support • May require more resources • Environment may need to be set
  • 32. Production Workload • A/B testing, canary testing • Should work well if • homogenous workloads and a way to control them precisely • potential issues have minimal impact on user satisfaction and company image and you can easily rollback the changes • fully parallel and scalable architecture
  • 34. PerformanceRisk Mitigation • Single-user performance engineering • Profiling, WPO, single-user performance • Software Performance Engineering • Modeling, Performance Patterns • Instrumentation / APM / Monitoring • Production system insights • Capacity Planning/Management • Resources Allocation • Continuous Integration / Deployment • Ability to deploy and remove changes quickly
  • 35. DefiningPerformanceTestingStrategy • What are performance risks we want to mitigate? • What part of this risks should be mitigated by performance testing? • Which performance tests will mitigate the risk? • When we should run them? • What process/environment/approach/tools we need in our context to implement them?
  • 36. Examples • Handling full/extra load • System level, production[-like env], realistic load • Catching regressions • Continuous testing, limited scale/env • Early detection of performance problems • Exploratory tests, targeted workload • Performance optimization/investigation • Dedicated env, targeted workload
  • 37. Summary • Testing strategy became very non-trivial • A lot of options along many dimensions • Defined by context • “Automation” is only one part of it • Important for iterative development • Part of performance engineering strategy • Should be considered amongst other activities