SlideShare a Scribd company logo
Top ten secret weapons for performance testing in
an agile environment
by
Alistair Jones & Patrick Kua
agile2009@thoughtworks.com
http://connect.thoughtworks.com/agile2009/
© ThoughtWorks 2009
Make Performance Explicit
© ThoughtWorks 2009
So that I can make better
investment decisions
As an investor
I want to see the value of my
portfolio presented on a single web
page must have “good” performance, less than 0.2s
page load for about 10,000 concurrent users
© ThoughtWorks 2009
© ThoughtWorks 2009
So that investors have a high-
quality experience as the business
grows
As the Operations Manager
I want the portfolio value page to
render within 0.2s when 10,000
users are logged in
One Team
© ThoughtWorks 2009
Team Dynamics
© ThoughtWorks 2009
Performance Testers Part of Team
© ThoughtWorks 2009
© ThoughtWorks 2009
Performance Testers Part of Team
© ThoughtWorks 2009
Pair on Performance Test Stories
© ThoughtWorks 2009
Rotate Pairs
© ThoughtWorks 2009
Customer Driven
© ThoughtWorks 2009
What was a good source of requirements?
© ThoughtWorks 2009
© ThoughtWorks 2009
Existing Pain Points
An example...
© ThoughtWorks 2009
So that we can budget for future
hardware needs as we grow
As the data centre manager
I want to know how much traffic
we can handle now
© ThoughtWorks 2009
Another example
© ThoughtWorks 2009
© ThoughtWorks 2009
So that we have confidence in
meeting our SLA
As the Operations Manager
I want to ensure that a sustained
peak load does not take out our
service
Personas
© ThoughtWorks 2009
Who is the customer?
© ThoughtWorks 2009
End Users
Operations
Power
Users
Marketing
Investors
Discipline
© ThoughtWorks 2009
© ThoughtWorks 2009
Is the
hypothesis
valid?
Change the
application
code
Observe test
results
Formulate an
hypothesis
Design an
experiment
Run the
experiment
What do you see?
Why is it doing that?
How can I prove that’s what’s happening?
Take the time to gather the evidence.
Safe in the knowledge that I’m making it faster
© ThoughtWorks 2009
© ThoughtWorks 2009
Saw tooth pattern (1 minute intervals)
Directory structure of (yyyy/mm/minuteofday)?. Slow down
due to # of files in directory?
1 directory should result in even worse performance...
We ran the test…
Is the
hypothesis
valid?
Change the
application
code
Observe test
results
Formulate an
hypothesis
Design an
experiment
Run the
experiment
One Directory
© ThoughtWorks 2009
Play Performance Early
© ThoughtWorks 2009
© ThoughtWorks 2009
End
Start End
Other projects start
performance testing
here
Start
Agile projects start
performance testing as
early as possible
Iterate Don’t (Just)
Increment
© ThoughtWorks 2009
© ThoughtWorks 2009
We Sashimi
© ThoughtWorks 2009
Sashimi Slice By...
Presentation
© ThoughtWorks 2009
© ThoughtWorks 2009
So that I can better see trends in
performance
As the Operations Manager
I want a graph of requests per
second
© ThoughtWorks 2009
So that I can better see trends in
performance
As the Operations Manager
I want a graph of average latency
per second
© ThoughtWorks 2009
So that I can easily scan results at a
single glance
As the Operations Manager
I want a one page showing all
results
Sashimi Slice By...
Scenario
© ThoughtWorks 2009
© ThoughtWorks 2009
So that we never have a day like
“October 10”
As the Operations Manager
I want to ensure that a sustained
peak load does not take out our
service
© ThoughtWorks 2009
So that we never have a day like
“November 12”
As the Operations Manager
I want to ensure that an escalating
load up to xxx requests/second
does not take out our service
Automate, Automate,
Automate
© ThoughtWorks 2009
© ThoughtWorks 2009
Automated
Compilation
Automated
Tests
Automated
Packaging
Automated
Deployment
Automation => Reproducible and Consistent
Automation => Faster Feedback
Automation => Higher Productivity
Why Automation?
© ThoughtWorks 2009
© ThoughtWorks 2009
Automated
Application
Deployment
Automated
Load
Generation
Automated
Test
Orchestration
Automated
Analysis
Automated Scheduling
Automated Result Archiving
Continuous Performance
Testing
© ThoughtWorks 2009
© ThoughtWorks 2009
Application
Build Pipelines
© ThoughtWorks 2009
Performance
Build RPM
Functional Test
Compile & Unit Test
© ThoughtWorks 2009
Test Drive Your Performance
Test Code
© ThoughtWorks 2009
V Model Testing
© ThoughtWorks 2009
http://en.wikipedia.org/wiki/V-Model_(software_development)
Performance
Testing
Slower + Longer
Fast
Speed
We make mistakes
© ThoughtWorks 2009
V Model Testing
© ThoughtWorks 2009
http://en.wikipedia.org/wiki/V-Model_(software_development)
Performance
Testing
Slower + Longer
Fast
Speed
Unit test
performance
code to fail
faster
Fail Fast!
© ThoughtWorks 2009
Fast feedback!
Faster learning
Faster results
Classic Performance Areas to Test
© ThoughtWorks 2009
Analysis
Information
Collection
Visualisation
Publishing
Presentation
Get Feedback
© ThoughtWorks 2009
Frequently (Weekly) Showcase
© ThoughtWorks 2009
Here is what we learned this week....
Frequently (Weekly) Showcase
© ThoughtWorks 2009
And based on this... We changed our
directory structure.
Frequently (Weekly) Showcase
© ThoughtWorks 2009
Should we do something different
knowing this new information?
List of All Secret Weapons
1. Make Performance Explicit
2. One Team
3. Customer Driven
4. Discipline
5. Play Performance Early
6. Iterate Don't (Just) Increment
7. Automate, Automate, Automate
8. Test Drive Your Performance Code
9. Continuous Performance Testing
10. Get Feedback
© ThoughtWorks 2009
• Talk to us tonight at... ThoughtWorks’ Agile Open
Office @ 7pm: http://tiny.cc/uqtLa
• Email us... agile2009@thoughtworks.com
• Visit our website: http://www.thoughtworks.com
• Leave your business card at the back
Photo Credits (Creative Commons licence)
• Barbed wire picture: http://www.flickr.com/photos/lapideo/446201948/
• Eternal clock: http://www.flickr.com/photos/robbie73/3387189144/
• Sashimi from http://www.flickr.com/photos/mac-ash/3719114621/
For more information
© ThoughtWorks 2009

More Related Content

PPTX
How to measure the business impact of web performance
PPTX
Scrumban pechakucha
PPTX
Caching Tips & Tricks
PPTX
Build, Launch, Fail, Learn, Refactor, Repeat
PPTX
One trunk one pipeline one truth
PPTX
Velocity NY - How to Measure Revenue in Milliseconds
PPTX
ypobo - Enterprise DevOps Adoption
PPTX
Testit 2017 - Exploratory Testing for Everyone
How to measure the business impact of web performance
Scrumban pechakucha
Caching Tips & Tricks
Build, Launch, Fail, Learn, Refactor, Repeat
One trunk one pipeline one truth
Velocity NY - How to Measure Revenue in Milliseconds
ypobo - Enterprise DevOps Adoption
Testit 2017 - Exploratory Testing for Everyone

What's hot (17)

PPTX
Optimizely NYC Developer Meetup - Experimentation at Blue Apron
PPTX
Tis The Season: Load Testing Tips and Checklist for Retail Seasonal Readiness
PDF
Bycraft Conference - Running operations in 2 hours
PPTX
Kanban - Set Work in Progress Limits
PPTX
Living with acceptance tests: Beyond Write-Once (XP NYC)
PPTX
WordPress Affiliate Toolkit - Affiliate Summit East 2014
PDF
O product where art thou
PPTX
Ольга Гриник “Make your tester’s life easier with automated deployment. A Rea...
PPT
Extending Continuous Integration
PDF
Anand Ramdeo - Automation Frameworks - EuroSTAR 2012
PPTX
Humans by the hundred (DevOps Days Ohio)
PDF
Open Source SLAs
PPTX
Kanban - Classes of Service
PPTX
Increasing Traffic Through Optimization : The Importance of Site Speed
PPTX
Do it faster, keep it great, Digiday Agency Summit, October 2016
PPTX
Responsive App Design with the Salesforce Lightning Design System
PPT
Planning XP
Optimizely NYC Developer Meetup - Experimentation at Blue Apron
Tis The Season: Load Testing Tips and Checklist for Retail Seasonal Readiness
Bycraft Conference - Running operations in 2 hours
Kanban - Set Work in Progress Limits
Living with acceptance tests: Beyond Write-Once (XP NYC)
WordPress Affiliate Toolkit - Affiliate Summit East 2014
O product where art thou
Ольга Гриник “Make your tester’s life easier with automated deployment. A Rea...
Extending Continuous Integration
Anand Ramdeo - Automation Frameworks - EuroSTAR 2012
Humans by the hundred (DevOps Days Ohio)
Open Source SLAs
Kanban - Classes of Service
Increasing Traffic Through Optimization : The Importance of Site Speed
Do it faster, keep it great, Digiday Agency Summit, October 2016
Responsive App Design with the Salesforce Lightning Design System
Planning XP
Ad

Viewers also liked (6)

DOCX
Bab ii agama
PPT
PRESENTATION BAHASA MELAYU
PPT
Methadone
PPTX
Bahasa melayu presentation
PPT
Bab ii agama
PRESENTATION BAHASA MELAYU
Methadone
Bahasa melayu presentation
Ad

Similar to Top ten secret weapons for performance testing in an agile environment (20)

PPSX
Top Ten Secret Weapons For Agile Performance Testing
PPSX
Top ten secret weapons for performance testing in an agile environment
PDF
DevOps in Practice: When does "Practice" Become "Doing"?
PDF
Reduce Test Automation Execution Time by 80%
PDF
UI Test Cases With CloudStack
PPTX
Augmenting Coded UI
PDF
Fundamentals Performance Testing
PPT
Automation testing strategy, approach & planning
PDF
Universal test solutions customer testimonial 10192013-v2.3
PPTX
D-CAST: The Future of Agile Testing Meetup
PDF
DevOPs Transformation Workshop
PPT
Universal test solutions customer testimonial 10192013-v2.2
PDF
Introduction to lean and agile
PPTX
Introduction To Agile And Scrum Innotech
PPTX
measuring and monitoring client side performance / Nir Nahum
PPTX
Transform Software Testing and Quality with the Neotys-Inflectra Platform
PPT
Continuous Load Testing with CloudTest and Jenkins
PPTX
Confessions of the Tester
PDF
PureSystems Summary and Actions, John Kaemmerer and Gerry Novan, 11th Sept 14
PPTX
Mobile User Experience: Auto Drive through Performance Metrics
Top Ten Secret Weapons For Agile Performance Testing
Top ten secret weapons for performance testing in an agile environment
DevOps in Practice: When does "Practice" Become "Doing"?
Reduce Test Automation Execution Time by 80%
UI Test Cases With CloudStack
Augmenting Coded UI
Fundamentals Performance Testing
Automation testing strategy, approach & planning
Universal test solutions customer testimonial 10192013-v2.3
D-CAST: The Future of Agile Testing Meetup
DevOPs Transformation Workshop
Universal test solutions customer testimonial 10192013-v2.2
Introduction to lean and agile
Introduction To Agile And Scrum Innotech
measuring and monitoring client side performance / Nir Nahum
Transform Software Testing and Quality with the Neotys-Inflectra Platform
Continuous Load Testing with CloudTest and Jenkins
Confessions of the Tester
PureSystems Summary and Actions, John Kaemmerer and Gerry Novan, 11th Sept 14
Mobile User Experience: Auto Drive through Performance Metrics

Recently uploaded (20)

PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PPTX
Big Data Technologies - Introduction.pptx
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PDF
Approach and Philosophy of On baking technology
PPTX
sap open course for s4hana steps from ECC to s4
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
Machine learning based COVID-19 study performance prediction
PPTX
Understanding_Digital_Forensics_Presentation.pptx
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PDF
KodekX | Application Modernization Development
PPTX
MYSQL Presentation for SQL database connectivity
PPTX
Spectroscopy.pptx food analysis technology
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PDF
Review of recent advances in non-invasive hemoglobin estimation
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
20250228 LYD VKU AI Blended-Learning.pptx
Digital-Transformation-Roadmap-for-Companies.pptx
“AI and Expert System Decision Support & Business Intelligence Systems”
Big Data Technologies - Introduction.pptx
Chapter 3 Spatial Domain Image Processing.pdf
Approach and Philosophy of On baking technology
sap open course for s4hana steps from ECC to s4
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
Machine learning based COVID-19 study performance prediction
Understanding_Digital_Forensics_Presentation.pptx
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
KodekX | Application Modernization Development
MYSQL Presentation for SQL database connectivity
Spectroscopy.pptx food analysis technology
Encapsulation_ Review paper, used for researhc scholars
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
NewMind AI Weekly Chronicles - August'25 Week I
Review of recent advances in non-invasive hemoglobin estimation
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Building Integrated photovoltaic BIPV_UPV.pdf

Top ten secret weapons for performance testing in an agile environment

Editor's Notes

  • #4: In a conventional project, we focus on the functionality that needs to be delivered. Performance might be important, but performance requirements are considered quite separate from functional requirements. One approach is to attach “conditions” to story cards, i.e. this functionality must handle a certain load. In our experience, where performance is of critical conern, pull out the performance requirement as its own story…
  • #5: Calling out performance requirements as their own stories allows you to: validate the benefit you expect from delivering the performance -prioritise performance work against other requirements -know when you’re done
  • #9: not sure if you like this picture, I was really looking for a good shot looking out over no-man’s land at the Berlin wall. I want the idea of divisions along skill lines breading hostility and un-cooperation.
  • #13: Everything should be based on some foreseeable scenario, and who benefits from it Harder to do without repetition (involvement and feedback) [not sure if this makes sense anymore] Extremely important to keep people focused as its easy to drift Capture different profiles Separation simulation from optimisation -> Problem Identification vs Problem Resolution (or broken down further Solution Brainstorm -> Solution Investigation) Linking back to why is even more essential -> map to existing problems or fears Latency vs throughput -> determine what is the most useful metric and define service level agreements
  • #15: http://www.flickr.com/photos/denniskatinas/2183690848/ Not sure which one you like better
  • #17: Here’s an example... (in the style of Feature Injection) “What’s our upper limit?”
  • #19: Here’s another example... (in the style of Feature Injection), “Can we handle peaks in traffic again?” So that we have confidence in meeting our SLA As the Operations Manager I want to ensure that a sustained peak load does not take out our service
  • #21: It helps to be clear about who is going to benefit from any performance testing (tuning and optimisation) that is going to take place. Ensure that they get a stake on prioritisation that will help with the next point...
  • #23: Evidence-based decision-making. Don’t commit to a code change until you know it’s the right thing to do.
  • #25: Evidence-based decision-making. Don’t commit to a code change until you know it’s the right thing to do.
  • #27: It helps to have the customer (mentioned in the previous slide) be a key stakeholder to prioritise.
  • #28: Application supports better ability to be performance tested easier Like TDD changes the design/architecture of a system Need to find reference for this Measuring it early helps raise what changes contribute to slowness Performance work takes longer Lead times potentially large and long lead time (sequential) – think of where gantt chart may actually be useful Run it as a parallel track of work to normal functionality (not sequential) Minimal environment availability (expensive, non concurrent use) Need minimal functionality or at least clearly defined interfaces to operate against Want to have some time to respond to feedback -> work that into the process as early as possible and potentially change architecture/design
  • #29: Start with the simplest performance test scenarios -> Sanity test/smoke test -> Hit all aspects -> Use to drive out automated deployment (environment limitations, configuration issues, minimal set of reporting needs – green/red) -> Hit integration boundaries but with a small problem rather than everything Next story might be a more complex script or something that drives out more of the infrastrcutre Performance stories should not be : -> Build out tasks -> Does not enhance anything without other stories Log files -> Contents early. Consumer Driven. Contracts for analysis. Keep around. Keep notes around what was varied INVEST stories Avoid the large “performance test” story Separate types of stories Optimise vs Measure Optimise is riskier components. Less known. “Done” is difficult to estimate Measure is clearer. Allows you to make better informed choices Know when to stop When enough is enough
  • #30: The best lessons are learned from iterating, not from incrementing. Iterate over your performance test harness, framework and test fixtures. Make it easier to increment into new areas by incrementing in a different direction each time. - Start with simple performance test scenarios - Don’t build too much infrastructure at once - Refine the test harness and things used to create more tests - Should always be delivering value - Identify useful features in performance testing and involve the stakeholder(s) to help prioritise them in Prioritise and schedule in analysis stories (metrics and graphs) Some of this work will still be big
  • #31: Sashimi is nice and bite sized. You don’t eat the entire fish at once. You’re eating a part of it. Sashimi slices are nice and thin. There are a couple of different strategies linking this in. Think of sashimi as the thinnest possible slice.
  • #33: Number of requests over time
  • #34: Latency over time
  • #35: “I don’t want to click through to each graph”
  • #37: “I don’t want to click through to each graph”
  • #38: “I don’t want to click through to each graph”
  • #40: Automated build is a key XP practice. The first stage of automating a build is often to automate compilation However, for a typical project, we go on after compilation to run tests, as another automated step. In fact we may have a whole series of automted steps that chain on after each other, automating many aspects of the development process, all the way from compiling source to to deploying a complete application into the production environment.
  • #41: Automation is powerful lever in software projects because: it gives us reproducable, consistent processes We get faster feedback when something goes wrong Overall higher productivity – we can repeat an automated build much more often than we could if it was manual
  • #42: In performance testing we can use automate many of the common tasks in a similar way to how we automate a software build. For any performance test, there is a linear series of activities that can be automated (first row of slide) In our recent projects we’ve been using the build tool ant for most of performance scripting. You could use any scripting language, but here are some very basic scripts to show you the kind of thing we mean… [possibly animate transitions to the 4 following slides] Once we’ve auomted the running of a single test, we can move on even more aspects of automation such as scheduling and result archiving, whch lead us into… Continuous Performance testing.
  • #44: Performance tests can take a long time to run, you need all the time you can to get good results. Lean on your automation to have tests running all the time, automatically using more hardware when available (in the evening or at the weekend for example)
  • #45: For a faster feedback, set up your CI server so that performance tests are always running against the latest version of the application.