SlideShare a Scribd company logo
Performance Testing in 5 Steps:
'A Guideline to a Successful Load
Test'
Mieke Gevers
Mieke Gevers, AQIS - Belgium
© 2008-2009 AQIS bvba 3
Agenda
1. What is load testing?
2. Methodology
3. Conclusion
© 2008-2009 AQIS bvba 4
Agenda
1. What is load testing?
a. Terminology
b. Challenges
2. Methodology
3. Conclusion
© 2008-2009 AQIS bvba 5
1. Terminology
• "load testing" usually is defined as the
process of exercising the system under
test by feeding it the largest tasks it can
operate with.
• Load testing is sometimes called volume
testing, or longevity/endurance testing.
© 2008-2009 AQIS bvba 6
1. Terminology
• “Stress testing” is testing by trying to break
the system under test by using
overwhelming resources or by taking
resources away from it (in which case it is
sometimes called negative testing).
© 2008-2009 AQIS bvba 7
• Failover Tests : verify of redundancy
mechanisms while under load.
• Can the system-under-test recover &/or react
quickly enough? Can the other web servers
(load balancers) handle the sudden dumping of
extra load?
• Advantage: Allowing technicians to address
problems in advance, in the comfort of a testing
situation, rather than in the heat of a production
outage.
1. Terminology
© 2008-2009 AQIS bvba 8
• Soak Tests ~ Endurance Testing : running
a system at high levels of load for
prolonged periods of time.
• Ideal for finding memory leaks and other
defects which make a system prone to
crashing/breaking after a certain number
of iterations
1. Terminology
© 2008-2009 AQIS bvba 9
1. Terminology
• Network Sensitivity Tests : are tests that
set up scenarios of varying types of
network activity (traffic, error rates...), and
then measure the impact of that traffic on
various applications that are bandwidth
dependant.
• eg. Chat applications, Streaming Media…
© 2008-2009 AQIS bvba 10
1. Terminology
• Targeted Infrastructure Test : are Isolated tests
of each layer and or component in an end to end
application configuration. It includes all
components of the AUT infrastructure and
beyond.
• eg. includes communications infrastructure,
Load Balancers, Web Servers, Application
Servers, Crypto cards, Citrix Servers,
Database…
• Goal: identifying any performance issues that
would fundamentally limit the overall ability of a
system to deliver at a given performance level.
© 2008-2009 AQIS bvba 11
1. Terminology
Time
Predicted
point of failure
Upgraded Capacity
Existing Capacity
Lead
TimeTrigger
Point
Anticipated
Peak Loads
Work
load
Usable
Reserve
Capacity
Point Response Time
degradation becomes
noticeable
Current
load
©copyright 2004-2005 by Wilson Mar. All rights reserved.
Performance testing is the overall process,
Stress Test
Load Test
Scalability
© 2008-2009 AQIS bvba 12
2. Challenges
© 2008-2009 AQIS bvba 13
2. Challenge 1 : Communication
The requirements
specification was
defined like this
The developers
understood it
in that way
This is how the
problem was
solved before.
This is how the
problem is
solved now
That is the program
after debugging
This is how the program
is described by
marketing department
This, in fact, is what the
customer wanted … ;-)
© 2008-2009 AQIS bvba 14
2. Challenge 2 : HW & SW
© 2008-2009 AQIS bvba 15
2. Challenge 2 : HW & SW
Security
© 2008-2009 AQIS bvba 16
2. Challenge 3 : User Nature
© 2008-2009 AQIS bvba 17
2. Challenge 3 : User Nature
© 2008-2009 AQIS bvba 18
2. Challenge 3 : User Nature
© 2008-2009 AQIS bvba 19
Response Times: The Three Important Limits [Miller 1968; Card et al. 1991]:
•0.1 second
limit for having the user feel that the system is reacting instantaneously
•1.0 second
limit for the user's flow of thought to stay uninterrupted, even though the
user will notice the delay.
•10 seconds
limit for keeping the user's attention focused on the dialogue. For longer
delays, users will want to perform other tasks while waiting for the
computer to finish, so they should be given feedback indicating when the
computer expects to be done.
2. Challenge 3 : User Nature
© 2008-2009 AQIS bvba 20
• Computer scientists identify future IT
challenges
Goals for IT using the power of quantum
physics, building systems that can't go wrong
“Ambitious goals include harnessing the power
of quantum physics, building systems that
can't go wrong, and simulating living creatures
in every detail.” By Peter Sayer, IDG News Service January 25, 2005
2. Challenge 4 : Future
© 2008-2009 AQIS bvba 21
2. Challenge 5 : Resources
• HW
• Infrastructure
• Access
• People
• Time
• ....
© 2008-2009 AQIS bvba 22
Agenda
1. What is load testing?
2. Methodology
3. Conclusion
© 2008-2009 AQIS bvba 23
3. Methodology
Analyze
© 2008-2009 AQIS bvba 24
Documentation
© 2008-2009 AQIS bvba 25
1. Determine system performance, response time, and
throughput under a specific load
2. Testing the ability to handle load (and stress) to identify
bottlenecks
3. Evaluating whether system resources are being
utilised efficiently
4. Testing system robustness and capability to recover
from errors
5. Testing across different configurations or versions
6. Testing systems for scalability
1. Goals
© 2008-2009 AQIS bvba 26
Examine SUT
• Architecture
Overviews
• Deployment Topology
• 3rd party components
and SLA’s
• Firewall capacity
• Load balancing
• Connectivity
• Network
• Session management
• All queues
• Caching models
• Security methods
• Fail over mechanisms
• Redundancy
• Bandwidth
• ……
© 2008-2009 AQIS bvba 28
5 steps
Analyze
Plan/Define/Design scenario's
© 2008-2009 AQIS bvba 29
Design and Model
1. Transactions and Workflows scenario's
2. User profiling and modeling
3. Workload scenario's
4. Test environment
5. Test data
© 2008-2009 AQIS bvba 30
1. Transition matrix
online shopping visit
 Make Flow-charts of Functional Paths through the
application-under-test.
 Break this down in Logical modules = transactions
© 2008-2009 AQIS bvba 31
2. User Profiling
• User activities, transactions and usage
patterns
• Client platforms and preferences
• Client Internet access speeds and browser
types
• User geographic locations
© 2008-2009 AQIS bvba 32
3. Workflow Modeling
• Size of customer base
• Growth factor
• Site arrival rates and site abandonment
• Think times and latency
• Background noise
© 2008-2009 AQIS bvba 33
4. Test environment
• Production replica
–Expensive, usually not possible
• Scaled down
–Extrapolation factor
• Actual production equipment
–For new applications
© 2008-2009 AQIS bvba 34
5. Test data
• Data
• Randomized
• Database
• Cookies, Session ID’s, Hidden ID’s, values
• Certificates
• Security settings
• IP addresses identifications
• User specifics identifications, PKI’s
© 2008-2009 AQIS bvba 35
6 steps
Analyze
Plan/Define/Design scenario's
Build/Record scenario’s
© 2008-2009 AQIS bvba 36
Load Test scenario's
• Program Perl, Java
• Stubs
• HW devices
• FreeWare tools
• Commercial Tools (capture & playback)
•...
© 2008-2009 AQIS bvba 37
How to reproduce correct web
behavior?
• Correct environment simulation
– Correct protocol (Business level)
– Speed
– UI (Browser/MMS/..)
– Security simulation/usage
– Calculating the amount of virtual users and
the arrival rate
– ...
© 2008-2009 AQIS bvba 38
5 steps
Analyze
Plan/Define/Design scenario's
Build/Record scenario’s
Baseline + Load Test
© 2008-2009 AQIS bvba 39
What is a baseline?
• Baseline with one user
• Monitor the involved back-end
systems
• Can the load test achieve the goals
• Test-runs
© 2008-2009 AQIS bvba 40
Load Testing: How?
© 2008-2009 AQIS
bvba
41
Users Internet Firewall
Load
balancers
Web
Servers
Application
Servers
DB ServersLAN
© 2008-2009 AQIS bvba 42
Monitoring
© 2008-2009 AQIS bvba 43
What to Measure
 Maximum login requests (per min/per sec)
 Average session length (at peak)
 max. Concurrent sessions
 Average pages per session
 Average hits per page
 Average request distribution
 Average size of a page & specific size of a page
 Arrival rates
Tip: Session length(in min)= Max_concurrent_sessions/
Login_requests_per_min
Metrics
© 2008-2009 AQIS bvba 44
5 steps
Analyze
Plan/Define/Design scenario's
Build/Record scenario’s
Reporting
Baseline + Load Test
© 2008-2009 AQIS bvba 45
Analysing Results
– Correlation of monitor and load test data –
root cause analysis
© 2008-2009 AQIS bvba 46
6 steps?
Analyze
Plan/Define/Design scenario's
Build/Record scenario’s
Baseline + Load Test
Reporting
Iteration-Monitor
© 2008-2009 AQIS bvba 47
Tips
• Be aware of constant changes
• Constantly training
• User behaviours
• Increase of transactions (seasonal –
permanent)
• Use Common Sence
© 2008-2009 AQIS bvba 48
 Capacity planning for “Web performance”, Menascé
 Building High-Scalability Server Farms, 1999 Microsoft Corporation
 Detecting System Bottlenecks in Sites Using Site Server 3.0 Commerce
Edition, 1999 Microsoft Corporation
 Design for scalability, IBM High-Volume Web site team, December 1999
http://www7b.boulder.ibm.com/wsdd/library/techarticles/hvws/scalability.html
 Performane Testing Guidance for Web applications, Scott Barber
 ISO 9126, ISO/IEC 12207:2008, International Organization for
Standardization
Resources
© 2008-2009 AQIS
bvba
49
Questions &
Thank you !
Mieke Gevers
info@aqis.eu
www.aqis.eu
Agile Quality in
Information Systems

More Related Content

PPT
'Model Based Test Design' by Mattias Armholt
PDF
Frank Cohen - Are We Ready For Cloud Testing - EuroSTAR 2010
PPT
Seretta Gamba - A Sneaky Way to Introduce More Automated Testing
PPT
Darius Silingas - From Model Driven Testing to Test Driven Modelling
PDF
C.V, Narayanan - Open Source Tools for Test Management - EuroSTAR 2010
PPT
Henk Doornbos & Rix Groenboom - Test Patterns: A New Concept For Testing
PPT
Elise Greveraars - Tester Needed? No Thanks, We Use MBT!
PPT
Martin Gijsen - Effective Test Automation a la Carte
'Model Based Test Design' by Mattias Armholt
Frank Cohen - Are We Ready For Cloud Testing - EuroSTAR 2010
Seretta Gamba - A Sneaky Way to Introduce More Automated Testing
Darius Silingas - From Model Driven Testing to Test Driven Modelling
C.V, Narayanan - Open Source Tools for Test Management - EuroSTAR 2010
Henk Doornbos & Rix Groenboom - Test Patterns: A New Concept For Testing
Elise Greveraars - Tester Needed? No Thanks, We Use MBT!
Martin Gijsen - Effective Test Automation a la Carte

What's hot (20)

PPT
Mattias Ratert - Incremental Scenario Testing
PDF
Christian Bk Hansen - Agile on Huge Banking Mainframe Legacy Systems - EuroST...
PPT
Hakan Fredriksson - Experiences With MBT and Qtronic
PPT
Rob Baarda - Are Real Test Metrics Predictive for the Future?
PPT
Wim Demey - Regression Testing in a Migration Project
PPTX
T19 performance testing effort - estimation or guesstimation revised
PPT
Klaus Olsen - Agile Test Management Using Scrum
PPT
Derk jan de Grood - ET, Best of Both Worlds
PDF
Training program BaffleSol academy of learning
PDF
Bert Zuurke - A Lean And Mean Approach To Model-Based Testing - EuroSTAR 2010
PDF
Tim Koomen - Testing Package Solutions: Business as usual? - EuroSTAR 2010
PPT
Lauri Pietarinen - What's Wrong With My Test Data
PPT
Graham Bath - SOA: Whats in it for Testers?
PPT
'Automated Reliability Testing via Hardware Interfaces' by Bryan Bakker
PPT
Testing
PPT
Bruno Legeard - Model-Based Testing of a Financial Application
PPT
want to contact me login to www.stqa.org
PDF
Edwin Van Loon - How Much Testing is Enough - EuroSTAR 2010
PPTX
Mickiel Vroon - Test Environment, The Future Achilles’ Heel
Mattias Ratert - Incremental Scenario Testing
Christian Bk Hansen - Agile on Huge Banking Mainframe Legacy Systems - EuroST...
Hakan Fredriksson - Experiences With MBT and Qtronic
Rob Baarda - Are Real Test Metrics Predictive for the Future?
Wim Demey - Regression Testing in a Migration Project
T19 performance testing effort - estimation or guesstimation revised
Klaus Olsen - Agile Test Management Using Scrum
Derk jan de Grood - ET, Best of Both Worlds
Training program BaffleSol academy of learning
Bert Zuurke - A Lean And Mean Approach To Model-Based Testing - EuroSTAR 2010
Tim Koomen - Testing Package Solutions: Business as usual? - EuroSTAR 2010
Lauri Pietarinen - What's Wrong With My Test Data
Graham Bath - SOA: Whats in it for Testers?
'Automated Reliability Testing via Hardware Interfaces' by Bryan Bakker
Testing
Bruno Legeard - Model-Based Testing of a Financial Application
want to contact me login to www.stqa.org
Edwin Van Loon - How Much Testing is Enough - EuroSTAR 2010
Mickiel Vroon - Test Environment, The Future Achilles’ Heel
Ad

Similar to Mieke Gevers - Performance Testing in 5 Steps - A Guideline to a Successful Load Test (20)

PPT
PDF
Testing Applications—For the Cloud and in the Cloud
PDF
TechTalk_Cloud Performance Testing_0.6
PDF
Tools. Techniques. Trouble?
PPTX
Neev Load Testing Services
PDF
Testing the Migration of Monolithic Applications to Microservices on the Cloud
PDF
Designing Scalable Applications
PPTX
Resilience planning and how the empire strikes back
PDF
Software UAT Case study - Finserv
PDF
Resilience Planning & How the Empire Strikes Back
PDF
Adding Value in the Cloud with Performance Test
PDF
A Year of “Testing” the Cloud for Development and Test
PPT
Infrastructure Strategy
PPT
Bugday bkk-2014 nitisak-auto_perf
PDF
Testing Microservices
PDF
Service Testing & Virtualization in an Enterprise Environments
PDF
Cloud Computing and Agile Product Line Engineering Integration
PPTX
Load and performance testing
PDF
Lessons from Large-Scale Cloud Software at Databricks
PPTX
Semantic Validation: Enforcing Kafka Data Quality Through Schema-Driven Verif...
Testing Applications—For the Cloud and in the Cloud
TechTalk_Cloud Performance Testing_0.6
Tools. Techniques. Trouble?
Neev Load Testing Services
Testing the Migration of Monolithic Applications to Microservices on the Cloud
Designing Scalable Applications
Resilience planning and how the empire strikes back
Software UAT Case study - Finserv
Resilience Planning & How the Empire Strikes Back
Adding Value in the Cloud with Performance Test
A Year of “Testing” the Cloud for Development and Test
Infrastructure Strategy
Bugday bkk-2014 nitisak-auto_perf
Testing Microservices
Service Testing & Virtualization in an Enterprise Environments
Cloud Computing and Agile Product Line Engineering Integration
Load and performance testing
Lessons from Large-Scale Cloud Software at Databricks
Semantic Validation: Enforcing Kafka Data Quality Through Schema-Driven Verif...
Ad

More from TEST Huddle (20)

PPTX
Why We Need Diversity in Testing- Accenture
PPTX
Keys to continuous testing for faster delivery euro star webinar
PPTX
Why you Shouldnt Automated But You Will Anyway
PDF
Being a Tester in Scrum
PDF
Leveraging Visual Testing with Your Functional Tests
PPTX
Using Test Trees to get an Overview of Test Work
PPTX
Big Data: The Magic to Attain New Heights
PPTX
Will Robots Replace Testers?
PPTX
TDD For The Rest Of Us
PDF
Scaling Agile with LeSS (Large Scale Scrum)
PPTX
Creating Agile Test Strategies for Larger Enterprises
PPTX
Is There A Risk?
PDF
Are Your Tests Well-Travelled? Thoughts About Test Coverage
PDF
Growing a Company Test Community: Roles and Paths for Testers
PDF
Do we need testers on agile teams?
PDF
How to use selenium successfully
PDF
Testers & Teams on the Agile Fluency™ Journey
PDF
Practical Test Strategy Using Heuristics
PDF
Thinking Through Your Role
PDF
Using Selenium 3 0
Why We Need Diversity in Testing- Accenture
Keys to continuous testing for faster delivery euro star webinar
Why you Shouldnt Automated But You Will Anyway
Being a Tester in Scrum
Leveraging Visual Testing with Your Functional Tests
Using Test Trees to get an Overview of Test Work
Big Data: The Magic to Attain New Heights
Will Robots Replace Testers?
TDD For The Rest Of Us
Scaling Agile with LeSS (Large Scale Scrum)
Creating Agile Test Strategies for Larger Enterprises
Is There A Risk?
Are Your Tests Well-Travelled? Thoughts About Test Coverage
Growing a Company Test Community: Roles and Paths for Testers
Do we need testers on agile teams?
How to use selenium successfully
Testers & Teams on the Agile Fluency™ Journey
Practical Test Strategy Using Heuristics
Thinking Through Your Role
Using Selenium 3 0

Recently uploaded (20)

PDF
How to Choose the Right IT Partner for Your Business in Malaysia
PDF
Adobe Premiere Pro 2025 (v24.5.0.057) Crack free
PDF
Upgrade and Innovation Strategies for SAP ERP Customers
PDF
Internet Downloader Manager (IDM) Crack 6.42 Build 41
PDF
Nekopoi APK 2025 free lastest update
PPTX
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
PDF
Adobe Illustrator 28.6 Crack My Vision of Vector Design
PDF
medical staffing services at VALiNTRY
PDF
Design an Analysis of Algorithms II-SECS-1021-03
PDF
How Creative Agencies Leverage Project Management Software.pdf
PDF
PTS Company Brochure 2025 (1).pdf.......
PDF
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
PPTX
CHAPTER 2 - PM Management and IT Context
PPTX
Introduction to Artificial Intelligence
PDF
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
PDF
2025 Textile ERP Trends: SAP, Odoo & Oracle
PPTX
Reimagine Home Health with the Power of Agentic AI​
PPTX
Transform Your Business with a Software ERP System
PDF
AI in Product Development-omnex systems
PDF
How to Migrate SBCGlobal Email to Yahoo Easily
How to Choose the Right IT Partner for Your Business in Malaysia
Adobe Premiere Pro 2025 (v24.5.0.057) Crack free
Upgrade and Innovation Strategies for SAP ERP Customers
Internet Downloader Manager (IDM) Crack 6.42 Build 41
Nekopoi APK 2025 free lastest update
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
Adobe Illustrator 28.6 Crack My Vision of Vector Design
medical staffing services at VALiNTRY
Design an Analysis of Algorithms II-SECS-1021-03
How Creative Agencies Leverage Project Management Software.pdf
PTS Company Brochure 2025 (1).pdf.......
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
CHAPTER 2 - PM Management and IT Context
Introduction to Artificial Intelligence
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
2025 Textile ERP Trends: SAP, Odoo & Oracle
Reimagine Home Health with the Power of Agentic AI​
Transform Your Business with a Software ERP System
AI in Product Development-omnex systems
How to Migrate SBCGlobal Email to Yahoo Easily

Mieke Gevers - Performance Testing in 5 Steps - A Guideline to a Successful Load Test

  • 1. Performance Testing in 5 Steps: 'A Guideline to a Successful Load Test' Mieke Gevers
  • 2. Mieke Gevers, AQIS - Belgium
  • 3. © 2008-2009 AQIS bvba 3 Agenda 1. What is load testing? 2. Methodology 3. Conclusion
  • 4. © 2008-2009 AQIS bvba 4 Agenda 1. What is load testing? a. Terminology b. Challenges 2. Methodology 3. Conclusion
  • 5. © 2008-2009 AQIS bvba 5 1. Terminology • "load testing" usually is defined as the process of exercising the system under test by feeding it the largest tasks it can operate with. • Load testing is sometimes called volume testing, or longevity/endurance testing.
  • 6. © 2008-2009 AQIS bvba 6 1. Terminology • “Stress testing” is testing by trying to break the system under test by using overwhelming resources or by taking resources away from it (in which case it is sometimes called negative testing).
  • 7. © 2008-2009 AQIS bvba 7 • Failover Tests : verify of redundancy mechanisms while under load. • Can the system-under-test recover &/or react quickly enough? Can the other web servers (load balancers) handle the sudden dumping of extra load? • Advantage: Allowing technicians to address problems in advance, in the comfort of a testing situation, rather than in the heat of a production outage. 1. Terminology
  • 8. © 2008-2009 AQIS bvba 8 • Soak Tests ~ Endurance Testing : running a system at high levels of load for prolonged periods of time. • Ideal for finding memory leaks and other defects which make a system prone to crashing/breaking after a certain number of iterations 1. Terminology
  • 9. © 2008-2009 AQIS bvba 9 1. Terminology • Network Sensitivity Tests : are tests that set up scenarios of varying types of network activity (traffic, error rates...), and then measure the impact of that traffic on various applications that are bandwidth dependant. • eg. Chat applications, Streaming Media…
  • 10. © 2008-2009 AQIS bvba 10 1. Terminology • Targeted Infrastructure Test : are Isolated tests of each layer and or component in an end to end application configuration. It includes all components of the AUT infrastructure and beyond. • eg. includes communications infrastructure, Load Balancers, Web Servers, Application Servers, Crypto cards, Citrix Servers, Database… • Goal: identifying any performance issues that would fundamentally limit the overall ability of a system to deliver at a given performance level.
  • 11. © 2008-2009 AQIS bvba 11 1. Terminology Time Predicted point of failure Upgraded Capacity Existing Capacity Lead TimeTrigger Point Anticipated Peak Loads Work load Usable Reserve Capacity Point Response Time degradation becomes noticeable Current load ©copyright 2004-2005 by Wilson Mar. All rights reserved. Performance testing is the overall process, Stress Test Load Test Scalability
  • 12. © 2008-2009 AQIS bvba 12 2. Challenges
  • 13. © 2008-2009 AQIS bvba 13 2. Challenge 1 : Communication The requirements specification was defined like this The developers understood it in that way This is how the problem was solved before. This is how the problem is solved now That is the program after debugging This is how the program is described by marketing department This, in fact, is what the customer wanted … ;-)
  • 14. © 2008-2009 AQIS bvba 14 2. Challenge 2 : HW & SW
  • 15. © 2008-2009 AQIS bvba 15 2. Challenge 2 : HW & SW Security
  • 16. © 2008-2009 AQIS bvba 16 2. Challenge 3 : User Nature
  • 17. © 2008-2009 AQIS bvba 17 2. Challenge 3 : User Nature
  • 18. © 2008-2009 AQIS bvba 18 2. Challenge 3 : User Nature
  • 19. © 2008-2009 AQIS bvba 19 Response Times: The Three Important Limits [Miller 1968; Card et al. 1991]: •0.1 second limit for having the user feel that the system is reacting instantaneously •1.0 second limit for the user's flow of thought to stay uninterrupted, even though the user will notice the delay. •10 seconds limit for keeping the user's attention focused on the dialogue. For longer delays, users will want to perform other tasks while waiting for the computer to finish, so they should be given feedback indicating when the computer expects to be done. 2. Challenge 3 : User Nature
  • 20. © 2008-2009 AQIS bvba 20 • Computer scientists identify future IT challenges Goals for IT using the power of quantum physics, building systems that can't go wrong “Ambitious goals include harnessing the power of quantum physics, building systems that can't go wrong, and simulating living creatures in every detail.” By Peter Sayer, IDG News Service January 25, 2005 2. Challenge 4 : Future
  • 21. © 2008-2009 AQIS bvba 21 2. Challenge 5 : Resources • HW • Infrastructure • Access • People • Time • ....
  • 22. © 2008-2009 AQIS bvba 22 Agenda 1. What is load testing? 2. Methodology 3. Conclusion
  • 23. © 2008-2009 AQIS bvba 23 3. Methodology Analyze
  • 24. © 2008-2009 AQIS bvba 24 Documentation
  • 25. © 2008-2009 AQIS bvba 25 1. Determine system performance, response time, and throughput under a specific load 2. Testing the ability to handle load (and stress) to identify bottlenecks 3. Evaluating whether system resources are being utilised efficiently 4. Testing system robustness and capability to recover from errors 5. Testing across different configurations or versions 6. Testing systems for scalability 1. Goals
  • 26. © 2008-2009 AQIS bvba 26 Examine SUT • Architecture Overviews • Deployment Topology • 3rd party components and SLA’s • Firewall capacity • Load balancing • Connectivity • Network • Session management • All queues • Caching models • Security methods • Fail over mechanisms • Redundancy • Bandwidth • ……
  • 27. © 2008-2009 AQIS bvba 28 5 steps Analyze Plan/Define/Design scenario's
  • 28. © 2008-2009 AQIS bvba 29 Design and Model 1. Transactions and Workflows scenario's 2. User profiling and modeling 3. Workload scenario's 4. Test environment 5. Test data
  • 29. © 2008-2009 AQIS bvba 30 1. Transition matrix online shopping visit  Make Flow-charts of Functional Paths through the application-under-test.  Break this down in Logical modules = transactions
  • 30. © 2008-2009 AQIS bvba 31 2. User Profiling • User activities, transactions and usage patterns • Client platforms and preferences • Client Internet access speeds and browser types • User geographic locations
  • 31. © 2008-2009 AQIS bvba 32 3. Workflow Modeling • Size of customer base • Growth factor • Site arrival rates and site abandonment • Think times and latency • Background noise
  • 32. © 2008-2009 AQIS bvba 33 4. Test environment • Production replica –Expensive, usually not possible • Scaled down –Extrapolation factor • Actual production equipment –For new applications
  • 33. © 2008-2009 AQIS bvba 34 5. Test data • Data • Randomized • Database • Cookies, Session ID’s, Hidden ID’s, values • Certificates • Security settings • IP addresses identifications • User specifics identifications, PKI’s
  • 34. © 2008-2009 AQIS bvba 35 6 steps Analyze Plan/Define/Design scenario's Build/Record scenario’s
  • 35. © 2008-2009 AQIS bvba 36 Load Test scenario's • Program Perl, Java • Stubs • HW devices • FreeWare tools • Commercial Tools (capture & playback) •...
  • 36. © 2008-2009 AQIS bvba 37 How to reproduce correct web behavior? • Correct environment simulation – Correct protocol (Business level) – Speed – UI (Browser/MMS/..) – Security simulation/usage – Calculating the amount of virtual users and the arrival rate – ...
  • 37. © 2008-2009 AQIS bvba 38 5 steps Analyze Plan/Define/Design scenario's Build/Record scenario’s Baseline + Load Test
  • 38. © 2008-2009 AQIS bvba 39 What is a baseline? • Baseline with one user • Monitor the involved back-end systems • Can the load test achieve the goals • Test-runs
  • 39. © 2008-2009 AQIS bvba 40 Load Testing: How?
  • 40. © 2008-2009 AQIS bvba 41 Users Internet Firewall Load balancers Web Servers Application Servers DB ServersLAN
  • 41. © 2008-2009 AQIS bvba 42 Monitoring
  • 42. © 2008-2009 AQIS bvba 43 What to Measure  Maximum login requests (per min/per sec)  Average session length (at peak)  max. Concurrent sessions  Average pages per session  Average hits per page  Average request distribution  Average size of a page & specific size of a page  Arrival rates Tip: Session length(in min)= Max_concurrent_sessions/ Login_requests_per_min Metrics
  • 43. © 2008-2009 AQIS bvba 44 5 steps Analyze Plan/Define/Design scenario's Build/Record scenario’s Reporting Baseline + Load Test
  • 44. © 2008-2009 AQIS bvba 45 Analysing Results – Correlation of monitor and load test data – root cause analysis
  • 45. © 2008-2009 AQIS bvba 46 6 steps? Analyze Plan/Define/Design scenario's Build/Record scenario’s Baseline + Load Test Reporting Iteration-Monitor
  • 46. © 2008-2009 AQIS bvba 47 Tips • Be aware of constant changes • Constantly training • User behaviours • Increase of transactions (seasonal – permanent) • Use Common Sence
  • 47. © 2008-2009 AQIS bvba 48  Capacity planning for “Web performance”, Menascé  Building High-Scalability Server Farms, 1999 Microsoft Corporation  Detecting System Bottlenecks in Sites Using Site Server 3.0 Commerce Edition, 1999 Microsoft Corporation  Design for scalability, IBM High-Volume Web site team, December 1999 http://www7b.boulder.ibm.com/wsdd/library/techarticles/hvws/scalability.html  Performane Testing Guidance for Web applications, Scott Barber  ISO 9126, ISO/IEC 12207:2008, International Organization for Standardization Resources
  • 48. © 2008-2009 AQIS bvba 49 Questions & Thank you ! Mieke Gevers info@aqis.eu www.aqis.eu Agile Quality in Information Systems

Editor's Notes

  • #6: In the testing literature, the term "load testing" is usually defined as the process of exercising the system under test by feeding it the largest tasks it can operate with. Load testing is sometimes called volume testing, or longevity/endurance testing.
  • #7: Stress testing tries to break the system under test by overwhelming its resources or by taking resources away from it (in which case it is sometimes called negative testing). The main purpose behind this madness is to make sure that the system fails and recovers gracefully -- this quality is known as recoverability.
  • #14: Challenges on a human nature Outsoursing
  • #15: 3 mayor Challenges (there are more) on a HW/ Physical nature
  • #16: 3 mayor Challenges (there are more) on a HW/ Physical nature
  • #17: Challenges on Human Nature 0.1 second is about the limit for having the user feel that the system is reacting instantaneously, meaning that no special feedback is necessary except to display the result. 1.0 second is about the limit for the user's flow of thought to stay uninterrupted, even though the user will notice the delay. Normally, no special feedback is necessary during delays of more than 0.1 but less than 1.0 second, but the user does lose the feeling of operating directly on the data. 10 seconds is about the limit for keeping the user's attention focused on the dialogue. For longer delays, users will want to perform other tasks while waiting for the computer to finish, so they should be given feedback indicating when the computer expects to be done. Feedback during the delay is especially important if the response time is likely to be highly variable, since users will then not know what to expect.
  • #18: Challenges on Human Nature 0.1 second is about the limit for having the user feel that the system is reacting instantaneously, meaning that no special feedback is necessary except to display the result. 1.0 second is about the limit for the user's flow of thought to stay uninterrupted, even though the user will notice the delay. Normally, no special feedback is necessary during delays of more than 0.1 but less than 1.0 second, but the user does lose the feeling of operating directly on the data. 10 seconds is about the limit for keeping the user's attention focused on the dialogue. For longer delays, users will want to perform other tasks while waiting for the computer to finish, so they should be given feedback indicating when the computer expects to be done. Feedback during the delay is especially important if the response time is likely to be highly variable, since users will then not know what to expect.
  • #19: Challenges on Human Nature 0.1 second is about the limit for having the user feel that the system is reacting instantaneously, meaning that no special feedback is necessary except to display the result. 1.0 second is about the limit for the user's flow of thought to stay uninterrupted, even though the user will notice the delay. Normally, no special feedback is necessary during delays of more than 0.1 but less than 1.0 second, but the user does lose the feeling of operating directly on the data. 10 seconds is about the limit for keeping the user's attention focused on the dialogue. For longer delays, users will want to perform other tasks while waiting for the computer to finish, so they should be given feedback indicating when the computer expects to be done. Feedback during the delay is especially important if the response time is likely to be highly variable, since users will then not know what to expect.
  • #20: Challenges on Human Nature 0.1 second is about the limit for having the user feel that the system is reacting instantaneously, meaning that no special feedback is necessary except to display the result. 1.0 second is about the limit for the user's flow of thought to stay uninterrupted, even though the user will notice the delay. Normally, no special feedback is necessary during delays of more than 0.1 but less than 1.0 second, but the user does lose the feeling of operating directly on the data. 10 seconds is about the limit for keeping the user's attention focused on the dialogue. For longer delays, users will want to perform other tasks while waiting for the computer to finish, so they should be given feedback indicating when the computer expects to be done. Feedback during the delay is especially important if the response time is likely to be highly variable, since users will then not know what to expect.
  • #22: 3 mayor Challenges (there are more) on a HW/ Physical nature
  • #27: Analyze at a higher level
  • #29: Study and Analyze Plan and Define scenario's Design/Model scenario's Build/Record scenario’s into scripts Establish Baseline Evaluate, Tune and Report Deploy and Monitor
  • #36: Study and Analyze Plan and Define scenario's Design/Model scenario's Build/Record scenario’s into scripts Establish Baseline Evaluate, Tune and Report Deploy and Monitor
  • #38: This phase => tryout scripts and Load test to see if it the goals are achievable– fe. With a small set of VUsers/ Realistic environment simulation Modem speed Client IP simulation Web Browser simulation Wap, 3G Phone types HTTP features (redirection, authentication) Connections – Threads Calculating the amount of virtual users
  • #45: Study and Analyze Plan and Define scenario's Design/Model scenario's Build/Record scenario’s into scripts Establish Baseline Evaluate, Tune and Report Deploy and Monitor