SlideShare a Scribd company logo
What is Performance Testing?
An Overview of Performance Testing
Ministry of Testing Meet Up
James Venetsanakos, April 12, 2017
Who am I
• James Venetsanakos:
http://www.linkedin.com/in/jamesv/
• Performance Testing since 1998
• Automated Functional Testing since 2006
• Currently:
– Design and implement custom automation and
performance testing frameworks
– Lead automation and performance testing teams
– Integrate performance testing practices into iterative
SDLC frameworks
2
Agenda
• Performance Testing vs. Other Types of Testing
• Types of Performance Testing
• Performance Test Assets
• Performance Test Process
• Selecting Performance Testing Tools
• Wisdom and Insight
• Recommended Resources
3
Performance Testing vs. Others
• Functional Testing: Does it work?
• Security Testing: Is it secure?
• Performance Testing: How well does it work?
– Testing to determine how a system behaves under load
– Suite of different types of tests to answer different questions
– Results are environment dependent (hardware, server
configuration, network configuration, data, etc.)
– Same test run multiple times for consistency
– Test model can drastically influence results in a good or bad way
• eCommerce example
• Denny’s Super Bowl Ad
• HealthCare.gov
4
Types of Performance Tests
• End to End
• Component
• Profile
• Load
• Stress/Scale/Capacity
• Duration/Longevity/Soak/Volume
• Point Load
• CDN
• Failover and Disaster Recovery
• Tuning
• Cloud Elasticity
• Mobile Device
• Mixed GUI and Server
5
End-To-End Testing
• Most common form of performance testing
– Many commercial tools: LoadRunner, SmartBear, Rational
– Open source tools: Jmeter, Grinder
– Hybrid tools: BlazeMeter, OctoPerf
• Tests software by simulating an end-to-end process with
multiple, virtual users
– Login -> execute workflow -> logout -> repeat
• Gives the end user experience less local processing
– Performance test tools do not render web pages
– Some now run javascript
– Only measure time to last byte
• Does not break down the total response time
Time spent in UI vs. Middle Tier vs. DB
6
Component Testing
• Tests specific pieces of the application in isolation
– Similar to Unit Testing in development
– Test directly or using a special UI or test harness (e. g. API)
– Can use stubs or full downstream pieces
• Example: middle tier with and without DB active
• Isolates code for testing
– Only the component is tested
– Components can be tested independently of each other
– No need to wait for a complete end-to-end system
– Fewer things to break during testing
• More work than end-to-end testing
– Tests cannot be run concurrently
– Need to develop test harness or UI to conduct testing 7
Profile Testing
Run from build to build
•End-To-End or Component Testing
•Comparing current results to previous results rather than
SLA
•Looking for trends (e. g. performance degradation)
•Good for iterative performance testing process or no SLA
•Used to create a baseline for new features or pages
•Results dependent on System Under Test (SUT)
8
Load Testing
• Tests system at different levels of load
– Load is measured in number of users running at some rate
– Can be done as end-to-end or component testing
• Load is increased over time (ramps)
– Evenly: 10 users every 15 minutes
– Automatically: tool adjusts virtual users to maintain specific goal
– Manually: Engineer adjusts users in response to server metrics
• Well behaved system
– Response times stay flat or increase linearly with load
– Throughput increases with load
– Few, if any, failed transactions occur
– Best results: increasing load with no increase in response time
9
Stress/Scale/Capacity Testing
• Determines maximum load system or server can handle
– Testing hardware instead of software
– Looking for failure point instead of defect
– Important to use realistic production targets and workflow mixes
• Find configuration necessary to support a particular load
– Ensure implementation supports expected production loads
– Prevents overspending of hardware
– Helps with budgeting for new hardware and upgrades
10
Duration/Longevity Testing
• Also called Soak or Volume Testing
• Observes system response over time
– Usually 8, 12, or 24 hours (shift or workday)
– Load is constant
– Run with less redundancy
– Focus on servers
• Looking for anomalous behavior
– Memory leaks
– Increasing response times
– Decreasing throughput
– CPU spikes
• Don’t run during system maintenance or batch jobs 11
Point Load Testing
• Massive Load in a Short Duration
– Concert tickets going on sale
– Fire Sale
– Employee logon at the start of a shift
– Product mentioned in the news
• Fashion of Duchess of Cambridge brings down designer’s web site(s)
• Testing Adds Load Very Quickly
– Hundreds of Users in a few minutes or less
– Use “Rendezvous Point” in some tools
– Look for:
• Back up in queues
• Resource Contention
• Time outs
– All transactions should process completely (albeit slowly) 12
CDN Testing
• CDN = Content Delivery Network (e. g. Akamai)
• Testing configuration of CDN and servers
• Do not assume CDN is all set
– Many times web servers are misconfigured
– Developers mistakenly write code to overwrite CDN options
– Static content sourced from origin
• CDN tests make sure static content comes from
local sources
– Must work with CDN provider to avoid excessive charges
• Configure non-CDN tests to ignore static
content to simulate use of CDN 13
Failover/Disaster Recovery Testing
• Tests system response to loss of service
– Individual server goes down
– Loss of network connectivity
– Disaster at data center
• Incident should not be observable to end user
– Load balancing shifts to working servers
– Ability of remaining servers to handle work
– Single point of failure will cause complete system failure
• Availability versus cost decision
• Users never know different data center is used
14
Tuning Testing
• Tests configuration changes to system
– Network, Application, Web, or File Server Settings
• Timeout
• Load balancing
– Operating System
• Paging or Garbage Collection
– Hardware
• Amount of RAM
• Number of CPUs
• Number of servers
• No code changes
• Done after bugs are resolved and code is optimized
• Optimizes performance for a given code base
15
Cloud Elasticity
• Cloud service must include self-service to be considered
a cloud per IEEE (e. g. AWS, Azure, Rackspace)
• Opening a help desk ticket to create a virtual machine is
NOT a cloud environment
• System detects increase in load
– Adds servers automatically
– Shuts down servers when load drops below threshold
– Tests make sure system responds appropriately
• Must have alerts when too many servers are added
– Cloud is not free
• Pay for every cycle and every bit in and out of cloud
– “Dirty” applications are bad for elasticity
• Pay for unneeded resources
• Cloud costs can escalate and you wind up with a big bill
16
Mobile Device Testing
• Simulated or Real Devices
– Emulators vs. Perfecto Mobile
• Mobile devices are like old computers
– Limited CPU
– Limited Memory
– Limited Network
• New Concerns
– Battery Use
• CPU Consumption
– Spotty Network
– Transfer from WIFI to Cellular
– Stream or Store
– Open Connection Duration
• Ads eat into data limits 17
Mixed GUI and Server Testing
• Run simulated users along with real browsers
– Jmeter and Selenium
– LoadRunner and UFT
• Get server metrics and browser metrics
– Obtain full page rendering times
– Some issues only found locally
• Example of page rendering issue solved with paging
• All Browser Testing
– Run real browsers in cloud
– No simulated users
– Flood.io
18
Performance Test Assets
• Test Plan
– States performance targets or expectations
• “Response times do not exceed 5 seconds more than 90% of the
time using the performance environment at a rate of 5 TPS”
– Details areas to be tested and how they will be tested
• Tests scripts and scenarios
• Transaction rates
• Test Environment
– Complete production-like hardware and data
• Test Scripts
– Simulate real-life work executed by real users at real rates
– One script represents one or multiple workflows
• Test Scenarios
– Consists of several scripts to simulate a multi-user system
19
Performance Test Reports
• Application Measurements
– Response times
– Failure rates
– Throughput/Hits
– HTTP codes
• Loadrunner-specific Server Metrics (integrated with Loadrunner
report)
– Any metric available from PerfMon (CPU, SQL Server, ASP, .Net)
– Any metric available from rstat daemon for Unix-based systems
– Limited metrics for other technologies
• Oracle, DB2, WebLogic, WebSphere, etc.
• Expandable through HP Diagnostics or SiteScope
• Tool Independent Results (not integrated with Loadrunner report)
– Database traces
– Server Logs
– Event logs
– Systems reports
• Performance Testing without Analytics is Useless
20
Performance Test Process
• Performance Testing is run a like a project
inside of a project
– Requirements Stage
– Design Stage
– Development Stage
– Testing Stage
– Deployment Stage (execution of tests)
– Monitoring Stage (analysis of test results)
• Can be automated and run against builds
• Performance Test Checklists
21
Performance Testing Steps
• Install Build
• Verify Build
• Verify Test Scripts and Data
• Execute Test Suite
• Write Reports
• Confer with DBA, Systems, and Project Team
• Assign Defects
• Re-test Fixes
22
Tool Selection
• You must consider
– Technology to Support
– Skill Level of Performance Testers
– Test Asset Maintainability
– Tool Monitoring Capabilities
– Tool Reporting Capabilities
– Budget for Tools, People, and Environment (this is NOT the
driver)
• Types
– Pure Commercial (LoadRunner, SOASTA, Rational)
– Pure Open Source (jmeter, Gatling)
– Commercial built on open source (OctoPerf, BlazeMeter)
• Free isn’t Free
– You pay with money or time
23
Wisdom and Insight
• Performance is NOT a bolt-on
– It must be built into the application
– Design choices MUST consider performance
• Performance testing only identifies issues
– It does not identify problem code (most of the time)
– It does not fix issues
• One hour of manual testing takes 10 hours to automate
• Small changes to the application can break the entire
suite of test scripts
• Performance test scripts are data sensitive
• Full testing cannot begin until a full environment is ready
– Includes code, hardware, and data
24
Recommended Resources
• Excellent Audio Blog for Testers, PMs, Business, QA
– http://www.perfbytes.com
• BlazeMeter Blog
– http://www.blazemeter.com/blog
• Computer Management Group (Capacity Planning)
– http://www.cmg.org
• Alexander Podelko (Links to Performance Information)
– http://www.alexanderpodelko.com/PerfDesign.html
• Greg Tutunjian (Scrum and Agile)
– http://www.scrumdoc.com
25
Questions?
26

More Related Content

PPTX
Load and performance testing
PPTX
Load Runner
PPTX
Manual testing
PPTX
Introduction to performance testing
PPTX
Performance testing
PDF
Performance Requirement Gathering
PPT
Performance and load testing
PPT
Performance Testing With Jmeter
Load and performance testing
Load Runner
Manual testing
Introduction to performance testing
Performance testing
Performance Requirement Gathering
Performance and load testing
Performance Testing With Jmeter

What's hot (20)

PPTX
Performance testing
PDF
Introduction to jmeter
PPTX
Types of performance testing
PPSX
Manual testing
PDF
Infographic: Importance of Performance Testing
PPTX
Load testing jmeter
PPT
Testing fundamentals
PDF
Client-Side Performance Testing
PDF
Performance testing presentation
PPTX
Regression and performance testing
PPTX
Performance Testing from Scratch + JMeter intro
PPT
Manual testing concepts course 1
PDF
Test Automation Framework Design | www.idexcel.com
PPTX
Intro to Manual Testing
PPTX
Introduction to performance testing
PDF
Jmeter Performance Testing
PPTX
Load testing with J meter
PDF
Performance Test Plan - Sample 2
PPT
QA process Presentation
PPTX
J Meter Intro
Performance testing
Introduction to jmeter
Types of performance testing
Manual testing
Infographic: Importance of Performance Testing
Load testing jmeter
Testing fundamentals
Client-Side Performance Testing
Performance testing presentation
Regression and performance testing
Performance Testing from Scratch + JMeter intro
Manual testing concepts course 1
Test Automation Framework Design | www.idexcel.com
Intro to Manual Testing
Introduction to performance testing
Jmeter Performance Testing
Load testing with J meter
Performance Test Plan - Sample 2
QA process Presentation
J Meter Intro
Ad

Similar to Performance Testing Overview (20)

PPT
08-Performence_Testing Project Explain.ppt
PPTX
Tools of the Trade: Load Testing - Ignite session at WebPerfDays NY 14
PPTX
Introduction to Performance Testing
PDF
ConFoo: Moving web performance testing to the left
PDF
Comprehensive Performance Testing: From Early Dev to Live Production
PPTX
Performance testing
PPTX
Performance Testing
PPTX
Introduction to Performance Testing
PPTX
Alexander Podelko - Context-Driven Performance Testing
PPT
PERFTEST.ppt
PPT
PERFTEST.ppt
PDF
Heuristics of performance testing
PPTX
QSpiders - Introduction to JMeter
PPTX
Context-Driven Performance Testing
PPT
PPT
Less11 3 e_loadmodule_1
PPTX
Performance testing and j meter overview
PPS
Performance Test Slideshow R E C E N T
PPS
Performance Test Slideshow Recent
PPTX
TGT#19 - 3 seconds or less - Piotr Liss
08-Performence_Testing Project Explain.ppt
Tools of the Trade: Load Testing - Ignite session at WebPerfDays NY 14
Introduction to Performance Testing
ConFoo: Moving web performance testing to the left
Comprehensive Performance Testing: From Early Dev to Live Production
Performance testing
Performance Testing
Introduction to Performance Testing
Alexander Podelko - Context-Driven Performance Testing
PERFTEST.ppt
PERFTEST.ppt
Heuristics of performance testing
QSpiders - Introduction to JMeter
Context-Driven Performance Testing
Less11 3 e_loadmodule_1
Performance testing and j meter overview
Performance Test Slideshow R E C E N T
Performance Test Slideshow Recent
TGT#19 - 3 seconds or less - Piotr Liss
Ad

Recently uploaded (20)

PPTX
VVF-Customer-Presentation2025-Ver1.9.pptx
PPTX
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
PPTX
CHAPTER 12 - CYBER SECURITY AND FUTURE SKILLS (1) (1).pptx
PDF
PTS Company Brochure 2025 (1).pdf.......
PDF
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
PDF
Upgrade and Innovation Strategies for SAP ERP Customers
PDF
System and Network Administration Chapter 2
PPTX
Transform Your Business with a Software ERP System
PDF
Design an Analysis of Algorithms II-SECS-1021-03
PDF
Internet Downloader Manager (IDM) Crack 6.42 Build 41
PDF
System and Network Administraation Chapter 3
PDF
Softaken Excel to vCard Converter Software.pdf
PPTX
ISO 45001 Occupational Health and Safety Management System
PDF
How to Choose the Right IT Partner for Your Business in Malaysia
PDF
Which alternative to Crystal Reports is best for small or large businesses.pdf
PDF
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
PDF
top salesforce developer skills in 2025.pdf
PPTX
L1 - Introduction to python Backend.pptx
PDF
Complete React Javascript Course Syllabus.pdf
PPT
Introduction Database Management System for Course Database
VVF-Customer-Presentation2025-Ver1.9.pptx
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
CHAPTER 12 - CYBER SECURITY AND FUTURE SKILLS (1) (1).pptx
PTS Company Brochure 2025 (1).pdf.......
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
Upgrade and Innovation Strategies for SAP ERP Customers
System and Network Administration Chapter 2
Transform Your Business with a Software ERP System
Design an Analysis of Algorithms II-SECS-1021-03
Internet Downloader Manager (IDM) Crack 6.42 Build 41
System and Network Administraation Chapter 3
Softaken Excel to vCard Converter Software.pdf
ISO 45001 Occupational Health and Safety Management System
How to Choose the Right IT Partner for Your Business in Malaysia
Which alternative to Crystal Reports is best for small or large businesses.pdf
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
top salesforce developer skills in 2025.pdf
L1 - Introduction to python Backend.pptx
Complete React Javascript Course Syllabus.pdf
Introduction Database Management System for Course Database

Performance Testing Overview

  • 1. What is Performance Testing? An Overview of Performance Testing Ministry of Testing Meet Up James Venetsanakos, April 12, 2017
  • 2. Who am I • James Venetsanakos: http://www.linkedin.com/in/jamesv/ • Performance Testing since 1998 • Automated Functional Testing since 2006 • Currently: – Design and implement custom automation and performance testing frameworks – Lead automation and performance testing teams – Integrate performance testing practices into iterative SDLC frameworks 2
  • 3. Agenda • Performance Testing vs. Other Types of Testing • Types of Performance Testing • Performance Test Assets • Performance Test Process • Selecting Performance Testing Tools • Wisdom and Insight • Recommended Resources 3
  • 4. Performance Testing vs. Others • Functional Testing: Does it work? • Security Testing: Is it secure? • Performance Testing: How well does it work? – Testing to determine how a system behaves under load – Suite of different types of tests to answer different questions – Results are environment dependent (hardware, server configuration, network configuration, data, etc.) – Same test run multiple times for consistency – Test model can drastically influence results in a good or bad way • eCommerce example • Denny’s Super Bowl Ad • HealthCare.gov 4
  • 5. Types of Performance Tests • End to End • Component • Profile • Load • Stress/Scale/Capacity • Duration/Longevity/Soak/Volume • Point Load • CDN • Failover and Disaster Recovery • Tuning • Cloud Elasticity • Mobile Device • Mixed GUI and Server 5
  • 6. End-To-End Testing • Most common form of performance testing – Many commercial tools: LoadRunner, SmartBear, Rational – Open source tools: Jmeter, Grinder – Hybrid tools: BlazeMeter, OctoPerf • Tests software by simulating an end-to-end process with multiple, virtual users – Login -> execute workflow -> logout -> repeat • Gives the end user experience less local processing – Performance test tools do not render web pages – Some now run javascript – Only measure time to last byte • Does not break down the total response time Time spent in UI vs. Middle Tier vs. DB 6
  • 7. Component Testing • Tests specific pieces of the application in isolation – Similar to Unit Testing in development – Test directly or using a special UI or test harness (e. g. API) – Can use stubs or full downstream pieces • Example: middle tier with and without DB active • Isolates code for testing – Only the component is tested – Components can be tested independently of each other – No need to wait for a complete end-to-end system – Fewer things to break during testing • More work than end-to-end testing – Tests cannot be run concurrently – Need to develop test harness or UI to conduct testing 7
  • 8. Profile Testing Run from build to build •End-To-End or Component Testing •Comparing current results to previous results rather than SLA •Looking for trends (e. g. performance degradation) •Good for iterative performance testing process or no SLA •Used to create a baseline for new features or pages •Results dependent on System Under Test (SUT) 8
  • 9. Load Testing • Tests system at different levels of load – Load is measured in number of users running at some rate – Can be done as end-to-end or component testing • Load is increased over time (ramps) – Evenly: 10 users every 15 minutes – Automatically: tool adjusts virtual users to maintain specific goal – Manually: Engineer adjusts users in response to server metrics • Well behaved system – Response times stay flat or increase linearly with load – Throughput increases with load – Few, if any, failed transactions occur – Best results: increasing load with no increase in response time 9
  • 10. Stress/Scale/Capacity Testing • Determines maximum load system or server can handle – Testing hardware instead of software – Looking for failure point instead of defect – Important to use realistic production targets and workflow mixes • Find configuration necessary to support a particular load – Ensure implementation supports expected production loads – Prevents overspending of hardware – Helps with budgeting for new hardware and upgrades 10
  • 11. Duration/Longevity Testing • Also called Soak or Volume Testing • Observes system response over time – Usually 8, 12, or 24 hours (shift or workday) – Load is constant – Run with less redundancy – Focus on servers • Looking for anomalous behavior – Memory leaks – Increasing response times – Decreasing throughput – CPU spikes • Don’t run during system maintenance or batch jobs 11
  • 12. Point Load Testing • Massive Load in a Short Duration – Concert tickets going on sale – Fire Sale – Employee logon at the start of a shift – Product mentioned in the news • Fashion of Duchess of Cambridge brings down designer’s web site(s) • Testing Adds Load Very Quickly – Hundreds of Users in a few minutes or less – Use “Rendezvous Point” in some tools – Look for: • Back up in queues • Resource Contention • Time outs – All transactions should process completely (albeit slowly) 12
  • 13. CDN Testing • CDN = Content Delivery Network (e. g. Akamai) • Testing configuration of CDN and servers • Do not assume CDN is all set – Many times web servers are misconfigured – Developers mistakenly write code to overwrite CDN options – Static content sourced from origin • CDN tests make sure static content comes from local sources – Must work with CDN provider to avoid excessive charges • Configure non-CDN tests to ignore static content to simulate use of CDN 13
  • 14. Failover/Disaster Recovery Testing • Tests system response to loss of service – Individual server goes down – Loss of network connectivity – Disaster at data center • Incident should not be observable to end user – Load balancing shifts to working servers – Ability of remaining servers to handle work – Single point of failure will cause complete system failure • Availability versus cost decision • Users never know different data center is used 14
  • 15. Tuning Testing • Tests configuration changes to system – Network, Application, Web, or File Server Settings • Timeout • Load balancing – Operating System • Paging or Garbage Collection – Hardware • Amount of RAM • Number of CPUs • Number of servers • No code changes • Done after bugs are resolved and code is optimized • Optimizes performance for a given code base 15
  • 16. Cloud Elasticity • Cloud service must include self-service to be considered a cloud per IEEE (e. g. AWS, Azure, Rackspace) • Opening a help desk ticket to create a virtual machine is NOT a cloud environment • System detects increase in load – Adds servers automatically – Shuts down servers when load drops below threshold – Tests make sure system responds appropriately • Must have alerts when too many servers are added – Cloud is not free • Pay for every cycle and every bit in and out of cloud – “Dirty” applications are bad for elasticity • Pay for unneeded resources • Cloud costs can escalate and you wind up with a big bill 16
  • 17. Mobile Device Testing • Simulated or Real Devices – Emulators vs. Perfecto Mobile • Mobile devices are like old computers – Limited CPU – Limited Memory – Limited Network • New Concerns – Battery Use • CPU Consumption – Spotty Network – Transfer from WIFI to Cellular – Stream or Store – Open Connection Duration • Ads eat into data limits 17
  • 18. Mixed GUI and Server Testing • Run simulated users along with real browsers – Jmeter and Selenium – LoadRunner and UFT • Get server metrics and browser metrics – Obtain full page rendering times – Some issues only found locally • Example of page rendering issue solved with paging • All Browser Testing – Run real browsers in cloud – No simulated users – Flood.io 18
  • 19. Performance Test Assets • Test Plan – States performance targets or expectations • “Response times do not exceed 5 seconds more than 90% of the time using the performance environment at a rate of 5 TPS” – Details areas to be tested and how they will be tested • Tests scripts and scenarios • Transaction rates • Test Environment – Complete production-like hardware and data • Test Scripts – Simulate real-life work executed by real users at real rates – One script represents one or multiple workflows • Test Scenarios – Consists of several scripts to simulate a multi-user system 19
  • 20. Performance Test Reports • Application Measurements – Response times – Failure rates – Throughput/Hits – HTTP codes • Loadrunner-specific Server Metrics (integrated with Loadrunner report) – Any metric available from PerfMon (CPU, SQL Server, ASP, .Net) – Any metric available from rstat daemon for Unix-based systems – Limited metrics for other technologies • Oracle, DB2, WebLogic, WebSphere, etc. • Expandable through HP Diagnostics or SiteScope • Tool Independent Results (not integrated with Loadrunner report) – Database traces – Server Logs – Event logs – Systems reports • Performance Testing without Analytics is Useless 20
  • 21. Performance Test Process • Performance Testing is run a like a project inside of a project – Requirements Stage – Design Stage – Development Stage – Testing Stage – Deployment Stage (execution of tests) – Monitoring Stage (analysis of test results) • Can be automated and run against builds • Performance Test Checklists 21
  • 22. Performance Testing Steps • Install Build • Verify Build • Verify Test Scripts and Data • Execute Test Suite • Write Reports • Confer with DBA, Systems, and Project Team • Assign Defects • Re-test Fixes 22
  • 23. Tool Selection • You must consider – Technology to Support – Skill Level of Performance Testers – Test Asset Maintainability – Tool Monitoring Capabilities – Tool Reporting Capabilities – Budget for Tools, People, and Environment (this is NOT the driver) • Types – Pure Commercial (LoadRunner, SOASTA, Rational) – Pure Open Source (jmeter, Gatling) – Commercial built on open source (OctoPerf, BlazeMeter) • Free isn’t Free – You pay with money or time 23
  • 24. Wisdom and Insight • Performance is NOT a bolt-on – It must be built into the application – Design choices MUST consider performance • Performance testing only identifies issues – It does not identify problem code (most of the time) – It does not fix issues • One hour of manual testing takes 10 hours to automate • Small changes to the application can break the entire suite of test scripts • Performance test scripts are data sensitive • Full testing cannot begin until a full environment is ready – Includes code, hardware, and data 24
  • 25. Recommended Resources • Excellent Audio Blog for Testers, PMs, Business, QA – http://www.perfbytes.com • BlazeMeter Blog – http://www.blazemeter.com/blog • Computer Management Group (Capacity Planning) – http://www.cmg.org • Alexander Podelko (Links to Performance Information) – http://www.alexanderpodelko.com/PerfDesign.html • Greg Tutunjian (Scrum and Agile) – http://www.scrumdoc.com 25