SlideShare a Scribd company logo
Todd Warren – CS 394 Spring 2011Developing Software at Scale: Lessons from 20+ Years at Microsoft
TodayTeam structure at MicrosoftProduct Complexity and SchedulingQuality and software testingKnowing when to ship
Programs vs. Software products3x3x9xSource:  Fred Brooks Jr., Mythical Man Month, “The Tar Pit”
Software Products vs. Custom Software DevelopmentSource: Hoch, Roeding, Purkert, Lindner, “Secrets of Software Success”, 1999
RolesProduct ManagerUser-Interface designerEnd-user liaisonProject ManagerArchitectDevelopersTool SmithQA/TestersBuild CoordinatorRisk OfficerEnd User DocumentationProgram ManagementSoftware Development EngineersTest and Quality AssuranceUser Assistance / EducationSource:  McConnell
ResourcesSize Matters!Different Methodologies and ApproachesScope of Feature and Quality MattersAffects Level of Process needed and overhead5 person teams:   Moderate Process, Shared Roles24 person teams (PMC): Moderate Process, Lifecycle oriented roles and specialization—good for “Extreme” style process60-100 (MS Project): Moderate Process, some loose functional specialization and lifecycle100-200 (Windows CE) person teams: Medium to Heavy Process, Lifecycle roles and functional specialization1000+ Person Teams (Windows Mobile): Heavy Process, Multiple Methodologies, Formal Integration ProcessHigher Quality==more rigorous processTrue also for open source, online projectsApache is best example of very specified culture of contribution
Organization and its affect on ProductsVeryFormalCasualMore Formal
Project Interdependency Matters:“Star” or “Mesh”OfficeWindowsEdgeEdgeEdgeEdgeEdgeEdgeCoreEdgeCoreEdgeEdgeEdgeEdgeEdge
A ‘Typical’ Product Group 25% Developers45% Testers10% Program Management10% User Education / Localization7% Marketing3% Overhead
Small Product:  Portable Media Center 1 UI Designer5 Program managers8 Developers10 testers
Microsoft Project30 Developers (27%)36 Testers	(33%)15 Program Mgrs (14%)20 UA/Localization (18%)6	Marketing	(5%)3	Overhead	(3%)
Exchange Numbers (circa 2000)112 Developers (25.9%)247 Testers (57.3%)44 Program Mgrs. (10.2%)12 Marketing (2.7%)16 Overhead (3.7%)
Windows CE (circa 2002)
Windows Mobile (circa 2009)
Amount of Time 3 month maximum is a good rule of thumb for a stage/milestone.Hard for people to focus on anything longer than 3 months.Never let things go un-built for longer than a week
Smaple Staged Timeline (Project 2000)
How Long?216 days development (truthfully probably more like 260d)284 days on “testing” in exampleComponent Tests: 188dSystem wide tests:~97d50/50 split between design/implement and test/fixSome Projects (e.g. operating systems, servers) longer integration period (more like 2:1)Factors: How distributed, number of “moving parts”Show why some of the Extreme methodology is appealing.
Fred Brooks OS/360 Rules of thumb1/3 planning1/6 coding1/4 component test and early system test1/4 system test, all components in hand
Office 2000 Schedule
A few projects compared to Brooks
Quality and TestingDesign in Scenarios up frontWhat is necessary for the componentUI is different than APIServer is different than clientSet Criteria and usage scenariosUnderstanding (and controlling if possible) the environment in which the software is developed and used“The last bug is found when the last customer dies”-Brian Valentine, SVP eCommerce, Amazon
Example of ComplexityTopology CoverageExchange versions:	4.0 (latest SP), 5.0 (latest SP) and 5.5 Windows NT version: 	3.51, 4.0 (latest SP’s)Langs. (Exchange and	USA/USA, JPN/Chinese, JPN/Taiwan, JPN/Korean,Windows NT):	JPN/JPN, GER/GER, FRN/FRNPlatforms: 		Intel, Alpha, (MIPS, PPC 4.0 only)Connectors X.400:	Over TCP, TP4, TP0/X.25Connectors IMS:	Over LAN, RAS, ISDNConnectors RAS:	Over NetBEUI, IPX, TCPConnector interop:	MS Mail, MAC Mail, cc:Mail, NotesNews:	NNTP in/outAdmin:	Daily operationsStore:	Public >16GB and Private Store >16GB Replication:	29 sites, 130 servers, 200,000 users, 10 AB viewsClient protocols:	MAPI, LDAP, POP3, IMAP4, NNTP, HTTP Telecommunication:	Slow Link Simulator, Noise SimulationFault tolerance:Windows NT ClusteringSecurity:	Exchange KMS server, MS Certificate ServerProxy firewall:	Server-to-Server and Client-to-Server
Complexity 2: Windows CE5m lines of code4 processor architecturesARM/Xscale, MIPS, x86, SH20 Board Support PackagesOver 1000 possible operating system components1000’s of peripherals
Complexity 3: Windows Mobile 6.x2 code instances (“standard” and “pro”)4 ARM Chip Variants3 memory configuration variations8 Screen sizes (QVGA, VGA, WVGA, Square..)60 major interacting software components3 network technologies (CDMA, GSM, WiFi)Some distinct features for 7 major vendors100 dependent 3rd party apps for a complete “phone”
Bugs over the lifecycle
Bugs over the lifecycle
Flow of tests during the cycleUnit TestsImplementedFeatureImplementedFeature isSpecifiedTest DesignIs writtenTest ReleaseDocumentComponentTestingSpecializedTestingSystemTest”Bug FixRegressionTests
Ways of TestingTypes of TestsBlack BoxWhite Box“Gray” BoxStage of CycleUnit Test / Verification TestComponentAcceptance TestSystem TestPerformance TestStress TestExternal Testing (Alpha/Beta/”Dogfood”)Regression Testing
Four Rules of TestingGuard the ProcessCatch Bugs EarlyTest with the Customer in MindM0M1M2RTMMake it MeasurableShip RequirementProOnGo LLC – May 2009
Inside the Mind of a TesterHow close are we to satisfying agreed upon metrics/criteria?Are the criteria passing stably, every time we test?What are we building,and why?What do our bug trends say about our progress?How risky is this last-minute code check-in?What metrics and criteria summarize customer demands?Based on current trends, when will we pass all criteria?Do we pass all criteria?  If not:  what, why, how?Can we reliably measure these metrics and criteria?RTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
What are we building, and why?  Any problems with this?What metrics and criteria summarize customer demands?Can we reliably measure these metrics and criteria?M0: Specs & Test Plans// An API that draws a line from x to yVOID LineTo(INT x, INT y);RTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
Balance:  Fast vs. ThoroughCanaryMostFrequentShallow CoverageLeastFrequentCompletesCoverageBuild Verification TestsAutomated Test PassManual Test PassRTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
Fast Tests that can automatically run at check-in timeStatic Code Analysis (like lint)Trial build, before check-in committed to SCMForm-field tests:Check-In cites a bug number?Code-reviewer field filled out?Canary & Check-In TestsRTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
Build Verification TestGoal: find bugs so heinous that they could…Block ability to dogfoodDerail a substantial portion of test pass (5%?)Unwritten Contract:You break the build, you fix it within an hourDay or nightHolds up productivity of entire teamRTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
Automated Test PassExample on a Microsoft product:Number of test cases:  6 digitsNumber of test runs:  7 digits14 different target device flavorsRuns 24/7, results available via webAutomatic handling of device resets / failsafeRequires creativity:How would you automate an image editor?A 3D graphics engine?RTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
Manual Test PassCost of automating vs. Cost of running manuallyRationale/Quantitative way to decide whether to automateReality: few organizations maximize benefits of automationTherefore, manual testing lives onTough “Automated vs. Manual” decisions:Testing for audio glitches (does the audio crackle?)Does the UI feel responsive enough?RTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
Tracking BugsWho foundWhenWhat, and it’s seveirtyHow to reproduceWhat part of the productCreate aWhere fixed and by whomStateOpen, Resolved, ClosedDispositionFixed, Not Fixed, Postponed, “By Design”
Release CriteriaWhat must be true for a release to be done or completeIncludes a mix of criteriaAll features implemented and reviewedDocumentation completeAll Bugs Closed (not necessarily fixed)Performance Criteria metVideo
Bug “Triage”Late in the cycle, a process for determining what to fixGetting people together and prioritizing impact on release criteria and overall stability goalsEven Known Crashing bugs are postponed depending on criteria
SummaryWith software products, know what to build for the customerHave checkpoints for progress (Milestones)Many types of testing and structure: right tool for the jobDetermine and measure ship criteria

More Related Content

PPTX
Web Application Testing
PDF
52892006 manual-testing-real-time
DOC
Manoj resume
DOC
Manual testing interview question by INFOTECH
PPTX
Testing web application
DOCX
Info manual testing questions
PPT
Manual testing visonia
PDF
What is Web Testing?
Web Application Testing
52892006 manual-testing-real-time
Manoj resume
Manual testing interview question by INFOTECH
Testing web application
Info manual testing questions
Manual testing visonia
What is Web Testing?

What's hot (20)

DOC
Manual testing interview questions
DOCX
Ravi_Kumar_Mekala_Performance_Tester
PDF
Manual software-testing-interview-questions-with-answers
PDF
Automated Browser Compatibility Testing
PPS
A perspective on web testing.ppt
RTF
Saravanan.docs
DOCX
Resume-Ramchandra Gupta
PPT
Testing Presentation
DOCX
Interview questions
PDF
[2015/2016] Software development process
DOCX
FazilShaikh Resume 13th january
PPTX
Compatibility testing a must do of the web apps 2012
DOC
Ahmed Faraz
PDF
20070925 03 - La qualimétrie en environnement industriel (Schneider automation)
PDF
Web App Testing - A Practical Approach
DOCX
Timothy Pettway Resume 8-2-2015
PDF
Manual Testing Interview Questions | Edureka
PPT
07 Outsource To India Independent Testing
PPT
Checklist for website testing
DOCX
Shavetambri
Manual testing interview questions
Ravi_Kumar_Mekala_Performance_Tester
Manual software-testing-interview-questions-with-answers
Automated Browser Compatibility Testing
A perspective on web testing.ppt
Saravanan.docs
Resume-Ramchandra Gupta
Testing Presentation
Interview questions
[2015/2016] Software development process
FazilShaikh Resume 13th january
Compatibility testing a must do of the web apps 2012
Ahmed Faraz
20070925 03 - La qualimétrie en environnement industriel (Schneider automation)
Web App Testing - A Practical Approach
Timothy Pettway Resume 8-2-2015
Manual Testing Interview Questions | Edureka
07 Outsource To India Independent Testing
Checklist for website testing
Shavetambri
Ad

Viewers also liked (7)

PPT
From West Coast to Gold Coast
PPTX
US Venture Capital 101: An introduction for the USAID YALI fellows at Northw...
PPT
Ashesi Univerisity: The Entrepreneurial story of the creation of a New Unive...
PPTX
NCIIA 2014 - Adapting Lean Startup in NUvention Web
PPTX
N uvention web for McCormick Advisory Council
PDF
Ashesi University Entrepreneurship Course Outline Fall 2014
DOC
VentureWell Paper: From West Coast to Gold Coast
From West Coast to Gold Coast
US Venture Capital 101: An introduction for the USAID YALI fellows at Northw...
Ashesi Univerisity: The Entrepreneurial story of the creation of a New Unive...
NCIIA 2014 - Adapting Lean Startup in NUvention Web
N uvention web for McCormick Advisory Council
Ashesi University Entrepreneurship Course Outline Fall 2014
VentureWell Paper: From West Coast to Gold Coast
Ad

Similar to Developing software at scale cs 394 may 2011 (20)

PPTX
ESEconf2011 - Guckenheimer Sam: "Agile in the Very Large"
PDF
Test process - Important Concepts
PPT
Software Project Management lecture 10
PPT
QA process Presentation
PPT
SOFTWARE TESTING
PDF
Software Quality and Test Strategies for Ruby and Rails Applications
PPTX
prod-dev-management.pptx
PPT
Software Testing Process
PPT
Testing process
PPTX
Testing Sap: Modern Methodology
PPTX
Qa documentation pp
PPTX
Large scale agile development practices
DOCX
Automation Testing Syllabus - Checklist
PPTX
Software testing
PDF
Introduction to Automated Testing
PDF
Introduction to-automated-testing
PPT
extreme Programming
PPTX
From Waterfall to Agile - Six Months In
PPT
SOASTA Webinar: Process Compression For Mobile App Dev 120612
PDF
Vaidyanathan Ramalingam Trade Off Economics In Testing Conference Speech
ESEconf2011 - Guckenheimer Sam: "Agile in the Very Large"
Test process - Important Concepts
Software Project Management lecture 10
QA process Presentation
SOFTWARE TESTING
Software Quality and Test Strategies for Ruby and Rails Applications
prod-dev-management.pptx
Software Testing Process
Testing process
Testing Sap: Modern Methodology
Qa documentation pp
Large scale agile development practices
Automation Testing Syllabus - Checklist
Software testing
Introduction to Automated Testing
Introduction to-automated-testing
extreme Programming
From Waterfall to Agile - Six Months In
SOASTA Webinar: Process Compression For Mobile App Dev 120612
Vaidyanathan Ramalingam Trade Off Economics In Testing Conference Speech

Recently uploaded (20)

PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PPTX
Big Data Technologies - Introduction.pptx
PPTX
SOPHOS-XG Firewall Administrator PPT.pptx
PDF
Electronic commerce courselecture one. Pdf
PDF
cuic standard and advanced reporting.pdf
PDF
gpt5_lecture_notes_comprehensive_20250812015547.pdf
PPTX
A Presentation on Artificial Intelligence
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PDF
Approach and Philosophy of On baking technology
PDF
Machine learning based COVID-19 study performance prediction
PPT
Teaching material agriculture food technology
PDF
NewMind AI Weekly Chronicles - August'25-Week II
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
Big Data Technologies - Introduction.pptx
SOPHOS-XG Firewall Administrator PPT.pptx
Electronic commerce courselecture one. Pdf
cuic standard and advanced reporting.pdf
gpt5_lecture_notes_comprehensive_20250812015547.pdf
A Presentation on Artificial Intelligence
MIND Revenue Release Quarter 2 2025 Press Release
Approach and Philosophy of On baking technology
Machine learning based COVID-19 study performance prediction
Teaching material agriculture food technology
NewMind AI Weekly Chronicles - August'25-Week II
Dropbox Q2 2025 Financial Results & Investor Presentation
Unlocking AI with Model Context Protocol (MCP)
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Assigned Numbers - 2025 - Bluetooth® Document
“AI and Expert System Decision Support & Business Intelligence Systems”
Build a system with the filesystem maintained by OSTree @ COSCUP 2025

Developing software at scale cs 394 may 2011

  • 1. Todd Warren – CS 394 Spring 2011Developing Software at Scale: Lessons from 20+ Years at Microsoft
  • 2. TodayTeam structure at MicrosoftProduct Complexity and SchedulingQuality and software testingKnowing when to ship
  • 3. Programs vs. Software products3x3x9xSource: Fred Brooks Jr., Mythical Man Month, “The Tar Pit”
  • 4. Software Products vs. Custom Software DevelopmentSource: Hoch, Roeding, Purkert, Lindner, “Secrets of Software Success”, 1999
  • 5. RolesProduct ManagerUser-Interface designerEnd-user liaisonProject ManagerArchitectDevelopersTool SmithQA/TestersBuild CoordinatorRisk OfficerEnd User DocumentationProgram ManagementSoftware Development EngineersTest and Quality AssuranceUser Assistance / EducationSource: McConnell
  • 6. ResourcesSize Matters!Different Methodologies and ApproachesScope of Feature and Quality MattersAffects Level of Process needed and overhead5 person teams: Moderate Process, Shared Roles24 person teams (PMC): Moderate Process, Lifecycle oriented roles and specialization—good for “Extreme” style process60-100 (MS Project): Moderate Process, some loose functional specialization and lifecycle100-200 (Windows CE) person teams: Medium to Heavy Process, Lifecycle roles and functional specialization1000+ Person Teams (Windows Mobile): Heavy Process, Multiple Methodologies, Formal Integration ProcessHigher Quality==more rigorous processTrue also for open source, online projectsApache is best example of very specified culture of contribution
  • 7. Organization and its affect on ProductsVeryFormalCasualMore Formal
  • 8. Project Interdependency Matters:“Star” or “Mesh”OfficeWindowsEdgeEdgeEdgeEdgeEdgeEdgeCoreEdgeCoreEdgeEdgeEdgeEdgeEdge
  • 9. A ‘Typical’ Product Group 25% Developers45% Testers10% Program Management10% User Education / Localization7% Marketing3% Overhead
  • 10. Small Product: Portable Media Center 1 UI Designer5 Program managers8 Developers10 testers
  • 11. Microsoft Project30 Developers (27%)36 Testers (33%)15 Program Mgrs (14%)20 UA/Localization (18%)6 Marketing (5%)3 Overhead (3%)
  • 12. Exchange Numbers (circa 2000)112 Developers (25.9%)247 Testers (57.3%)44 Program Mgrs. (10.2%)12 Marketing (2.7%)16 Overhead (3.7%)
  • 15. Amount of Time 3 month maximum is a good rule of thumb for a stage/milestone.Hard for people to focus on anything longer than 3 months.Never let things go un-built for longer than a week
  • 16. Smaple Staged Timeline (Project 2000)
  • 17. How Long?216 days development (truthfully probably more like 260d)284 days on “testing” in exampleComponent Tests: 188dSystem wide tests:~97d50/50 split between design/implement and test/fixSome Projects (e.g. operating systems, servers) longer integration period (more like 2:1)Factors: How distributed, number of “moving parts”Show why some of the Extreme methodology is appealing.
  • 18. Fred Brooks OS/360 Rules of thumb1/3 planning1/6 coding1/4 component test and early system test1/4 system test, all components in hand
  • 20. A few projects compared to Brooks
  • 21. Quality and TestingDesign in Scenarios up frontWhat is necessary for the componentUI is different than APIServer is different than clientSet Criteria and usage scenariosUnderstanding (and controlling if possible) the environment in which the software is developed and used“The last bug is found when the last customer dies”-Brian Valentine, SVP eCommerce, Amazon
  • 22. Example of ComplexityTopology CoverageExchange versions: 4.0 (latest SP), 5.0 (latest SP) and 5.5 Windows NT version: 3.51, 4.0 (latest SP’s)Langs. (Exchange and USA/USA, JPN/Chinese, JPN/Taiwan, JPN/Korean,Windows NT): JPN/JPN, GER/GER, FRN/FRNPlatforms: Intel, Alpha, (MIPS, PPC 4.0 only)Connectors X.400: Over TCP, TP4, TP0/X.25Connectors IMS: Over LAN, RAS, ISDNConnectors RAS: Over NetBEUI, IPX, TCPConnector interop: MS Mail, MAC Mail, cc:Mail, NotesNews: NNTP in/outAdmin: Daily operationsStore: Public >16GB and Private Store >16GB Replication: 29 sites, 130 servers, 200,000 users, 10 AB viewsClient protocols: MAPI, LDAP, POP3, IMAP4, NNTP, HTTP Telecommunication: Slow Link Simulator, Noise SimulationFault tolerance:Windows NT ClusteringSecurity: Exchange KMS server, MS Certificate ServerProxy firewall: Server-to-Server and Client-to-Server
  • 23. Complexity 2: Windows CE5m lines of code4 processor architecturesARM/Xscale, MIPS, x86, SH20 Board Support PackagesOver 1000 possible operating system components1000’s of peripherals
  • 24. Complexity 3: Windows Mobile 6.x2 code instances (“standard” and “pro”)4 ARM Chip Variants3 memory configuration variations8 Screen sizes (QVGA, VGA, WVGA, Square..)60 major interacting software components3 network technologies (CDMA, GSM, WiFi)Some distinct features for 7 major vendors100 dependent 3rd party apps for a complete “phone”
  • 25. Bugs over the lifecycle
  • 26. Bugs over the lifecycle
  • 27. Flow of tests during the cycleUnit TestsImplementedFeatureImplementedFeature isSpecifiedTest DesignIs writtenTest ReleaseDocumentComponentTestingSpecializedTestingSystemTest”Bug FixRegressionTests
  • 28. Ways of TestingTypes of TestsBlack BoxWhite Box“Gray” BoxStage of CycleUnit Test / Verification TestComponentAcceptance TestSystem TestPerformance TestStress TestExternal Testing (Alpha/Beta/”Dogfood”)Regression Testing
  • 29. Four Rules of TestingGuard the ProcessCatch Bugs EarlyTest with the Customer in MindM0M1M2RTMMake it MeasurableShip RequirementProOnGo LLC – May 2009
  • 30. Inside the Mind of a TesterHow close are we to satisfying agreed upon metrics/criteria?Are the criteria passing stably, every time we test?What are we building,and why?What do our bug trends say about our progress?How risky is this last-minute code check-in?What metrics and criteria summarize customer demands?Based on current trends, when will we pass all criteria?Do we pass all criteria? If not: what, why, how?Can we reliably measure these metrics and criteria?RTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
  • 31. What are we building, and why? Any problems with this?What metrics and criteria summarize customer demands?Can we reliably measure these metrics and criteria?M0: Specs & Test Plans// An API that draws a line from x to yVOID LineTo(INT x, INT y);RTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
  • 32. Balance: Fast vs. ThoroughCanaryMostFrequentShallow CoverageLeastFrequentCompletesCoverageBuild Verification TestsAutomated Test PassManual Test PassRTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
  • 33. Fast Tests that can automatically run at check-in timeStatic Code Analysis (like lint)Trial build, before check-in committed to SCMForm-field tests:Check-In cites a bug number?Code-reviewer field filled out?Canary & Check-In TestsRTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
  • 34. Build Verification TestGoal: find bugs so heinous that they could…Block ability to dogfoodDerail a substantial portion of test pass (5%?)Unwritten Contract:You break the build, you fix it within an hourDay or nightHolds up productivity of entire teamRTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
  • 35. Automated Test PassExample on a Microsoft product:Number of test cases: 6 digitsNumber of test runs: 7 digits14 different target device flavorsRuns 24/7, results available via webAutomatic handling of device resets / failsafeRequires creativity:How would you automate an image editor?A 3D graphics engine?RTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
  • 36. Manual Test PassCost of automating vs. Cost of running manuallyRationale/Quantitative way to decide whether to automateReality: few organizations maximize benefits of automationTherefore, manual testing lives onTough “Automated vs. Manual” decisions:Testing for audio glitches (does the audio crackle?)Does the UI feel responsive enough?RTM MilestoneConfirm, ShipM0Specs & Test PlansM1 .. MnDevelopment & TestProOnGo LLC – May 2009
  • 37. Tracking BugsWho foundWhenWhat, and it’s seveirtyHow to reproduceWhat part of the productCreate aWhere fixed and by whomStateOpen, Resolved, ClosedDispositionFixed, Not Fixed, Postponed, “By Design”
  • 38. Release CriteriaWhat must be true for a release to be done or completeIncludes a mix of criteriaAll features implemented and reviewedDocumentation completeAll Bugs Closed (not necessarily fixed)Performance Criteria metVideo
  • 39. Bug “Triage”Late in the cycle, a process for determining what to fixGetting people together and prioritizing impact on release criteria and overall stability goalsEven Known Crashing bugs are postponed depending on criteria
  • 40. SummaryWith software products, know what to build for the customerHave checkpoints for progress (Milestones)Many types of testing and structure: right tool for the jobDetermine and measure ship criteria

Editor's Notes

  • #30: StoriesQ1: David & Testing on EmulatorQ2: “We’re at 89%, how is that different than 90%?”Q3: QFEsQ4: Times that I held the line, and times that I didn’t