SlideShare a Scribd company logo
Software Security Excellence
How to Get the
Most out of Security Tools
Jason Taylor, CTO
About Security Innovation
• Authority in Software Security
• 15+ years research on software vulnerabilities
• Security testing methodology adopted by SAP,
Symantec, Microsoft and McAfee
• Authors of 18 books
• Helping organizations minimize risk
• Assessment: Show me the gaps
• Education: Guide me to the right decisions
• Standards: Set goals and make it easy and natural
• Tech-enabled services for both breadth and depth
• Gartner MQ Leader for security computer-based training
Agenda
Attacker Perspective
• When/where to use tools vs. manual testing methods
• Focusing on hot spots and "seeing" clues that tools cannot
• Putting it together to reduce risk
The World Has Changed
• Attackers are more and more sophisticated
• Increasing number of skilled attackers
• Attacks are targeted
• Consequences are higher
• Attackers are:
• Criminal organizations
• Government sponsored (or employed)
• Activist/personal Motivation
The Costs Keep Piling Up
• Monetary
• Average cost of a data breach in the US is $6.5 million*
• Dwith phishing attacks costs the average 10,000-employee company $3.77
million a year*
• Phishing is the 2nd most common attack vector**
• Average cost per record $217, and $259 in financial sector*
• Reputational
• 69% of consumers would be less inclined to do business
with a breached organization**
• The FTC is involved...
• Court says that FTC can hit companies with fines for failing to protect consumer
information
*Ponemon 2015 ** Verizon 2015
Not Much of a Deterrent
• Approximately 5% of cyber criminals are caught*
• Although a 5% prosecution rate is troubling, what’s worse is that only 2%
of all intrusions ever reach the attention of law enforcement
• Not surprisingly, a 5% conviction rate does not lend itself to meaningful
deterrence
*McAfee ** Georgetown Law Journal
Attackers & Motivations
Classic Actors The New School
Individual Hackers
• Exploration
• Notoriety
State Sponsored
• Espionage
• Warfare
Criminals
• Money
Political/Hacktivists
• Disruption/Embarrassment
• Intelligence
How Did We Get Here?
• Software has grown up in a trusting, insecure world
• Systems have historically been built to share data and facilitate collaboration
• In the early days, trust was (safely) assumed
• Software developers got used to this trust
• Universities didn’t bother teaching security
• As an industry we…
• …know too little
• …trust too much
• …have too much faith in boundary defenses
Opposing Goals
• Security vs. Features
• More features == more bugs
• Security vs. Reliability
• Writing error handling code means more chances of
introducing security bugs
• When error handlers fail, the app fails…who then
is in charge of security?
• Security vs. Performance
• Security code slows apps down
• Security vs. Usability
• Running every process as admin/root sure is convenient for the user
• Security options are confusing and awkward
Solutions
• Integrate security into the entire development process:
• Beginning at Requirements
• Continuing in Design
• Implemented in Coding
• Verified in Testing
• Response Processes in Place
• Knowledge and training is the first step
• Tools and automation are a lever on your security expertise
Agenda
• Attacker Perspective
When/where to use tools vs. manual testing methods
• Focusing on hot spots and "seeing" clues that tools cannot
• Putting it together to reduce risk
Generally Accepted Realities
• Given the absence of time constraints, manual testing will discover most
defects; automated testing will find a varying percentage
• Automation will find known and common vulnerabilities faster than humans
• It is difficult for automated tools to uncover compound vulnerabilities and
defects related to the business functionality
• Automation produces false positives; humans rarely do
Partners in Crime: Humans and Robots
• Tools, like humans, excel at certain tasks
• Not all applications need a deep pen test, and many need to go
“beyond the scan”
• Both play a critical role because they find different types of
vulnerabilities
• The challenge is finding the optimal balance to achieve breadth and
depth of coverage in the most efficient manner
Power of Automation
• Finding low hanging fruit
• Lowering skills barrier to testing
• Identifying weak spots for an experienced tester to focus on conducting deeper,
more specialized attacks
• Recognizing a pattern and applying similar attacks
• E.g. if a scanner recognizes a SQL database error message it may try variants of SQL
injection attacks
• Comparing strings to cover version checks, error messages, configuration issues,
TLS cipher suites and options, HTTP server verbs, etc.
• Cross Site Request Forgery (CSRF) that require sending similar requests with a
few headers changed based on authorization rules
Challenges with SAST/DAST Tools
• Missed coverage due to:
• Constraints such as authentication for role-sensitive applications
• Use of technology that eludes “spidering” or “crawling” in web application
• The result is often an incomplete and enumerated attack surface
• Logic is difficult to program into a linear automated tool
• Whereas a human can understand business use cases and alter tests
• False positives can be time consuming to deal with
• Findings often lack business/risk context
• Complacency – people tend to trust coverage and reports
• Often limited to Web applications
The Human Value-Add
• Imaginative
• In the absence of code, we need to make assumptions as to how the technology is
commonly used, patterns people follow, what is running the back end
• Observant
• Ability to look for clues or other things are out of place or have changed (i.e. URL,
content types)
• Logical
• Can evaluate error messages from tools, encodings, page layouts, load time, etc
• Accurate
• Virtually no false positives and contextual vulnerability risk rating
• Specialized
• Not limited to Web applications
• Can identify root cause and design flaws
In-Practice: Value of Intuition
• During automated testing of a CRM application, results did not find
Session Management Weakness
• Bit surprising as scanners are usually good at this
• Manual testing discovered a Direct Object Access (DOA) vulnerability
• When this previously missing parameter was added, user was given access
to resources previously unavailable
• Further manual discovered that any tampering of those parameters
resulted in server sessions being terminated
Tools can be programmed to perform complicated checks, but not
intuition checks, which often result in the discovery of critical
vulnerabilities
In Practice: Chaining Vulnerabilities
• An example from our testing:
• Automation is good at finding simple file upload
vulnerabilities as well as cross site scripting problems
• In this case the automation found nothing since there
was code to block malicious upload
• Manual testing discovered a problem in this security code
• Clever testing allowed malicious upload as well as discovering a XSS problem
• Required a very specific malicious file with PNG file headers, an html file extension
and a malicious jscript payload embedded within the image
• This file was used to run script code on the server
Automation is not capable of this type of testing, but your attackers are!
The Challenges with Manual Testing
• More expensive and time consuming
• Requires highly skilled testers
• Tests are not reusable, though patterns are
• Inconsistency from tester to tester
Agenda
• Attacker Perspective
• When/where to use tools vs. manual testing methods
Focusing on hot spots and "seeing" clues that tools cannot
• Putting it together to reduce risk
What Are Hot Spots?
• Areas that provide disproportionate return for your effort
• Security defects tend to cluster in areas
• Some areas of the code have more security impact than others
• This combination is a Hot Spot
• Focus your efforts
• If you know what you’re looking for, you will find more vulnerabilities
• Un-guided testing/review is a waste of time
• Use Hot Spots as a heat map to guide security design, code and deployment
inspections
Hot Spots are the best places to apply human effort
Reflecting on Security Hot Spots
• You can use hot spots and common vulnerabilities to share:
• Principles, patterns, and practices
• Knowledge around threats, attacks, vulnerabilities, and
countermeasures
• To keep them relative and effective, consider the following questions
• How can you improve security results in your organization?
• How can you organize your bodies of knowledge?
• How can you improve sharing patterns, anti-patterns, and checklists?
• How can you tune and prune your security inspections?
Agenda
• Attacker Perspective
• When/where to use tools vs. manual testing methods
• Focusing on hot spots and "seeing" clues that tools cannot
Putting it together to reduce risk
AppSec Program Goals
• Effective Vulnerability Management
• Regular, iterative testing ensures continually-improving test results and will catch
vulnerabilities more quickly
• Measure risk of each application through vulnerability discovery & remediation metrics
• Discover trends and weaknesses that can be used to improve the overall AppSec and
secure coding program through standards and training
• Optimized Frequency and Depth of Testing
• Match level of testing and analysis to application criticality
• Ensure high risk applications get more attention and low-risk ones are not over tested
• Cost Management
• Predictable cost
• Investment matched to level of risk
Best Practices for Assessments
• Use automated tools for heavy lifting
• Find common and known vulnerabilities faster than humans
• Adopt when you have the skills to use properly
• Understand what was found - security implication is not always obvious
• Be sure they are integrated into SDLC and used at key checkpoints
• Complement with manual efforts
• Necessary to find deeply rooted and elusive vulnerabilities that tools can not
• Be sure to leverage a threat model to focus on high-risk areas
• Support vulnerability remediation
• Problem isn’t solved when found, only when corrected properly
• Match test efforts with your organization’s ability to remediate
Take a Risk-Based Approach
• Conventional approaches to application security not risk-based
• Typically no more than automated scanning that look for some pre-determined
set of common vulnerabilities
• Frequently fail to address each application’s unique code-, system- and
workflow-level vulnerabilities
• Provides little practical guidance on prioritizing defect remediation
• Does not yield a roadmap to guide enterprise AppSec posture improvements
• The majority of application security programs focus on:*
• Automated security testing during development (41%)
• Secure coding standards that are adhered to (32%)
• A secure SDLC process improvement plan (30%)
*”The State of Application Security Maturity” – Ponemon Institue & Security Innovation, 2013
Why Risk-Rank?
• Helps your organization to
• Quantitatively categorize application assets
• Plan assessment and mitigation activities cost effectively
• Ensure prioritization is based on –real- business risk
• Inappropriate security assessments are costly
• Deep inspection on all applications is neither feasible nor necessary
• Spending time on a low-priority application while a high-risk one remains
vulnerable can be devastating (data breach, DDOS, etc.)
• Helps gauge the security maturity level of your teams
• Enables risk-based decisions for managing deployed applications
• Remove, replace, take off-line, implement compensating controls
Analyze Root Cause
• Watch for duplicates
• Dozens of vulnerabilities can result from a single root that creates multiple paths to
exploit
• Static Analysis tools can help identify these root causes
• Determine root cause and likelihood of an exploit
• Requires some manual code review skills to do dataflow/control flow analysis
• You can use DREAD to evaluate severity
• Damage Potential
• Reproducibility
• Exploitability
• Affected Users
• Discoverability
• Review trend information to determine whether the vulnerability has existed before and
what actions were taken to reduce or eliminate it
Hands-on Security Code Reviews
• Manual Code Review compliments automated scanning
• Should be part of your normal SDLC practices
• The use of manual reviews vs. automated tools shouldn't be an either-or
proposition
• Static testing tools can find common problems a lot faster than humans
• Only humans can find design-level issues like poor identity-verification questions,
business logic attacks, and compound vulnerabilities
No Scanner is 100% effective ‘out of the box’
• Fine-tuning scans to reduce false positives can shorten time spent chasing
“fake vulnerabilities”
• When a human recognizes a certain pattern or input that causes
misbehavior, automation can be leveraged to check for that particular
input/pattern across a much broader surface area quickly
• Need to:
• Customize and fine-tune
• Integrate into the SDLC at key checkpoints
• Compliment with manual reviews
Conclusion
• Automation provides a fast method for flagging common vulnerabilities
• However, it yields only partial code coverage, cannot detect certain vulnerabilities,
and can lose efficiency gains with lots of false positives
• Humans can leverage expertise and creativity to hunt down unknown, complex
and stealth-like vulnerabilities
• However, this can be time consuming so focus on critical areas
• Different tools will find different vulnerabilities in different types of applications
• Ensure your teams know what their tools are capable of
• Leveraging automation early on to find low hanging fruit and other potential
issues provides a useful jumpstart in determining deeper testing is needed
How Security Innovation Can Help
• Standards – Set goals and make it easy
• Align development activities with policies and compliance mandates
• Secure coding standards
Education – Enable me to make the right decisions
• 120+ courses cover all major technologies, platforms and roles
• Mobile and Web Application Security CTF hackathons make it
fun to “learn by doing”
• Assessment – Show me the Gaps!
• Security design and code reviews
• Software penetration tests
• Secure SDLC Gap Analysis
Whitepaper
Smart Software Security Testing
• Findings from study comparing
manual and automated techniques
• Available end of March
• getsecure@securityinnovation.com
to request a copy

More Related Content

PPTX
Security best practices for regular users
PDF
Threat Modeling to Reduce Software Security Risk
PPTX
The QA Analyst's Hacker's Landmark Tour v3.0
PDF
What Every Developer And Tester Should Know About Software Security
PDF
Declaration of Mal(WAR)e
PDF
Application Security Testing for Software Engineers: An approach to build sof...
PPTX
5 things i wish i knew about sast (DSO-LG July 2021)
PDF
Evil User Stories - Improve Your Application Security
Security best practices for regular users
Threat Modeling to Reduce Software Security Risk
The QA Analyst's Hacker's Landmark Tour v3.0
What Every Developer And Tester Should Know About Software Security
Declaration of Mal(WAR)e
Application Security Testing for Software Engineers: An approach to build sof...
5 things i wish i knew about sast (DSO-LG July 2021)
Evil User Stories - Improve Your Application Security

What's hot (20)

PDF
AppSec in an Agile World
PDF
What is pentest
PPTX
Regular Expression Denial of Service RegexDoS
PPTX
WTF is Penetration Testing v.2
PPTX
Technical Writing for Consultants
PDF
Threat Intelligence - Routes to a Proactive Capability
PDF
Fully Integrated Defense Operation
PDF
Penetration Testing Execution Phases
PPTX
Security Culture from Concept to Maintenance: Secure Software Development Lif...
PDF
The Future of Software Security Assurance
PDF
Security Training: Making your weakest link the strongest - CircleCityCon 2017
PPTX
For Business's Sake, Let's focus on AppSec
PPTX
Software Security Assurance - Program Building (You're going to need a bigger...
PPTX
NextGen Endpoint Security for Dummies
PPTX
Vulnerability assessment and penetration testing
PDF
Ensuring Security through Continuous Testing
PPTX
Full stack vulnerability management at scale
PPTX
How to Build a Successful Incident Response Program
PDF
Rapid Threat Modeling Techniques
AppSec in an Agile World
What is pentest
Regular Expression Denial of Service RegexDoS
WTF is Penetration Testing v.2
Technical Writing for Consultants
Threat Intelligence - Routes to a Proactive Capability
Fully Integrated Defense Operation
Penetration Testing Execution Phases
Security Culture from Concept to Maintenance: Secure Software Development Lif...
The Future of Software Security Assurance
Security Training: Making your weakest link the strongest - CircleCityCon 2017
For Business's Sake, Let's focus on AppSec
Software Security Assurance - Program Building (You're going to need a bigger...
NextGen Endpoint Security for Dummies
Vulnerability assessment and penetration testing
Ensuring Security through Continuous Testing
Full stack vulnerability management at scale
How to Build a Successful Incident Response Program
Rapid Threat Modeling Techniques
Ad

Viewers also liked (16)

PDF
Get Smart about Ransomware: Protect Yourself and Organization
PPTX
Security Best Practices for Regular Users
PDF
Enterprise Mobility: Challenge or opportunity
PDF
Io uso Tor e non lascio tracce! Sei proprio sicuro?
PPT
Software Security Testing
PPTX
Smart City: i temi e le idee per la città del futuro Etna Hitech_corso ISVI 2...
PDF
Matteo meucci Software Security - Napoli 10112016
PPT
Software security
PDF
Pramata Tech Dinosaurs ePaper - Social Sharing
PPT
Visual Studio 2013 - Recursos da IDE
PDF
Presence Agent y Presence Scripting para personas con limitaciones visuales
PDF
Wedia Social Media presentation at DigitalDays
PDF
Rsa2012 下一代安全的战略思考-绿盟科技赵粮
PDF
Why Consider #FlashStorage in your #DataCenter
PPTX
Perceptive Software Scope
PPTX
Getting started with performance testing
Get Smart about Ransomware: Protect Yourself and Organization
Security Best Practices for Regular Users
Enterprise Mobility: Challenge or opportunity
Io uso Tor e non lascio tracce! Sei proprio sicuro?
Software Security Testing
Smart City: i temi e le idee per la città del futuro Etna Hitech_corso ISVI 2...
Matteo meucci Software Security - Napoli 10112016
Software security
Pramata Tech Dinosaurs ePaper - Social Sharing
Visual Studio 2013 - Recursos da IDE
Presence Agent y Presence Scripting para personas con limitaciones visuales
Wedia Social Media presentation at DigitalDays
Rsa2012 下一代安全的战略思考-绿盟科技赵粮
Why Consider #FlashStorage in your #DataCenter
Perceptive Software Scope
Getting started with performance testing
Ad

Similar to How to Get the Most Out of Security Tools (20)

PPTX
Hacker vs Tools: Which to Choose?
PPTX
Hacker vs tools
PPTX
threat_and_vulnerability_management_-_ryan_elmer_-_frsecure.pptx
PPTX
Assessing System Risk the Smart Way
PPTX
Threat Modeling - Locking the Door to Vulnerabilities
PDF
Top Security Challenges Facing Credit Unions Today
PPTX
Cyber Security # Lec 3
PPTX
Application Threat Modeling
PDF
Threat modelling & apps testing
PPTX
Reduce Third Party Developer Risks
PPTX
Cybersecurity Frameworks and You: The Perfect Match
PDF
DevSecCon Asia 2017 Pishu Mahtani: Adversarial Modelling
PPTX
Module 6.pptx
PPTX
Vendors, and Risk, and Tigers, and Bears, Oh My: How to Create a Vendor Revie...
PPTX
5 Ways to Reduce 3rd Party Developer Risk
PPTX
Building an AppSec Team Extended Cut
PPTX
Mike Spaulding - Building an Application Security Program
PPTX
Threat modelling(system + enterprise)
PPTX
Can You Really Automate Yourself Secure
PPTX
Your cyber security webinar
Hacker vs Tools: Which to Choose?
Hacker vs tools
threat_and_vulnerability_management_-_ryan_elmer_-_frsecure.pptx
Assessing System Risk the Smart Way
Threat Modeling - Locking the Door to Vulnerabilities
Top Security Challenges Facing Credit Unions Today
Cyber Security # Lec 3
Application Threat Modeling
Threat modelling & apps testing
Reduce Third Party Developer Risks
Cybersecurity Frameworks and You: The Perfect Match
DevSecCon Asia 2017 Pishu Mahtani: Adversarial Modelling
Module 6.pptx
Vendors, and Risk, and Tigers, and Bears, Oh My: How to Create a Vendor Revie...
5 Ways to Reduce 3rd Party Developer Risk
Building an AppSec Team Extended Cut
Mike Spaulding - Building an Application Security Program
Threat modelling(system + enterprise)
Can You Really Automate Yourself Secure
Your cyber security webinar

More from Security Innovation (20)

PPTX
Securing Applications in the Cloud
PPTX
Modernizing, Migrating & Mitigating - Moving to Modern Cloud & API Web Apps W...
PPTX
Develop, Test & Maintain Secure Systems (While Being PCI Compliant)
PPTX
Protecting Sensitive Data (and be PCI Compliant too!)
PDF
5 Ways To Train Security Champions
PPTX
Aligning Application Security to Compliance
PPTX
How to Hijack a Pizza Delivery Robot with Injection Flaws
PPTX
How an Attacker "Audits" Your Software Systems
PPTX
Opening the Talent Spigot to Securing our Digital Future
PDF
Slashing Your Cloud Risk: 3 Must-Do's
PPTX
A Fresh, New Look for CMD+CTRL Cyber Range
PPTX
Security Testing for IoT Systems
PPTX
Cyber Ranges: A New Approach to Security
PPTX
Is Blockchain Right for You? The Million Dollar Question
PPTX
Privacy: The New Software Development Dilemma
PPTX
Privacy Secrets Your Systems May Be Telling
PPTX
Secure DevOps - Evolution or Revolution?
PPTX
IoT Security: Debunking the "We Aren't THAT Connected" Myth
PDF
GDPR: The Application Security Twist
PDF
The New OWASP Top Ten: Let's Cut to the Chase
Securing Applications in the Cloud
Modernizing, Migrating & Mitigating - Moving to Modern Cloud & API Web Apps W...
Develop, Test & Maintain Secure Systems (While Being PCI Compliant)
Protecting Sensitive Data (and be PCI Compliant too!)
5 Ways To Train Security Champions
Aligning Application Security to Compliance
How to Hijack a Pizza Delivery Robot with Injection Flaws
How an Attacker "Audits" Your Software Systems
Opening the Talent Spigot to Securing our Digital Future
Slashing Your Cloud Risk: 3 Must-Do's
A Fresh, New Look for CMD+CTRL Cyber Range
Security Testing for IoT Systems
Cyber Ranges: A New Approach to Security
Is Blockchain Right for You? The Million Dollar Question
Privacy: The New Software Development Dilemma
Privacy Secrets Your Systems May Be Telling
Secure DevOps - Evolution or Revolution?
IoT Security: Debunking the "We Aren't THAT Connected" Myth
GDPR: The Application Security Twist
The New OWASP Top Ten: Let's Cut to the Chase

Recently uploaded (20)

PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PPTX
Cloud computing and distributed systems.
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PDF
Unlocking AI with Model Context Protocol (MCP)
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PDF
Encapsulation theory and applications.pdf
PDF
Machine learning based COVID-19 study performance prediction
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
DOCX
The AUB Centre for AI in Media Proposal.docx
PPTX
A Presentation on Artificial Intelligence
PDF
cuic standard and advanced reporting.pdf
PDF
Modernizing your data center with Dell and AMD
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Shreyas Phanse Resume: Experienced Backend Engineer | Java • Spring Boot • Ka...
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Cloud computing and distributed systems.
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
Unlocking AI with Model Context Protocol (MCP)
“AI and Expert System Decision Support & Business Intelligence Systems”
Encapsulation theory and applications.pdf
Machine learning based COVID-19 study performance prediction
Per capita expenditure prediction using model stacking based on satellite ima...
The AUB Centre for AI in Media Proposal.docx
A Presentation on Artificial Intelligence
cuic standard and advanced reporting.pdf
Modernizing your data center with Dell and AMD
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
Dropbox Q2 2025 Financial Results & Investor Presentation
Encapsulation_ Review paper, used for researhc scholars
Shreyas Phanse Resume: Experienced Backend Engineer | Java • Spring Boot • Ka...
20250228 LYD VKU AI Blended-Learning.pptx

How to Get the Most Out of Security Tools

  • 1. Software Security Excellence How to Get the Most out of Security Tools Jason Taylor, CTO
  • 2. About Security Innovation • Authority in Software Security • 15+ years research on software vulnerabilities • Security testing methodology adopted by SAP, Symantec, Microsoft and McAfee • Authors of 18 books • Helping organizations minimize risk • Assessment: Show me the gaps • Education: Guide me to the right decisions • Standards: Set goals and make it easy and natural • Tech-enabled services for both breadth and depth • Gartner MQ Leader for security computer-based training
  • 3. Agenda Attacker Perspective • When/where to use tools vs. manual testing methods • Focusing on hot spots and "seeing" clues that tools cannot • Putting it together to reduce risk
  • 4. The World Has Changed • Attackers are more and more sophisticated • Increasing number of skilled attackers • Attacks are targeted • Consequences are higher • Attackers are: • Criminal organizations • Government sponsored (or employed) • Activist/personal Motivation
  • 5. The Costs Keep Piling Up • Monetary • Average cost of a data breach in the US is $6.5 million* • Dwith phishing attacks costs the average 10,000-employee company $3.77 million a year* • Phishing is the 2nd most common attack vector** • Average cost per record $217, and $259 in financial sector* • Reputational • 69% of consumers would be less inclined to do business with a breached organization** • The FTC is involved... • Court says that FTC can hit companies with fines for failing to protect consumer information *Ponemon 2015 ** Verizon 2015
  • 6. Not Much of a Deterrent • Approximately 5% of cyber criminals are caught* • Although a 5% prosecution rate is troubling, what’s worse is that only 2% of all intrusions ever reach the attention of law enforcement • Not surprisingly, a 5% conviction rate does not lend itself to meaningful deterrence *McAfee ** Georgetown Law Journal
  • 7. Attackers & Motivations Classic Actors The New School Individual Hackers • Exploration • Notoriety State Sponsored • Espionage • Warfare Criminals • Money Political/Hacktivists • Disruption/Embarrassment • Intelligence
  • 8. How Did We Get Here? • Software has grown up in a trusting, insecure world • Systems have historically been built to share data and facilitate collaboration • In the early days, trust was (safely) assumed • Software developers got used to this trust • Universities didn’t bother teaching security • As an industry we… • …know too little • …trust too much • …have too much faith in boundary defenses
  • 9. Opposing Goals • Security vs. Features • More features == more bugs • Security vs. Reliability • Writing error handling code means more chances of introducing security bugs • When error handlers fail, the app fails…who then is in charge of security? • Security vs. Performance • Security code slows apps down • Security vs. Usability • Running every process as admin/root sure is convenient for the user • Security options are confusing and awkward
  • 10. Solutions • Integrate security into the entire development process: • Beginning at Requirements • Continuing in Design • Implemented in Coding • Verified in Testing • Response Processes in Place • Knowledge and training is the first step • Tools and automation are a lever on your security expertise
  • 11. Agenda • Attacker Perspective When/where to use tools vs. manual testing methods • Focusing on hot spots and "seeing" clues that tools cannot • Putting it together to reduce risk
  • 12. Generally Accepted Realities • Given the absence of time constraints, manual testing will discover most defects; automated testing will find a varying percentage • Automation will find known and common vulnerabilities faster than humans • It is difficult for automated tools to uncover compound vulnerabilities and defects related to the business functionality • Automation produces false positives; humans rarely do
  • 13. Partners in Crime: Humans and Robots • Tools, like humans, excel at certain tasks • Not all applications need a deep pen test, and many need to go “beyond the scan” • Both play a critical role because they find different types of vulnerabilities • The challenge is finding the optimal balance to achieve breadth and depth of coverage in the most efficient manner
  • 14. Power of Automation • Finding low hanging fruit • Lowering skills barrier to testing • Identifying weak spots for an experienced tester to focus on conducting deeper, more specialized attacks • Recognizing a pattern and applying similar attacks • E.g. if a scanner recognizes a SQL database error message it may try variants of SQL injection attacks • Comparing strings to cover version checks, error messages, configuration issues, TLS cipher suites and options, HTTP server verbs, etc. • Cross Site Request Forgery (CSRF) that require sending similar requests with a few headers changed based on authorization rules
  • 15. Challenges with SAST/DAST Tools • Missed coverage due to: • Constraints such as authentication for role-sensitive applications • Use of technology that eludes “spidering” or “crawling” in web application • The result is often an incomplete and enumerated attack surface • Logic is difficult to program into a linear automated tool • Whereas a human can understand business use cases and alter tests • False positives can be time consuming to deal with • Findings often lack business/risk context • Complacency – people tend to trust coverage and reports • Often limited to Web applications
  • 16. The Human Value-Add • Imaginative • In the absence of code, we need to make assumptions as to how the technology is commonly used, patterns people follow, what is running the back end • Observant • Ability to look for clues or other things are out of place or have changed (i.e. URL, content types) • Logical • Can evaluate error messages from tools, encodings, page layouts, load time, etc • Accurate • Virtually no false positives and contextual vulnerability risk rating • Specialized • Not limited to Web applications • Can identify root cause and design flaws
  • 17. In-Practice: Value of Intuition • During automated testing of a CRM application, results did not find Session Management Weakness • Bit surprising as scanners are usually good at this • Manual testing discovered a Direct Object Access (DOA) vulnerability • When this previously missing parameter was added, user was given access to resources previously unavailable • Further manual discovered that any tampering of those parameters resulted in server sessions being terminated Tools can be programmed to perform complicated checks, but not intuition checks, which often result in the discovery of critical vulnerabilities
  • 18. In Practice: Chaining Vulnerabilities • An example from our testing: • Automation is good at finding simple file upload vulnerabilities as well as cross site scripting problems • In this case the automation found nothing since there was code to block malicious upload • Manual testing discovered a problem in this security code • Clever testing allowed malicious upload as well as discovering a XSS problem • Required a very specific malicious file with PNG file headers, an html file extension and a malicious jscript payload embedded within the image • This file was used to run script code on the server Automation is not capable of this type of testing, but your attackers are!
  • 19. The Challenges with Manual Testing • More expensive and time consuming • Requires highly skilled testers • Tests are not reusable, though patterns are • Inconsistency from tester to tester
  • 20. Agenda • Attacker Perspective • When/where to use tools vs. manual testing methods Focusing on hot spots and "seeing" clues that tools cannot • Putting it together to reduce risk
  • 21. What Are Hot Spots? • Areas that provide disproportionate return for your effort • Security defects tend to cluster in areas • Some areas of the code have more security impact than others • This combination is a Hot Spot • Focus your efforts • If you know what you’re looking for, you will find more vulnerabilities • Un-guided testing/review is a waste of time • Use Hot Spots as a heat map to guide security design, code and deployment inspections Hot Spots are the best places to apply human effort
  • 22. Reflecting on Security Hot Spots • You can use hot spots and common vulnerabilities to share: • Principles, patterns, and practices • Knowledge around threats, attacks, vulnerabilities, and countermeasures • To keep them relative and effective, consider the following questions • How can you improve security results in your organization? • How can you organize your bodies of knowledge? • How can you improve sharing patterns, anti-patterns, and checklists? • How can you tune and prune your security inspections?
  • 23. Agenda • Attacker Perspective • When/where to use tools vs. manual testing methods • Focusing on hot spots and "seeing" clues that tools cannot Putting it together to reduce risk
  • 24. AppSec Program Goals • Effective Vulnerability Management • Regular, iterative testing ensures continually-improving test results and will catch vulnerabilities more quickly • Measure risk of each application through vulnerability discovery & remediation metrics • Discover trends and weaknesses that can be used to improve the overall AppSec and secure coding program through standards and training • Optimized Frequency and Depth of Testing • Match level of testing and analysis to application criticality • Ensure high risk applications get more attention and low-risk ones are not over tested • Cost Management • Predictable cost • Investment matched to level of risk
  • 25. Best Practices for Assessments • Use automated tools for heavy lifting • Find common and known vulnerabilities faster than humans • Adopt when you have the skills to use properly • Understand what was found - security implication is not always obvious • Be sure they are integrated into SDLC and used at key checkpoints • Complement with manual efforts • Necessary to find deeply rooted and elusive vulnerabilities that tools can not • Be sure to leverage a threat model to focus on high-risk areas • Support vulnerability remediation • Problem isn’t solved when found, only when corrected properly • Match test efforts with your organization’s ability to remediate
  • 26. Take a Risk-Based Approach • Conventional approaches to application security not risk-based • Typically no more than automated scanning that look for some pre-determined set of common vulnerabilities • Frequently fail to address each application’s unique code-, system- and workflow-level vulnerabilities • Provides little practical guidance on prioritizing defect remediation • Does not yield a roadmap to guide enterprise AppSec posture improvements • The majority of application security programs focus on:* • Automated security testing during development (41%) • Secure coding standards that are adhered to (32%) • A secure SDLC process improvement plan (30%) *”The State of Application Security Maturity” – Ponemon Institue & Security Innovation, 2013
  • 27. Why Risk-Rank? • Helps your organization to • Quantitatively categorize application assets • Plan assessment and mitigation activities cost effectively • Ensure prioritization is based on –real- business risk • Inappropriate security assessments are costly • Deep inspection on all applications is neither feasible nor necessary • Spending time on a low-priority application while a high-risk one remains vulnerable can be devastating (data breach, DDOS, etc.) • Helps gauge the security maturity level of your teams • Enables risk-based decisions for managing deployed applications • Remove, replace, take off-line, implement compensating controls
  • 28. Analyze Root Cause • Watch for duplicates • Dozens of vulnerabilities can result from a single root that creates multiple paths to exploit • Static Analysis tools can help identify these root causes • Determine root cause and likelihood of an exploit • Requires some manual code review skills to do dataflow/control flow analysis • You can use DREAD to evaluate severity • Damage Potential • Reproducibility • Exploitability • Affected Users • Discoverability • Review trend information to determine whether the vulnerability has existed before and what actions were taken to reduce or eliminate it
  • 29. Hands-on Security Code Reviews • Manual Code Review compliments automated scanning • Should be part of your normal SDLC practices • The use of manual reviews vs. automated tools shouldn't be an either-or proposition • Static testing tools can find common problems a lot faster than humans • Only humans can find design-level issues like poor identity-verification questions, business logic attacks, and compound vulnerabilities
  • 30. No Scanner is 100% effective ‘out of the box’ • Fine-tuning scans to reduce false positives can shorten time spent chasing “fake vulnerabilities” • When a human recognizes a certain pattern or input that causes misbehavior, automation can be leveraged to check for that particular input/pattern across a much broader surface area quickly • Need to: • Customize and fine-tune • Integrate into the SDLC at key checkpoints • Compliment with manual reviews
  • 31. Conclusion • Automation provides a fast method for flagging common vulnerabilities • However, it yields only partial code coverage, cannot detect certain vulnerabilities, and can lose efficiency gains with lots of false positives • Humans can leverage expertise and creativity to hunt down unknown, complex and stealth-like vulnerabilities • However, this can be time consuming so focus on critical areas • Different tools will find different vulnerabilities in different types of applications • Ensure your teams know what their tools are capable of • Leveraging automation early on to find low hanging fruit and other potential issues provides a useful jumpstart in determining deeper testing is needed
  • 32. How Security Innovation Can Help • Standards – Set goals and make it easy • Align development activities with policies and compliance mandates • Secure coding standards Education – Enable me to make the right decisions • 120+ courses cover all major technologies, platforms and roles • Mobile and Web Application Security CTF hackathons make it fun to “learn by doing” • Assessment – Show me the Gaps! • Security design and code reviews • Software penetration tests • Secure SDLC Gap Analysis
  • 33. Whitepaper Smart Software Security Testing • Findings from study comparing manual and automated techniques • Available end of March • getsecure@securityinnovation.com to request a copy