SlideShare a Scribd company logo
Application Assessment Techniques

OWASP Northern Virginia

August 6th, 2009
Agenda
•   Background
•   Common Pitfalls in Application Assessment
•   Moving Beyond
    – Threat Modeling
    – Code Review
    – Dynamic Testing
       y            g
•   Presenting Results
•   Questions / Panel Discussion




                                                1
Background
•   Dan Cornell
    – Principal at Denim Group www.denimgroup.com
    – Software Developer: MCSD Java 2 Certified Programmer
                           MCSD,
    – OWASP: Global Membership Committee, Open Review Project, SA Chapter Lead


•   Denim Group
    – Application Development
        • Java and .NET
    – Application Security
        • Assessments, penetration tests, code reviews, training, process consulting




                                                                                       2
How Not To Do It




                   3
How Not To Do It
•   Q: What are you all doing to address application security concerns in
    your organization?
•   A: We b
    A W bought “XYZ Scanner”
                 ht      S         ”
•   Q: Okay… Are you actually using it?
•   A: We ran some scans
•   Q: And how did that go?
•   A: Oh we found some stuff…
•   Q: How did you address those issues?
•   A: I think we sent the report to the developers. Not sure what they did
    with them. I guess I ought to check in on that…



                                                                              4
Goals of Application Assessment
•   Vary by organization, by application and by assessment

•   Determine the security state of an application
•   Characterize risk to executives and decision makers
•   Prove a point
             p
•   Set the stage for future efforts




                                                             5
Common Pitfalls in Application Assessment




                                            6
Common Pitfalls in Application Assessment
•   Ad hoc approach
    – Non-repeatable, non-comprehensive
•   Reliance on automated t l
    R li          t   t d tools
    – Can only find a subset of vulnerabilities – false negatives
    – Even the good tools need tuning to reduce false positives
•   Current commercial tools are biased
    – Rulesets and capabilities typically over-focused on web applications
•   Too focused on one approach
    – Static and dynamic testing have different strengths
    – Economic concerns constrain the amount of testing that can be performed – make
      the most of the time you have




                                                                                       7
Moving Beyond
•   Automated versus Manual
•   Threat Modeling
•   Dynamic Testing
•   Source Code Review




                              8
Automated Versus Manual




                          9
Automated Versus Manual
•   Automated tools are great at:
     – Consistency - not getting tired
     – Data flow analysis
•   Automated tools are terrible for:
     – Understanding business context
•   Manual testing is great at:
     – Identifying business logic flaws
•   Manual testing is terrible for:




                                          10
Threat Modeling
•   Provides high-level understanding of the system
    – Useful for creating a structured test plan
•   Provides
    P id application context
             li ti      t t
    – Crucial for characterizing results
•   Complementary with Abuse Cases




                                                      11
Threat Modeling Approach
•   Establish scope and system boundaries
•   Decompose the system into a Data Flow Diagram (DFD)
•   Assign potential threats based on asset types




                                                          12
Threat Model Example




                       13
Mapping Threats to Asset Types
Threat Type                  External     Process   Data Flow   Data Store
                             Interactor
S – Spoofing
S S     fi                   Yes
                             Y            Yes
                                          Y

T – Tampering                             Yes       Yes         Yes

R – Repudiation              Yes          Yes                   Yes

I – Information Disclosure                Yes       Yes         Yes

D – Denial of Service                     Yes       Yes         Yes

E – Elevation of Privilege                Yes



                                                                             14
Threat Modeling
•   Result is a structured, repeatable list of threats to check
     – Strength is to find known problems repeatably
•   Augment with Ab
    A     t ith Abuse C
                      Cases
     – “What could go wrong” scenarios
     – More creative and unstructured




                                                                  15
Dynamic, Static and Manual Testing
Source Code Review
•   Advantages
•   Disadvantages
•   Approaches




                     17
Static Analysis Advantages
•   Have access to the actual instructions the software will be executing
     – No need to guess or interpret behavior
     – Full access to all the software’s possible behaviors
                              software s
•   Remediation is easier because you know where the problems are
Static Analysis Disadvantages
•   Require access to source code or at least binary code
     – Typically need access to enough software artifacts to execute a build
•   Typically
    T i ll require proficiency running software b ild
                    i     fi i         i     ft     builds
•   Will not find issues related to operational deployment environments
Approaches
•   Run automated tools with default ruleset
     – Provides a first-cut look at the security state of the application
     – Identify “hot spots
                 hot spots”
•   Craft custom rules specific to the application
     –   3rd party code
     –   Break very large applications into manageable chunks
     –   Application-specific APIs – sources, sinks, filter functions
     –   Compliance-specific constructs
•   This is an iterative process




                                                                            20
Approaches
•   Auditing results from an automated scan
    – Typically must sample for larger applications (or really bad ones)
    – Many results tend to cluster on a per application basis – coding idioms for error
                                        per-application
      handling, resource lifecycle
•   Manual review
    – Must typically focus the effort for economic reasons
    – Hot spots from review of automated results
    – Security-critical functions from review of automated results – encoding,
      canonicalization
    – Security-critical areas
    – Startup, shutdown




                                                                                          21
Dynamic Testing
•   Advantages
•   Disadvantages
•   Approaches




                    22
Dynamic Analysis Advantages
•   Only requires a running system to perform a test
•   No requirement to have access to source code or binary code
•   No need to understand how to write software or execute builds
     – Tools tend to be more “fire and forget”
•   Tests a specific, operational deployment
     – Can find infrastructure, configuration and patch errors that Static Analysis tools will
       miss
Dynamic Analysis Disadvantages
•   Limited scope of what can be found
     – Application must be footprinted to find the test area
     – That can cause areas to be missed
     – You can only test what you have found
•   No access to actual instructions being executed
     – Tool is exercising the application
     – Pattern matching on requests and responses
Approaches
•   Where possible/reasonable confirm findings of the source code review
•   Determine if mitigating factors impact severity
     – WAFs, SSO, etc
     – Be careful with this
•   Look at things easiest to test on a running application
     – M
       Macro error h dli
                   handling
     – Authentication and authorization implementation




                                                                           25
Bringing Approaches Together
•   These approaches feed one another
    – Valuable to be able to re-run tools and iterate between static and dynamic testing
•   Results must be communicated i th context th Th t M d l
    R   lt     tb         i t d in the   t t the Threat Model
    – Severity, compliance implications, etc




                                                                                           26
Presenting Results




                     27
Presenting Results
•   Universal developer reaction:
     – “That’s not exploitable”
     – “That’s not the way it works in production”
        That s                         production
•   Demonstrations of attacks can inspire comprehension
     – This can be a trap – often demonstrating exploitability of a vulnerability takes longer
       than fixing the vulnerability
•   Properly characterize mitigating factors
     – Often deployed incorrectly
     – Code has a tendency to migrate from application to application
                          y       g         pp             pp
•   Risk is important – so is the level of effort required to fix




                                                                                                 28
Questions?
Dan Cornell
dan@denimgroup.com
Twitter: @d i l
T itt @danielcornell
                  ll

(210) 572-4400

Web: www.denimgroup.com
Blog: denimgroup.typepad.com




                               29

More Related Content

PPTX
Security operation center (SOC)
PPTX
CyberSecurity
PPTX
What is Penetration Testing?
PPTX
Red team and blue team in ethical hacking
PDF
ICION 2016 - Cyber Security Governance
PPT
IT System & Security Audit
PPT
Scanning web vulnerabilities
PDF
Penetration testing
Security operation center (SOC)
CyberSecurity
What is Penetration Testing?
Red team and blue team in ethical hacking
ICION 2016 - Cyber Security Governance
IT System & Security Audit
Scanning web vulnerabilities
Penetration testing

What's hot (20)

PPT
Introduction to it auditing
PDF
Building an SRE Organization @ Squarespace
PDF
Vulnerability and Patch Management
PDF
Secure Design: Threat Modeling
PDF
Cyber Security Governance
PPTX
Chapter 5: Asset Management
PDF
When and How to Set up a Security Operations Center
PDF
Learn from the Experts: Using DORA Metrics to Accelerate Value Stream Flow
PPT
Chapter 3: Information Security Framework
PDF
DevSecOps Jenkins Pipeline -Security
PDF
Understanding Cyber Attack - Cyber Kill Chain.pdf
PPTX
Zero Trust Framework for Network Security​
PPTX
Security and DevOps Overview
PDF
DevOps Transformation: Learnings and Best Practices
PPTX
Windows 10 Migration
PPTX
Enterprise Security Architecture Design
PPTX
Cyber security fundamentals
PPTX
Introduction to cyber security amos
PPTX
An introduction to Cyber Essentials
PPT
Data Center Migration
Introduction to it auditing
Building an SRE Organization @ Squarespace
Vulnerability and Patch Management
Secure Design: Threat Modeling
Cyber Security Governance
Chapter 5: Asset Management
When and How to Set up a Security Operations Center
Learn from the Experts: Using DORA Metrics to Accelerate Value Stream Flow
Chapter 3: Information Security Framework
DevSecOps Jenkins Pipeline -Security
Understanding Cyber Attack - Cyber Kill Chain.pdf
Zero Trust Framework for Network Security​
Security and DevOps Overview
DevOps Transformation: Learnings and Best Practices
Windows 10 Migration
Enterprise Security Architecture Design
Cyber security fundamentals
Introduction to cyber security amos
An introduction to Cyber Essentials
Data Center Migration
Ad

Similar to Application Assessment Techniques (20)

PDF
Rolling Out An Enterprise Source Code Review Program
PPT
Software Security in the Real World
PDF
Web Application Remediation - OWASP San Antonio March 2007
PDF
Application Security Program Management with Vulnerability Manager
PDF
Static Analysis Techniques For Testing Application Security - Houston Tech Fest
KEY
How to break web applications
PDF
Ibm עמרי וייסמן
PDF
Omri
PDF
Ibm עמרי וייסמן
PPTX
Hacker vs Tools: Which to Choose?
PPTX
Hacker vs tools
PDF
Vulnerability Management In An Application Security World
PDF
Bringing the hacker mindset into requirements and testing by Eapen Thomas and...
PDF
Dirty Little Secret - Mobile Applications Invading Your Privacy
PDF
The Future of Software Security Assurance
PDF
Secure Programming With Static Analysis
PPTX
How to Get the Most Out of Security Tools
PPTX
Application Security TRENDS – Lessons Learnt- Firosh Ummer
PDF
AppSec in an Agile World
PPTX
Integrating security into Continuous Delivery
Rolling Out An Enterprise Source Code Review Program
Software Security in the Real World
Web Application Remediation - OWASP San Antonio March 2007
Application Security Program Management with Vulnerability Manager
Static Analysis Techniques For Testing Application Security - Houston Tech Fest
How to break web applications
Ibm עמרי וייסמן
Omri
Ibm עמרי וייסמן
Hacker vs Tools: Which to Choose?
Hacker vs tools
Vulnerability Management In An Application Security World
Bringing the hacker mindset into requirements and testing by Eapen Thomas and...
Dirty Little Secret - Mobile Applications Invading Your Privacy
The Future of Software Security Assurance
Secure Programming With Static Analysis
How to Get the Most Out of Security Tools
Application Security TRENDS – Lessons Learnt- Firosh Ummer
AppSec in an Agile World
Integrating security into Continuous Delivery
Ad

More from Denim Group (20)

PDF
Long-term Impact of Log4J
PDF
Threat Modeling the CI/CD Pipeline to Improve Software Supply Chain Security ...
PDF
Threat Modeling the CI/CD Pipeline to Improve Software Supply Chain Security ...
PDF
Optimizing Security Velocity in Your DevSecOps Pipeline at Scale
PDF
Application Asset Management with ThreadFix
PDF
OWASP San Antonio Meeting 10/2/20
PDF
AppSec Fast and Slow: Your DevSecOps CI/CD Pipeline Isn’t an SSA Program
PDF
Using Collaboration to Make Application Vulnerability Management a Team Sport
PDF
Managing Penetration Testing Programs and Vulnerability Time to Live with Thr...
PDF
Security Champions: Pushing Security Expertise to the Edges of Your Organization
PDF
The As, Bs, and Four Cs of Testing Cloud-Native Applications
PDF
An Updated Take: Threat Modeling for IoT Systems
PPTX
Continuous Authority to Operate (ATO) with ThreadFix – Bringing Commercial In...
PDF
A New View of Your Application Security Program with Snyk and ThreadFix
PDF
Enabling Developers in Your Application Security Program With Coverity and Th...
PDF
AppSec in a World of Digital Transformation
PDF
The As, Bs, and Four Cs of Testing Cloud-Native Applications
PDF
Enabling Developers in Your Application Security Program With Coverity and Th...
PDF
AppSec in a World of Digital Transformation
PDF
Enumerating Enterprise Attack Surface
Long-term Impact of Log4J
Threat Modeling the CI/CD Pipeline to Improve Software Supply Chain Security ...
Threat Modeling the CI/CD Pipeline to Improve Software Supply Chain Security ...
Optimizing Security Velocity in Your DevSecOps Pipeline at Scale
Application Asset Management with ThreadFix
OWASP San Antonio Meeting 10/2/20
AppSec Fast and Slow: Your DevSecOps CI/CD Pipeline Isn’t an SSA Program
Using Collaboration to Make Application Vulnerability Management a Team Sport
Managing Penetration Testing Programs and Vulnerability Time to Live with Thr...
Security Champions: Pushing Security Expertise to the Edges of Your Organization
The As, Bs, and Four Cs of Testing Cloud-Native Applications
An Updated Take: Threat Modeling for IoT Systems
Continuous Authority to Operate (ATO) with ThreadFix – Bringing Commercial In...
A New View of Your Application Security Program with Snyk and ThreadFix
Enabling Developers in Your Application Security Program With Coverity and Th...
AppSec in a World of Digital Transformation
The As, Bs, and Four Cs of Testing Cloud-Native Applications
Enabling Developers in Your Application Security Program With Coverity and Th...
AppSec in a World of Digital Transformation
Enumerating Enterprise Attack Surface

Recently uploaded (20)

PPTX
Big Data Technologies - Introduction.pptx
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PPTX
MYSQL Presentation for SQL database connectivity
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
Electronic commerce courselecture one. Pdf
PPTX
Understanding_Digital_Forensics_Presentation.pptx
PDF
Spectral efficient network and resource selection model in 5G networks
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PPT
Teaching material agriculture food technology
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PDF
Empathic Computing: Creating Shared Understanding
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
Big Data Technologies - Introduction.pptx
Reach Out and Touch Someone: Haptics and Empathic Computing
The AUB Centre for AI in Media Proposal.docx
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Unlocking AI with Model Context Protocol (MCP)
MIND Revenue Release Quarter 2 2025 Press Release
Mobile App Security Testing_ A Comprehensive Guide.pdf
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Per capita expenditure prediction using model stacking based on satellite ima...
MYSQL Presentation for SQL database connectivity
NewMind AI Weekly Chronicles - August'25 Week I
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Electronic commerce courselecture one. Pdf
Understanding_Digital_Forensics_Presentation.pptx
Spectral efficient network and resource selection model in 5G networks
20250228 LYD VKU AI Blended-Learning.pptx
Teaching material agriculture food technology
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
Empathic Computing: Creating Shared Understanding
“AI and Expert System Decision Support & Business Intelligence Systems”

Application Assessment Techniques

  • 1. Application Assessment Techniques OWASP Northern Virginia August 6th, 2009
  • 2. Agenda • Background • Common Pitfalls in Application Assessment • Moving Beyond – Threat Modeling – Code Review – Dynamic Testing y g • Presenting Results • Questions / Panel Discussion 1
  • 3. Background • Dan Cornell – Principal at Denim Group www.denimgroup.com – Software Developer: MCSD Java 2 Certified Programmer MCSD, – OWASP: Global Membership Committee, Open Review Project, SA Chapter Lead • Denim Group – Application Development • Java and .NET – Application Security • Assessments, penetration tests, code reviews, training, process consulting 2
  • 4. How Not To Do It 3
  • 5. How Not To Do It • Q: What are you all doing to address application security concerns in your organization? • A: We b A W bought “XYZ Scanner” ht S ” • Q: Okay… Are you actually using it? • A: We ran some scans • Q: And how did that go? • A: Oh we found some stuff… • Q: How did you address those issues? • A: I think we sent the report to the developers. Not sure what they did with them. I guess I ought to check in on that… 4
  • 6. Goals of Application Assessment • Vary by organization, by application and by assessment • Determine the security state of an application • Characterize risk to executives and decision makers • Prove a point p • Set the stage for future efforts 5
  • 7. Common Pitfalls in Application Assessment 6
  • 8. Common Pitfalls in Application Assessment • Ad hoc approach – Non-repeatable, non-comprehensive • Reliance on automated t l R li t t d tools – Can only find a subset of vulnerabilities – false negatives – Even the good tools need tuning to reduce false positives • Current commercial tools are biased – Rulesets and capabilities typically over-focused on web applications • Too focused on one approach – Static and dynamic testing have different strengths – Economic concerns constrain the amount of testing that can be performed – make the most of the time you have 7
  • 9. Moving Beyond • Automated versus Manual • Threat Modeling • Dynamic Testing • Source Code Review 8
  • 11. Automated Versus Manual • Automated tools are great at: – Consistency - not getting tired – Data flow analysis • Automated tools are terrible for: – Understanding business context • Manual testing is great at: – Identifying business logic flaws • Manual testing is terrible for: 10
  • 12. Threat Modeling • Provides high-level understanding of the system – Useful for creating a structured test plan • Provides P id application context li ti t t – Crucial for characterizing results • Complementary with Abuse Cases 11
  • 13. Threat Modeling Approach • Establish scope and system boundaries • Decompose the system into a Data Flow Diagram (DFD) • Assign potential threats based on asset types 12
  • 15. Mapping Threats to Asset Types Threat Type External Process Data Flow Data Store Interactor S – Spoofing S S fi Yes Y Yes Y T – Tampering Yes Yes Yes R – Repudiation Yes Yes Yes I – Information Disclosure Yes Yes Yes D – Denial of Service Yes Yes Yes E – Elevation of Privilege Yes 14
  • 16. Threat Modeling • Result is a structured, repeatable list of threats to check – Strength is to find known problems repeatably • Augment with Ab A t ith Abuse C Cases – “What could go wrong” scenarios – More creative and unstructured 15
  • 17. Dynamic, Static and Manual Testing
  • 18. Source Code Review • Advantages • Disadvantages • Approaches 17
  • 19. Static Analysis Advantages • Have access to the actual instructions the software will be executing – No need to guess or interpret behavior – Full access to all the software’s possible behaviors software s • Remediation is easier because you know where the problems are
  • 20. Static Analysis Disadvantages • Require access to source code or at least binary code – Typically need access to enough software artifacts to execute a build • Typically T i ll require proficiency running software b ild i fi i i ft builds • Will not find issues related to operational deployment environments
  • 21. Approaches • Run automated tools with default ruleset – Provides a first-cut look at the security state of the application – Identify “hot spots hot spots” • Craft custom rules specific to the application – 3rd party code – Break very large applications into manageable chunks – Application-specific APIs – sources, sinks, filter functions – Compliance-specific constructs • This is an iterative process 20
  • 22. Approaches • Auditing results from an automated scan – Typically must sample for larger applications (or really bad ones) – Many results tend to cluster on a per application basis – coding idioms for error per-application handling, resource lifecycle • Manual review – Must typically focus the effort for economic reasons – Hot spots from review of automated results – Security-critical functions from review of automated results – encoding, canonicalization – Security-critical areas – Startup, shutdown 21
  • 23. Dynamic Testing • Advantages • Disadvantages • Approaches 22
  • 24. Dynamic Analysis Advantages • Only requires a running system to perform a test • No requirement to have access to source code or binary code • No need to understand how to write software or execute builds – Tools tend to be more “fire and forget” • Tests a specific, operational deployment – Can find infrastructure, configuration and patch errors that Static Analysis tools will miss
  • 25. Dynamic Analysis Disadvantages • Limited scope of what can be found – Application must be footprinted to find the test area – That can cause areas to be missed – You can only test what you have found • No access to actual instructions being executed – Tool is exercising the application – Pattern matching on requests and responses
  • 26. Approaches • Where possible/reasonable confirm findings of the source code review • Determine if mitigating factors impact severity – WAFs, SSO, etc – Be careful with this • Look at things easiest to test on a running application – M Macro error h dli handling – Authentication and authorization implementation 25
  • 27. Bringing Approaches Together • These approaches feed one another – Valuable to be able to re-run tools and iterate between static and dynamic testing • Results must be communicated i th context th Th t M d l R lt tb i t d in the t t the Threat Model – Severity, compliance implications, etc 26
  • 29. Presenting Results • Universal developer reaction: – “That’s not exploitable” – “That’s not the way it works in production” That s production • Demonstrations of attacks can inspire comprehension – This can be a trap – often demonstrating exploitability of a vulnerability takes longer than fixing the vulnerability • Properly characterize mitigating factors – Often deployed incorrectly – Code has a tendency to migrate from application to application y g pp pp • Risk is important – so is the level of effort required to fix 28
  • 30. Questions? Dan Cornell dan@denimgroup.com Twitter: @d i l T itt @danielcornell ll (210) 572-4400 Web: www.denimgroup.com Blog: denimgroup.typepad.com 29