SlideShare a Scribd company logo
ADVERSARY SIMULATION
“RED CELL”
APPROACHES TO
IMPROVING SECURITY
Talk Background
Introduction and Overview of Red Teaming
What are our organizations Challenges &
Opportunities?
What makes Red Teaming / Red Cell effective?
What is Adversary simulation
TLDR… Extra Resources
$whoami
• Chris Hernandez
• Red Teamer
• Former:
• Pentester @Veris Group ATD
• Lots of other stuff
• Exploit / Bug Research
• Blog= Nopsled.ninja
• @piffd0s
What is Red Teaming?
• Mindset and Tactics
• Takes many forms, Tabletop Exercises,
Alternative analysis, computer models, and
vulnerability probes.
• Not limited to InfoSec
• Critical Thinking
• Cognitive Psychologist
What are its origins?
• Originated in the 1960’s military war-game
exercises
• “Red” = the soviet union
• 1963 - First public / documented example was
a red team exercise structured around
procuring a long range bomber.
• Most early examples are structured around
determining Soviet Unions capability
Why does this matter to me?
Pass the salt…
Try This…
Secure 360   adversary simulation
What happens when we fail?
Unified Vision ‘01 & Millennium Challenge ‘02
• Millennium challenge ’02
• Red Cell Is highly restricted in
its actions
• Red Cell pre-emptively attacks
US navy fleet with all of their
air and sea resources sinking
21 Navy Vessels
• White Cell “refloats” sunken
navy vessels
• Unified Vision ’01
• White Cell informs Red Cell
that Blue Team has destroyed
all of their 21 hidden ballistic
missile silos
• Blue Team commander never
actually new the location of
any of the 21 silos
What happens when we succeed?
RedTeam Success Stories
• New York Marathon, NYPD and New York Roadrunners
• Cover scenarios like:
• How do you identify tainted water sources
• How to respond if drones show up in specific locations
• Race can be diverted at any point
• Israeli Defense Force – “Ipcha Mistabra”
• The opposite is most likely
• Small group in the intelligence branch
• Briefs Officials and Leaders on opposite explanations for scenarios
How does any of that apply to my business?
• Red Team Failure
• Agendas
• Restricted actions
• Poor Communication
• Narrow scope
• Unrealistic Scenarios
• Not having a red team
• Red Team Success
• Good questions
• Make no assumptions
• Open Access
• Fluid Communication
• Realistic Scenarios
• Agendas
What makes a red team
effective?
Red Cell Effectiveness
• Ex. 57th adversary tactics group
• Only Highly skilled pilots are
allowed to become “aggressors”
• Allowed only to use known
adversary tactics and techniques
depending on who they are
emulating
• Same should apply to all red
teams
• Adversary emulation is key to
realistic simulations
Red Cell Effectiveness
• Effective adversary emulation
can mean being a “worse”
threat actor
• Tests defenders “post-
compromise” security posture.
Aka “assumed breach model”
• Post compromise / foothold
can also save valuable time
and money.
What are the benefits of an effective Red Cell?
• Train and measure IR teams detection and response.
• MSFT measures this as MTTD MTTR Mean time to
detect, and Mean Time to Recovery
• Validates investment in very expensive security
products, services, and subscriptions
Putting it all together – Adversary simulation
• Emulate realistic threat actors TTPs
• Assume breach model
• Model attacker activity to your environment / risk
• Information exchange between red and blue teams*
• Protect Red Team culture
• Repeat in a reasonable amount of time
ADDITIONAL RESOURCES
Books:
Red Team – Micah Zenko
Applied Critical Thinking Handbook – UFMCS
Online:
Microsoft Enterprise Cloud Redteaming Whitepaper
2015’s Red team Tradecraft / Adversary Simulation – Raphael Mudge
The Pyramid of Pain – David Bianco
Veris Group - Adaptive Threat Devision – Will Shroeder and Justin Warner
The Adversary Manifesto - Crowdstrike

More Related Content

PPTX
Adversary simulation
PPTX
Presentation cst
KEY
Life/Programming lessons from Starcraft
PDF
Agile by Sun Tzu
PPTX
The Art of Scrum - Agile Principles in ‘Sun Tzu's Art of War’ A BA perspectiv...
PPTX
An introduction to ROP
PPTX
Return oriented programming (ROP)
PPTX
Red Teaming and Energy Grid Security
Adversary simulation
Presentation cst
Life/Programming lessons from Starcraft
Agile by Sun Tzu
The Art of Scrum - Agile Principles in ‘Sun Tzu's Art of War’ A BA perspectiv...
An introduction to ROP
Return oriented programming (ROP)
Red Teaming and Energy Grid Security

Similar to Secure 360 adversary simulation (20)

PPTX
bsides NOVA 2017 So You Want to Be a Cyber Threat Analyst eh?
PPTX
The Offensive Defender | Cyberspace Trapping
PDF
When is a Red Team a Red Team
PDF
[AVTOKYO 2017] What is red team?
PDF
Leveraging red for defense
PPTX
Finding the Sweet Spot: Counter Honeypot Operations (CHOps) by Jonathan Creek...
PDF
Mechanical Turk Demystified: Best practices for sourcing and scaling quality ...
PDF
Going Purple : From full time breaker to part time fixer: 1 year later
PPTX
BSides Huntsville Keynote - Active Cyber Defense Cycle
PPTX
44CON 2013 - Surviving the 0-day - Reducing the Window of Exposure - Andreas ...
PPTX
Defending Enterprise IT - beating assymetricality
PPT
Business ideas or business opportunities
PPTX
Threat Modeling Lessons From Star Wars
PDF
Building a Successful Internal Adversarial Simulation Team - Chris Gates & Ch...
PDF
Threat Intelligence: State-of-the-art and Trends - Secure South West 2015
PPTX
Software Security Initiative Capabilities: Where Do I Begin?
PPTX
6 myths of Software Testing (As I have seen during my testing journey)
PPT
Modelling "Effects" in Simulation and Training.
PDF
huntpedia.pdf
PDF
Threat Modeling: Best Practices
bsides NOVA 2017 So You Want to Be a Cyber Threat Analyst eh?
The Offensive Defender | Cyberspace Trapping
When is a Red Team a Red Team
[AVTOKYO 2017] What is red team?
Leveraging red for defense
Finding the Sweet Spot: Counter Honeypot Operations (CHOps) by Jonathan Creek...
Mechanical Turk Demystified: Best practices for sourcing and scaling quality ...
Going Purple : From full time breaker to part time fixer: 1 year later
BSides Huntsville Keynote - Active Cyber Defense Cycle
44CON 2013 - Surviving the 0-day - Reducing the Window of Exposure - Andreas ...
Defending Enterprise IT - beating assymetricality
Business ideas or business opportunities
Threat Modeling Lessons From Star Wars
Building a Successful Internal Adversarial Simulation Team - Chris Gates & Ch...
Threat Intelligence: State-of-the-art and Trends - Secure South West 2015
Software Security Initiative Capabilities: Where Do I Begin?
6 myths of Software Testing (As I have seen during my testing journey)
Modelling "Effects" in Simulation and Training.
huntpedia.pdf
Threat Modeling: Best Practices
Ad

Recently uploaded (20)

PDF
Spectral efficient network and resource selection model in 5G networks
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
Machine learning based COVID-19 study performance prediction
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PPTX
MYSQL Presentation for SQL database connectivity
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PPTX
Programs and apps: productivity, graphics, security and other tools
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PPTX
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Review of recent advances in non-invasive hemoglobin estimation
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
Approach and Philosophy of On baking technology
Spectral efficient network and resource selection model in 5G networks
20250228 LYD VKU AI Blended-Learning.pptx
Reach Out and Touch Someone: Haptics and Empathic Computing
Machine learning based COVID-19 study performance prediction
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Diabetes mellitus diagnosis method based random forest with bat algorithm
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
MYSQL Presentation for SQL database connectivity
NewMind AI Weekly Chronicles - August'25 Week I
Programs and apps: productivity, graphics, security and other tools
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
Dropbox Q2 2025 Financial Results & Investor Presentation
The AUB Centre for AI in Media Proposal.docx
Review of recent advances in non-invasive hemoglobin estimation
Advanced methodologies resolving dimensionality complications for autism neur...
Approach and Philosophy of On baking technology
Ad

Secure 360 adversary simulation

  • 2. Talk Background Introduction and Overview of Red Teaming What are our organizations Challenges & Opportunities? What makes Red Teaming / Red Cell effective? What is Adversary simulation TLDR… Extra Resources
  • 3. $whoami • Chris Hernandez • Red Teamer • Former: • Pentester @Veris Group ATD • Lots of other stuff • Exploit / Bug Research • Blog= Nopsled.ninja • @piffd0s
  • 4. What is Red Teaming?
  • 5. • Mindset and Tactics • Takes many forms, Tabletop Exercises, Alternative analysis, computer models, and vulnerability probes. • Not limited to InfoSec • Critical Thinking • Cognitive Psychologist
  • 6. What are its origins?
  • 7. • Originated in the 1960’s military war-game exercises • “Red” = the soviet union • 1963 - First public / documented example was a red team exercise structured around procuring a long range bomber. • Most early examples are structured around determining Soviet Unions capability
  • 8. Why does this matter to me?
  • 12. What happens when we fail?
  • 13. Unified Vision ‘01 & Millennium Challenge ‘02 • Millennium challenge ’02 • Red Cell Is highly restricted in its actions • Red Cell pre-emptively attacks US navy fleet with all of their air and sea resources sinking 21 Navy Vessels • White Cell “refloats” sunken navy vessels • Unified Vision ’01 • White Cell informs Red Cell that Blue Team has destroyed all of their 21 hidden ballistic missile silos • Blue Team commander never actually new the location of any of the 21 silos
  • 14. What happens when we succeed?
  • 15. RedTeam Success Stories • New York Marathon, NYPD and New York Roadrunners • Cover scenarios like: • How do you identify tainted water sources • How to respond if drones show up in specific locations • Race can be diverted at any point • Israeli Defense Force – “Ipcha Mistabra” • The opposite is most likely • Small group in the intelligence branch • Briefs Officials and Leaders on opposite explanations for scenarios
  • 16. How does any of that apply to my business? • Red Team Failure • Agendas • Restricted actions • Poor Communication • Narrow scope • Unrealistic Scenarios • Not having a red team • Red Team Success • Good questions • Make no assumptions • Open Access • Fluid Communication • Realistic Scenarios • Agendas
  • 17. What makes a red team effective?
  • 18. Red Cell Effectiveness • Ex. 57th adversary tactics group • Only Highly skilled pilots are allowed to become “aggressors” • Allowed only to use known adversary tactics and techniques depending on who they are emulating • Same should apply to all red teams • Adversary emulation is key to realistic simulations
  • 19. Red Cell Effectiveness • Effective adversary emulation can mean being a “worse” threat actor • Tests defenders “post- compromise” security posture. Aka “assumed breach model” • Post compromise / foothold can also save valuable time and money.
  • 20. What are the benefits of an effective Red Cell? • Train and measure IR teams detection and response. • MSFT measures this as MTTD MTTR Mean time to detect, and Mean Time to Recovery • Validates investment in very expensive security products, services, and subscriptions
  • 21. Putting it all together – Adversary simulation • Emulate realistic threat actors TTPs • Assume breach model • Model attacker activity to your environment / risk • Information exchange between red and blue teams* • Protect Red Team culture • Repeat in a reasonable amount of time
  • 22. ADDITIONAL RESOURCES Books: Red Team – Micah Zenko Applied Critical Thinking Handbook – UFMCS Online: Microsoft Enterprise Cloud Redteaming Whitepaper 2015’s Red team Tradecraft / Adversary Simulation – Raphael Mudge The Pyramid of Pain – David Bianco Veris Group - Adaptive Threat Devision – Will Shroeder and Justin Warner The Adversary Manifesto - Crowdstrike

Editor's Notes

  • #2: It was a dark and stormy night. “Captain, captain, wake up!” “Ohh … What is it?” “Sorry to awaken you sir, but we have a serious problem.” “Well what is it?” “There’s a ship in our sea-lane about 20 miles away, and they refuse to move.” “Tell them to move.” “Sir, we have. They won’t move.” “I’ll tell them!” The signal goes out: “Move starboard 20 degrees. At once!” The signal returns: “Move starboard yourself 20 degrees. At once.” “I can’t believe this. I mean, I’m a captain. Let them know who I am. I am important.” The signal goes out: “This is Captain Horatio Hornblower the 26th commanding you to move starboard 20 degrees at once.” The signal returns: “This is seaman Carl Jones the third commanding you to move starboard 20 degrees at once.” “What arrogance! Who is this joker? I mean, we’re a battleship! We could just blow them — let them know who we are!” The signal goes out: “This is the Mighty Missouri, Flagship of the Seventh Fleet!” The signal returns: “This is the Lighthouse.” That’s a story thats found in the Naval Proceedings Manual where they literally interpreted a lighthouse to be a ship, but I like the story because it helps me introduce this subject: That there are specific ways of thinking that are very common, that make it very difficult to be effective at securiing an organization. And I want to share some of those thought patterns with you today so will be aware of them and can hopefully operate your organization more effectively with that knowledge. So, I’d like to share with you some things I’ve learned in my career in information security, these are my perspectives and opinions on techniques for improving the security of your organization…. The ideas are not new or revolutionary. I’m just trying to share what, in my experience I feel works well in regards to redteaming
  • #3: So at a high level, we talk about…
  • #4: Just briefly let me tell you my story …. I’ve worn various security hats in my career, some defensive and offensive, from helpdesk to redteaming I’ve done about everything in between and I like to think that that gives me some perspective on the challenges of security in an organization.
  • #6: Both Approach, Mindset, and TacticsIf you are a leader in an environment you probably don’t know everything that is going on. If you are wise enough to come to this conclusion you need a red team to be the bring an alternate perspective The alternative perspective would apply to your problems, and the problems of your adversary
  • #8: Earliest evidence of the origins of redteaming came out of military wargaming exercises, 1976 – Hardliners in the Ford administration didn’t agree with the CIA’s conclusion. Believed that the U.S. had a capability gap. Team “B” of experts with access to all information about known soviet military capabilities and came to an alternative conclusion compared to the CIA report.
  • #10: Example of a scotoma Red teams responsibility is to see other teams blind spots and predict failures To do this they need to be aware of their own blind spots a partial loss of vision or a blind spot in an otherwise normal visual field.
  • #11: Lets try a game… Find all the red you can in the room… Now… Where is the brown
  • #14: Military examples Translate this to real world / business scenarios
  • #16: Multiple contingency plans for mulpiple scenarios As a result of the redteam simulation they are able to better pretect the marathon - They are directed to come to the opposite conclusion of whatever the current plan or conventional wisdom is. They don’t just brief generals. They go to parliament. They brief the prime minister’s office and the prime minister’s Cabinet. They describe their jobs—one of the individuals I know who did the briefings—as exhaustive. You have to essentially be argumentative by design. You have to challenge and doubt everything that happens.
  • #19: The key takeaway here is to understand that it is the highly skilled indivudual who can become an aggressor You have to be good enough, to restrict yourself to a specific capability or skillset, but that capability and skillset changes based on who you are emulating
  • #20: Image credit: david bianco
  • #21: ----- Meeting Notes (1/20/16 15:14) ----- nobody wants to drop 100k on a fireye and find out its configured wrong
  • #22: This is a great argument for Red Teams ingesting threat intelligence reports < they can work it into their tradecraft for redteam operations If you want to spend a year on an op working to get in, with an 0-day you can, but the simple fact is, if an adversary wants in bad enough, they will get in. Again, if you know an adversaries MO, storyboard it, and determine where it could get caught and where defenses are lacking Debrief after op completion Teams need to be external in terms of culture, but internal and aware in terms of critical thought Demoralizing if the blue team gets crushed week in and out