Reverse Engineering
the Wetware
Understanding Human Behavior to Improve
Information Security
#retww
Alexandre Sieira
@AlexandreSieira
CTO @NiddelCorp
Principal @MLSecProject
Matt Hathaway
@TheWay99
Products Leader @Rapid7
Let’s Start On A Very Serious Note
Because we are very serious people. Seriously.
People… are not rational.
More evidence is available at @awardsdarwin if you’re still not convinced.
The People-Process-Technology Triad
• People define and execute
processes.
• People decide funding, implement,
operate and/or monitor the
technology.
• Your adversaries are people.
• People are key.
Yes, people. Really.
• A lot of humans are needed to keep any organization secure
• Executive teams – funding (or lack thereof)
• Security teams – build the process, use the technology
• IT teams – implement controls, operate infrastructure
• Security vendors – ummm… perfect people, of course, right?
• The end users – your organization wouldn’t exist without them
• Blame game achieves nothing
• Executives don’t get it.
• It’s the shitty tech, not us.
• It’s the shitty team, not the tech.
• I took that dumb training, but I want my bonus.
Why do Humans Think Like They Do?
• Wetware evolved to meet survival
requirements of hunter-gatherer lifestyle and
threat model:
• Low latency, near-real-time decisions;
• High aversion to loss;
• False positives have much lower cost than false
negatives;
• Social living in small communities where it is easy to
keep track of reputations.
... so not really surprising to find heuristics don’t
work as well now.
Why do Humans Think Like They Do?
From ”The Art of Thinking Security Clearly” RSAC USA 2015 presentation by @apbarros
System 1 operates automatically and quickly, with little or no
effort, no sense of voluntary control and no self-awareness.
”System 1 runs the show, that’s the one you want to move.” –
Daniel Kahneman
System 2 allocates attention to the effortful mental activities that
demand it, including complex computations.
Its operations are often associated with the subjective experience of
agency, choice, and concentration.
InfoSec, Know Thyself
No, not in the biblical sense. Wrong conference.
You Are Not Above Average At Everything
• Ever heard of this town? Yeah, me
neither.
• “…all the women are strong, all the
men are good-looking, and all the
children are above average.” –
Garrison Keillor
• I’m a better than average X
• I am better than the sysadmin
who got pwned at RSA.
Experts Who Are Actually Amateurs
• Successful lawyers of [major law firm]: I spend all day avoiding traps.
No phishing email is going to work on me.
• Some silly finance team wired $1 million to help the CFO close a last-
minute deal in China. How could they fall for that?
Experts Who Are Actually Amateurs
Experts Who Feel Like Amateurs
• Can’t get through all your email?
• In awe of a colleague’s
knowledge?
• Hungover and can’t focus?
• Feel underqualified and at risk of
being exposed at any moment?
• Recommended reading: @sroberts “Imposter Syndrome in DFIR” blog
“Expert” Reality
• If you don’t think you can be
outwitted, you probably already
are
• If you don’t ever feel like an
imposter, you’re not challenging
yourself to get better
Professional Certifications
• Can be objectively useful as signals and
filters in professionals / organizations
market (information asymmetry);
• Getting a certificate causes you to
overestimate your own expertise;
H/T to @fsmontenegro
• Positive bias to vendor and/or other certificate holders:
• “If I went through the effort of certifying on that vendor’s product and I consider myself a
good person, then that vendor must be good too”;
• Endowment effect;
• Might impact vendor selection and/or hiring processes.
More Tired == Less Rational
• Tired people are less likely to make rational
decisions:
• Less oversight from system 2;
• Less capacity to avoid and resist temptation.
• Think IT maintenance windows and DFIR:
• Mandatory down time for people involved?
• Follow the sun teams working on their own
mornings?
• References:
• https://hbr.org/2016/02/dont-make-important-
decisions-late-in-the-day
• http://sloanreview.mit.edu/article/why-sleep-is-a-
strategic-resource/
You Will Be Judged Unfairly
• Think the team was at fault for
Target/Neiman Marcus breaches?
• February 2014:
• “Hackers Set Off 60,000 Alerts While
Bagging Credit Card Data”
• December 2014:
• Lawsuit not dismissed because of
“disabling certain security features”
Your Heart Is Not Scientific
• Always had a “gut feeling” about
something you couldn’t prove?
• Know in your heart that every time
you got sick, you waited too long?
• Maybe you have a specific event
which always seems to reveal the
truth?
We Search For Explanations In What We Have
• Ever heard an educated person explain that their team only lost
because they didn’t stick to their ritual?
• What if simply checking the hash against VirusTotal was enough to
find malware the first time, so you kept doing it?
Test. Trick. Get Attention. Trigger System 2.
• Still not considering phishing
your employees?
• It might be the only way they’ll
ever think twice
• Incident response exercises
seem cheesy?
• Consider using random data or
fake incidents to go further
Cheating and Morality
Stop shifting in your seats, not that kind of cheating.
Why and When do Humans Cheat?
• Rational humans would cheat when cost-
benefit analysis merited it:
• Personal gain from cheating;
• Chance of being caught;
• Penalty if caught.
• Most actual humans cheat when:
• There’s a gain for self or others;
• It’s possible to justify or rationalize;
• ”Fudge factor” model.
Dan Ariely - Dishonesty
• ”Distance” makes cheating
easier to justify:
• Moving the golf ball: hand << foot
<< club;
• Same monetary values, different
behaviors:
• Stolen Skype account and charges
against PayPal account
• “Would this person have taken
cash from my wallet?”
Cybercrime Feels Victimless
• Just as spending tokens removes
a lot of the pain
• Only writing software or sending
emails prevents criminals from
feeling the guilt
• The vast majority would never
rob a stranger at knifepoint
Conflicts of Interest and Self-Deception
• Dentists that purchase CAD/CAM machines
prescribe more unnecessary treatments that use
it;
• Art appreciation under fMRI experiment –
positive bias towards vendors.
• ”Perfectly well-meaning people can get tripped
up on the quirks of the human mind, make
egregious mistakes, and still consider themselves
to be good and moral.”
WTH Effect
• The more we perceive ourselves as immoral, the
more immoral behavior we allow ourselves;
• ”Normalization of Deviance” effect over time;
• Pay attention to feedback from outsiders, make
data-driven decisions when possible;
• ”Resetting” events / rituals might help overcome
this: confession, New Year’s resolutions, Truth
and Reconciliation Commission in South Africa.
Payback and Altruism
• Coffee shop experiment shows revenge
can be unconscious justification for
dishonesty:
• 45% returned extra money if experimenter
was polite;
• Only 14% if impolite.
• Social utility also a factor:
• ”Robin Hood” effect;
• Impact of not cheating on peers or
underlings due to group quotas and
bonuses.
The Value of Good Examples
”The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves”, Dan Ariely
Priming the Morality Cache
• If you remind people about moral codes or their
responsibility to act correctly cheating is
reduced:
• Honor code;
• Religious commandments;
• Personal commitment.
• Simply changing signature to top of insurance
form instead of bottom increased self-reported
car mileage by 15%;
• Implications for process and UI design.
Be Grateful for
Unfriendly Auditors!
… now that’s something you don’t
hear every day.
Designing Rules and Incentives
”When the rules are somewhat open to interpretation, when there are
gray areas, and when people are left to score their own performance –
even honorable games may be traps for dishonesty”
”The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves” – Dan Ariely
http://www.zdnet.com/article/facebook-engages-in-instagram-bug-spat-with-security-researcher/
But At Least We Can Trust
Vendors, Right?
You must be nodding right now. Why else would this section be here?
Don’t Trust Statistics… Without Specifics
• “…the malware that was used would have slipped
or probably got past 90 percent of internet
defenses that are out there today in private
industry and [would have] challenged even state
government.”
-- Joseph Demarest, assistant director of the FBI’s cyber
division
• “FBI: 90% Of US Companies Could Be Hacked Just
Like Sony”
-- Business Insider
Why Does FUD Marketing Still Exist?
• Ever hear “such a dangerous time we live in”?
• Did you FLY HERE?!
• What if you told your CFO a solution would
prevent stolen HVAC accounts from
authenticating?
InfoSec marketing rings a bell?
”The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves”, Chapter 7
The Experts Again. From Adjacent Fields.
• Ever heard of “green lumber” trading?
• An expert on green lumber would make a
great trader, right?
• NSA retiree == InfoSec expert?
• Intelligence official [asked about Snowden’s
skills]: “It’s 2013 and the NSA is stuck in
2003 technology.”
• Buy our detection because… HD!
Always Consider What’s Unsaid
• Would you buy something that
“eliminated 90% of herpes”?
• What if it said “10% of herpes survives”?
• So now “blocks 90% of malware”?
• “Pricing starts at $99”?
Find Leadership Who Lets You Admit #FAIL
• Ever feel like it would be a waste to not
finish your meal at a restaurant?
• Think you’ve already committed to this
talk, so you should stay?
• How about “no more money for
detection until the SIEM ROI is proven”?
Should You Always Push For ‘Build’ Over ‘Buy’?
• Never give a product consideration
because of a shitty demo?
• IT and Security software UIs have
been terrible for a decade, so it
won’t ever change, right?
• Think your security team can build
better software every time?
Human Minds Find Patterns In Randomness
• Confident you can differentiate
between random and malicious?
• Remember iPod shuffle mode
complaints when a song was repeated?
• Manually reviewing logs all day would
probably be best for finding patterns,
though…
Look To Take The Conclusion From Humans
• Battling cognitive biases is hard
• We will make the wrong decision often and consistently (as covered here)
• Be aware of this and strive to call yourself out
• Humans are best at deciding what to do
• Identifying symptoms - machines
• Identifying root cause – machines
• Identifying solutions - humans
Recommended Reading
• The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves
and everything else by @danariely
• The Black Swan: The Impact of the Highly Improbable and everything else by Nassim
Nicholas Taleb
• Thinking, Fast and Slow by the legendary Daniel Kahneman
• Economic aspects of tech certifications by @fsmontenegro:
• https://fsmontenegro.wordpress.com/2016/01/03/professional-certifications-information-
asymmetry/
• https://fsmontenegro.wordpress.com/2016/01/13/professional-certifications-behavioural-
economics/
• RSAC USA 2015 – The Art of Thinking Security Clearly by @apbarros
https://www.rsaconference.com/events/us15/agenda/sessions/1531/the-art-of-thinking-security-clearly
Thank you!
Alexandre Sieira
@AlexandreSieira
CTO @NiddelCorp
Matt Hathaway
@TheWay99
Products Leader @Rapid7
• Q & A?
• Feedback?

More Related Content

PDF
The BIG ONE 2.0 - HouSecCon
PPTX
2012 777 The Seven Blind Spots in Business and How to Prevent Them
PDF
Social engineering and indian jugaad
PDF
How Four Cognitive Biases Deceive Analysts and Destroy Actionability
PPTX
Uma Introdução a Threat Intelligence e Threat Hunting para Empresas Sem Orçam...
PPTX
BYOD: Bring Your Own... Disaster?
PPTX
Sharing is Caring: Medindo a Eficácia de Comunidades de Compartilhamento de T...
PPTX
Data-Driven Threat Intelligence: Useful Methods and Measurements for Handling...
The BIG ONE 2.0 - HouSecCon
2012 777 The Seven Blind Spots in Business and How to Prevent Them
Social engineering and indian jugaad
How Four Cognitive Biases Deceive Analysts and Destroy Actionability
Uma Introdução a Threat Intelligence e Threat Hunting para Empresas Sem Orçam...
BYOD: Bring Your Own... Disaster?
Sharing is Caring: Medindo a Eficácia de Comunidades de Compartilhamento de T...
Data-Driven Threat Intelligence: Useful Methods and Measurements for Handling...

Similar to Reverse Engineering the Wetware: Understanding Human Behavior to Improve Information Security (20)

PPTX
People are the biggest risk
PPTX
Janitor vs cleaner
PPTX
A psychological approah in "credit card fraud risk management"
PDF
The Hacking Team Hack: Lessons Learned for Enterprise Security
PDF
Social Engineering, or hacking people
PPTX
Cybercrime and the Developer Java2Days 2016 Sofia
PDF
Your're Special (But Not That Special)
PDF
Data Protection – How Not to Panic and Make it a Positive
PPTX
Devnexus 2017 Cybercrime and the Developer: How do you make a difference?
PDF
Cybersecurity is a Team Sport (SecureWorld - Dallas 2018)
PPTX
Cybercrime and the Developer: How to Start Defending Against the Darker Side...
PPTX
Jax london2016 cybercrime-and-the-developer
PPTX
How Yammer Stayed Lean Post-Acquisition: Customer Development as Survival Str...
PPTX
PPTX
The Art of Human Hacking : Social Engineering
PPTX
Corp Web Risks and Concerns
PPTX
Cyber Security 101: Training, awareness, strategies for small to medium sized...
PPTX
Security Awareness - Defined, Managed and Measured
PPTX
Top 7 Data Rules to Experience Massive Growth
PDF
Integrated Security, Safety and Surveillance Solution i3S
People are the biggest risk
Janitor vs cleaner
A psychological approah in "credit card fraud risk management"
The Hacking Team Hack: Lessons Learned for Enterprise Security
Social Engineering, or hacking people
Cybercrime and the Developer Java2Days 2016 Sofia
Your're Special (But Not That Special)
Data Protection – How Not to Panic and Make it a Positive
Devnexus 2017 Cybercrime and the Developer: How do you make a difference?
Cybersecurity is a Team Sport (SecureWorld - Dallas 2018)
Cybercrime and the Developer: How to Start Defending Against the Darker Side...
Jax london2016 cybercrime-and-the-developer
How Yammer Stayed Lean Post-Acquisition: Customer Development as Survival Str...
The Art of Human Hacking : Social Engineering
Corp Web Risks and Concerns
Cyber Security 101: Training, awareness, strategies for small to medium sized...
Security Awareness - Defined, Managed and Measured
Top 7 Data Rules to Experience Massive Growth
Integrated Security, Safety and Surveillance Solution i3S
Ad

Recently uploaded (20)

PDF
1 - Historical Antecedents, Social Consideration.pdf
PDF
A comparative study of natural language inference in Swahili using monolingua...
PDF
WOOl fibre morphology and structure.pdf for textiles
DOCX
search engine optimization ppt fir known well about this
PDF
Unlock new opportunities with location data.pdf
PDF
A novel scalable deep ensemble learning framework for big data classification...
PDF
Getting Started with Data Integration: FME Form 101
PDF
DP Operators-handbook-extract for the Mautical Institute
PDF
Univ-Connecticut-ChatGPT-Presentaion.pdf
PPTX
Group 1 Presentation -Planning and Decision Making .pptx
PDF
Five Habits of High-Impact Board Members
PDF
From MVP to Full-Scale Product A Startup’s Software Journey.pdf
PDF
sustainability-14-14877-v2.pddhzftheheeeee
PDF
Hybrid horned lizard optimization algorithm-aquila optimizer for DC motor
PDF
A Late Bloomer's Guide to GenAI: Ethics, Bias, and Effective Prompting - Boha...
PDF
How ambidextrous entrepreneurial leaders react to the artificial intelligence...
PDF
Video forgery: An extensive analysis of inter-and intra-frame manipulation al...
PDF
Transform Your ITIL® 4 & ITSM Strategy with AI in 2025.pdf
PDF
Enhancing emotion recognition model for a student engagement use case through...
PDF
Developing a website for English-speaking practice to English as a foreign la...
1 - Historical Antecedents, Social Consideration.pdf
A comparative study of natural language inference in Swahili using monolingua...
WOOl fibre morphology and structure.pdf for textiles
search engine optimization ppt fir known well about this
Unlock new opportunities with location data.pdf
A novel scalable deep ensemble learning framework for big data classification...
Getting Started with Data Integration: FME Form 101
DP Operators-handbook-extract for the Mautical Institute
Univ-Connecticut-ChatGPT-Presentaion.pdf
Group 1 Presentation -Planning and Decision Making .pptx
Five Habits of High-Impact Board Members
From MVP to Full-Scale Product A Startup’s Software Journey.pdf
sustainability-14-14877-v2.pddhzftheheeeee
Hybrid horned lizard optimization algorithm-aquila optimizer for DC motor
A Late Bloomer's Guide to GenAI: Ethics, Bias, and Effective Prompting - Boha...
How ambidextrous entrepreneurial leaders react to the artificial intelligence...
Video forgery: An extensive analysis of inter-and intra-frame manipulation al...
Transform Your ITIL® 4 & ITSM Strategy with AI in 2025.pdf
Enhancing emotion recognition model for a student engagement use case through...
Developing a website for English-speaking practice to English as a foreign la...
Ad

Reverse Engineering the Wetware: Understanding Human Behavior to Improve Information Security

  • 1. Reverse Engineering the Wetware Understanding Human Behavior to Improve Information Security #retww Alexandre Sieira @AlexandreSieira CTO @NiddelCorp Principal @MLSecProject Matt Hathaway @TheWay99 Products Leader @Rapid7
  • 2. Let’s Start On A Very Serious Note Because we are very serious people. Seriously.
  • 3. People… are not rational. More evidence is available at @awardsdarwin if you’re still not convinced.
  • 4. The People-Process-Technology Triad • People define and execute processes. • People decide funding, implement, operate and/or monitor the technology. • Your adversaries are people. • People are key.
  • 5. Yes, people. Really. • A lot of humans are needed to keep any organization secure • Executive teams – funding (or lack thereof) • Security teams – build the process, use the technology • IT teams – implement controls, operate infrastructure • Security vendors – ummm… perfect people, of course, right? • The end users – your organization wouldn’t exist without them • Blame game achieves nothing • Executives don’t get it. • It’s the shitty tech, not us. • It’s the shitty team, not the tech. • I took that dumb training, but I want my bonus.
  • 6. Why do Humans Think Like They Do? • Wetware evolved to meet survival requirements of hunter-gatherer lifestyle and threat model: • Low latency, near-real-time decisions; • High aversion to loss; • False positives have much lower cost than false negatives; • Social living in small communities where it is easy to keep track of reputations. ... so not really surprising to find heuristics don’t work as well now.
  • 7. Why do Humans Think Like They Do? From ”The Art of Thinking Security Clearly” RSAC USA 2015 presentation by @apbarros System 1 operates automatically and quickly, with little or no effort, no sense of voluntary control and no self-awareness. ”System 1 runs the show, that’s the one you want to move.” – Daniel Kahneman System 2 allocates attention to the effortful mental activities that demand it, including complex computations. Its operations are often associated with the subjective experience of agency, choice, and concentration.
  • 8. InfoSec, Know Thyself No, not in the biblical sense. Wrong conference.
  • 9. You Are Not Above Average At Everything • Ever heard of this town? Yeah, me neither. • “…all the women are strong, all the men are good-looking, and all the children are above average.” – Garrison Keillor • I’m a better than average X • I am better than the sysadmin who got pwned at RSA.
  • 10. Experts Who Are Actually Amateurs • Successful lawyers of [major law firm]: I spend all day avoiding traps. No phishing email is going to work on me. • Some silly finance team wired $1 million to help the CFO close a last- minute deal in China. How could they fall for that?
  • 11. Experts Who Are Actually Amateurs
  • 12. Experts Who Feel Like Amateurs • Can’t get through all your email? • In awe of a colleague’s knowledge? • Hungover and can’t focus? • Feel underqualified and at risk of being exposed at any moment? • Recommended reading: @sroberts “Imposter Syndrome in DFIR” blog
  • 13. “Expert” Reality • If you don’t think you can be outwitted, you probably already are • If you don’t ever feel like an imposter, you’re not challenging yourself to get better
  • 14. Professional Certifications • Can be objectively useful as signals and filters in professionals / organizations market (information asymmetry); • Getting a certificate causes you to overestimate your own expertise; H/T to @fsmontenegro • Positive bias to vendor and/or other certificate holders: • “If I went through the effort of certifying on that vendor’s product and I consider myself a good person, then that vendor must be good too”; • Endowment effect; • Might impact vendor selection and/or hiring processes.
  • 15. More Tired == Less Rational • Tired people are less likely to make rational decisions: • Less oversight from system 2; • Less capacity to avoid and resist temptation. • Think IT maintenance windows and DFIR: • Mandatory down time for people involved? • Follow the sun teams working on their own mornings? • References: • https://hbr.org/2016/02/dont-make-important- decisions-late-in-the-day • http://sloanreview.mit.edu/article/why-sleep-is-a- strategic-resource/
  • 16. You Will Be Judged Unfairly • Think the team was at fault for Target/Neiman Marcus breaches? • February 2014: • “Hackers Set Off 60,000 Alerts While Bagging Credit Card Data” • December 2014: • Lawsuit not dismissed because of “disabling certain security features”
  • 17. Your Heart Is Not Scientific • Always had a “gut feeling” about something you couldn’t prove? • Know in your heart that every time you got sick, you waited too long? • Maybe you have a specific event which always seems to reveal the truth?
  • 18. We Search For Explanations In What We Have • Ever heard an educated person explain that their team only lost because they didn’t stick to their ritual? • What if simply checking the hash against VirusTotal was enough to find malware the first time, so you kept doing it?
  • 19. Test. Trick. Get Attention. Trigger System 2. • Still not considering phishing your employees? • It might be the only way they’ll ever think twice • Incident response exercises seem cheesy? • Consider using random data or fake incidents to go further
  • 20. Cheating and Morality Stop shifting in your seats, not that kind of cheating.
  • 21. Why and When do Humans Cheat? • Rational humans would cheat when cost- benefit analysis merited it: • Personal gain from cheating; • Chance of being caught; • Penalty if caught. • Most actual humans cheat when: • There’s a gain for self or others; • It’s possible to justify or rationalize; • ”Fudge factor” model.
  • 22. Dan Ariely - Dishonesty • ”Distance” makes cheating easier to justify: • Moving the golf ball: hand << foot << club; • Same monetary values, different behaviors: • Stolen Skype account and charges against PayPal account • “Would this person have taken cash from my wallet?”
  • 23. Cybercrime Feels Victimless • Just as spending tokens removes a lot of the pain • Only writing software or sending emails prevents criminals from feeling the guilt • The vast majority would never rob a stranger at knifepoint
  • 24. Conflicts of Interest and Self-Deception • Dentists that purchase CAD/CAM machines prescribe more unnecessary treatments that use it; • Art appreciation under fMRI experiment – positive bias towards vendors. • ”Perfectly well-meaning people can get tripped up on the quirks of the human mind, make egregious mistakes, and still consider themselves to be good and moral.”
  • 25. WTH Effect • The more we perceive ourselves as immoral, the more immoral behavior we allow ourselves; • ”Normalization of Deviance” effect over time; • Pay attention to feedback from outsiders, make data-driven decisions when possible; • ”Resetting” events / rituals might help overcome this: confession, New Year’s resolutions, Truth and Reconciliation Commission in South Africa.
  • 26. Payback and Altruism • Coffee shop experiment shows revenge can be unconscious justification for dishonesty: • 45% returned extra money if experimenter was polite; • Only 14% if impolite. • Social utility also a factor: • ”Robin Hood” effect; • Impact of not cheating on peers or underlings due to group quotas and bonuses.
  • 27. The Value of Good Examples ”The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves”, Dan Ariely
  • 28. Priming the Morality Cache • If you remind people about moral codes or their responsibility to act correctly cheating is reduced: • Honor code; • Religious commandments; • Personal commitment. • Simply changing signature to top of insurance form instead of bottom increased self-reported car mileage by 15%; • Implications for process and UI design.
  • 29. Be Grateful for Unfriendly Auditors! … now that’s something you don’t hear every day.
  • 30. Designing Rules and Incentives ”When the rules are somewhat open to interpretation, when there are gray areas, and when people are left to score their own performance – even honorable games may be traps for dishonesty” ”The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves” – Dan Ariely http://www.zdnet.com/article/facebook-engages-in-instagram-bug-spat-with-security-researcher/
  • 31. But At Least We Can Trust Vendors, Right? You must be nodding right now. Why else would this section be here?
  • 32. Don’t Trust Statistics… Without Specifics • “…the malware that was used would have slipped or probably got past 90 percent of internet defenses that are out there today in private industry and [would have] challenged even state government.” -- Joseph Demarest, assistant director of the FBI’s cyber division • “FBI: 90% Of US Companies Could Be Hacked Just Like Sony” -- Business Insider
  • 33. Why Does FUD Marketing Still Exist? • Ever hear “such a dangerous time we live in”? • Did you FLY HERE?! • What if you told your CFO a solution would prevent stolen HVAC accounts from authenticating?
  • 34. InfoSec marketing rings a bell? ”The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves”, Chapter 7
  • 35. The Experts Again. From Adjacent Fields. • Ever heard of “green lumber” trading? • An expert on green lumber would make a great trader, right? • NSA retiree == InfoSec expert? • Intelligence official [asked about Snowden’s skills]: “It’s 2013 and the NSA is stuck in 2003 technology.” • Buy our detection because… HD!
  • 36. Always Consider What’s Unsaid • Would you buy something that “eliminated 90% of herpes”? • What if it said “10% of herpes survives”? • So now “blocks 90% of malware”? • “Pricing starts at $99”?
  • 37. Find Leadership Who Lets You Admit #FAIL • Ever feel like it would be a waste to not finish your meal at a restaurant? • Think you’ve already committed to this talk, so you should stay? • How about “no more money for detection until the SIEM ROI is proven”?
  • 38. Should You Always Push For ‘Build’ Over ‘Buy’? • Never give a product consideration because of a shitty demo? • IT and Security software UIs have been terrible for a decade, so it won’t ever change, right? • Think your security team can build better software every time?
  • 39. Human Minds Find Patterns In Randomness • Confident you can differentiate between random and malicious? • Remember iPod shuffle mode complaints when a song was repeated? • Manually reviewing logs all day would probably be best for finding patterns, though…
  • 40. Look To Take The Conclusion From Humans • Battling cognitive biases is hard • We will make the wrong decision often and consistently (as covered here) • Be aware of this and strive to call yourself out • Humans are best at deciding what to do • Identifying symptoms - machines • Identifying root cause – machines • Identifying solutions - humans
  • 41. Recommended Reading • The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves and everything else by @danariely • The Black Swan: The Impact of the Highly Improbable and everything else by Nassim Nicholas Taleb • Thinking, Fast and Slow by the legendary Daniel Kahneman • Economic aspects of tech certifications by @fsmontenegro: • https://fsmontenegro.wordpress.com/2016/01/03/professional-certifications-information- asymmetry/ • https://fsmontenegro.wordpress.com/2016/01/13/professional-certifications-behavioural- economics/ • RSAC USA 2015 – The Art of Thinking Security Clearly by @apbarros https://www.rsaconference.com/events/us15/agenda/sessions/1531/the-art-of-thinking-security-clearly
  • 42. Thank you! Alexandre Sieira @AlexandreSieira CTO @NiddelCorp Matt Hathaway @TheWay99 Products Leader @Rapid7 • Q & A? • Feedback?

Editor's Notes

  • #6: ” I did my job” is not an effective approach LOVE ”you can’t patch stupid” but it’s inaccurate.
  • #10: If you think you’re the best at everything, you’re either horribly delusional or just not observant. Lake Wobegon effect (illusory superiority)
  • #11: Will never forget the discussion with Law Firm director of security Fake e-mail from CEO at Magnolia Health Corporation – just a week ago
  • #13: Expert biases can certainly go the other way, too.
  • #14: This all means you probably have the wrong view of your own expertise, so you have to directly consider that
  • #15: Experiment: Allow people to cheat, and they do; Ask them to predict future performance knowing that cheating will not be possible; People estimate on cheating performance (75%), not actual performance (50%); Monetary reward for correct performance estimation made no difference.
  • #16: Ego depletion. Judges reviewing parole cases: more granted early morning and right after lunch. Social engineering at the end of shifts?
  • #17: Public with Insufficient evidence Dan Ariely founded the ”Center for Advanced Hindsight” at Duke university as a joke about this effect. http://www.bloomberg.com/bw/articles/2014-02-21/neiman-marcus-hackers-set-off-60-000-alerts-while-bagging-credit-card-data
  • #18: Going with your gut will only justify the ”quick to judge” crowd Illusion of validity
  • #19: Illusory correlation Have you prevented attacks by rebooting the firewall appliance regularly?
  • #20: To summarize: you need to question everything. Especially, your first reaction.
  • #21: Why do we care? How attackers think Why IT people bypass security measures
  • #23: Club: 23% Kick: 14% Hand: 10% Make losses concrete on awareness training, make consequences clearer on UI / tickets.
  • #24: What if you’re not even stealing the data? Think you conscience is capable of justifying such behavior?
  • #25: You actions can impact how independent your decisions are. Conflict of interest consultant doing risk analysis / recommendations but they also sell controls. When it was known a gallery sponsored the lunch break, the reward center (ventromedial frontal cortex) of their brain reacted more favorably to same painting when associated with their logo. When asked, subjects genuinely said the gallery had no impact on their decision.
  • #28: Start of discussion on practical ways to reduce dishonesty.
  • #29: Reminder works if reference is wrong, such as a non-existent MIT/Yale honor code or atheists swearing on the bible.
  • #31: On bug bounties,researchers are basically deciding by themselves what and how to proceed. But they are the ones that stand to gain from finding vulns. Conflict of interest of sorts. Creating rules and processes to allow that to be done Shout out to BugCrowd and HackerOne
  • #33: Don’t trust the headline. Read the details.
  • #34: Availability Heuristic (and then, Cascade) Visible and newsworthy threats lead to funding – not necessarily the most relevant
  • #36: When a company pitches its people as a reason to buy its software, does it seem relevant? http://www.dailykos.com/story/2013/08/27/1234282/-The-NSA-s-astounding-internal-security-nbsp-failures#
  • #37: If it sounds really exciting, make sure to invoke System 2 before proceeding. Framing Effect
  • #38: Always try to acknowledge when your investments fail so you can find an alternative sooner SUNK-COST Fallacy
  • #39: Don’t constantly look for yet another reason that your team should build everything. Keep your expectations high, but look for something to surpass them. Confirmation Bias
  • #40: Because opting to use no technology will get you stuck in rabbit holes. Clustering illusion Manual review of large datasets yields patterns, even when none exist
  • #41: So we all have to do what? Second guess ourselves. With too few security pros, we need to use technology where effective. Two meanings of analytics: allowing analyst to do whatever analysis he wants or automating decision making and detection on a statistically sound way.