SlideShare a Scribd company logo
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 1
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Darkness Falls Fast
Developing Actionable Security Metrics
to Protect Enterprises
Brian Bissett
Department of the Treasury
Bureau of Fiscal Service
Bio-IT World Conference 2021
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 2
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Good Metrics
Irrespective of industry or initiative, a good metric will be:
 Specific
 Relevant
 Repeatable (Results fall within an acceptable margin of
error when run under identical circumstances)
 Aligned with business goals, quantitative, demonstrate
controllability, and can be control charted (trendable).
 A Leading indicator with defensible causal relationships to
business outcomes.1
 Objective and bear a clear relationship to the business of
the enterprise and its goals with context and meaning.
 Low in overhead.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 3
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Great Metrics
A Great Metric will:
 Immediately convey whether a situation is good or bad,
normal or abnormal.2
 Provide insight into business implications.
 Objective and bear a clear relationship to the business
of the enterprise and its goals with context and
meaning.
 Composed of cardinal numbers: ratios, absolute
numbers, or percentages.3
 Articulate what is most important to the organization.
 Have a first-order cause-and-effect relationship (ideal).
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 4
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
The Element of Time
 Forrester Research - enterprises need a mixture of
metrics that lag, lead, and are coincident to the
enterprise.4
 Gartner’s recommendation (Rule Number 4) “Choose
Metrics that are forward looking” for security contexts.5
 So who is right?
 Leading or Forward-Looking metrics are most valuable to
an organization. Such metrics are speculation based on
past performance, expert opinion, and other factors
subject to debate and error.
 Expert opinion lies at the bottom of the hierarchy of
evidence. This makes Gartner’s recommendation very
controversial.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 5
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Leading, Lagging, or Coincident?
 A lagging metric will highlight the results of past
decisions.
 A coincident or real time metric will provide a snapshot
of the current situation.
 Leading indicators provide predictive data points.
 Past performance may be the single best predictor of
future behavior, but:
 Trends do not continue forever, they will reach an asymptotic
limit, “burn out”, or crash or spike due to a supply issue.
 Future behavior is modeled on parameter estimation from
expert opinion, and “experts” are frequently wrong.
 Everyone is subject to conformational bias.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 6
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Cisco Metric Types
The Cisco framework utilizes two types of metrics6:
1. A ratio or percentage type measurement (typically a
pass/fail type of metric)
2. An on-time correction metric (measures if a
vulnerability was rectified in the time allotted for its
closure)
3. Federal Standards are* (Usually – exceptions exist)
* Binding Operational Directive 19-02.
Severity Rectification Time Limit
Critical 15 Days
High 30 Days
Moderate 30 Days
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 7
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Cisco Vulnerability Metric Framework
The Cisco Vulnerability Metric Framework divides
vulnerabilities into three categories: technology, process,
and people.
1. Technology - factors such as antimalware compliance,
stack compliance, application security weaknesses, and
open security exceptions.
2. Process - weaknesses in the architecture of the
enterprise and the processes that allow access to the
enterprise
3. People - security awareness of the people who have
access to the enterprise
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 8
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Cisco “maturity level” for metrics
Cisco defines a “maturity level” for metrics
Order from least to most is:
Ad hoc
Reactive
Proactive
Predictive
L
E
A
S
T
M
O
S
T
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 9
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Common Industry Metrics7
IT security spending as a percent of total IT spending:
=
𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔
𝑡𝑜𝑡𝑎𝑙 𝐼𝑇 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔
x 100
 the relative level of investment to support the security
of the enterprise from the perspective of the total IT
portfolio.
 IT security spending per employee
=
𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔
𝑇𝑜𝑡𝑎𝑙 𝑒𝑚𝑝𝑙𝑜𝑦𝑒𝑒𝑠
 insight on the level of investment the enterprise is
making to develop and maintain both security
conscious employees and the protection of the
environments they work within.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 10
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Common Industry Metrics II7
IT security spending per thousand dollars of revenue:
=
𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔
𝑡𝑜𝑡𝑎𝑙 𝑟𝑒𝑣𝑒𝑛𝑢𝑒 1000
 The metric is a ratio and the denominator is expressed in
thousands to prevent the value of the metric from being a
very small number.
 Security spending distribution by functional area:
 =
𝐴𝑟𝑒𝑎 𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔
𝑇𝑜𝑡𝑎𝑙 𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔
 Indicates types of investments the enterprise is making.
 Mapping of where resources are being applied relative to
operational risk and agency strategic plans.
 Snapshot of tradeoffs made, and the winners and losers.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 11
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Human Capital7
 Measure of IT security support intensity from a human
capital perspective.
 IT security FTEs as percentage of Total Employees:

𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝐹𝑇𝐸𝑠
𝑇𝑜𝑡𝑎𝑙 𝐸𝑚𝑝𝑙𝑜𝑦𝑒𝑒𝑠
x 100
 IT security FTEs as percentage of Total IT FTEs:

𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝐹𝑇𝐸𝑠
𝑇𝑜𝑡𝑎𝑙 𝐼𝑇 𝐹𝑇𝐸𝑠
x 100
 Assists in determining if staff size for the enterprise is
appropriate.
 Can also granulate to personnel in Common Areas such
as: Identity and Access Management, Network Security,
End Point Security, and Data Security.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 12
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Human Capital by Function Area
IT Security staffing distribution by functional area indicates
personnel investments by functions.
Common Functional Areas Include:
 Identity and Access Management
 Network Security
 End Point Security
 Data Security
 Governance
 Risk
 Compliance Management
 The distribution of operational infrastructure security
staffing by task provides an understanding of how security
FTEs are dispersed to support the technology
environments.
Tend to be Personnel Intensive
Significant Qualitative Factors.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 13
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Limitations of Existing Metrics
 Existing security metrics exhibit a low level of
correlation with vulnerabilities and attacks.
 Often, they fail to provide an adequate assessment of
security.
 The number of vulnerability exploits is not proportional
to the total number of vulnerabilities discovered in a
Windows operating system.
 There is no apparent correlation between the number
of vulnerabilities discovered, and the size of the OS
code.
 This suggests the existence of deployment-specific
factors, yet to be characterized systematically, that
influence the security of systems in active use.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 14
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Commonly Neglected Metrics
 Coverage – the type of scanning. agent based,
authenticated with a username and password, or
unauthenticated?
 Vulnerability Dwell Time – the time a known
vulnerability remains active on an enterprise.
 Average number of Vulnerabilities per Asset over time
– measure vulnerabilities over a continuous period of
time. Do not rely on scan results which may have not
seen all the assets during a scan and reflect drops that
in actuality are simply deviations (scanning gaps).
 Remediation of vulnerabilities vs. SLAs – How quickly
an organization or its agents successfully remediate its
vulnerabilities demonstrates program effectiveness.
(Especially for Cloud based Assets).
(but important)
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 15
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Novel New Metrics
 Symantec Research Labs and The University of
Maryland at College Park have proposed new security
metrics which are measured in the deployment
environment.
 Once a system is deployed, security becomes a moving
target as attackers exploit new vulnerabilities (to
subvert the system's functionality), vendors distribute
software updates (to patch vulnerabilities and improve
security), and users reconfigure the system to add
functionality.
 The following four new metrics are derived from field-
gathered data and thus capture the state of system
security as experienced by the end users.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 16
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Vulnerabilities Exploited in Wild
𝑉𝑝𝑒𝑥 = A count of vulnerabilities exploited in the wild.8
 For a product 𝑝, it obtains the subset of a product's
disclosed vulnerabilities that have been exploited in the
wild.
 Prior research has suggested that these signatures
represent the best indicator for which vulnerabilities are
exploited in real-world attacks.
 Metric combines information from the National
Vulnerability Database (NVD) and Symantec’s databases of
attack signatures to obtain the subset of a product’s
disclosed vulnerabilities that have been exploited.
 The NVD is a public vulnerability is a database of software
vulnerabilities which is widely accepted for vulnerability
research.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 17
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Exploitation ratio
The exploitation ratio is the proportion of disclosed
vulnerabilities for a product p that have been exploited up
until time t.8
𝐸𝑅
𝑝
𝑡 =
𝑉
𝑝
𝑒𝑥
(𝑡)
𝑉
𝑝(𝑡)
 It captures the likelihood that a vulnerability will be
exploited at time t.
 Ratio is time dependent.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 18
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Attack Volume
 Attack Volume is a measure of how frequently a
product p is attacked.8
 𝐴𝑉
𝑝
 Intuitively, it is the average number of attacks
experienced by a machine in a month due to a product
p being installed.
 It is the number of attacks that exploit a vulnerability of
p against hosts with p installed, normalized by the total
number of machine-months during which p was
installed.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 19
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Exercised Attack Surface
 The exercised attack surface captures the portion of
theoretical attack surface of a host that is targeted in a
particular month.8
 𝐸𝐴𝑆ℎ
𝑝
(𝑚)
 Intuitively, the exercised attack surface is the number of
distinct vulnerabilities that are exploited on a host h in
a given month m.
 The exercised attack surface attributable to a particular
product can be computed for a particular time interval
depending upon situational awareness required.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 20
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
CVSS Vulnerability Measurement
 Vulnerability measurement remains one of the most
popular areas for metric development.
 The Common Vulnerability Scoring System (CVSS) was
designed to provide an overall composite score
representing the severity and risk of a vulnerability.9
 The CVSS score is derived from metrics and formulas.
 Metrics are in three distinct categories that can be
quantitatively or qualitatively measured.
 Base metrics contain qualities that are intrinsic to any given
vulnerability that do not change over time or in different
environments.
 Temporal metrics contain vulnerability characteristics which evolve
over the lifetime of vulnerability.
 Environmental metrics contain those vulnerability characteristics
which are tied to a specific implementation in an enterprise.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 21
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
The 7 CVSS Base Metrics9
1. Access Vector (AV) is the vulnerability is exploitable locally or remotely?
2. Access Complexity (AC) the complexity of attack required to exploit the
vulnerability once access gained to the target system (high or low).
3. Authentication (A) does an attacker need to be authenticated to the
target system in order to exploit the vulnerability?
4. Confidentiality Impact (CI) the impact on confidentiality of a successful
exploit of the vulnerability on the target system. (None, partial or
complete)
5. Integrity Impact (II) impact on integrity of a successful exploit of the
vulnerability on the target system. (None, partial or complete).
6. Availability Impact (AI) measures the impact on availability of a
successful exploit of the vulnerability on the target system. (None,
partial or complete).
7. Impact Bias (IB) allows a score to convey greater weighting to one of
the three impact metrics over the other two.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 22
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
CVSS Temporal & Environmental10
Temporal Metrics represent time dependent features of
the vulnerability under the CVSS framework are:
1. Exploitability (the difficulty involved in exploiting the
vulnerability).
2. Remediation level (the maturity level of a fix).
3. Report confidence (the credibility of the threat).
Environmental Metrics represent the implementation and
environment specific features of vulnerability under the CVSS
framework are:
1. Collateral Damage Potential (CDP), measures the potential
for a loss of physical equipment, property damage, or loss
of life or limb. (None, low, medium, or high).
2. Target Distribution (TD), measures the relative size of the
field of target systems susceptible to the vulnerability.
(None, low, medium, or high).
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 23
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
A Novel Vulnerability Method
 Moving Averages can help discern when Vulnerability
growth exceeds historical norms.
 A short term moving average and a long term moving
average is calculated for the enterprise.
 When the short term moving average crosses the
long term moving average on the Y-Axis in the
positive direction , it is indicative of faster than
normal vulnerability growth and/or a lack of
sufficient remediation.
 When the short term moving average crosses the
long term moving average on the Y-Axis in the
negative direction , it is indicative of successful
vulnerability remediation efforts to restore norms.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 24
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Moving Average Flags
Exceeding Norms
Restoration to Norms
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 25
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Moving Average Considerations
 The short and long term moving average intervals
(windows) must be set appropriately, taking into
consideration vulnerability severity.
 The more severe the vulnerability, the smaller the short
term moving average interval (window) should be.
 The short term window should not exceed the required
remediation time for the severity of the vulnerability.
 Baseline creep is a reality when utilizing the moving
average technique.
 The baseline tends to deviate from its starting norms
with time, often upward.
 This can be adjusted for with hard stop upper and
lower control limits based on historical norms.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 26
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Balanced Scorecards
 Robert Kaplan and David Norton of Harvard University
developed a concept called the “Balanced Scorecard.”11
 Adapted to security, the Balanced Scorecard helps bridge
the gap between information security and management.
 The Center for Internet Security (CIS) has defined twenty
eight significant metrics that encompass seven business
functions. The seven business functions are incident
management, vulnerability management, patch
management, configuration management, change
management, application security, and financial metrics.12
 The Center for Internet Security advocates security
scorecards with only three main sections: Impact,
Operations, and Financial.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 27
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Forrester Best Scorecard Practices
 Forrester Research recommends that seven or fewer
metrics be used when presenting metrics on a
scorecard to senior executives.13
 Scorecards be updating be automated.
 Do not rely solely on absolute numbers, Forrester
advocates tracking proportions as well.
 Forrester lists six categories to track in Balanced
Security Scorecards.
 1) demographics; 2) security; 3) compliance; 4)
administration cost and efficiency; 5) business agility
and service delivery; and 6) customer-facing Identity
and Access Management.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 28
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
Wrap Up
 There exist a plethora of metrics to discern situational
awareness the state of IT Security within an enterprise.
 Significant gaps in awareness exist with many of the
more commonly used metrics, while some commonly
neglected metrics frequently tell a more holistic story.
 Not all metrics are applicable to every enterprise, and
customized metrics and/or parameterization may be
necessary depending on organizational needs.
 Recognize Metrics which measure “what got you here”
may not facilitate getting to the next goal post.
 Metrics are frequently time dependent and have
boundary conditions, determine these limitations.
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 29
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
References
1. Paul Proctor, Jeffrey Wheatman, Rob McMillan. “Develop Key Risk Indicators
and Security Metrics That Influence Business Decision Making.”, Page 1,
(Gartner Research. ID G00276149. 31 July 2015.)
2. Rob McMillan. “Sharpen Your Security Metrics to Make Them Relevant and
Effective.”, Page 6, (Gartner Research. ID G00259303. Refreshed 5 December
2016, Published 13 May 2014.)
3. Rob McMillan. “Five Required Characteristics of Security Metrics.”, Page 2,
(Gartner Research. ID G00245748. Refreshed 3 March 2017, Published 5
December 2012.)
4. Stephanie Balaouras, Laura Koetzle, Chase Cunningham, Jeff Pollard, Heidi
Shey, Bill Barringham, Peggy Dostie. “Craft Zero Trust Security Metrics That
Matter To Your Business, Performance Management: The Security
Architecture and Operations Playbook.”, Page 4, (Forrester Research. March
27, 2018.)
5. Jeffrey Wheatman, Rob McMillan. “Apply Five Rules to Your Security
Metrics.”, Page 8, (Gartner Research. ID G00341872. 7 November 2017).
6. Gerwin Tijink, Hessel Heerebout. “Unified Security Metrics”, Page 5, Cisco
White Paper. C11-737409. 2016.
7. Stegman, “IT Key Metrics Data 2019", Page 21, (Gartner Research. ID
G00375660).
L E A D ∙ T R A N S F O R M ∙ D E L I V E R
Page 30
Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information
provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the
specific circumstances of your situation.
References
8. Kartik Nayak, Daniel Marino, Petros Efstathopoulos, Tudor Dumitras. “Some
Vulnerabilities Are Different Than Others Studying Vulnerabilities and Attack
Surfaces in the Wild.”, Pages 1-2, (University of Maryland, College Park.
Symantec Research Labs. International Symposium on Research in Attacks,
Intrusions and Defenses 2014. 17 September, 2014.)
9. Victor-Valeriu Patriciu, Iustin Priescu, Sebastian Nicolaescu, “Security Metrics
for Enterprise Information Systems”, page 153, (Journal of Applied
Quantitative Methods, Vol 1, No.2, Winter 2006).
10. Wayne, Jansen, Directions in Security Metrics Research, Page 8, (Computer
Security Division, Information Technology Laboratory, National Institute of
Standards and Technology. NISTIR 7564. April 2009).
11. Andrew Jaquith. “Proving Your Worth: Follow These Steps to Create a
Successful Security Metrics Program.”, Page 31, Information Security. March
2010.
12. “The CIS Security Metrics.”, Page 8, (The Center for Internet Security.
November 1, 2010.)
13. Andras Cser, Merritt Maxim, Stephanie Balaouras, Madeline Cyr, Bill
Barringham, Peggy Dostie. “Develop Actionable Business-Centric Identity And
Access Management Metrics, Performance Management: The Identity And
Access Management Playbook.”, Page 5, (Forrester Research. July 27, 2018.).

More Related Content

PPTX
Program Management of SSA's Data Center OMB 300 Program
PPTX
Information Security Risk Management
PDF
Why Your Organization Should Leverage Data Science for Risk Intelligence and ...
PDF
Information Technology Vendor Risk Management
PDF
Aligning Risk Management with ITIL
PDF
Effective Security Metrics
PDF
Using Security Metrics to Drive Action
PPTX
Presenting Metrics to the Executive Team
Program Management of SSA's Data Center OMB 300 Program
Information Security Risk Management
Why Your Organization Should Leverage Data Science for Risk Intelligence and ...
Information Technology Vendor Risk Management
Aligning Risk Management with ITIL
Effective Security Metrics
Using Security Metrics to Drive Action
Presenting Metrics to the Executive Team

What's hot (20)

DOCX
case studies on risk management in IT enabled organisation(vadodara)
PPTX
Security Metrics Program
PDF
Cybersecurity Incident Management PowerPoint Presentation Slides
PDF
Integrated risk management
PDF
Improving Security Metrics
PDF
Key Risk Indicators - Changing the Reference Points
PDF
Information Security Strategic Management
PDF
An Intro to Core
PPTX
Key risk indicators shareslide
PPT
Risk Management: A Holistic Organizational Approach
PPTX
PDF
IT Security and Risk Management - Visionet Systems
PDF
IIA Facilitated Risk Workshop
PDF
u10a1-Risk Assessment Report-Beji Jacob
PDF
Allgress Brochure
PPTX
KRI (Key Risk Indicators) & IT
PPTX
Emerging Need of a Chief Information Security Officer (CISO)
PDF
Infographic - Critical Capabilities of a Good Risk Management Solution
PDF
The Journey to Integrated Risk Management: Lessons from the Field
PDF
Reporting to the Board on Corporate Compliance
case studies on risk management in IT enabled organisation(vadodara)
Security Metrics Program
Cybersecurity Incident Management PowerPoint Presentation Slides
Integrated risk management
Improving Security Metrics
Key Risk Indicators - Changing the Reference Points
Information Security Strategic Management
An Intro to Core
Key risk indicators shareslide
Risk Management: A Holistic Organizational Approach
IT Security and Risk Management - Visionet Systems
IIA Facilitated Risk Workshop
u10a1-Risk Assessment Report-Beji Jacob
Allgress Brochure
KRI (Key Risk Indicators) & IT
Emerging Need of a Chief Information Security Officer (CISO)
Infographic - Critical Capabilities of a Good Risk Management Solution
The Journey to Integrated Risk Management: Lessons from the Field
Reporting to the Board on Corporate Compliance
Ad

Similar to IT Security Metrics (20)

PPTX
Managing Reputation
PDF
WHEN Group Investor Deck
PDF
How close is your organization to being breached | Safe Security
PPT
2007 CPM West Keynote Presentation
PDF
EMEA: Using Security Metrics to Drive Action - 22 Experts Share How to Commun...
PDF
Titas Global Ltd
PDF
Marsh Analytics - CFO com
PPTX
Global trends in AI and healthcare
PDF
Trustwave: 7 Experts on Transforming Your Threat Detection & Response Strategy
PDF
Risksense: 7 Experts on Threat and Vulnerability Management
PDF
The Critical Incident Response Maturity Journey
 
PDF
DSS Investor Deck
PDF
DSS Investor Deck
PDF
Risk Mgmt - Define_And_Articulate
PDF
Experion Data Breach Response Excerpts
DOCX
PADM 530 CASE STUDY PROJECT PART 1 DECLARED JURISDICTION.docx
PDF
disaster-recovery-online
PDF
Emerging Threats Report 2013
PPT
ComplianceOnline PPT Format 2015 Developing an Effective Fraud Risk Managemen...
Managing Reputation
WHEN Group Investor Deck
How close is your organization to being breached | Safe Security
2007 CPM West Keynote Presentation
EMEA: Using Security Metrics to Drive Action - 22 Experts Share How to Commun...
Titas Global Ltd
Marsh Analytics - CFO com
Global trends in AI and healthcare
Trustwave: 7 Experts on Transforming Your Threat Detection & Response Strategy
Risksense: 7 Experts on Threat and Vulnerability Management
The Critical Incident Response Maturity Journey
 
DSS Investor Deck
DSS Investor Deck
Risk Mgmt - Define_And_Articulate
Experion Data Breach Response Excerpts
PADM 530 CASE STUDY PROJECT PART 1 DECLARED JURISDICTION.docx
disaster-recovery-online
Emerging Threats Report 2013
ComplianceOnline PPT Format 2015 Developing an Effective Fraud Risk Managemen...
Ad

More from Brian Bissett (17)

PDF
Automating Data Analysis with Excel Bio-IT World 2018
PDF
Deaths by Shooting in Baltimore before and after the Firearms Safety Act of 2...
PPTX
Bio-IT 2017 Automation
PPSX
Presentation given at Bio-IT World 2016 as a Senior Member of the IEEE on the...
PPTX
Lies, Damn Lies, and Big Data
PPTX
Data Analytics of Strategic Information Technology Asset Reviews
PDF
ElogDoct: A Tool for Lipophilicity Determination in Drug Discovery. 2. Basic ...
PDF
ElogPoct: A Tool for Lipophilicity Determination in Drug Discovery
PDF
Automating pKa Curve Fitting Using Origin
PDF
Physicochemical Profiling In Drug Research
PDF
Addressable Location Indicator Apparatus and Method
PDF
Automated Kinetic Solubility Assay Apparatus and Method
PDF
Multivariate Analysis Of Energy Policy Options Using Lindo
PPT
Bio-IT World 2009: Adjusting Information Flow from In-house HTS to Global Out...
PPT
Advanced Excel Technologies In Early Development Applications
PPS
Development of Pfizer's Third Generation Turbidimetric Solubility Assay - An ...
PPS
Bridging Pharma And IT 2008
Automating Data Analysis with Excel Bio-IT World 2018
Deaths by Shooting in Baltimore before and after the Firearms Safety Act of 2...
Bio-IT 2017 Automation
Presentation given at Bio-IT World 2016 as a Senior Member of the IEEE on the...
Lies, Damn Lies, and Big Data
Data Analytics of Strategic Information Technology Asset Reviews
ElogDoct: A Tool for Lipophilicity Determination in Drug Discovery. 2. Basic ...
ElogPoct: A Tool for Lipophilicity Determination in Drug Discovery
Automating pKa Curve Fitting Using Origin
Physicochemical Profiling In Drug Research
Addressable Location Indicator Apparatus and Method
Automated Kinetic Solubility Assay Apparatus and Method
Multivariate Analysis Of Energy Policy Options Using Lindo
Bio-IT World 2009: Adjusting Information Flow from In-house HTS to Global Out...
Advanced Excel Technologies In Early Development Applications
Development of Pfizer's Third Generation Turbidimetric Solubility Assay - An ...
Bridging Pharma And IT 2008

Recently uploaded (20)

PDF
168300704-gasification-ppt.pdfhghhhsjsjhsuxush
PPTX
Supervised vs unsupervised machine learning algorithms
PDF
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
PDF
Fluorescence-microscope_Botany_detailed content
PPTX
Microsoft-Fabric-Unifying-Analytics-for-the-Modern-Enterprise Solution.pptx
PDF
Lecture1 pattern recognition............
PPT
Quality review (1)_presentation of this 21
PPTX
Introduction to Firewall Analytics - Interfirewall and Transfirewall.pptx
PDF
Clinical guidelines as a resource for EBP(1).pdf
PPTX
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
PPTX
climate analysis of Dhaka ,Banglades.pptx
PPTX
Business Acumen Training GuidePresentation.pptx
PPTX
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
PPT
Reliability_Chapter_ presentation 1221.5784
PDF
.pdf is not working space design for the following data for the following dat...
PPTX
mbdjdhjjodule 5-1 rhfhhfjtjjhafbrhfnfbbfnb
PPTX
ALIMENTARY AND BILIARY CONDITIONS 3-1.pptx
PPTX
STUDY DESIGN details- Lt Col Maksud (21).pptx
PPTX
Introduction to Knowledge Engineering Part 1
PDF
Galatica Smart Energy Infrastructure Startup Pitch Deck
168300704-gasification-ppt.pdfhghhhsjsjhsuxush
Supervised vs unsupervised machine learning algorithms
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
Fluorescence-microscope_Botany_detailed content
Microsoft-Fabric-Unifying-Analytics-for-the-Modern-Enterprise Solution.pptx
Lecture1 pattern recognition............
Quality review (1)_presentation of this 21
Introduction to Firewall Analytics - Interfirewall and Transfirewall.pptx
Clinical guidelines as a resource for EBP(1).pdf
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
climate analysis of Dhaka ,Banglades.pptx
Business Acumen Training GuidePresentation.pptx
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
Reliability_Chapter_ presentation 1221.5784
.pdf is not working space design for the following data for the following dat...
mbdjdhjjodule 5-1 rhfhhfjtjjhafbrhfnfbbfnb
ALIMENTARY AND BILIARY CONDITIONS 3-1.pptx
STUDY DESIGN details- Lt Col Maksud (21).pptx
Introduction to Knowledge Engineering Part 1
Galatica Smart Energy Infrastructure Startup Pitch Deck

IT Security Metrics

  • 1. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 1 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Darkness Falls Fast Developing Actionable Security Metrics to Protect Enterprises Brian Bissett Department of the Treasury Bureau of Fiscal Service Bio-IT World Conference 2021
  • 2. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 2 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Good Metrics Irrespective of industry or initiative, a good metric will be:  Specific  Relevant  Repeatable (Results fall within an acceptable margin of error when run under identical circumstances)  Aligned with business goals, quantitative, demonstrate controllability, and can be control charted (trendable).  A Leading indicator with defensible causal relationships to business outcomes.1  Objective and bear a clear relationship to the business of the enterprise and its goals with context and meaning.  Low in overhead.
  • 3. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 3 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Great Metrics A Great Metric will:  Immediately convey whether a situation is good or bad, normal or abnormal.2  Provide insight into business implications.  Objective and bear a clear relationship to the business of the enterprise and its goals with context and meaning.  Composed of cardinal numbers: ratios, absolute numbers, or percentages.3  Articulate what is most important to the organization.  Have a first-order cause-and-effect relationship (ideal).
  • 4. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 4 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. The Element of Time  Forrester Research - enterprises need a mixture of metrics that lag, lead, and are coincident to the enterprise.4  Gartner’s recommendation (Rule Number 4) “Choose Metrics that are forward looking” for security contexts.5  So who is right?  Leading or Forward-Looking metrics are most valuable to an organization. Such metrics are speculation based on past performance, expert opinion, and other factors subject to debate and error.  Expert opinion lies at the bottom of the hierarchy of evidence. This makes Gartner’s recommendation very controversial.
  • 5. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 5 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Leading, Lagging, or Coincident?  A lagging metric will highlight the results of past decisions.  A coincident or real time metric will provide a snapshot of the current situation.  Leading indicators provide predictive data points.  Past performance may be the single best predictor of future behavior, but:  Trends do not continue forever, they will reach an asymptotic limit, “burn out”, or crash or spike due to a supply issue.  Future behavior is modeled on parameter estimation from expert opinion, and “experts” are frequently wrong.  Everyone is subject to conformational bias.
  • 6. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 6 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Cisco Metric Types The Cisco framework utilizes two types of metrics6: 1. A ratio or percentage type measurement (typically a pass/fail type of metric) 2. An on-time correction metric (measures if a vulnerability was rectified in the time allotted for its closure) 3. Federal Standards are* (Usually – exceptions exist) * Binding Operational Directive 19-02. Severity Rectification Time Limit Critical 15 Days High 30 Days Moderate 30 Days
  • 7. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 7 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Cisco Vulnerability Metric Framework The Cisco Vulnerability Metric Framework divides vulnerabilities into three categories: technology, process, and people. 1. Technology - factors such as antimalware compliance, stack compliance, application security weaknesses, and open security exceptions. 2. Process - weaknesses in the architecture of the enterprise and the processes that allow access to the enterprise 3. People - security awareness of the people who have access to the enterprise
  • 8. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 8 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Cisco “maturity level” for metrics Cisco defines a “maturity level” for metrics Order from least to most is: Ad hoc Reactive Proactive Predictive L E A S T M O S T
  • 9. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 9 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Common Industry Metrics7 IT security spending as a percent of total IT spending: = 𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔 𝑡𝑜𝑡𝑎𝑙 𝐼𝑇 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔 x 100  the relative level of investment to support the security of the enterprise from the perspective of the total IT portfolio.  IT security spending per employee = 𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔 𝑇𝑜𝑡𝑎𝑙 𝑒𝑚𝑝𝑙𝑜𝑦𝑒𝑒𝑠  insight on the level of investment the enterprise is making to develop and maintain both security conscious employees and the protection of the environments they work within.
  • 10. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 10 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Common Industry Metrics II7 IT security spending per thousand dollars of revenue: = 𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔 𝑡𝑜𝑡𝑎𝑙 𝑟𝑒𝑣𝑒𝑛𝑢𝑒 1000  The metric is a ratio and the denominator is expressed in thousands to prevent the value of the metric from being a very small number.  Security spending distribution by functional area:  = 𝐴𝑟𝑒𝑎 𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔 𝑇𝑜𝑡𝑎𝑙 𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝑠𝑝𝑒𝑛𝑑𝑖𝑛𝑔  Indicates types of investments the enterprise is making.  Mapping of where resources are being applied relative to operational risk and agency strategic plans.  Snapshot of tradeoffs made, and the winners and losers.
  • 11. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 11 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Human Capital7  Measure of IT security support intensity from a human capital perspective.  IT security FTEs as percentage of Total Employees:  𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝐹𝑇𝐸𝑠 𝑇𝑜𝑡𝑎𝑙 𝐸𝑚𝑝𝑙𝑜𝑦𝑒𝑒𝑠 x 100  IT security FTEs as percentage of Total IT FTEs:  𝐼𝑇 𝑠𝑒𝑐𝑢𝑟𝑖𝑡𝑦 𝐹𝑇𝐸𝑠 𝑇𝑜𝑡𝑎𝑙 𝐼𝑇 𝐹𝑇𝐸𝑠 x 100  Assists in determining if staff size for the enterprise is appropriate.  Can also granulate to personnel in Common Areas such as: Identity and Access Management, Network Security, End Point Security, and Data Security.
  • 12. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 12 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Human Capital by Function Area IT Security staffing distribution by functional area indicates personnel investments by functions. Common Functional Areas Include:  Identity and Access Management  Network Security  End Point Security  Data Security  Governance  Risk  Compliance Management  The distribution of operational infrastructure security staffing by task provides an understanding of how security FTEs are dispersed to support the technology environments. Tend to be Personnel Intensive Significant Qualitative Factors.
  • 13. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 13 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Limitations of Existing Metrics  Existing security metrics exhibit a low level of correlation with vulnerabilities and attacks.  Often, they fail to provide an adequate assessment of security.  The number of vulnerability exploits is not proportional to the total number of vulnerabilities discovered in a Windows operating system.  There is no apparent correlation between the number of vulnerabilities discovered, and the size of the OS code.  This suggests the existence of deployment-specific factors, yet to be characterized systematically, that influence the security of systems in active use.
  • 14. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 14 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Commonly Neglected Metrics  Coverage – the type of scanning. agent based, authenticated with a username and password, or unauthenticated?  Vulnerability Dwell Time – the time a known vulnerability remains active on an enterprise.  Average number of Vulnerabilities per Asset over time – measure vulnerabilities over a continuous period of time. Do not rely on scan results which may have not seen all the assets during a scan and reflect drops that in actuality are simply deviations (scanning gaps).  Remediation of vulnerabilities vs. SLAs – How quickly an organization or its agents successfully remediate its vulnerabilities demonstrates program effectiveness. (Especially for Cloud based Assets). (but important)
  • 15. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 15 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Novel New Metrics  Symantec Research Labs and The University of Maryland at College Park have proposed new security metrics which are measured in the deployment environment.  Once a system is deployed, security becomes a moving target as attackers exploit new vulnerabilities (to subvert the system's functionality), vendors distribute software updates (to patch vulnerabilities and improve security), and users reconfigure the system to add functionality.  The following four new metrics are derived from field- gathered data and thus capture the state of system security as experienced by the end users.
  • 16. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 16 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Vulnerabilities Exploited in Wild 𝑉𝑝𝑒𝑥 = A count of vulnerabilities exploited in the wild.8  For a product 𝑝, it obtains the subset of a product's disclosed vulnerabilities that have been exploited in the wild.  Prior research has suggested that these signatures represent the best indicator for which vulnerabilities are exploited in real-world attacks.  Metric combines information from the National Vulnerability Database (NVD) and Symantec’s databases of attack signatures to obtain the subset of a product’s disclosed vulnerabilities that have been exploited.  The NVD is a public vulnerability is a database of software vulnerabilities which is widely accepted for vulnerability research.
  • 17. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 17 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Exploitation ratio The exploitation ratio is the proportion of disclosed vulnerabilities for a product p that have been exploited up until time t.8 𝐸𝑅 𝑝 𝑡 = 𝑉 𝑝 𝑒𝑥 (𝑡) 𝑉 𝑝(𝑡)  It captures the likelihood that a vulnerability will be exploited at time t.  Ratio is time dependent.
  • 18. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 18 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Attack Volume  Attack Volume is a measure of how frequently a product p is attacked.8  𝐴𝑉 𝑝  Intuitively, it is the average number of attacks experienced by a machine in a month due to a product p being installed.  It is the number of attacks that exploit a vulnerability of p against hosts with p installed, normalized by the total number of machine-months during which p was installed.
  • 19. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 19 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Exercised Attack Surface  The exercised attack surface captures the portion of theoretical attack surface of a host that is targeted in a particular month.8  𝐸𝐴𝑆ℎ 𝑝 (𝑚)  Intuitively, the exercised attack surface is the number of distinct vulnerabilities that are exploited on a host h in a given month m.  The exercised attack surface attributable to a particular product can be computed for a particular time interval depending upon situational awareness required.
  • 20. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 20 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. CVSS Vulnerability Measurement  Vulnerability measurement remains one of the most popular areas for metric development.  The Common Vulnerability Scoring System (CVSS) was designed to provide an overall composite score representing the severity and risk of a vulnerability.9  The CVSS score is derived from metrics and formulas.  Metrics are in three distinct categories that can be quantitatively or qualitatively measured.  Base metrics contain qualities that are intrinsic to any given vulnerability that do not change over time or in different environments.  Temporal metrics contain vulnerability characteristics which evolve over the lifetime of vulnerability.  Environmental metrics contain those vulnerability characteristics which are tied to a specific implementation in an enterprise.
  • 21. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 21 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. The 7 CVSS Base Metrics9 1. Access Vector (AV) is the vulnerability is exploitable locally or remotely? 2. Access Complexity (AC) the complexity of attack required to exploit the vulnerability once access gained to the target system (high or low). 3. Authentication (A) does an attacker need to be authenticated to the target system in order to exploit the vulnerability? 4. Confidentiality Impact (CI) the impact on confidentiality of a successful exploit of the vulnerability on the target system. (None, partial or complete) 5. Integrity Impact (II) impact on integrity of a successful exploit of the vulnerability on the target system. (None, partial or complete). 6. Availability Impact (AI) measures the impact on availability of a successful exploit of the vulnerability on the target system. (None, partial or complete). 7. Impact Bias (IB) allows a score to convey greater weighting to one of the three impact metrics over the other two.
  • 22. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 22 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. CVSS Temporal & Environmental10 Temporal Metrics represent time dependent features of the vulnerability under the CVSS framework are: 1. Exploitability (the difficulty involved in exploiting the vulnerability). 2. Remediation level (the maturity level of a fix). 3. Report confidence (the credibility of the threat). Environmental Metrics represent the implementation and environment specific features of vulnerability under the CVSS framework are: 1. Collateral Damage Potential (CDP), measures the potential for a loss of physical equipment, property damage, or loss of life or limb. (None, low, medium, or high). 2. Target Distribution (TD), measures the relative size of the field of target systems susceptible to the vulnerability. (None, low, medium, or high).
  • 23. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 23 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. A Novel Vulnerability Method  Moving Averages can help discern when Vulnerability growth exceeds historical norms.  A short term moving average and a long term moving average is calculated for the enterprise.  When the short term moving average crosses the long term moving average on the Y-Axis in the positive direction , it is indicative of faster than normal vulnerability growth and/or a lack of sufficient remediation.  When the short term moving average crosses the long term moving average on the Y-Axis in the negative direction , it is indicative of successful vulnerability remediation efforts to restore norms.
  • 24. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 24 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Moving Average Flags Exceeding Norms Restoration to Norms
  • 25. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 25 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Moving Average Considerations  The short and long term moving average intervals (windows) must be set appropriately, taking into consideration vulnerability severity.  The more severe the vulnerability, the smaller the short term moving average interval (window) should be.  The short term window should not exceed the required remediation time for the severity of the vulnerability.  Baseline creep is a reality when utilizing the moving average technique.  The baseline tends to deviate from its starting norms with time, often upward.  This can be adjusted for with hard stop upper and lower control limits based on historical norms.
  • 26. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 26 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Balanced Scorecards  Robert Kaplan and David Norton of Harvard University developed a concept called the “Balanced Scorecard.”11  Adapted to security, the Balanced Scorecard helps bridge the gap between information security and management.  The Center for Internet Security (CIS) has defined twenty eight significant metrics that encompass seven business functions. The seven business functions are incident management, vulnerability management, patch management, configuration management, change management, application security, and financial metrics.12  The Center for Internet Security advocates security scorecards with only three main sections: Impact, Operations, and Financial.
  • 27. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 27 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Forrester Best Scorecard Practices  Forrester Research recommends that seven or fewer metrics be used when presenting metrics on a scorecard to senior executives.13  Scorecards be updating be automated.  Do not rely solely on absolute numbers, Forrester advocates tracking proportions as well.  Forrester lists six categories to track in Balanced Security Scorecards.  1) demographics; 2) security; 3) compliance; 4) administration cost and efficiency; 5) business agility and service delivery; and 6) customer-facing Identity and Access Management.
  • 28. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 28 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. Wrap Up  There exist a plethora of metrics to discern situational awareness the state of IT Security within an enterprise.  Significant gaps in awareness exist with many of the more commonly used metrics, while some commonly neglected metrics frequently tell a more holistic story.  Not all metrics are applicable to every enterprise, and customized metrics and/or parameterization may be necessary depending on organizational needs.  Recognize Metrics which measure “what got you here” may not facilitate getting to the next goal post.  Metrics are frequently time dependent and have boundary conditions, determine these limitations.
  • 29. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 29 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. References 1. Paul Proctor, Jeffrey Wheatman, Rob McMillan. “Develop Key Risk Indicators and Security Metrics That Influence Business Decision Making.”, Page 1, (Gartner Research. ID G00276149. 31 July 2015.) 2. Rob McMillan. “Sharpen Your Security Metrics to Make Them Relevant and Effective.”, Page 6, (Gartner Research. ID G00259303. Refreshed 5 December 2016, Published 13 May 2014.) 3. Rob McMillan. “Five Required Characteristics of Security Metrics.”, Page 2, (Gartner Research. ID G00245748. Refreshed 3 March 2017, Published 5 December 2012.) 4. Stephanie Balaouras, Laura Koetzle, Chase Cunningham, Jeff Pollard, Heidi Shey, Bill Barringham, Peggy Dostie. “Craft Zero Trust Security Metrics That Matter To Your Business, Performance Management: The Security Architecture and Operations Playbook.”, Page 4, (Forrester Research. March 27, 2018.) 5. Jeffrey Wheatman, Rob McMillan. “Apply Five Rules to Your Security Metrics.”, Page 8, (Gartner Research. ID G00341872. 7 November 2017). 6. Gerwin Tijink, Hessel Heerebout. “Unified Security Metrics”, Page 5, Cisco White Paper. C11-737409. 2016. 7. Stegman, “IT Key Metrics Data 2019", Page 21, (Gartner Research. ID G00375660).
  • 30. L E A D ∙ T R A N S F O R M ∙ D E L I V E R Page 30 Disclaimer: Not an official spokesperson for Treasury. The views expressed herein by the author do not necessarily reflect the views of Treasury. The information provided is of a general, broad, and wide-spread nature, and only a competent authority with specialized knowledge of your unique environment can address the specific circumstances of your situation. References 8. Kartik Nayak, Daniel Marino, Petros Efstathopoulos, Tudor Dumitras. “Some Vulnerabilities Are Different Than Others Studying Vulnerabilities and Attack Surfaces in the Wild.”, Pages 1-2, (University of Maryland, College Park. Symantec Research Labs. International Symposium on Research in Attacks, Intrusions and Defenses 2014. 17 September, 2014.) 9. Victor-Valeriu Patriciu, Iustin Priescu, Sebastian Nicolaescu, “Security Metrics for Enterprise Information Systems”, page 153, (Journal of Applied Quantitative Methods, Vol 1, No.2, Winter 2006). 10. Wayne, Jansen, Directions in Security Metrics Research, Page 8, (Computer Security Division, Information Technology Laboratory, National Institute of Standards and Technology. NISTIR 7564. April 2009). 11. Andrew Jaquith. “Proving Your Worth: Follow These Steps to Create a Successful Security Metrics Program.”, Page 31, Information Security. March 2010. 12. “The CIS Security Metrics.”, Page 8, (The Center for Internet Security. November 1, 2010.) 13. Andras Cser, Merritt Maxim, Stephanie Balaouras, Madeline Cyr, Bill Barringham, Peggy Dostie. “Develop Actionable Business-Centric Identity And Access Management Metrics, Performance Management: The Identity And Access Management Playbook.”, Page 5, (Forrester Research. July 27, 2018.).