SlideShare a Scribd company logo
Copyright © 2013 Splunk Inc.

Analytics with Splunk
Enterprise
Legal Notices
During the course of this presentation, we may make forward-looking statements regarding future events or the
expected performance of the company. We caution you that such statements reflect our current
expectations and estimates based on factors currently known to us and that actual events or results could differ
materially. For important factors that may cause actual results to differ from those contained in our forward-looking
statements, please review our filings with the SEC. The forward-looking statements made in this presentation are
being made as of the time and date of its live presentation. If reviewed after its live presentation, this presentation
may not contain current or accurate information. We do not assume any obligation to update any forward-looking
statements we may make. In addition, any information about our roadmap outlines our general product direction
and is subject to change at any time without notice. It is for informational purposes only and shall not, be
incorporated into any contract or other commitment. Splunk undertakes no obligation either to develop the
features or functionality described or to include any such feature or functionality in a future release.
Splunk, Splunk>, Splunk Storm, Listen to Your Data, SPL and The Engine for Machine Data are trademarks and registered trademarks of
Splunk Inc. in the United States and other countries. All other brand names, product names, or trademarks belong to their respective
owners.

©2013 Splunk Inc. All rights reserved.
Search is hard.
Analytics Big Picture
Pivot

Build complex reports without the
search language

Data
Model

Provides more meaningful representation
of underlying raw machine data

Analytics
Store

Acceleration technology delivers up to
1000x faster analytics over Splunk 5

4
Operational Intelligence Across the Enterprise
[10/11/12

18:57:04
000000b0

UTC]

Raw
Data

IT professional
Create and share data models
Accelerate data models and custom
searches with the analytics store
Create reports with pivot

Analytics
Store

Developer
Leverage data models to
abstract data
Leverage pivot in custom apps

Data
Model

Pivot

Analyst
Create reports using pivot based on
data models created by IT
Pivot is a query builder.
Data Models 101
Source
Data set

Source
Source
Success
Sourcetype

Failure
Warning
Source
Business division
Source
Data set
Source
Business division

Source
Technology 1
Common model

Technology 2
Technology 3
Context
Splunk Search Language
search and filter | munge | report | clean-up
sourcetype=access_combined source = "/home/ssorkin/banner_access.log.2013.6.gz"
| eval unique=(uid + useragent) | stats dc(unique) by os_name
| rename dc(unique) as "Unique Visitors" os_name as "Operating System"
Hurdles
index=main source=*/banner_access* uri_path=/js/*/*/login/* guid=* useragent!=*KTXN* useragent!=*GomezAgent* clientip!=206.80.3.67
clientip!=198.144.207.62 clientip!=97.65.63.66 clientip!=175.45.37.78 clientip!=209.119.210.194 clientip!=212.36.37.138 clientip!=204.156.84.0/24
clientip!=216.221.226.0/24 clientip!=207.87.200.162 | rex field=uri_path "/js/(?<t>[^/]*)/(?<v>[^/]*)/login/(?<l>[^/]*)” | eval license = case(l LIKE "prod%" AND
t="pro", "enterprise", l LIKE "trial%" AND t="pro", "trial", t="free", "free”) | rex field=v "^(?<vers>d.d)” | bin span=1d _time as day | stats values(vers) as vers
min(day) as min_day min(eval(if(vers=="5.0", _time, null()))) as min_day_50 dc(day) as days values(license) as license by guid | eval type =
if(match(vers,"4.*"), "upgrade", "not upgrade") + "/" + if(days > 1, "repeat", "not repeat")| search license=enterprise | eval _time = min_day_50| timechart
count by type| streamstats sum(*) as *

•

Simple searches easy… Multi-stage munging/reporting is hard!

•

Need to understand data’s structure to construct search

•

Non-technical users may not have data source domain knowledge

•

Splunk admins do not have end-user search context
Data Model Goals
•

Make it easy to share/reuse domain knowledge

•

Admins/power users build data models

•

Non-technical users interact with data via pivot UI
Data Models 101
What is a Data Model?
A data model is a search-time mapping of data onto a hierarchical structure

Encapsulate the knowledge
needed to build a search
Pivot reports are build on top
of data models

Data-independent

Screenshot here
A Data Model is a Collection of Objects

Screenshot here
Objects Have Constraints and Attributes

Screenshot here
Child Objects Inherit Constraints and Attributes

Screenshot here
Child Objects Inherit Constraints and Attributes
Building Data Models
Three Root Object Types
Event

– MapstoSplunkevents
– Requiresconstraints
andattributes
Three Root Object Types
Event

– MapstoSplunkevents
– Requiresconstraints
andattributes

Search
– MapstoarbitrarySplunksearch(may
includegenerating,transformingand
reportingsearchcommands)
– Requiressearchstringattributes
•

Transaction
– Mapsto groupsof Splunkeventsor
groupsof Splunksearchresults
– Requiresobjectsto group,fields/
conditionstogroupby,andattributes
Three Root Object Types
Event

– MapstoSplunkevents
– Requiresconstraints
andattributes

Search
– MapstoarbitrarySplunksearch(may
includegenerating,transformingand
reportingsearchcommands)
Requiressearchstringattributes

Transaction
– Mapsto groupsof Splunkeventsor
groupsof Splunksearchresults
– Requiresobjectsto group,fields/
conditionstogroupby,andattributes
Object Attributes
Auto-extracted – default and predefined fields
Eval expression – a new field based
on an expression that you define
Lookup – leverage an existing lookup
table
Regular expression – extract a new
field based on regex
Geo IP – add geolocation fields such
as latitude, longitude, country, etc.
Object Attributes
Set field types

Configure various flags
Note: Child object configuration can differ from parent
Best Practices
Use event objects as often as possible
– Benefit from data model acceleration

Resist the urge to use search objects instead of event objects!!
– Event based searches can be optimized better

Minimize object hierarchy depth when possible
– Constraint based filtering is less efficient deeper down the tree

Event object with deepest tree (and most matching results) first
– Model-wide acceleration only for first event object and its
descendants
Warnings!
Object constraints and attributes cannot contain pipes or subsearches
A transaction object requires at least one event or search object in the data model
Lookups used in attributes must be globally visible (or at least visible to the app
using the data model)
No versioning on data models (and objects)!
From Data Models to Reports
Using the UI
Subhead

Count of http_success
events, split by useragent
events

fields
Under the Hood: Object Search String Generation
Event Object
Syntax:
<constraints search> | <my attribute
definitions>

Example:
sourcetype=access_* OR sourcetype=iis*
uri=* uri_path=* status=* clientip=* referer=*
useragent=*
Under the Hood: Object Search String Generation
Search Object
Syntax:
<base search> | <my attribute definitions>

Example:
_time=* host=* source=* sourcetype=* uri=*
status<600 clientip=* referer=* useragent=*
(sourcetype=access_* OR source=*.log) | eval
userid=clientip | stats first(_time) as
earliest, last(_time) as latest, list(uri_path) as
uri_list by userid
| earliest=* latest=* uri_list=*
Under the Hood: Object Search String Generation
Transaction Object
Syntax:
<objects to group search> | transaction <group
by fields> <group by params>
| <my attribute definitions>

Example:
sourcetype=access_* uri=* uri_path=* status=*
clientip=* referer=* useragent=* | transaction
clientip useragent | eval
landingpage=mvindex(uri_path,1) | eval
exitpage=mvindex(uri_path,-1)
Under the Hood: Object Search String Generation
Child Object
Syntax:
<parent object search> | search <my constraints> |
<my attribute definitions>

Example:
sourcetype=access_* uri=* uri_path=* status=*
clientip=* referer=* useragent=* status=2* | <my
attribute definitions>
Using the Splunk Search Language
Object Search String
| datamodel <modelname> <objectID> search

Example:
| datamodel WebIntelligence HTTP_Request search

Behind the scenes:
sourcetype=access_* OR sourcetype=iis* uri=* uri_path=* status=* clientip=* referer=*
useragent=*
Under the hood: Pivot Search String Generation
Pivot search = object search + filters + reporting + formatting
Example:
(sourcetype=access_* OR sourcetype=iis*) status=2*
uri=* uri_path=* status=* clientip=* referer=* useragent=*
| stats count AS "Count of HTTP_Sucess" by ”useragent"
| sort limit=0 "useragent" | fields - _span
| fields "useragent" "Count of HTTP_Success"
| fillnull "Count of HTTP_Success"
| fields "useragent" *
Using the Splunk Search Language
Pivot Search String
| pivot <modelname> <objectID> [statsfns, rowsplit, colsplit, filters, …]

Example:
| pivot WebIntelligence HTTP_Request count(HTTP_Request) AS "Count of HTTP_Request" SPLITROW status
AS "status" SORT 0 status

Behind the scenes:
sourcetype=access_* OR sourcetype=iis* uri=* uri_path=* status=* clientip=* referer=* useragent=*
| stats count AS "Count of HTTP_Request" by "status"
| sort limit=0 "status" | fields - _span
| fields "status", "Count of HTTP_Request"
| fillnull "Count of HTTP_Request"
| fields "status" *
Warnings
• | datamodel and | pivot are generating commands

– They must be at the beginning of the search string
•

Use objectIDs NOT user-visible object names
Managing Data
Models
Data Model on Disk
Each data model is a separate JSON file
Lives in <myapp>/local/data/models
(or <myapp>/default/data/models for
pre-installed models)
Has associated conf stanzas
and metadata
Editing Data Model JSON
At your own risk!

Models edited via the UI are validated
Manually edited data models: NOT SUPPORTED
Exception: installing a new model by adding the file to
<myapp>/<local OR default>/data/models is probably okay
Deleting a Data Model
Use the UI for appropriate cleanup
Potential for bad state if manually deleting model on disk
Interacting With a Data Model
Use data model builder and pivot UI – safest option!
Use REST API – for developers (see docs for details)

Use | datamodel and | pivot Splunk search commands
Permissions
Data models have
permissions just like
other Splunk objects
Edit permissions
through the UI
Data Model Acceleration
Turn on
acceleration via UI

Setting written to conf file

Admin or power user

Backend magic

Poll: are there new
accelerated
models?

Kick off collection

Acceleration

Non-technical user

Run search using on-disk acceleration

Run a pivot report
No acceleration

Kick off ad-hoc acceleration and run search
Model-Wide Acceleration
Only accelerates first eventbased object and descendants
Does not accelerate search and
transaction-based objects

Pivot search:
| tstats count AS "Count of HTTP_Success" from datamodel="WebIntelligence" where
(nodename="HTTP_Request") (nodename="HTTP_Request.HTTP_Success") prestats=true | stats count AS
"Count of HTTP_Success”
Ad-Hoc Object Acceleration
Kick off acceleration on pivot page (re) load for non-accelerated models
and search/transaction objects
Amortize cost of ad-hoc acceleration over repeated pivoting on
same object
Pivot search:
| tstats count AS "Count of HTTP_Success" from sid=1379116434.663 prestats=true | stats count AS
"Count of HTTP_Success”
Acceleration Disclaimers
Works with search-head pooling – we collect on indexers
Cannot edit accelerated models
Thank You

More Related Content

PPTX
SplunkLive! Beginner Session
PPTX
SplunkLive! Data Models 101
PPTX
Splunk live beginner training nyc
PPTX
SplunkLive! Presentation - Data Onboarding with Splunk
PPTX
Data Models Breakout Session
PPTX
Splunk overview
PPTX
SplunkLive 2011 Beginners Session
PPTX
Splunk Ninjas: New features, pivot, and search dojo
SplunkLive! Beginner Session
SplunkLive! Data Models 101
Splunk live beginner training nyc
SplunkLive! Presentation - Data Onboarding with Splunk
Data Models Breakout Session
Splunk overview
SplunkLive 2011 Beginners Session
Splunk Ninjas: New features, pivot, and search dojo

What's hot (19)

PDF
SplunkSummit 2015 - A Quick Guide to Search Optimization
PPTX
SplunkLive! Getting Started with Splunk Enterprise
PPTX
Getting Started with Splunk Break out Session
PDF
Splunk
PPTX
Workshop splunk 6.5-saint-louis-mo
PPTX
SplunkLive! Dallas Nov 2012 - Metro PCS
PPTX
Splunk live! ninjas_break-out
PDF
Splunk Insights
PDF
SplunkLive! Hamburg / München Advanced Session
PDF
Splunk as a_big_data_platform_for_developers_spring_one2gx
PPTX
Getting Data into Splunk
PPTX
Splunk Ninjas: New Features, Pivot, and Search Dojo
PPTX
Data Onboarding Breakout Session
PDF
Splunk Ninjas: New Features, Pivot and Search Dojo
PDF
Splunking configfiles 20211208_daniel_wilson
PPTX
SplunkLive! Detroit April 2013 - Domino's Pizza
PDF
Nationwide Splunk Ninjas!
PPTX
Taking Splunk to the Next Level - Architecture
PDF
Data Onboarding
SplunkSummit 2015 - A Quick Guide to Search Optimization
SplunkLive! Getting Started with Splunk Enterprise
Getting Started with Splunk Break out Session
Splunk
Workshop splunk 6.5-saint-louis-mo
SplunkLive! Dallas Nov 2012 - Metro PCS
Splunk live! ninjas_break-out
Splunk Insights
SplunkLive! Hamburg / München Advanced Session
Splunk as a_big_data_platform_for_developers_spring_one2gx
Getting Data into Splunk
Splunk Ninjas: New Features, Pivot, and Search Dojo
Data Onboarding Breakout Session
Splunk Ninjas: New Features, Pivot and Search Dojo
Splunking configfiles 20211208_daniel_wilson
SplunkLive! Detroit April 2013 - Domino's Pizza
Nationwide Splunk Ninjas!
Taking Splunk to the Next Level - Architecture
Data Onboarding
Ad

Viewers also liked (9)

PPTX
SplunkLive! Analytics with Splunk Enterprise - Part 1
PPT
Hq pixton nte rm
PPTX
Network Forensics for Splunk, an Emulex presentation
PDF
SplunkLive! München 2016 - Splunk für Security
PDF
SplunkLive! Hamburg 2016 - Use Case Otto
PDF
SplunkLive! München 2016 - Splunk @ Datev
PPTX
Splunk sales presentation
PPTX
Building a Security Information and Event Management platform at Travis Per...
PPTX
Splunk Overview
SplunkLive! Analytics with Splunk Enterprise - Part 1
Hq pixton nte rm
Network Forensics for Splunk, an Emulex presentation
SplunkLive! München 2016 - Splunk für Security
SplunkLive! Hamburg 2016 - Use Case Otto
SplunkLive! München 2016 - Splunk @ Datev
Splunk sales presentation
Building a Security Information and Event Management platform at Travis Per...
Splunk Overview
Ad

Similar to SplunkLive! Analytics with Splunk Enterprise (20)

PPTX
Analytics with splunk - Advanced
PPTX
SplunkLive! Analytics with Splunk Enterprise - Part 2
PPTX
Data models pivot with splunk break out session
PPTX
SplunkLive! Munich 2018: Data Onboarding Overview
PPTX
SplunkLive! Zurich 2018: Integrating Metrics and Logs
PPTX
SplunkLive! Frankfurt 2018 - Data Onboarding Overview
DOCX
CMGT410 v19Project Charter TemplateCMGT410 v19Page 2 of 3P.docx
PPTX
Business Analytics Paradigm Change
PPT
Cognos framework manager
PPTX
Advanced Use Cases for Analytics Breakout Session
PDF
Getting Started with Splunk Enterprise
PDF
Azure_Purview.pdf
PDF
(ATS6-APP01) Unleashing the Power of Your Data with Discoverant
PPTX
Microsoft Purview
PPTX
SplunkLive! What's New in Splunk 6 Session
PPTX
SplunkLive! Frankfurt 2018 - Integrating Metrics & Logs
PPTX
Splunk Ninjas: New Features, Pivot, and Search Dojo
PPTX
SplunkLive! Frankfurt 2018 - Get More From Your Machine Data with Splunk AI
PDF
Elastic Stack: Using data for insight and action
PPTX
Splunk Enterprise 6.3 - Splunk Tech Day
Analytics with splunk - Advanced
SplunkLive! Analytics with Splunk Enterprise - Part 2
Data models pivot with splunk break out session
SplunkLive! Munich 2018: Data Onboarding Overview
SplunkLive! Zurich 2018: Integrating Metrics and Logs
SplunkLive! Frankfurt 2018 - Data Onboarding Overview
CMGT410 v19Project Charter TemplateCMGT410 v19Page 2 of 3P.docx
Business Analytics Paradigm Change
Cognos framework manager
Advanced Use Cases for Analytics Breakout Session
Getting Started with Splunk Enterprise
Azure_Purview.pdf
(ATS6-APP01) Unleashing the Power of Your Data with Discoverant
Microsoft Purview
SplunkLive! What's New in Splunk 6 Session
SplunkLive! Frankfurt 2018 - Integrating Metrics & Logs
Splunk Ninjas: New Features, Pivot, and Search Dojo
SplunkLive! Frankfurt 2018 - Get More From Your Machine Data with Splunk AI
Elastic Stack: Using data for insight and action
Splunk Enterprise 6.3 - Splunk Tech Day

More from Splunk (20)

PDF
Splunk Leadership Forum Wien - 20.05.2025
PDF
Splunk Security Update | Public Sector Summit Germany 2025
PDF
Building Resilience with Energy Management for the Public Sector
PDF
IT-Lagebild: Observability for Resilience (SVA)
PDF
Nach dem SOC-Aufbau ist vor der Automatisierung (OFD Baden-Württemberg)
PDF
Monitoring einer Sicheren Inter-Netzwerk Architektur (SINA)
PDF
Praktische Erfahrungen mit dem Attack Analyser (gematik)
PDF
Cisco XDR & Splunk SIEM - stronger together (DATAGROUP Cyber Security)
PDF
Security - Mit Sicherheit zum Erfolg (Telekom)
PDF
One Cisco - Splunk Public Sector Summit Germany April 2025
PDF
.conf Go 2023 - Data analysis as a routine
PDF
.conf Go 2023 - How KPN drives Customer Satisfaction on IPTV
PDF
.conf Go 2023 - Navegando la normativa SOX (Telefónica)
PDF
.conf Go 2023 - Raiffeisen Bank International
PDF
.conf Go 2023 - På liv og død Om sikkerhetsarbeid i Norsk helsenett
PDF
.conf Go 2023 - Many roads lead to Rome - this was our journey (Julius Bär)
PDF
.conf Go 2023 - Das passende Rezept für die digitale (Security) Revolution zu...
PDF
.conf go 2023 - Cyber Resilienz – Herausforderungen und Ansatz für Energiever...
PDF
.conf go 2023 - De NOC a CSIRT (Cellnex)
PDF
conf go 2023 - El camino hacia la ciberseguridad (ABANCA)
Splunk Leadership Forum Wien - 20.05.2025
Splunk Security Update | Public Sector Summit Germany 2025
Building Resilience with Energy Management for the Public Sector
IT-Lagebild: Observability for Resilience (SVA)
Nach dem SOC-Aufbau ist vor der Automatisierung (OFD Baden-Württemberg)
Monitoring einer Sicheren Inter-Netzwerk Architektur (SINA)
Praktische Erfahrungen mit dem Attack Analyser (gematik)
Cisco XDR & Splunk SIEM - stronger together (DATAGROUP Cyber Security)
Security - Mit Sicherheit zum Erfolg (Telekom)
One Cisco - Splunk Public Sector Summit Germany April 2025
.conf Go 2023 - Data analysis as a routine
.conf Go 2023 - How KPN drives Customer Satisfaction on IPTV
.conf Go 2023 - Navegando la normativa SOX (Telefónica)
.conf Go 2023 - Raiffeisen Bank International
.conf Go 2023 - På liv og død Om sikkerhetsarbeid i Norsk helsenett
.conf Go 2023 - Many roads lead to Rome - this was our journey (Julius Bär)
.conf Go 2023 - Das passende Rezept für die digitale (Security) Revolution zu...
.conf go 2023 - Cyber Resilienz – Herausforderungen und Ansatz für Energiever...
.conf go 2023 - De NOC a CSIRT (Cellnex)
conf go 2023 - El camino hacia la ciberseguridad (ABANCA)

Recently uploaded (20)

PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PDF
Network Security Unit 5.pdf for BCA BBA.
PDF
Machine learning based COVID-19 study performance prediction
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PPTX
sap open course for s4hana steps from ECC to s4
PDF
Empathic Computing: Creating Shared Understanding
PPTX
Cloud computing and distributed systems.
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PPTX
Spectroscopy.pptx food analysis technology
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
Digital-Transformation-Roadmap-for-Companies.pptx
“AI and Expert System Decision Support & Business Intelligence Systems”
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
The Rise and Fall of 3GPP – Time for a Sabbatical?
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
Network Security Unit 5.pdf for BCA BBA.
Machine learning based COVID-19 study performance prediction
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
The AUB Centre for AI in Media Proposal.docx
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
20250228 LYD VKU AI Blended-Learning.pptx
Diabetes mellitus diagnosis method based random forest with bat algorithm
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Per capita expenditure prediction using model stacking based on satellite ima...
sap open course for s4hana steps from ECC to s4
Empathic Computing: Creating Shared Understanding
Cloud computing and distributed systems.
Building Integrated photovoltaic BIPV_UPV.pdf
Spectroscopy.pptx food analysis technology

SplunkLive! Analytics with Splunk Enterprise

  • 1. Copyright © 2013 Splunk Inc. Analytics with Splunk Enterprise
  • 2. Legal Notices During the course of this presentation, we may make forward-looking statements regarding future events or the expected performance of the company. We caution you that such statements reflect our current expectations and estimates based on factors currently known to us and that actual events or results could differ materially. For important factors that may cause actual results to differ from those contained in our forward-looking statements, please review our filings with the SEC. The forward-looking statements made in this presentation are being made as of the time and date of its live presentation. If reviewed after its live presentation, this presentation may not contain current or accurate information. We do not assume any obligation to update any forward-looking statements we may make. In addition, any information about our roadmap outlines our general product direction and is subject to change at any time without notice. It is for informational purposes only and shall not, be incorporated into any contract or other commitment. Splunk undertakes no obligation either to develop the features or functionality described or to include any such feature or functionality in a future release. Splunk, Splunk>, Splunk Storm, Listen to Your Data, SPL and The Engine for Machine Data are trademarks and registered trademarks of Splunk Inc. in the United States and other countries. All other brand names, product names, or trademarks belong to their respective owners. ©2013 Splunk Inc. All rights reserved.
  • 4. Analytics Big Picture Pivot Build complex reports without the search language Data Model Provides more meaningful representation of underlying raw machine data Analytics Store Acceleration technology delivers up to 1000x faster analytics over Splunk 5 4
  • 5. Operational Intelligence Across the Enterprise [10/11/12 18:57:04 000000b0 UTC] Raw Data IT professional Create and share data models Accelerate data models and custom searches with the analytics store Create reports with pivot Analytics Store Developer Leverage data models to abstract data Leverage pivot in custom apps Data Model Pivot Analyst Create reports using pivot based on data models created by IT
  • 6. Pivot is a query builder.
  • 13. Splunk Search Language search and filter | munge | report | clean-up sourcetype=access_combined source = "/home/ssorkin/banner_access.log.2013.6.gz" | eval unique=(uid + useragent) | stats dc(unique) by os_name | rename dc(unique) as "Unique Visitors" os_name as "Operating System"
  • 14. Hurdles index=main source=*/banner_access* uri_path=/js/*/*/login/* guid=* useragent!=*KTXN* useragent!=*GomezAgent* clientip!=206.80.3.67 clientip!=198.144.207.62 clientip!=97.65.63.66 clientip!=175.45.37.78 clientip!=209.119.210.194 clientip!=212.36.37.138 clientip!=204.156.84.0/24 clientip!=216.221.226.0/24 clientip!=207.87.200.162 | rex field=uri_path "/js/(?<t>[^/]*)/(?<v>[^/]*)/login/(?<l>[^/]*)” | eval license = case(l LIKE "prod%" AND t="pro", "enterprise", l LIKE "trial%" AND t="pro", "trial", t="free", "free”) | rex field=v "^(?<vers>d.d)” | bin span=1d _time as day | stats values(vers) as vers min(day) as min_day min(eval(if(vers=="5.0", _time, null()))) as min_day_50 dc(day) as days values(license) as license by guid | eval type = if(match(vers,"4.*"), "upgrade", "not upgrade") + "/" + if(days > 1, "repeat", "not repeat")| search license=enterprise | eval _time = min_day_50| timechart count by type| streamstats sum(*) as * • Simple searches easy… Multi-stage munging/reporting is hard! • Need to understand data’s structure to construct search • Non-technical users may not have data source domain knowledge • Splunk admins do not have end-user search context
  • 15. Data Model Goals • Make it easy to share/reuse domain knowledge • Admins/power users build data models • Non-technical users interact with data via pivot UI
  • 17. What is a Data Model? A data model is a search-time mapping of data onto a hierarchical structure Encapsulate the knowledge needed to build a search Pivot reports are build on top of data models Data-independent Screenshot here
  • 18. A Data Model is a Collection of Objects Screenshot here
  • 19. Objects Have Constraints and Attributes Screenshot here
  • 20. Child Objects Inherit Constraints and Attributes Screenshot here
  • 21. Child Objects Inherit Constraints and Attributes
  • 23. Three Root Object Types Event – MapstoSplunkevents – Requiresconstraints andattributes
  • 24. Three Root Object Types Event – MapstoSplunkevents – Requiresconstraints andattributes Search – MapstoarbitrarySplunksearch(may includegenerating,transformingand reportingsearchcommands) – Requiressearchstringattributes • Transaction – Mapsto groupsof Splunkeventsor groupsof Splunksearchresults – Requiresobjectsto group,fields/ conditionstogroupby,andattributes
  • 25. Three Root Object Types Event – MapstoSplunkevents – Requiresconstraints andattributes Search – MapstoarbitrarySplunksearch(may includegenerating,transformingand reportingsearchcommands) Requiressearchstringattributes Transaction – Mapsto groupsof Splunkeventsor groupsof Splunksearchresults – Requiresobjectsto group,fields/ conditionstogroupby,andattributes
  • 26. Object Attributes Auto-extracted – default and predefined fields Eval expression – a new field based on an expression that you define Lookup – leverage an existing lookup table Regular expression – extract a new field based on regex Geo IP – add geolocation fields such as latitude, longitude, country, etc.
  • 27. Object Attributes Set field types Configure various flags Note: Child object configuration can differ from parent
  • 28. Best Practices Use event objects as often as possible – Benefit from data model acceleration Resist the urge to use search objects instead of event objects!! – Event based searches can be optimized better Minimize object hierarchy depth when possible – Constraint based filtering is less efficient deeper down the tree Event object with deepest tree (and most matching results) first – Model-wide acceleration only for first event object and its descendants
  • 29. Warnings! Object constraints and attributes cannot contain pipes or subsearches A transaction object requires at least one event or search object in the data model Lookups used in attributes must be globally visible (or at least visible to the app using the data model) No versioning on data models (and objects)!
  • 30. From Data Models to Reports
  • 31. Using the UI Subhead Count of http_success events, split by useragent events fields
  • 32. Under the Hood: Object Search String Generation Event Object Syntax: <constraints search> | <my attribute definitions> Example: sourcetype=access_* OR sourcetype=iis* uri=* uri_path=* status=* clientip=* referer=* useragent=*
  • 33. Under the Hood: Object Search String Generation Search Object Syntax: <base search> | <my attribute definitions> Example: _time=* host=* source=* sourcetype=* uri=* status<600 clientip=* referer=* useragent=* (sourcetype=access_* OR source=*.log) | eval userid=clientip | stats first(_time) as earliest, last(_time) as latest, list(uri_path) as uri_list by userid | earliest=* latest=* uri_list=*
  • 34. Under the Hood: Object Search String Generation Transaction Object Syntax: <objects to group search> | transaction <group by fields> <group by params> | <my attribute definitions> Example: sourcetype=access_* uri=* uri_path=* status=* clientip=* referer=* useragent=* | transaction clientip useragent | eval landingpage=mvindex(uri_path,1) | eval exitpage=mvindex(uri_path,-1)
  • 35. Under the Hood: Object Search String Generation Child Object Syntax: <parent object search> | search <my constraints> | <my attribute definitions> Example: sourcetype=access_* uri=* uri_path=* status=* clientip=* referer=* useragent=* status=2* | <my attribute definitions>
  • 36. Using the Splunk Search Language Object Search String | datamodel <modelname> <objectID> search Example: | datamodel WebIntelligence HTTP_Request search Behind the scenes: sourcetype=access_* OR sourcetype=iis* uri=* uri_path=* status=* clientip=* referer=* useragent=*
  • 37. Under the hood: Pivot Search String Generation Pivot search = object search + filters + reporting + formatting Example: (sourcetype=access_* OR sourcetype=iis*) status=2* uri=* uri_path=* status=* clientip=* referer=* useragent=* | stats count AS "Count of HTTP_Sucess" by ”useragent" | sort limit=0 "useragent" | fields - _span | fields "useragent" "Count of HTTP_Success" | fillnull "Count of HTTP_Success" | fields "useragent" *
  • 38. Using the Splunk Search Language Pivot Search String | pivot <modelname> <objectID> [statsfns, rowsplit, colsplit, filters, …] Example: | pivot WebIntelligence HTTP_Request count(HTTP_Request) AS "Count of HTTP_Request" SPLITROW status AS "status" SORT 0 status Behind the scenes: sourcetype=access_* OR sourcetype=iis* uri=* uri_path=* status=* clientip=* referer=* useragent=* | stats count AS "Count of HTTP_Request" by "status" | sort limit=0 "status" | fields - _span | fields "status", "Count of HTTP_Request" | fillnull "Count of HTTP_Request" | fields "status" *
  • 39. Warnings • | datamodel and | pivot are generating commands – They must be at the beginning of the search string • Use objectIDs NOT user-visible object names
  • 41. Data Model on Disk Each data model is a separate JSON file Lives in <myapp>/local/data/models (or <myapp>/default/data/models for pre-installed models) Has associated conf stanzas and metadata
  • 42. Editing Data Model JSON At your own risk! Models edited via the UI are validated Manually edited data models: NOT SUPPORTED Exception: installing a new model by adding the file to <myapp>/<local OR default>/data/models is probably okay
  • 43. Deleting a Data Model Use the UI for appropriate cleanup Potential for bad state if manually deleting model on disk
  • 44. Interacting With a Data Model Use data model builder and pivot UI – safest option! Use REST API – for developers (see docs for details) Use | datamodel and | pivot Splunk search commands
  • 45. Permissions Data models have permissions just like other Splunk objects Edit permissions through the UI
  • 46. Data Model Acceleration Turn on acceleration via UI Setting written to conf file Admin or power user Backend magic Poll: are there new accelerated models? Kick off collection Acceleration Non-technical user Run search using on-disk acceleration Run a pivot report No acceleration Kick off ad-hoc acceleration and run search
  • 47. Model-Wide Acceleration Only accelerates first eventbased object and descendants Does not accelerate search and transaction-based objects Pivot search: | tstats count AS "Count of HTTP_Success" from datamodel="WebIntelligence" where (nodename="HTTP_Request") (nodename="HTTP_Request.HTTP_Success") prestats=true | stats count AS "Count of HTTP_Success”
  • 48. Ad-Hoc Object Acceleration Kick off acceleration on pivot page (re) load for non-accelerated models and search/transaction objects Amortize cost of ad-hoc acceleration over repeated pivoting on same object Pivot search: | tstats count AS "Count of HTTP_Success" from sid=1379116434.663 prestats=true | stats count AS "Count of HTTP_Success”
  • 49. Acceleration Disclaimers Works with search-head pooling – we collect on indexers Cannot edit accelerated models

Editor's Notes

  • #5: Splunk 6 takes large-scalemachine data analytics to the next level by introducing three breakthrough innovations:Pivot – opens up the power of Splunk search to non-technical users with an easy-to-use drag and drop interface to explore, manipulate and visualize data Data Model – defines meaningful relationships in underlying machine data and making the data more useful to broader base of non-technical usersAnalytics Store – patent pending technology that accelerates data models by delivering extremely high performance data retrieval for analytical operations, up to 1000x faster than Splunk 5Let’s dig into each of these new features in more detail.
  • #6: How does theAnalytics Store, Data Model and Pivot benefit users across the enterprise?Lets start with the IT Professional – this includes the Splunk Administrator or an advanced Splunk user that is familiar with SPL.Using Splunk 6 they can:Create data modelsShare data models with other users – delivering a consistent view of the dataAccelerate data models using the Analytics StoreCreate reports using Pivot (although being power users, they may prefer using SPL directly!)Next we have the enterprise developer.Using Splunk 6 they can:Leverage data models built by IT, making searches more portable (using common Data Models ensures predictability of results)Leverage the Pivot interface in custom enterprise appsFinally, there are additional users that can now benefit – for example, the business or data analyst. Using Splunk 6 they can:Create reports, dashboards, charts and other visualizations using the Pivot interface and based on data models that provide an abstracted view of the raw data. Splunk 6 is not meant to replace existing BI and Business Analytics tools, but it does provide new visibility, insights and intelligence from operational data that can be used by business analysts to augment these tools. Data from Splunk software can also be leveraged directly using the Splunk API and SDKs and integrated into existing business analytics tools. For example, the recently announced Pentaho Business Analytics for Splunk® Enterprise (http://apps.splunk.com/app/1554), enables business users to utilize Pentaho to rapidly visualize and gain additional insights from Splunk’s machine data platform using existing in-house skills.
  • #14: -The Splunk search language is very expressive. - Can perform a wide variety of tasks ranging from filtering to data munging and reporting- There are various search commands for complex transformations and statistics (e.g. correlation, prediction etc)
  • #15: What does the search do?Basically, first it normalizes the individual accesses, which should be representable as a model object.Next it aggregates by guid to create an &quot;instance&quot; object, which should be representable in a DM.It calculates a field on that instance object, &quot;type&quot;.Then it builds a timechart. of those, using a special &quot;_time&quot; value.Low overhead to start but learning curve quickly gets steepObtaining website usage metrics should not require understanding Apache vs IIS formatAdmins won’t know apriori what questions are being asked of the data…so they can’t provide canned dashboards for all scenariosBackup search for example: eventtype=pageview | eval stage_2=if(searchmatch(&quot;uri=/download*&quot;), _time, null()) | eval stage_1=if(searchmatch(&quot;uri=/product*&quot;), _time, null()) | eval stage_3=if(searchmatch(&quot;uri=*download_track*&quot;), _time, null()) | stats min(stage_*) as stage_*  by cookie | search stage_1=* | where isnull(stage_2) OR stage_2 &gt;= stage_1 | where isnull(stage_3) OR stage_3 &gt;= stage_2 | eval stage = case(isnull(stage_2), &quot;stage_1&quot;, isnull(stage_3), &quot;stage_2&quot;, 1==1, &quot;stage_3&quot;) | stats count by stage | reverse | accum count as cumulative_count |  reverse | streamstats current=f max(cumulative_count) as stage_1_count last(cumulative_count) as prev_count
  • #19: What are the important “things” in your data?E.g. WebIntelligence might haveHTTPAccessHTTPSuccessUser SessionHow are they related?There’s more than one “right” way to define your objects
  • #20: Constraints filter down to a set of a dataAttributes are the fields and knowledge associated with the objectBoth are inherited!
  • #21: A child object is a type of its parent object: e.g. An HTTP_Success object is a type of HTTP_AccessAdding a child object is essentially a way of adding a filter on the parentsA parent-child relationship makes it easy to do queries like “What percentage of my HTTP_Access events are HTTP_Success events?”
  • #24: Constraints are essentially the search broken down into a hierarchy, attributes are the associated fields and knowledge
  • #25: Arbitrary searches that include transforming commands to define the dataset that they representFix example here? TODO
  • #26: Enable the creation of objects that represent transactionsUse fields that have already been added to the model via event or search objects
  • #27: This is how we capture knowledge
  • #28: Required: Only events that contain this field will be returned in PivotOptional: The field doesn&apos;t have to appear in every event Hidden: The field will not be displayed to Pivot users when they select the object in PivotUse this for fields that are only being used to define another attribute, such as an eval expression Hidden &amp; Required: Only events that contain this field will be returned, and the field will be hidden from use in Pivot
  • #29: Be careful about lookup permissions – must be available in the context where you want to use them
  • #47: Divanny will help make this slide MUCH prettier
  • #48: This could be more slides,more details
  • #50: This could be more slides,more details