SlideShare a Scribd company logo
Data encoding and 
metadata for streams
Data encoding and Metadata for Streams
> Introduction 
Me at a glance 
• My name is Jonathan Winandy (@ahoy_jon). 
• I am a Data pipeline engineer : 
• I worked on a “DataLake” ! 
• I use tools in the larger Java ecosystem like 
Java, Scala, Clojure, Hadoop … 
• And I am an “entrepreneur”.
> Introduction 
I cofounded two companies and they use streams 
as their data backbone. 
Health care oriented software engineering. 
Provide : 
Coordination for health care professionals.
I cofounded two companies and they use streams 
as their data backbone. 
@PrimaticeData 
“Good dataviz, surreal backends.” 
> Introduction 
Provide : 
Tools and methods for Data capitalisation.
> Introduction 
What are Streams ? 
It’s an abstract data structure with the following : 
operations : 
• append(bytes) -> void? 
• readAt(int) -> null | bytes 
rule 1 : 
∀p ∈ ℕ, for some definition of ‘==‘ 
x := readAt(p) 
y := readAt(p) 
! 
x != null => x == y 
Rule 1 implies : Infinite cacheability 
once the data is available at a position.
Streams are the simplest way 
to manage data. 
0 1 2 3 4 5 6 
And they are naturally compatible with the perception of information from a 
singular observer … 
> Introduction
But be careful, streams are definitely 
not like queues, ESB, EAI, or what ever 
messaging solution comes to mind …
> Introduction 
There is a lot to tell on Streams 
• Sub events : Events are pre-projected into … 
• Quantum of action : A ‘user’ action generates zero or one event (no 
more). 
• Structural sharing for large payload (cf. Content Addressable 
Storage). 
• Garbage collection for append only data structures. 
this presentation 
! 
• Causality enforcement in asynchronous contexts : On important 
request, causality is enforced. 
• Binary encoding and Metadata.
> Introduction 
A quick note on Causality 
If you don’t ensure causality for 
web apps, some strange 
comportements may arise : 
Sometimes, as a user, I 
cannot see my own “edits”. 
Sometimes, as a client, I 
cannot buy on the website 
after I checkout my basket. 
APP APP 
“Who is the fastest 
between the Data bus 
and the client ?” 
You don’t want to bet, 
especially under load.
Data encoding and 
metadata for streams
> Content 
Content : 
• Data encoding 
• Identity 
• Metadata 
• Datagram 
• Conclusion
> Data encoding 
State of data encodings in the 
industry 
• As always worse is considered better. 
• Most of streams have data encoded in : 
• CSV/TSV 
• JSON 
• Platform specific serialisations (eg: Java 
serialisation, Kryo)
> Data encoding 
Why this is important ? 
• Some streams may contains very large amount of 
Data, the chosen encoding must be cpu and space 
efficient. 
• Streams are processed 
by many programs, 
and many intermediaries, 
for many years, 
the chosen encoding must be processable in a 
generic way.
JSON is the lower denominator 
Plus : 
• It reaches the browser, you can produce and consume data from 
inside a web page. 
A lot of Cons : 
• Inefficient, 
• No dates, no proper numerics, 
• Very basic data structures, 
• Very error Prone. 
We all need JSON, 
but we should use it 
only when we can't 
avoid it. 
> Data encoding 
Eg : In our databases, we can 
avoid JSONs ;)
> Data encoding 
How bad JSON is ? 
{“name”:"Bob", 
“age":11, 
"gender":"Male"} 
39 Bytes for 
10 Bytes of data 
:02:06:62:6f:62:02:16:02:da:01
> Data encoding 
relevant 
ones 
popular binaries low tech cognitect “papa ?!” 
Avro Thrift Proto Buf JSON CSV Fressian Transit EDN XML RDF 
binary YES NO YES OK NO ?? 
generic YES ?? NO YES YES YES 
schema 
based 
YES NO YES NO ?? meta 
specific 
encoding 
S” YES OK Literal 
YES NO “STRING 
s 
reach the 
browser 
YES NO +++++ OK NO YES OK 
easy ? NO I PASS “true” YEP 
HUM 
? 
… 
safe ? YES HUM? NO NO MISM 
ATCH 
<! 
YES 
has 
dates? Soon NO NO YES YES
Identity 
> Identity 
• Most mechanism around stream assure an “at 
most once delivery”. 
• An identity definition is necessary to ensure 
idempotency.
> Identity 
There are 2 ways to refer to a message : 
• with a fingerprint calculated from the message 
(digest). 
• with an external identifier (like UUIDs).
> Identity 
UUIDs allow : 
F0991FD1-D58A-4A5F-8D13-903F368882D1 
8AA5C612-B365-4F8F-AF3F-DF623E1F6B22 
93A87D37-0658-47C9-84F6-801E83A5821C 
• to manage things that are not encoded yet. 
• to avoid the hashing and the parsing of payloads. 
Recommandation : add an UUID (128bits) to 
every elements of the stream.
Metadata 
> Metadata 
• Metadata uses range from the very useful (like 
http headers) to the very meta meta[1]. 
• Metadata on Stream elements is most of the 
time implicit, like for example the Content-Type : 
• “It’s a stream of JSONs” then every element 
of the stream has “content-type=application/json”. 
[1] I am looking at you RDF !
What kind of metadata there are for streams element ? 
• Content-type or data-encoding : 
e.g. : application/json 
• Type or Profile : indicate that the given element 
is an instance of a given type. 
e.g. : domain.model.MessageSent 
• Provenance information : 
e.g. : {“env”:”test”, 
“application”:{“name”:”webapp”, 
“version”:{“commit”:”68546ca…”}}} 
> Metadata
> Metadata 
A quick note on provenance 
The provenance is practical in distributed systems we 
want to know : 
• from which node do a element comes. 
• on the behalf of which agent this element is created. 
• from which environment[1] a element comes. 
[1] with new architecture and Data Labs, environments are sometimes 
shared on the same infrastructure (eg : no Pre-Production platform). 
It’s then very useful to safeguard against the pollution of data.
> Metadata 
{ 
"content-type":"application/json", 
"profile":"domain.model.MessageSent", 
"provenance":{ 
"application":{ 
"name":"webapp", 
"version":"68546ca6e963981a8279aa327cc1e1362d15554e" 
}, 
"node":{ 
"environement":"test", 
"network":{ 
"interface":{ 
"en0":{ 
"addresses":{ 
"192.168.0.13":{ 
"family":"inet", 
"netmask":"255.255.255.0", 
"broadcast":"192.168.0.255" 
} 
} 
} 
} 
}, 
"hostname":["Blaze"], 
"platform_family":"mac_os_x" 
} 
} 
} 
• The metadata of an element can 
represent a significant piece of data. 
Sometimes more than the data itself. 
• !! The same piece of metadata can be 
shared across many elements. !!
> Datagram 
Anatomy of an element 
:ID :HEADERS 
:BODY 
DB7D919B-248F-4676- 
8494-2698B48C69C3 
57158663-5933-4CE6- 
A54E-8179ECFBFCCA [“ich”,“bin”,“ein”,“JSON”] 
e.g.
> Datagram 
1. Create and register your headers 
(in a distributed Key/Store for example) . 
4813EDF2-B04E-4B70- 
{ 
AB04-0F9EA456E032 
"content-type":"application/json", 
"profile":"domain.model.MessageSent", 
"provenance":{ 
"application":{ 
"name":"webapp", 
"version":"68546ca6e963981a8279aa327cc1e1362d15554e" 
}, 
"node":{ 
"environement":"test", 
"network":{ 
"interface":{ 
"en0":{ 
"addresses":{ 
"192.168.0.13":{ 
"family":"inet", 
"netmask":"255.255.255.0", 
"broadcast":"192.168.0.255" 
} 
} 
} 
} 
}, 
"hostname":["Blaze"], 
"platform_family":"mac_os_x" 
} 
} 
}
> Datagram 
2. use it in your stream ! 
5462E738-ABAA-452F- 
87E0-FD38AEB9DF81 
4813EDF2-B04E-4B70- 
AB04-0F9EA456E032 
{"cid": {"idStr": "498683D2-1192-4794-8C23-5BE49EEEC763"}, 
"userId": 
{"idStr": "BC3D8614-AF1F-48C8-B91F-0D907FD0FAF3"}, 
"content": " Contenu de message de test"} 
81C76676-7B19-428E-8 
56D-984BB67287D1 
4813EDF2-B04E-4B70- 
AB04-0F9EA456E032 
{"cid": {"idStr": "498683D2-1192-4794-8C23-5BE49EEEC763"}, 
"userId": 
{"idStr": "BC3D8614-AF1F-48C8-B91F-0D907FD0FAF3"}, 
"content": " Contenu de message de test”}
> Datagram 
Ho : You can have also have a stream of headers … 
4813EDF2-B04E-4B70- 
AB04-0F9EA456E032 :HEADERS 
4813EDF2-B04E-4B70- 
AB04-0F9EA456E032 
5462E738-ABAA-452F- 
87E0-FD38AEB9DF81 
4813EDF2-B04E-4B70- 
AB04-0F9EA456E032 
81C76676-7B19-428E-85 
6D-984BB67287D1 
4813EDF2-B04E-4B70- 
AB04-0F9EA456E032 
69DFC711-9D21-4DD6- 
A51D-C04A7A6E20A9 
0 1 2
> Conclusion 
If you don’t yet use streams instead of 
databases, start to use one next Monday 
(even with JSON and no headers…). 
If you do already use streams … Well, 
you know what to do ! ;)
Data encoding and Metadata for Streams
Data encoding and Metadata for Streams
Data encoding and Metadata for Streams
Bonus :What is a CAS ? 
A Content Adressable Storage is a specific “key 
value store” : 
operations : 
• store(bytes) -> key 
• get(key) -> null | bytes 
rule 1 : 
key = h(data) 
h being a cryptographic hash 
function like md5 or sha1. 
rule 2 : 
∀data 
get(store(data)) = data 
Rule 1 and 2 imply : 
Infinite cacheability 
and scalability.
Exemple of architectures 
CLASSICAL 
APP 
APP 
DB 
WITH STREAMS 
APP 
APP 
append 
broadcast
The broadcast mechanism is equivalent to 
Exemple of architectures 
a db replication mechanism. 
CLASSICAL 
APP 
REPLICATION 
(BIN/LOG) 
APP 
APP 
DB 
DB 
WITH STREAMS 
APP 
APP 
APP 
append 
broadcast

More Related Content

PDF
Introduction à kafka
PDF
Streaming in Scala with Avro
PDF
Elasticsearch and Spark
PDF
Using Elasticsearch for Analytics
PDF
ELK Wrestling (Leeds DevOps)
PDF
Forcelandia 2016 PK Chunking
PDF
Interactive learning analytics dashboards with ELK (Elasticsearch Logstash Ki...
PPTX
Automatic Scaling Iterative Computations
Introduction à kafka
Streaming in Scala with Avro
Elasticsearch and Spark
Using Elasticsearch for Analytics
ELK Wrestling (Leeds DevOps)
Forcelandia 2016 PK Chunking
Interactive learning analytics dashboards with ELK (Elasticsearch Logstash Ki...
Automatic Scaling Iterative Computations

What's hot (20)

PDF
Elasticsearch
PPTX
Log analysis using Logstash,ElasticSearch and Kibana
PPT
Not only SQL
PPT
Yahoo! Hadoop User Group - May Meetup - Extraordinarily rapid and robust data...
PPTX
Presentation: mongo db & elasticsearch & membase
PDF
Hands On With Spark: Creating A Fast Data Pipeline With Structured Streaming ...
PDF
Scalable and Reliable Logging at Pinterest
PPTX
ELK at LinkedIn - Kafka, scaling, lessons learned
PDF
Lightbend Fast Data Platform
PPTX
Real time analytics using Hadoop and Elasticsearch
PDF
Facebook Presto presentation
PDF
Understanding Akka Streams, Back Pressure, and Asynchronous Architectures
PDF
Building Realtime Data Pipelines with Kafka Connect and Spark Streaming
PPTX
How to manage large amounts of data with akka streams
PDF
To Have Own Data Analytics Platform, Or NOT To
PDF
Евгений Бобров "Powered by OSS. Масштабируемая потоковая обработка и анализ б...
PDF
SF Front End Developers - Ember + D3
PDF
Fluentd and Docker - running fluentd within a docker container
PDF
Elasticsearch in Netflix
PDF
Akka Streams And Kafka Streams: Where Microservices Meet Fast Data
Elasticsearch
Log analysis using Logstash,ElasticSearch and Kibana
Not only SQL
Yahoo! Hadoop User Group - May Meetup - Extraordinarily rapid and robust data...
Presentation: mongo db & elasticsearch & membase
Hands On With Spark: Creating A Fast Data Pipeline With Structured Streaming ...
Scalable and Reliable Logging at Pinterest
ELK at LinkedIn - Kafka, scaling, lessons learned
Lightbend Fast Data Platform
Real time analytics using Hadoop and Elasticsearch
Facebook Presto presentation
Understanding Akka Streams, Back Pressure, and Asynchronous Architectures
Building Realtime Data Pipelines with Kafka Connect and Spark Streaming
How to manage large amounts of data with akka streams
To Have Own Data Analytics Platform, Or NOT To
Евгений Бобров "Powered by OSS. Масштабируемая потоковая обработка и анализ б...
SF Front End Developers - Ember + D3
Fluentd and Docker - running fluentd within a docker container
Elasticsearch in Netflix
Akka Streams And Kafka Streams: Where Microservices Meet Fast Data
Ad

Viewers also liked (19)

PDF
Test strategies for data processing pipelines
PDF
Character Encoding - MySQL DevRoom - FOSDEM 2015
DOCX
Data encoding techniques for reducing energyb consumption in network on-chip
PDF
The Mechanics of Testing Large Data Pipelines (QCon London 2016)
PDF
Stream Processing Everywhere - What to use?
PPTX
Ingest and Stream Processing - What will you choose?
PDF
Data pipelines from zero to solid
PPTX
Data encoding
PPT
CCNA
PPT
Encoding in Data Communication DC8
PPTX
Asynchronous and synchronous
PPS
Synchronous and-asynchronous-data-transfer
PPT
Ccna Presentation
PDF
A primer on building real time data-driven products
PPTX
Docker based Hadoop provisioning - anywhere
PPTX
Provisioning Big Data Platform using Cloudbreak & Ambari
PPT
Docker based Hadoop provisioning - Hadoop Summit 2014
PPTX
Cloudbreak - Technical Deep Dive
PPT
Data Encoding
Test strategies for data processing pipelines
Character Encoding - MySQL DevRoom - FOSDEM 2015
Data encoding techniques for reducing energyb consumption in network on-chip
The Mechanics of Testing Large Data Pipelines (QCon London 2016)
Stream Processing Everywhere - What to use?
Ingest and Stream Processing - What will you choose?
Data pipelines from zero to solid
Data encoding
CCNA
Encoding in Data Communication DC8
Asynchronous and synchronous
Synchronous and-asynchronous-data-transfer
Ccna Presentation
A primer on building real time data-driven products
Docker based Hadoop provisioning - anywhere
Provisioning Big Data Platform using Cloudbreak & Ambari
Docker based Hadoop provisioning - Hadoop Summit 2014
Cloudbreak - Technical Deep Dive
Data Encoding
Ad

Similar to Data encoding and Metadata for Streams (20)

PPT
Document Databases & RavenDB
PPTX
Data saturday malta - ADX Azure Data Explorer overview
PDF
Open Security Operations Center - OpenSOC
PPTX
Public private hybrid - cmdb challenge
PPTX
SQL to NoSQL: Top 6 Questions
PPTX
Sherlock Homepage - A detective story about running large web services (VISUG...
PPTX
Sherlock Homepage (Maarten Balliauw)
PPTX
Apache Spark Streaming -Real time web server log analytics
PPTX
10 Big Data Technologies you Didn't Know About
PPTX
Berlin Buzz Words - Apache Drill by Ted Dunning & Michael Hausenblas
PDF
From a student to an apache committer practice of apache io tdb
PDF
JSON-LD and SHACL for Knowledge Graphs
PPTX
OData: Universal Data Solvent or Clunky Enterprise Goo? (GlueCon 2015)
PDF
Analyzing Semi-Structured Data At Volume In The Cloud
PPTX
The Right Data for the Right Job
PPTX
MyHeritage Cassandra meetup 2016
PDF
MongoDB: What, why, when
PPTX
Sherlock Homepage - A detective story about running large web services - NDC ...
PDF
Etl with apache impala by athemaster
PPTX
Sharing a Startup’s Big Data Lessons
Document Databases & RavenDB
Data saturday malta - ADX Azure Data Explorer overview
Open Security Operations Center - OpenSOC
Public private hybrid - cmdb challenge
SQL to NoSQL: Top 6 Questions
Sherlock Homepage - A detective story about running large web services (VISUG...
Sherlock Homepage (Maarten Balliauw)
Apache Spark Streaming -Real time web server log analytics
10 Big Data Technologies you Didn't Know About
Berlin Buzz Words - Apache Drill by Ted Dunning & Michael Hausenblas
From a student to an apache committer practice of apache io tdb
JSON-LD and SHACL for Knowledge Graphs
OData: Universal Data Solvent or Clunky Enterprise Goo? (GlueCon 2015)
Analyzing Semi-Structured Data At Volume In The Cloud
The Right Data for the Right Job
MyHeritage Cassandra meetup 2016
MongoDB: What, why, when
Sherlock Homepage - A detective story about running large web services - NDC ...
Etl with apache impala by athemaster
Sharing a Startup’s Big Data Lessons

More from univalence (7)

PDF
Scala pour le Data Eng
PDF
Spark-adabra, Comment Construire un DATALAKE ! (Devoxx 2017)
PDF
7 key recipes for data engineering
PPTX
7 key recipes for data engineering
PDF
Beyond tabular data
PDF
Introduction aux Macros
PDF
Big data forever
Scala pour le Data Eng
Spark-adabra, Comment Construire un DATALAKE ! (Devoxx 2017)
7 key recipes for data engineering
7 key recipes for data engineering
Beyond tabular data
Introduction aux Macros
Big data forever

Recently uploaded (20)

PDF
NewMind AI Weekly Chronicles - August'25 Week I
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PDF
Approach and Philosophy of On baking technology
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PPTX
Understanding_Digital_Forensics_Presentation.pptx
PPTX
Cloud computing and distributed systems.
PDF
Spectral efficient network and resource selection model in 5G networks
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
KodekX | Application Modernization Development
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PPTX
Programs and apps: productivity, graphics, security and other tools
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
Machine learning based COVID-19 study performance prediction
PDF
Electronic commerce courselecture one. Pdf
NewMind AI Weekly Chronicles - August'25 Week I
Mobile App Security Testing_ A Comprehensive Guide.pdf
Approach and Philosophy of On baking technology
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Understanding_Digital_Forensics_Presentation.pptx
Cloud computing and distributed systems.
Spectral efficient network and resource selection model in 5G networks
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
KodekX | Application Modernization Development
Digital-Transformation-Roadmap-for-Companies.pptx
Programs and apps: productivity, graphics, security and other tools
Encapsulation_ Review paper, used for researhc scholars
Building Integrated photovoltaic BIPV_UPV.pdf
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
Reach Out and Touch Someone: Haptics and Empathic Computing
Advanced methodologies resolving dimensionality complications for autism neur...
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Machine learning based COVID-19 study performance prediction
Electronic commerce courselecture one. Pdf

Data encoding and Metadata for Streams

  • 1. Data encoding and metadata for streams
  • 3. > Introduction Me at a glance • My name is Jonathan Winandy (@ahoy_jon). • I am a Data pipeline engineer : • I worked on a “DataLake” ! • I use tools in the larger Java ecosystem like Java, Scala, Clojure, Hadoop … • And I am an “entrepreneur”.
  • 4. > Introduction I cofounded two companies and they use streams as their data backbone. Health care oriented software engineering. Provide : Coordination for health care professionals.
  • 5. I cofounded two companies and they use streams as their data backbone. @PrimaticeData “Good dataviz, surreal backends.” > Introduction Provide : Tools and methods for Data capitalisation.
  • 6. > Introduction What are Streams ? It’s an abstract data structure with the following : operations : • append(bytes) -> void? • readAt(int) -> null | bytes rule 1 : ∀p ∈ ℕ, for some definition of ‘==‘ x := readAt(p) y := readAt(p) ! x != null => x == y Rule 1 implies : Infinite cacheability once the data is available at a position.
  • 7. Streams are the simplest way to manage data. 0 1 2 3 4 5 6 And they are naturally compatible with the perception of information from a singular observer … > Introduction
  • 8. But be careful, streams are definitely not like queues, ESB, EAI, or what ever messaging solution comes to mind …
  • 9. > Introduction There is a lot to tell on Streams • Sub events : Events are pre-projected into … • Quantum of action : A ‘user’ action generates zero or one event (no more). • Structural sharing for large payload (cf. Content Addressable Storage). • Garbage collection for append only data structures. this presentation ! • Causality enforcement in asynchronous contexts : On important request, causality is enforced. • Binary encoding and Metadata.
  • 10. > Introduction A quick note on Causality If you don’t ensure causality for web apps, some strange comportements may arise : Sometimes, as a user, I cannot see my own “edits”. Sometimes, as a client, I cannot buy on the website after I checkout my basket. APP APP “Who is the fastest between the Data bus and the client ?” You don’t want to bet, especially under load.
  • 11. Data encoding and metadata for streams
  • 12. > Content Content : • Data encoding • Identity • Metadata • Datagram • Conclusion
  • 13. > Data encoding State of data encodings in the industry • As always worse is considered better. • Most of streams have data encoded in : • CSV/TSV • JSON • Platform specific serialisations (eg: Java serialisation, Kryo)
  • 14. > Data encoding Why this is important ? • Some streams may contains very large amount of Data, the chosen encoding must be cpu and space efficient. • Streams are processed by many programs, and many intermediaries, for many years, the chosen encoding must be processable in a generic way.
  • 15. JSON is the lower denominator Plus : • It reaches the browser, you can produce and consume data from inside a web page. A lot of Cons : • Inefficient, • No dates, no proper numerics, • Very basic data structures, • Very error Prone. We all need JSON, but we should use it only when we can't avoid it. > Data encoding Eg : In our databases, we can avoid JSONs ;)
  • 16. > Data encoding How bad JSON is ? {“name”:"Bob", “age":11, "gender":"Male"} 39 Bytes for 10 Bytes of data :02:06:62:6f:62:02:16:02:da:01
  • 17. > Data encoding relevant ones popular binaries low tech cognitect “papa ?!” Avro Thrift Proto Buf JSON CSV Fressian Transit EDN XML RDF binary YES NO YES OK NO ?? generic YES ?? NO YES YES YES schema based YES NO YES NO ?? meta specific encoding S” YES OK Literal YES NO “STRING s reach the browser YES NO +++++ OK NO YES OK easy ? NO I PASS “true” YEP HUM ? … safe ? YES HUM? NO NO MISM ATCH <! YES has dates? Soon NO NO YES YES
  • 18. Identity > Identity • Most mechanism around stream assure an “at most once delivery”. • An identity definition is necessary to ensure idempotency.
  • 19. > Identity There are 2 ways to refer to a message : • with a fingerprint calculated from the message (digest). • with an external identifier (like UUIDs).
  • 20. > Identity UUIDs allow : F0991FD1-D58A-4A5F-8D13-903F368882D1 8AA5C612-B365-4F8F-AF3F-DF623E1F6B22 93A87D37-0658-47C9-84F6-801E83A5821C • to manage things that are not encoded yet. • to avoid the hashing and the parsing of payloads. Recommandation : add an UUID (128bits) to every elements of the stream.
  • 21. Metadata > Metadata • Metadata uses range from the very useful (like http headers) to the very meta meta[1]. • Metadata on Stream elements is most of the time implicit, like for example the Content-Type : • “It’s a stream of JSONs” then every element of the stream has “content-type=application/json”. [1] I am looking at you RDF !
  • 22. What kind of metadata there are for streams element ? • Content-type or data-encoding : e.g. : application/json • Type or Profile : indicate that the given element is an instance of a given type. e.g. : domain.model.MessageSent • Provenance information : e.g. : {“env”:”test”, “application”:{“name”:”webapp”, “version”:{“commit”:”68546ca…”}}} > Metadata
  • 23. > Metadata A quick note on provenance The provenance is practical in distributed systems we want to know : • from which node do a element comes. • on the behalf of which agent this element is created. • from which environment[1] a element comes. [1] with new architecture and Data Labs, environments are sometimes shared on the same infrastructure (eg : no Pre-Production platform). It’s then very useful to safeguard against the pollution of data.
  • 24. > Metadata { "content-type":"application/json", "profile":"domain.model.MessageSent", "provenance":{ "application":{ "name":"webapp", "version":"68546ca6e963981a8279aa327cc1e1362d15554e" }, "node":{ "environement":"test", "network":{ "interface":{ "en0":{ "addresses":{ "192.168.0.13":{ "family":"inet", "netmask":"255.255.255.0", "broadcast":"192.168.0.255" } } } } }, "hostname":["Blaze"], "platform_family":"mac_os_x" } } } • The metadata of an element can represent a significant piece of data. Sometimes more than the data itself. • !! The same piece of metadata can be shared across many elements. !!
  • 25. > Datagram Anatomy of an element :ID :HEADERS :BODY DB7D919B-248F-4676- 8494-2698B48C69C3 57158663-5933-4CE6- A54E-8179ECFBFCCA [“ich”,“bin”,“ein”,“JSON”] e.g.
  • 26. > Datagram 1. Create and register your headers (in a distributed Key/Store for example) . 4813EDF2-B04E-4B70- { AB04-0F9EA456E032 "content-type":"application/json", "profile":"domain.model.MessageSent", "provenance":{ "application":{ "name":"webapp", "version":"68546ca6e963981a8279aa327cc1e1362d15554e" }, "node":{ "environement":"test", "network":{ "interface":{ "en0":{ "addresses":{ "192.168.0.13":{ "family":"inet", "netmask":"255.255.255.0", "broadcast":"192.168.0.255" } } } } }, "hostname":["Blaze"], "platform_family":"mac_os_x" } } }
  • 27. > Datagram 2. use it in your stream ! 5462E738-ABAA-452F- 87E0-FD38AEB9DF81 4813EDF2-B04E-4B70- AB04-0F9EA456E032 {"cid": {"idStr": "498683D2-1192-4794-8C23-5BE49EEEC763"}, "userId": {"idStr": "BC3D8614-AF1F-48C8-B91F-0D907FD0FAF3"}, "content": " Contenu de message de test"} 81C76676-7B19-428E-8 56D-984BB67287D1 4813EDF2-B04E-4B70- AB04-0F9EA456E032 {"cid": {"idStr": "498683D2-1192-4794-8C23-5BE49EEEC763"}, "userId": {"idStr": "BC3D8614-AF1F-48C8-B91F-0D907FD0FAF3"}, "content": " Contenu de message de test”}
  • 28. > Datagram Ho : You can have also have a stream of headers … 4813EDF2-B04E-4B70- AB04-0F9EA456E032 :HEADERS 4813EDF2-B04E-4B70- AB04-0F9EA456E032 5462E738-ABAA-452F- 87E0-FD38AEB9DF81 4813EDF2-B04E-4B70- AB04-0F9EA456E032 81C76676-7B19-428E-85 6D-984BB67287D1 4813EDF2-B04E-4B70- AB04-0F9EA456E032 69DFC711-9D21-4DD6- A51D-C04A7A6E20A9 0 1 2
  • 29. > Conclusion If you don’t yet use streams instead of databases, start to use one next Monday (even with JSON and no headers…). If you do already use streams … Well, you know what to do ! ;)
  • 33. Bonus :What is a CAS ? A Content Adressable Storage is a specific “key value store” : operations : • store(bytes) -> key • get(key) -> null | bytes rule 1 : key = h(data) h being a cryptographic hash function like md5 or sha1. rule 2 : ∀data get(store(data)) = data Rule 1 and 2 imply : Infinite cacheability and scalability.
  • 34. Exemple of architectures CLASSICAL APP APP DB WITH STREAMS APP APP append broadcast
  • 35. The broadcast mechanism is equivalent to Exemple of architectures a db replication mechanism. CLASSICAL APP REPLICATION (BIN/LOG) APP APP DB DB WITH STREAMS APP APP APP append broadcast