SlideShare a Scribd company logo
www.clairvoyantsoft.com
Airflow
A CLAIRVOYANT Story
Quick Poll
| 2
| 3
Robert Sanders
Big Data Manager and Engineer
Shekhar Vemuri
CTO
Shekhar works with clients across various industries and
helps define data strategy, and lead the implementation of
data engineering and data science efforts.
Was Co-founder and CTO of Blue Canary, a Predictive
analytics solution to help with student retention, Blue
Canary was later Acquired by Blackboard in 2015.
One of the early employees of Clairvoyant, Robert
primarily works with clients to enable them along their
big data journey. Robert has deep background in web
and enterprise systems, working on full-stack
implementations and then focusing on Data
management platforms.
| 4
About
Background Awards & Recognition
Boutique consulting firm centered on building data solutions and
products
All things Web and Data Engineering, Analytics, ML and User
Experience to bring it all together
Support core Hadoop platform, data engineering pipelines and
provide administrative and devops expertise focused on Hadoop
| 5
Currently working on building a data security solution to help enterprises
discover, secure and monitor sensitive data in their environment.
| 6
● What is Apache Airflow?
○ Features
○ Architecture
● Use Cases
● Lessons Learned
● Best Practices
● Scaling & High Availability
● Deployment, Management & More
● Questions
Agenda
| 7
Hey Robert, I heard about this new
hotness that will solve all of our
workflow scheduling and
orchestration problems. I played
with it for 2 hours and I am in love!
Can you try it out?
Must be pretty cool. I
wonder how it compares
to what we’re using. I’ll
check it out!
Genesis
| 8
Building Expertise vs Yak
Shaving
| 9
| 10
● Mostly used Cron and Oozie
● Did some crazy things with Java and Quartz in a past life
● Lot of operational support was going into debugging Oozie workloads and issues we ran into
with that
○ 4+ Years of working with Oozie “built expertise??”
● Needed a scalable, open source, user friendly engine for
○ Internal product needs
○ Client engagements
○ Making our Ops and Support teams lives easier
Why?
| 11
Scheduler Landscape
| 12
● “Apache Airflow is an Open Source platform to programmatically Author, Schedule and Monitor workflows”
○ Workflows as Python Code (this is huge!!!!!)
○ Provides monitoring tools like alerts and a web interface
● Written in Python
● Apache Incubator Project
○ Joined Apache Foundation in early 2016
○ https://github.com/apache/incubator-airflow/
What is Apache Airflow?
| 13
● Lightweight Workflow Platform
● Full blown Python scripts as DSL
● More flexible execution and workflow generation
● Feature Rich Web Interface
● Worker Processes can Scale Horizontally and Vertically
● Extensible
Why use Apache Airflow?
| 14
Building Blocks
| 15
Different Executors
● SequentialExecutor
● LocalExecutor
● CeleryExecutor
● MesosExecutor
● KubernetesExecutor (Coming Soon)
Executors
What are Executors?
Executors are the mechanism by which task
instances get run.
| 16
Single Node Deployment
| 17
Multi-Node Deployment
| 18
# Library Imports
from airflow.models import DAG
from airflow.operators import BashOperator
from datetime import datetime, timedelta
# Define global variables and default arguments
START_DATE = datetime.now() - timedelta(minutes=1)
default_args = dict(
'owner'='Airflow',
'retries'=1,
'retry_delay'=timedelta(minutes=5),
)
# Define the DAG
dag = DAG('dag_id', default_args=default_args, schedule_interval='0 0 * * *’, start_date=START_DATE)
# Define the Tasks
task1 = BashOperator(task_id='task1', bash_command="echo 'Task 1'", dag=dag)
task2 = BashOperator(task_id='task2', bash_command="echo 'Task 2'", dag=dag)
task3 = BashOperator(task_id='task3', bash_command="echo 'Task 3'", dag=dag)
# Define the Task Relationships
task1.set_downstream(task2)
task2.set_downstream(task3)
Defining a Workflow
| 19
dag = DAG('dag_id', …)
last_task = None
for i in range(1, 3):
task = BashOperator(
task_id='task' + str(i),
bash_command="echo 'Task" + str(i) + "'",
dag=dag)
if last_task is None:
last_task = task
else:
last_task.set_downstream(task)
last_task = task
Defining a Dynamic Workflow
| 20
● Action Operators
○ BashOperator(bash_command))
○ SSHOperator(ssh_hook, ssh_conn_id, remote_host, command)
○ PythonOperator(python_callable=python_function)
● Transfer Operators
○ HiveToMySqlTransfer(sql, mysql_table, hiveserver2_conn_id, mysql_conn_id, mysql_preoperator, mysql_postoperator, bulk_load)
○ MySqlToHiveTransfer(sql, hive_table, create, recreate, partition, delimiter, mysql_conn_id, hive_cli_conn_id, tblproperties)
● Sensor Operators
○ HdfsSensor(filepath, hdfs_conn_id, ignored_ext, ignore_copying, file_size, hook)
○ HttpSensor(endpoint, http_conn_id, method, request_params, headers, response_check, extra_options)
Many More
Operators
| 21
● Kogni discovers sensitive data across all data sources enterprise
● Need to configure scans with various schedules, work standalone or with a spark cluster
● Orchestrate, execute and manage dozens of pipelines that scan and ingest data in a secure
fashion
● Needed a tool to manage this outside of the core platform
● Started with exporting Oozie configuration from the core app - but conditional aspects and
visibility became an issue
● Needed something that supported deep DAGs and Broad DAGs
First Use Case (Description)
| 22
● Daily ETL Batch Process to Ingest data into Hadoop
○ Extract
■ 1226 tables from 23 databases
○ Transform
■ Impala scripts to join and transform data
○ Load
■ Impala scripts to load data into common final tables
● Other requirements
○ Make it extensible to allow the client to import more databases and tables in the future
○ Status emails to be sent out after daily job to report on success and failures
● Solution
○ Create a DAG that dynamically generates the workflow based off data in a Metastore
Second Use Case (Description)
| 23
Second Use Case (Architecture)
| 24
Second Use Case (DAG)
1,000 ft view 100,000 ft view
| 25
● Support
● Documentation
● Bugs and Odd Behavior
● Monitor Performance with Charts
● Tune Retries
● Tune Parallelism
Lessons Learned
| 26
● Load Data Incrementally
● Process Historic Data with Backfill operations
● Enforce Idempotency (retry safe)
● Execute Conditionally (BranchPythonOperator, ShortCuircuitOperator)
● Alert if there are failures (task failures and SLA misses) (Email/Slack)
● Use Sensor Operators to determine when to Start a Task (if applicable)
● Build Validation into your Workflows
● Test as much - but needs some thought
Best Practices
| 27
Scaling & High Availability
| 28
High Availability for the Scheduler
Scheduler Failover Controller: https://github.com/teamclairvoyant/airflow-scheduler-failover-controller
| 29
● PIP Install airflow site packages on all Nodes
● Set AIRFLOW_HOME env variable before setup
● Utilize MySQL or PostgreSQL as a Metastore
● Update Web App Port
● Utilize SystemD or Upstart Scripts (https://github.com/apache/incubator-
airflow/tree/master/scripts)
● Set Log Location
○ Local File System, S3 Bucket, Google Cloud Storage
● Daemon Monitoring (Nagios)
● Cloudera Manager CSD (Coming Soon)
Deployment & Management
| 30
● Web App Authentication
○ Password
○ LDAP
○ OAuth: Google, GitHub
● Role Based Access Control (RBAC) (Coming Soon)
● Protect airflow.cfg (expose_config, read access to airflow.cfg)
● Web App SSL
● Kerberos Ticket Renewer
Security
| 31
● PyUnit - Unit Testing
● Test DAG Tasks Individually
airflow test [--subdir SUBDIR] [--dry_run] [--task_params
TASK_PARAMS_JSON] dag_id task_id execution_date
● Airflow Unit Test Mode - Loads configurations from the unittests.cfg file
[tests]
unit_test_mode = true
● Always at the very least ensure that the DAG is valid (can be done as part of CI)
● Take it a step ahead by mock pipeline testing(with inputs and outputs) (especially if your DAGs
are broad)
Testing
Questions?
| 32
We are hiring!
| 33
@shekharv
shekhar@clairvoyant.ai
linkedin.com/in/shekharvemuri
@rssanders3
robert@clairvoyant.ai
linkedin.com/in/robert-sanders-cs

More Related Content

PPTX
Airflow - a data flow engine
PPTX
Apache Airflow overview
PPTX
Airflow 101
PPTX
Airflow presentation
PDF
Apache Airflow
PDF
Apache airflow
PDF
Intro to Airflow: Goodbye Cron, Welcome scheduled workflow management
PDF
Building an analytics workflow using Apache Airflow
Airflow - a data flow engine
Apache Airflow overview
Airflow 101
Airflow presentation
Apache Airflow
Apache airflow
Intro to Airflow: Goodbye Cron, Welcome scheduled workflow management
Building an analytics workflow using Apache Airflow

What's hot (20)

PDF
Introduction to Apache Airflow
PDF
Airflow introduction
PDF
Apache Airflow Architecture
PDF
Introducing Apache Airflow and how we are using it
PDF
Airflow for Beginners
PDF
How I learned to time travel, or, data pipelining and scheduling with Airflow
PPTX
Apache airflow
PPTX
Apache Airflow Introduction
PDF
Orchestrating workflows Apache Airflow on GCP & AWS
PDF
Airflow presentation
PDF
Designing a complete ci cd pipeline using argo events, workflow and cd products
PDF
Apache Airflow
PDF
Building Better Data Pipelines using Apache Airflow
PDF
Introduction to Apache Airflow - Data Day Seattle 2016
PDF
Airflow tutorials hands_on
PDF
Building a Data Pipeline using Apache Airflow (on AWS / GCP)
PDF
Data Engineer's Lunch #83: Strategies for Migration to Apache Iceberg
PDF
Apache Airflow
PPTX
Airflow at WePay
PDF
Airflow Intro-1.pdf
Introduction to Apache Airflow
Airflow introduction
Apache Airflow Architecture
Introducing Apache Airflow and how we are using it
Airflow for Beginners
How I learned to time travel, or, data pipelining and scheduling with Airflow
Apache airflow
Apache Airflow Introduction
Orchestrating workflows Apache Airflow on GCP & AWS
Airflow presentation
Designing a complete ci cd pipeline using argo events, workflow and cd products
Apache Airflow
Building Better Data Pipelines using Apache Airflow
Introduction to Apache Airflow - Data Day Seattle 2016
Airflow tutorials hands_on
Building a Data Pipeline using Apache Airflow (on AWS / GCP)
Data Engineer's Lunch #83: Strategies for Migration to Apache Iceberg
Apache Airflow
Airflow at WePay
Airflow Intro-1.pdf
Ad

Similar to Apache Airflow in Production (20)

PDF
Thinking DevOps in the era of the Cloud - Demi Ben-Ari
PDF
Database automation guide - Oracle Community Tour LATAM 2023
PPTX
Azure functions: from a function to a whole application in 60 minutes
PDF
DevOps for TYPO3 Teams and Projects
PDF
Designing for operability and managability
PDF
Devops with Python by Yaniv Cohen DevopShift
PPTX
Running Airflow Workflows as ETL Processes on Hadoop
PDF
Platform as a Service (PaaS) - A cloud service for Developers
PPTX
Serverless - DevOps Lessons Learned From Production
PPTX
Geospatial data platform at Uber
ODP
Building a Dev/Test Cloud with Apache CloudStack
PPTX
Nyc mule soft_meetup_13_march_2021
PPTX
adaidoadaoap9dapdadadjoadjoajdoiajodiaoiao
PDF
Monitoring Big Data Systems Done "The Simple Way" - Codemotion Berlin 2017
PDF
MuleSoft Surat Virtual Meetup#16 - Anypoint Deployment Option, API and Operat...
PDF
Uber Geo spatial data platform at DataWorks Summit
PDF
Machine learning and big data @ uber a tale of two systems
PDF
Sprint 78
PDF
Sprint 66
PDF
Monitoring Big Data Systems "Done the simple way" - Demi Ben-Ari - Codemotion...
Thinking DevOps in the era of the Cloud - Demi Ben-Ari
Database automation guide - Oracle Community Tour LATAM 2023
Azure functions: from a function to a whole application in 60 minutes
DevOps for TYPO3 Teams and Projects
Designing for operability and managability
Devops with Python by Yaniv Cohen DevopShift
Running Airflow Workflows as ETL Processes on Hadoop
Platform as a Service (PaaS) - A cloud service for Developers
Serverless - DevOps Lessons Learned From Production
Geospatial data platform at Uber
Building a Dev/Test Cloud with Apache CloudStack
Nyc mule soft_meetup_13_march_2021
adaidoadaoap9dapdadadjoadjoajdoiajodiaoiao
Monitoring Big Data Systems Done "The Simple Way" - Codemotion Berlin 2017
MuleSoft Surat Virtual Meetup#16 - Anypoint Deployment Option, API and Operat...
Uber Geo spatial data platform at DataWorks Summit
Machine learning and big data @ uber a tale of two systems
Sprint 78
Sprint 66
Monitoring Big Data Systems "Done the simple way" - Demi Ben-Ari - Codemotion...
Ad

More from Robert Sanders (6)

PPTX
Migrating Big Data Workloads to the Cloud
PPTX
Delivering digital transformation and business impact with io t, machine lear...
PPTX
Productionalizing spark streaming applications
PPTX
Airflow Clustering and High Availability
PPTX
Databricks Community Cloud Overview
PPTX
Intro to Apache Spark
Migrating Big Data Workloads to the Cloud
Delivering digital transformation and business impact with io t, machine lear...
Productionalizing spark streaming applications
Airflow Clustering and High Availability
Databricks Community Cloud Overview
Intro to Apache Spark

Recently uploaded (20)

PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PPTX
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PDF
Unlocking AI with Model Context Protocol (MCP)
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Spectral efficient network and resource selection model in 5G networks
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PPTX
Understanding_Digital_Forensics_Presentation.pptx
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PPT
Teaching material agriculture food technology
PDF
Review of recent advances in non-invasive hemoglobin estimation
PDF
KodekX | Application Modernization Development
PDF
Encapsulation theory and applications.pdf
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PPTX
Big Data Technologies - Introduction.pptx
Agricultural_Statistics_at_a_Glance_2022_0.pdf
The Rise and Fall of 3GPP – Time for a Sabbatical?
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
NewMind AI Weekly Chronicles - August'25 Week I
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
MIND Revenue Release Quarter 2 2025 Press Release
Unlocking AI with Model Context Protocol (MCP)
The AUB Centre for AI in Media Proposal.docx
Spectral efficient network and resource selection model in 5G networks
Per capita expenditure prediction using model stacking based on satellite ima...
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
Understanding_Digital_Forensics_Presentation.pptx
Dropbox Q2 2025 Financial Results & Investor Presentation
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Teaching material agriculture food technology
Review of recent advances in non-invasive hemoglobin estimation
KodekX | Application Modernization Development
Encapsulation theory and applications.pdf
Mobile App Security Testing_ A Comprehensive Guide.pdf
Big Data Technologies - Introduction.pptx

Apache Airflow in Production

  • 3. | 3 Robert Sanders Big Data Manager and Engineer Shekhar Vemuri CTO Shekhar works with clients across various industries and helps define data strategy, and lead the implementation of data engineering and data science efforts. Was Co-founder and CTO of Blue Canary, a Predictive analytics solution to help with student retention, Blue Canary was later Acquired by Blackboard in 2015. One of the early employees of Clairvoyant, Robert primarily works with clients to enable them along their big data journey. Robert has deep background in web and enterprise systems, working on full-stack implementations and then focusing on Data management platforms.
  • 4. | 4 About Background Awards & Recognition Boutique consulting firm centered on building data solutions and products All things Web and Data Engineering, Analytics, ML and User Experience to bring it all together Support core Hadoop platform, data engineering pipelines and provide administrative and devops expertise focused on Hadoop
  • 5. | 5 Currently working on building a data security solution to help enterprises discover, secure and monitor sensitive data in their environment.
  • 6. | 6 ● What is Apache Airflow? ○ Features ○ Architecture ● Use Cases ● Lessons Learned ● Best Practices ● Scaling & High Availability ● Deployment, Management & More ● Questions Agenda
  • 7. | 7 Hey Robert, I heard about this new hotness that will solve all of our workflow scheduling and orchestration problems. I played with it for 2 hours and I am in love! Can you try it out? Must be pretty cool. I wonder how it compares to what we’re using. I’ll check it out! Genesis
  • 8. | 8
  • 9. Building Expertise vs Yak Shaving | 9
  • 10. | 10 ● Mostly used Cron and Oozie ● Did some crazy things with Java and Quartz in a past life ● Lot of operational support was going into debugging Oozie workloads and issues we ran into with that ○ 4+ Years of working with Oozie “built expertise??” ● Needed a scalable, open source, user friendly engine for ○ Internal product needs ○ Client engagements ○ Making our Ops and Support teams lives easier Why?
  • 12. | 12 ● “Apache Airflow is an Open Source platform to programmatically Author, Schedule and Monitor workflows” ○ Workflows as Python Code (this is huge!!!!!) ○ Provides monitoring tools like alerts and a web interface ● Written in Python ● Apache Incubator Project ○ Joined Apache Foundation in early 2016 ○ https://github.com/apache/incubator-airflow/ What is Apache Airflow?
  • 13. | 13 ● Lightweight Workflow Platform ● Full blown Python scripts as DSL ● More flexible execution and workflow generation ● Feature Rich Web Interface ● Worker Processes can Scale Horizontally and Vertically ● Extensible Why use Apache Airflow?
  • 15. | 15 Different Executors ● SequentialExecutor ● LocalExecutor ● CeleryExecutor ● MesosExecutor ● KubernetesExecutor (Coming Soon) Executors What are Executors? Executors are the mechanism by which task instances get run.
  • 16. | 16 Single Node Deployment
  • 18. | 18 # Library Imports from airflow.models import DAG from airflow.operators import BashOperator from datetime import datetime, timedelta # Define global variables and default arguments START_DATE = datetime.now() - timedelta(minutes=1) default_args = dict( 'owner'='Airflow', 'retries'=1, 'retry_delay'=timedelta(minutes=5), ) # Define the DAG dag = DAG('dag_id', default_args=default_args, schedule_interval='0 0 * * *’, start_date=START_DATE) # Define the Tasks task1 = BashOperator(task_id='task1', bash_command="echo 'Task 1'", dag=dag) task2 = BashOperator(task_id='task2', bash_command="echo 'Task 2'", dag=dag) task3 = BashOperator(task_id='task3', bash_command="echo 'Task 3'", dag=dag) # Define the Task Relationships task1.set_downstream(task2) task2.set_downstream(task3) Defining a Workflow
  • 19. | 19 dag = DAG('dag_id', …) last_task = None for i in range(1, 3): task = BashOperator( task_id='task' + str(i), bash_command="echo 'Task" + str(i) + "'", dag=dag) if last_task is None: last_task = task else: last_task.set_downstream(task) last_task = task Defining a Dynamic Workflow
  • 20. | 20 ● Action Operators ○ BashOperator(bash_command)) ○ SSHOperator(ssh_hook, ssh_conn_id, remote_host, command) ○ PythonOperator(python_callable=python_function) ● Transfer Operators ○ HiveToMySqlTransfer(sql, mysql_table, hiveserver2_conn_id, mysql_conn_id, mysql_preoperator, mysql_postoperator, bulk_load) ○ MySqlToHiveTransfer(sql, hive_table, create, recreate, partition, delimiter, mysql_conn_id, hive_cli_conn_id, tblproperties) ● Sensor Operators ○ HdfsSensor(filepath, hdfs_conn_id, ignored_ext, ignore_copying, file_size, hook) ○ HttpSensor(endpoint, http_conn_id, method, request_params, headers, response_check, extra_options) Many More Operators
  • 21. | 21 ● Kogni discovers sensitive data across all data sources enterprise ● Need to configure scans with various schedules, work standalone or with a spark cluster ● Orchestrate, execute and manage dozens of pipelines that scan and ingest data in a secure fashion ● Needed a tool to manage this outside of the core platform ● Started with exporting Oozie configuration from the core app - but conditional aspects and visibility became an issue ● Needed something that supported deep DAGs and Broad DAGs First Use Case (Description)
  • 22. | 22 ● Daily ETL Batch Process to Ingest data into Hadoop ○ Extract ■ 1226 tables from 23 databases ○ Transform ■ Impala scripts to join and transform data ○ Load ■ Impala scripts to load data into common final tables ● Other requirements ○ Make it extensible to allow the client to import more databases and tables in the future ○ Status emails to be sent out after daily job to report on success and failures ● Solution ○ Create a DAG that dynamically generates the workflow based off data in a Metastore Second Use Case (Description)
  • 23. | 23 Second Use Case (Architecture)
  • 24. | 24 Second Use Case (DAG) 1,000 ft view 100,000 ft view
  • 25. | 25 ● Support ● Documentation ● Bugs and Odd Behavior ● Monitor Performance with Charts ● Tune Retries ● Tune Parallelism Lessons Learned
  • 26. | 26 ● Load Data Incrementally ● Process Historic Data with Backfill operations ● Enforce Idempotency (retry safe) ● Execute Conditionally (BranchPythonOperator, ShortCuircuitOperator) ● Alert if there are failures (task failures and SLA misses) (Email/Slack) ● Use Sensor Operators to determine when to Start a Task (if applicable) ● Build Validation into your Workflows ● Test as much - but needs some thought Best Practices
  • 27. | 27 Scaling & High Availability
  • 28. | 28 High Availability for the Scheduler Scheduler Failover Controller: https://github.com/teamclairvoyant/airflow-scheduler-failover-controller
  • 29. | 29 ● PIP Install airflow site packages on all Nodes ● Set AIRFLOW_HOME env variable before setup ● Utilize MySQL or PostgreSQL as a Metastore ● Update Web App Port ● Utilize SystemD or Upstart Scripts (https://github.com/apache/incubator- airflow/tree/master/scripts) ● Set Log Location ○ Local File System, S3 Bucket, Google Cloud Storage ● Daemon Monitoring (Nagios) ● Cloudera Manager CSD (Coming Soon) Deployment & Management
  • 30. | 30 ● Web App Authentication ○ Password ○ LDAP ○ OAuth: Google, GitHub ● Role Based Access Control (RBAC) (Coming Soon) ● Protect airflow.cfg (expose_config, read access to airflow.cfg) ● Web App SSL ● Kerberos Ticket Renewer Security
  • 31. | 31 ● PyUnit - Unit Testing ● Test DAG Tasks Individually airflow test [--subdir SUBDIR] [--dry_run] [--task_params TASK_PARAMS_JSON] dag_id task_id execution_date ● Airflow Unit Test Mode - Loads configurations from the unittests.cfg file [tests] unit_test_mode = true ● Always at the very least ensure that the DAG is valid (can be done as part of CI) ● Take it a step ahead by mock pipeline testing(with inputs and outputs) (especially if your DAGs are broad) Testing
  • 33. We are hiring! | 33 @shekharv shekhar@clairvoyant.ai linkedin.com/in/shekharvemuri @rssanders3 robert@clairvoyant.ai linkedin.com/in/robert-sanders-cs

Editor's Notes

  • #3: Workflow systems used Big data and hadoop Data engineering Cloud
  • #10: Workflow systems used Big data and hadoop Data engineering Cloud
  • #15: FINAL
  • #17: FINAL
  • #18: FINAL
  • #19: FINAL
  • #20: FINAL