This document summarizes a presentation about using Apache Storm as an ETL engine for ingesting data into Hadoop. It discusses when Storm is a good choice compared to other tools, how Lookout built real-time Storm topologies to ingest user event data into HDFS and device connection data into HBase, and operational aspects like deployment, monitoring, and infrastructure. The presentation covers best practices for building scalable Storm data pipelines and lessons learned.