This document discusses data ingestion into Hadoop. It describes how data can be ingested in real-time or in batches. Common tools for ingesting data into Hadoop include Apache Flume, Apache NiFi, and Apache Sqoop. Flume is designed for streaming data ingestion and uses a source-channel-sink architecture to reliably move data into Hadoop. NiFi focuses on real-time data collection and processing capabilities. Sqoop can import and export structured data between Hadoop and relational databases.