This document discusses Flume and Camus, two tools for ingesting event data into HDFS. Flume is described as an event streaming tool that uses sources to pull data, channels to transport it, and sinks to output the data. Camus is a LinkedIn project that uses MapReduce jobs to periodically pull Kafka data in batches and write it to HDFS. The document provides configuration examples and compares the two in terms of ease of setup, performance, customizability, and deployment. It notes that Flume is easier to maintain but has a more complex configuration, while Camus has poor documentation and performance can be impacted by its use of MapReduce.