Hadoop is an open-source framework used to store and process big data across many computers. It uses the Hadoop Distributed File System (HDFS) to store large files across multiple machines, and MapReduce to process data in parallel. Hadoop is designed to handle massive volumes of data, be fault-tolerant, and scale easily by adding more machines. It's a core tool in the Big Data ecosystem.