Apache Hadoop. ... The core of Apache Hadoop consists of a storage part, known as Hadoop Distributed File System (HDFS), and a processing part which is a MapReduce programming model. Hadoop splits files into large blocks and distributes them across nodes in a cluster.Hadoop is an open source distributed processing framework that manages data processing and storage for big data applications running in clustered systems.