Hadoop consists of two pieces
* Map Reduce
* HDFS
*Map Reduce
This is Processing part of data.
for example: Job will run and nodes will be created.
*HDFS: Hadoop Distributed File System.
This is storing part of the data.
Happy coding.
Bala