What will you learn from this Hadoop Commands tutorial?
This hadoop mapreduce tutorial will give you a list of commonly used hadoop fs commands that can be used to manage files on a Hadoop cluster. These hadoop hdfs commands can be run on a pseudo distributed cluster or from any of the VM’s like Hortonworks, Cloudera, etc.
Pre-requisites to follow this Hadoop tutorial
1) help HDFS Shell Command
Syntax of help hdfs Command
$ hadoop fs –help
Help hdfs shell command helps hadoop developers figure out all the available hadoop commands and how to use them.
If you would like more information about Big Data and Hadoop Certification, please click the orange "Request Info" button on top of this page.
Variations of the Hadoop fs Help Command
$ hadoop fs –help ls
Using the help command with a specific command lists the usage information along with the options to use the command.
2) Usage HDFS Shell Command
$ hadoop fs –usage ls
Usage command gives all the options that can be used with a particular hdfs command.
3) ls HDFS Shell Command
Syntax for ls Hadoop Command -
$ hadoop fs –ls
This command will list all the available files and subdirectories under default directory.For instance, in our example the default directory for Cloudera VM is /user/cloudera
Variations of Hadoop ls Shell Command
$ hadoop fs –ls /
Returns all the available files and subdirectories present under the root directory.
$ hadoop fs –ls –R /user/cloudera
Returns all the available files and recursively lists all the subdirectories under /user/Cloudera
4) mkdir- Used to create a new directory in HDFS at a given location.
Example of HDFS mkdir Command -
$ hadoop fs –mkdir /user/cloudera/dezyre1
The above command will create a new directory named dezyre1 under the location /user/cloudera
Note : Cloudera and other hadoop distribution vendors provide /user/
$ sudo –u hdfs hadoop fs –mkdir /dezyre
This command will create a new directory named dezyre under the / (root directory).
Copy a file from local filesytem to HDFS location.
For the following examples, we will use Sample.txt file available in the /home/Cloudera location.
Example - $ hadoop fs –copyFromLocal Sample1.txt /user/cloudera/dezyre1
Copy/Upload Sample1.txt available in /home/cloudera (local default) to /user/cloudera/dezyre1 (hdfs path)
6) put –
This hadoop command uploads a single file or multiple source files from local file system to hadoop distributed file system (HDFS).
Ex - $ hadoop fs –put Sample2.txt /user/cloudera/dezyre1
Copy/Upload Sample2.txt available in /home/cloudera (local default) to /user/cloudera/dezyre1 (hdfs path)
This hadoop command functions similar to the put command but the source file will be deleted after copying.
Example - $ hadoop fs –moveFromLocal Sample3.txt /user/cloudera/dezyre1
Move Sample3.txt available in /home/cloudera (local default) to /user/cloudera/dezyre1 (hdfs path). Source file will be deleted after moving.
Displays the disk usage for all the files available under a given directory.
Example - $ hadoop fs –du /user/cloudera/dezyre1
Displas disk usage of current hadoop distributed file system.
Example - $ hadoop fs –df
This HDFS command empties the trash by deleting all the files and directories.
Example - $ hadoop fs –expunge
This is similar to the cat command in Unix and displays the contents of a file.
Example - $ hadoop fs –cat /user/cloudera/dezyre1/Sample1.txt
Copy files from one HDFS location to another HDFS location.
Example – $ hadoop fs –cp /user/cloudera/dezyre/war_and_peace /user/cloudera/dezyre1/
Move files from one HDFS location to another HDFS location.
Example – $ hadoop fs –mv /user/cloudera/dezyre1/Sample1.txt /user/cloudera/dezyre/
Removes the file or directory from the mentioned HDFS location.
Example – $ hadoop fs –rm -r /user/cloudera/dezyre3
Example – $ hadoop fs –rm -r /user/cloudera/dezyre3
Deletes or removes the directory and its content from HDFS location in a recursive manner.
Example – $ hadoop fs –rm /user/cloudera/dezyre3
Delete or remove the files from HDFS location.
This hadoop command will show the last kilobyte of the file to stdout.
Example – $ hadoop fs -tail /user/cloudera/dezyre/war_and_peace
Example – $ hadoop fs -tail –f /user/cloudera/dezyre/war_and_peace
Using the tail commands with -f option, shows the last kilobyte of the file from end in a page wise format.
Copies the files to the local filesystem . This is similar to hadoop fs -get command but in this case the destination location msut be a local file reference
Example - $ hadoop fs –copyFromLocal /user/cloudera/dezyre1/Sample1.txt /home/cloudera/hdfs_bkp/
Copy/Download Sample1.txt available in /user/cloudera/dezyre1 (hdfs path) to /home/cloudera/hdfs_bkp/ (local path)
Downloads or Copies the files to the local filesystem.
Example - $ hadoop fs –get /user/cloudera/dezyre1/Sample2.txt /home/cloudera/hdfs_bkp/
Copy/Download Sample2.txt available in /user/cloudera/dezyre1 (hdfs path) to /home/cloudera/hdfs_bkp/ (local path)
Used to create an emplty file at the specified location.
Example - $ hadoop fs –touchz /user/cloudera/dezyre1/Sample4.txt
It will create a new empty file Sample4.txt in /user/cloudera/dezyre1/ (hdfs path)
This hadoop fs command is used to set the replication for a specific file.
Example - $ hadoop fs –setrep –w 1 /user/cloudera/dezyre1/Sample1.txt
It will set the replication factor of Sample1.txt to 1
This hadoop command is basically used to change the group name.
Example - $ sudo –u hdfs hadoop fs –chgrp –R cloudera /dezyre
It will change the /dezyre directory group membership from supergroup to cloudera (To perform this operation superuser permission is required)
This command lets you change both the owner and group name simulataneously.
Example - $ sudo –u hdfs hadoop fs –chown –R cloudera /dezyre
It will change the /dezyre directory ownership from hdfs user to cloudera user (To perform this operation superuser is permission required)
22) hadoop chmod
Used to change the permissions of a given file/dir.
Example - $ hadoop fs –chmod /dezyre
It will change the /dezyre directory permission to 700 (drwx------).
Note : hadoop chmod 777
To execute this , the user must be the owner of the file or must be a super user. On executing this command, all users will get read,write and execute permission on the file.