1-844-696-6465 (US)        +91 77600 44484        help@dezyre.com

Permission issues using some HDFS Commands



0
when I want to create an input directory using hadoop fs -mkdir input I get the following error: mkdir: Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x Looks like a classic wrong permissions case

Tuhino (need help with permission) is having the same issue as I am. I tried the solution mentioned in his post as follows:
specify the /user/cloudera folder as shown below
hadoop jar hadoop-examples.jar teragen 1000 /user/cloudera/teragen/out2

I logged in as user cloudera and ran the following command as suggested: hadoop jar hadoop-examples.jar teragen 1000 /user/cloudera/teragen/out2

I received the following error:
Exception in thread "main" java.io.IOException: Error opening job jar: hadoop-examples.jar

at org.apache.hadoop.util.RunJar.main(RunJar.java:135)
Caused by: java.util.zip.ZipException: error in opening zip file
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.(ZipFile.java:127)
at java.util.jar.JarFile.(JarFile.java:135)
at java.util.jar.JarFile.(JarFile.java:72)
at org.apache.hadoop.util.RunJar.main(RunJar.java:133)

I found 2 solutions that work as follows but would like to know what is the best solution and why:

Solution 1 - hdfs is the user that owns the folder being accessed. We can run the command "hadoop fs -mkdir /dezyre" as the hdfs user using "sudo -u hdfs" as a prefix to "hadoop fs -mkdir /dezyre" ("sudo -u hdfs hadoop fs -mkdir /dezyre"). Now the command executes without error. If we use this method consistently then it will require a conscious modification to any commands discovered in normal documentation.

Solution 2 - supergroup is the group that owns the folder being accessed, so any users in this group will be able to access the files. Would the ideal solution be to create a user group called supergroup and add all users that should access files under that ownership in HDFS to the group? Since HDFS permissions are tied into the permissions set on the local file system for any users accessing the cluster, this could be performed using the following commands (make sure to su to root):

groupadd supergroup
usermod -a -G supergroup cloudera

Could someone please let me know what the best solution is and why and if you have any other solutions that would work.

4 Answer(s)


0

Hi Robert,
I am using cloudera in virtual box. As a part of assignment1, I was trying to run the following command in the terminal window:
sudo addgroup dezyre_group

I get the error:
sudo: addgroup command not found.

I was wondering if you have any insights on this.
Thanks,
Kapil

0

hi Kapil,

"addgroup" is specific to Ubuntu, try these

groupadd hadoop
usermod -a -G hadoop hduser

0

As Shobhanc said but make sure to su to root, then run the commands:

groupadd hadoop
usermod -a -G hadoop hduser

0

Please also look at the post titled "need help with permission" for more information.

I now understand what to do regarding the hadoop jar command for the hadoop-examples.jar file (see post titled "need help with permission").

In regards to creating a directory; if I am within a folder where cloudera owns the folder like the /user/cloudera folder, then there is no need to make permission changes to make a directory/folder. But if we want to make the /dezyre directory like in the page 2 of the HDFS command assignment, then we must change permissions since hdfs is the user that owns that folder.

My question in regards to this situation where we want to make a directory where hdfs is the user that owns the folder; What is the best practice; my Solution 1 or my Solution 2 as stated previously or something different?

Thank you

Your Answer

Click on this code-snippet-icon icon to add code snippet.

Upload Files (Maximum image file size - 1.5 MB, other file size - 10 MB, total size - not more than 50 MB)

Email
Password