error when loading data in hive



0

I have an of issues.  We always login as cloudera and all the local files on Linux are owned by cloudera as well as all hdfs files are owned by cloudera.

 

Here is what I do -

This is on cdh4.4

[cloudera@localhost class7]$ sudo hive
Logging initialized using configuration in jar:file:/usr/lib/hive/lib/hive-common-0.10.0-cdh4.4.0.jar!/hive-log4j.properties
Hive history file=/tmp/root/hive_job_log_656dc0c0-8bb9-4d82-a403-688b9f054a7c_1077772704.txt

 

--I already have the page_view table without data.
hive>

hive> show tables;
OK
page_view
Time taken: 2.92 seconds


hive> LOAD DATA LOCAl INPATH '/home/cloudera/class7/page_view_20140415_IND.csv' INTO TABLE page_view PARTITION(dt='2014-04-15', country='IN');
Copying data from file:/home/cloudera/class7/page_view_20140415_IND.csv
Copying file: file:/home/cloudera/class7/page_view_20140415_IND.csv
Loading data to table kandb.page_view partition (dt=2014-04-15, country=IN)
chgrp: changing ownership of 'hdfs://localhost.localdomain:8020/user/hive/warehouse/kandb.db/page_view/dt=2014-04-15/country=IN/page_view_20140415_IND.csv':

User does not belong to hive
Partition kandb.page_view{dt=2014-04-15, country=IN} stats: [num_files: 1, num_rows: 0, total_size: 780, raw_data_size: 0]
Table kandb.page_view stats: [num_partitions: 1, num_files: 1, num_rows: 0, total_size: 780, raw_data_size: 0]
OK
Time taken: 4.592 seconds
hive> select count(*) from page_view;  (This failes. Sems there is some permission issue. Can you please help me) ?
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:224)
 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:204)
 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:149)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4705)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4687)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4661)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:3032)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2996)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2977)
 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:669)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:419)
 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44970)


1 Answer(s)


0

Hi Kan,

Your are using sudo with hive. It will the user that super user (root ) was to access the process. Your default user will change to root from cloudera. So, don't use the "hive". It will fix the issue.

Hope this helps.

Your Answer

Click on this code-snippet-icon icon to add code snippet.

Upload Files (Maximum image file size - 1.5 MB, other file size - 10 MB, total size - not more than 50 MB)

Email
Password