Issues with copying file from Local to Hadoop
5 Answer(s)
Robert
Rufus, are you talking about doing the copyFromLocal of words.txt in the "Run your first MapReduce program - WordCount" video in Module 4? The copyFromLocal seemed to work okay for me without having to do the sudo user. The one thing I can think of is if you type the command as is you need to make sure that you are currently in the directory in the linux file system that contains words.txt. Hope this helps.
Sep 26 2015 04:53 PM
Robert
Sep 26 2015 05:09 PM
Here is a snapshot of the command sequence that worked for me. This was using the Cloudera Quickstart VM within VirtualBox.
Sep 26 2015 05:09 PM
Andre
Still no success. It keeps telling me that it cannot file the file or directory 'words.txt' I can see the file by just simply typing ls.
Sep 26 2015 07:10 PM
DeZyre Support
hi Rufus,
Make sure that the file exists in the local file system
local file system -> /home/cloudera/words.txt
HDFS file system -> /user/cloudera
assuming the words.txt is in local file system, this will work -> hadoop fs -put /home/cloudera/words.txt /user/cloudera.
Please check the following video as well -> https://www.youtube.com/watch?v=1er-Kkl7mq8
Sep 26 2015 07:20 PM
Make sure that the file exists in the local file system
local file system -> /home/cloudera/words.txt
HDFS file system -> /user/cloudera
assuming the words.txt is in local file system, this will work -> hadoop fs -put /home/cloudera/words.txt /user/cloudera.
Please check the following video as well -> https://www.youtube.com/watch?v=1er-Kkl7mq8
Robert
One additional thought -- which VM image are you using? I do remember when I took the VMWare image from the DeZyre site and ran it with VMWare I was running into weird file issues (permission issues etc). When I did the download directly from the Cloudera site as Deb suggested and ran it with VirtualBox those problems seemed to go away and things ran fine. This might be a red herring but just thought I'd mention it as one data point.
Sep 26 2015 09:28 PM