Getting exception while executing wordcountv2 program..


2 Answer(s)


You have 3 problems:
1) Your input file should be copied to the HDFS input folder (NOT your local folder!!)
2) The output you specify should be a HDFS folder (NOT your local folder!!)
3) You didn't specify the input file. The directory name cannot be treated as the input file- if your input file is called wordcount.txt you should specify it explicitely:

This is my sample command line:
Hadoop jar wordcountv2.jar input/courses/wordcount.txt output_word (should not exist!!)

This works because it's being run from the local folder where the wordcountv2.jar is and I have copied my wordcount.txt to the HDFS



hi Soumyajit,

Your input file is not in HDFS it's in your local FS, please copy it using
hadoop fs -put /home/cloudera/inputfile.txt /user/cloudera/inputfile.txt

Thanks