Exceptions occured executing map reduce using jar file



0
I have created the class 'WordCount' and created a jar file and executed the map reduce using jar file but i am getting exception
[cloudera@localhost ~]$ hadoop jar WC.jar WordCount /WordCount_Input/WordCount_Input.txt /WordCount_Output
Exception in thread "main" java.lang.ClassNotFoundException: WordCount
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.util.RunJar.main(RunJar.java:201)

2 Answer(s)


0

hi Vijaykanth,

Looks like the class name that you created is different than "WordCount". I suggest that you execute the following command to find the package name and class name

jar -tvf WC.jar

Once you got the package and class name, please use the hierarchy to execute the command , something like

hadoop jar com.module3.WordCount

Thanks

0


[cloudera@localhost ~]$ jar tvf WC.jar
25 Tue Aug 05 10:14:52 PDT 2014 META-INF/MANIFEST.MF
368 Tue Aug 05 09:59:42 PDT 2014 .project
14271 Tue Aug 05 10:02:26 PDT 2014 .classpath

After executing commands i am not getting package name or classname

My WordCount program is below

package com.dezyre;

import java.io.IOException;
import java.util.*;

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;


public class WordCount {

public static class MyMapper extends Mapper {

private final static IntWritable one = new IntWritable(1);
private Text word = new Text();

public void map(LongWritable key, Text value, Context context)
throws IOException,InterruptedException {
String line = value.toString();
StringTokenizer tokenizer = new StringTokenizer(line);
while (tokenizer.hasMoreTokens()) {
word.set(tokenizer.nextToken());
context.write(word, one);
}
}
}

public static class MyReducer extends Reducer {
public void reduce(Text key, Iterable values, Context context) throws IOException,InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
context.write(key, new IntWritable(sum));
}
}

public static void main(String[] args) throws Exception {

Configuration conf = new Configuration();
Job job = new Job(conf, "WordCount");

job.setJarByClass(WordCount.class);


job.setMapperClass(MyMapper.class);
job.setReducerClass(MyReducer.class);

job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(IntWritable.class);

job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);

FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));

job.waitForCompletion(true);
}
}