How do I resolve the issue below?


11 Answer(s)


hi Christopher,

Step 1-> please verify if the oozie is up and working from the UI http://localhost:11000/oozie/

Step 3: Verify if the actual port and hostname are provided in the job.properties.

Step 3: Remove the forward slash from the command , here is the modified command
oozie job -oozie http://localhost:11000/oozie -config /usr/local/oozie WordCountTest/job.properties -run

Thanks

Your answer does not resolve the issue.. Here is the result:
root@xubuntu:/usr/local/oozie/distro/target/oozie-3.3.2-distro/oozie-3.3.2/bin# oozie job -oozie http://localhost:11000/oozie -config /usr/local/oozie WordCountTest/job.properties -run
java.lang.IllegalArgumentException: configuration must be a '.properties' or a '.xml' file
at org.apache.oozie.cli.OozieCLI.getConfiguration(OozieCLI.java:639)
at org.apache.oozie.cli.OozieCLI.jobCommand(OozieCLI.java:784)
at org.apache.oozie.cli.OozieCLI.processCommand(OozieCLI.java:504)
at org.apache.oozie.cli.OozieCLI.run(OozieCLI.java:477)
at org.apache.oozie.cli.OozieCLI.main(OozieCLI.java:179)
configuration must be a '.properties' or a '.xml' file

Here is the latest error:
root@xubuntu:/usr/local/oozie/distro/target/oozie-3.3.2-distro/oozie-3.3.2/bin# oozie job -oozie http://localhost:11000/oozie -config /usr/local/oozie/WordCountTest/job.properties -run
Error: E0501 : E0501: Could not perform authorization operation, Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "xubuntu/127.0.0.1"; destination host is: "localhost.localdomain":8020;

hi Christopher,

I followed these steps and was able to see the output created in hdfs
* Open WordCountTest/workflow.xml and make sure the below values are correct


mapred.input.dir
/user/cloudera/wc_input


mapred.output.dir
/user/cloudera/output1


* open WordCountTest/job.properties and make sure the values are as
nameNode=hdfs://localhost.localdomain:8020
jobTracker=localhost.localdomain:8021
queueName=default
oozie.wf.application.path=${nameNode}/user/cloudera/WordCountTest

* hadoop dfs -copyFromLocal WordCountTest /user/cloudera/
* [cloudera@localhost module9]$ oozie job -oozie http://localhost:11000/oozie -config WordCountTest/job.properties -run

The above command should give the following output on the terminal

job: 0000006-140820211110604-oozie-oozi-W

* Go to the jobtracker and you should the job is listed there, once the job is completed , check if the 'output1" folder is created in hdfs.

if you still dont see , go to job tracker click on the particular job id and see the complete log, to find what exactly the error is.

I also see from your post that you were executing as "root" user . As a norm all these steps need to be executed as "cloudera" user.

Thanks

hi Christopher,

if you still dont get the output, try changing the hostname in job.properties to xubuntu which is your hostname.

Thanks

Thanks for your answer. Two things:
1. Should user be cloudera even the user name on my system is root?
2. The is the job.properties file is the only place where hostname should be listed as xubuntu?

I am posting these questions again:

1. Should user be cloudera even through the user name on my system is root?

2. Is the job-properties file the only place where hostname should be listed as xubuntu?

hi Christopher,

Looks like you are on some kind of Ubuntu distribution. In that case, you might have installed hadoop as some user, lets say "hduser". I suggest that you use that user to execute all commands. "root" should only be used when you need elevated access to execute a superuser command.

In cloudera distribution, normally all commands will be executed as "cloudera".

Thanks

Yes.. But this does not resolve the issue. What about part 2 of the question

This question is still unanswered.

I are getting the same issue with MAC Lion 10.7.5 . I have hadoop and oozie installed under hadoop user . While executing the oozie job i am getting the same issue.

Christopher : Were you able to solve this issue ?