Problem in Single Node Hadoop installation.



0

I'm using centos7 (http://centos.webwerks.com/centos/7/isos/x86_64/CentOS-7-x86_64-Everything-1511.iso) and followed the steps shown in video (Option 1 - Install Hadoop 1.2.1 Single_Node) and everythings worked fine but after running bin/start-all.sh command ,the localhost:50070 is not running. I'm attaching the photo of output of this command.

And Java version : Oracle 1.7.0_79



12 Answer(s)


0

Hi Ravi,

Thanks for sharing the snapshot. You are getting the permission denied error as shown in the snapshot.

You are gettiing the error because you enter the wrong password.

Please stop the hadoop services and start the hadoop services again. It will ask for password again. Enter the correct password and it will resolve the issue.

You can also create password free ssh connection to avoid password.
 

Hope this helps.

Thanks.


0

I run ./stop-all.sh and then ./start-all.sh but the same problem still I'm getting. I'm attaching the snapshot of these commands.



0

Hi Ravi,

Could you please share the error snapshot and please also share the ouptut of $ jps command.
Thanks


0

I'm sharing the error coming in, localhost:50070 and $ jps command showing "command not found".



0

Output of JPS :



0

Hi Ravi,

Hadoop required server to run the services. You are using the Centos without server version. Please make sure you have apache server installed and running. You can also use other servers for this purpose.

To install the apache server -$ sudo yum -y install httpd

Check the status of apache server -$ sudo service httpd status

Now configure your system to start Apache at boot time...

$ systemctl start httpd.service

$ systemctl enable httpd.service

Steps to resolve the issue:

  1. If you don't have any server, please install the server and make sure it is working and enable.

  2. Next enter the following three commands to disable firewall.
    # service iptables save
    # service iptables stop
    # chkconfig iptables off

    If you are using IPv6 firewall, enter:
    # service ip6tables save
    # service ip6tables stop
    # chkconfig ip6tables off

  3. Recreate the HDFS:

    $ sudo rm -r location_of_HDFS

    $ sudo mkdir location_of_HDFS

    $ sudo chown hadoop_username location_of_HDFS

  4. Format the namenode:

    $ hadoop namenode -format

  5. Start the hadoop services:

    $ start-all.sh or

    $ start-dfs.sh and $ start-yarn.sh

Hope this helps.

Thanks


0

Still, I'm facing the same problem. I had executed following commands given in snapshot.

I'm not getting where I'm making the mistake.



0

Hi Ravi,

Is this good time to call you. Please let me know.

I will look forward for your response.

Thanks


0

yes, you can. M.no: 9864320757


0

Hi Ravi,

It quite late for India timings. I will call you tomorrow morning. We will surely resolve your issue.
Please let me know the convenient time to call you.
Thanks


0

time after 10-12 am or after 6 pm.


0

Sure, I will call you tomorrow at 8:00 pm. Hope this works for you.

Your Answer

Click on this code-snippet-icon icon to add code snippet.

Upload Files (Maximum image file size - 1.5 MB, other file size - 10 MB, total size - not more than 50 MB)

Email
Password