Installed Hadoop; How to Access Hadoop through URLs?



0

Hello,

I have installed Hadoop following instructions from this document:

https://docs.google.com/document/d/1dF_wB8R3rIbYwGGmwV5OQ4E3woixzMXOjNz8akW5xb0/edit#

I am not able to access the Hadoop URLs shown below. Is this something I can do on Windows explorer? I am unable to access these on Windows. What am I missing?

10.3 Access Hadoop URLs

HDFS : http://localhost:50070/

RM UI: http://localhost:8088/

 

Thanks

Lokesh


14 Answer(s)


0

Hi Lokesh,

You have to enter the following URL in Linux(Ubuntu) machine where Hadoop is install. If you want to access the HDFS in Windows, enter the following in your web browser:

URL:  <Ubuntu/Hadoop machine IP>:50070

Hope this helps.

Thanks.


0

Hello Abhijit,

THanks for your response. This did not work.

Can you please describe what needs to be entered on Ubuntu Linux and what needs to be entered on Windows? Please do not include any special words or characters like URL, <> or others. I just need the command only so I can copy and paste it on windows and ubuntu Linux.

Thanks


0

Hi Lokesh,

Please enter the your ubuntu(hadoop) machine ip followed by 50070 port number. The URL will look like this:

192.168.31.128:50070

Please replace 192.168.31.128 with your Linux machine Ip. You can check Linux machine Ip using "ifconfig" command in Linux terminal.

Hope this helps.

Thanks.


0

THis did not work. 

I just entered my machine ip like this on windows explorer and it did not work.

192.168.149.129:50070 

For ubuntu/ linux, what should I enter? Is there some command like xdg-open that I should use?

xdg-open

Maybe Hadoop is not installed. I did not get any error messages while following the instructions in the document.


0

WHen I ran the HDFS daemons, the output looks like the same as the instruction guide:

-------------

user@ubuntu:~/hadoop-2.7.3/sbin$ start-dfs.sh
17/08/03 15:40:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/user/hadoop-2.7.3/logs/hadoop-user-namenode-ubuntu.out
localhost: starting datanode, logging to /home/user/hadoop-2.7.3/logs/hadoop-user-datanode-ubuntu.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/user/hadoop-2.7.3/logs/hadoop-user-secondarynamenode-ubuntu.out
17/08/03 15:40:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
--------------

ifconfig:

inet addr:192.168.149.129  Bcast:192.168.149.255  Mask:255.255.255.0
 

--------------

I put the following URL on windows and windows and it did not connect:

192.168.149.129:50070

--------------

Is there any way I can troubleshoot?


0

I did some troubleshooting myself. In the /etc/hosts file, there are the following lines:

127.0.0.1       localhost
127.0.1.1       ubuntu

------------

Is it OK for these lines to be there or should these be removed?


0

I downloaded ubuntu for 32 bit and installed on my 64 bit machine. Do you think this is the issue?


0

Hi Lokesh,

As I see you are using Ubuntu version 16+ then you need not to worry about 32-bit or 64-bit version.

Could you please share the output of the following commands:

Command-1: $ start-dfs.sh

Command-2: $ start-yarn.sh

Command-3: $ jps

Thanks.


0

THe output of "jps" has an extra line for NodeManager, which is not in the document provided by you.

Here are the outputs of the 3 commands:

-----------------------------------------------------------------

Output of start-dfs.sh 

17/08/04 23:53:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/user/hadoop-2.7.3/logs/hadoop-user-namenode-ubuntu.out
localhost: starting datanode, logging to /home/user/hadoop-2.7.3/logs/hadoop-user-datanode-ubuntu.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/user/hadoop-2.7.3/logs/hadoop-user-secondarynamenode-ubuntu.out
17/08/04 23:53:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
------------------------------------

Output of start-yarn.sh


starting yarn daemons
starting resourcemanager, logging to /home/user/hadoop-2.7.3/logs/yarn-user-resourcemanager-ubuntu.out
localhost: starting nodemanager, logging to /home/user/hadoop-2.7.3/logs/yarn-user-nodemanager-ubuntu.out
----------------------

Output of jps


3954 Jps
3650 NodeManager
3191 DataNode
3528 ResourceManager
3372 SecondaryNameNode
-----------------------------------------------

 


0

Hi Lokesh,

Looks like your namenode is down. Please check core-site.xml .

Thanks.


0

Core-site.xml contents:

<configuration>

   <property>

   <name>fs.default.name</name>

   <value>hdfs://localhost:9000</value>

   </property>

</configuration>

Does it have everything that it is supposed to? I changed 9000 to 50070 and it still does not work. Can you please call me?


0

Hello Abhijit,

Thanks for working with me on this. I did all the steps recommended by you and also started the ubuntu from the my directory. This time, when I run jps command, I don't see the Datanode. Here's the output of jps:

3511 Jps
3063 NameNode
3399 SecondaryNameNode
 

Let me know if you have any suggestions.

Thanks

Lokesh

 


0

Abhijit, Thanks for the great help! It works now!


0

@Lokesh thanks for the confirmation. It's pleasure talking to you on call.

Your Answer

Click on this code-snippet-icon icon to add code snippet.

Upload Files (Maximum image file size - 1.5 MB, other file size - 10 MB, total size - not more than 50 MB)

Email
Password