HUE and cloudera maanager not opening


6 Answer(s)


Hi Sirajuddin,
Please try this and let me know the output.
Open the cloudera manager URL and wait for 10-15 min. If nothing pops up then goto Terminal and enter the command - hadoop fs -ls /
If you get the output as expected, then mean cloudera manager is slow.
Then we surely look into it and try to fix it.

Thanks

the terminal works fine : am able to get o/p for hadoop fs-ls. but cloudera and HUE doesnot work

null

Hi Sirajuddin,
HDFS is working, means you can perform all your task through the terminal but due to certain reason cloudera manager and hue is taking too much time to load in browser.
First clear the cache and restart the Cloudera manager and hue server.

As browser not working, please use the terminal to do it:
1. How to restart the cloudera manager:
http://www.cloudera.com/documentation/archive/manager/4-x/4-5-2/Cloudera-Manager-Enterprise-Edition-User-Guide/cmeeug_topic_14_3.html

2. How to restart the HUE
http://www.cloudera.com/documentation/archive/cdh/4-x/4-3-0/CDH4-Installation-Guide/cdh4ig_topic_15_7.html


After restarting the both, shutdown the system and restart it.
This will clean memory cache, if any.

Try to login into the Cloudera manager.

Hope this helps.
Thanks.

Tried all the above mentioned options, still doesnot work.

 

HUE ERROR

*******

Checking current configuration
Configuration files located in /etc/hue/conf.empty

Potential misconfiguration detected. Fix and restart Hue.

Hive Editor     The application won't work without a running HiveServer2.

Cloudera MAanager Error:

*****************

 

ttempting to connect to Cloudera Manager...

By default, the Cloudera QuickStart VM runs Cloudera's Distribution including Apache Hadoop (CDH) under Linux's service and configuration management. If you wish to migrate to Cloudera Manager, you must run one of the following commands.

To use Cloudera Express (free), run Launch Cloudera Express on the Desktop. This requires at least 8 GB of RAM and at least 2 virtual CPUs.

To begin a 60-day trial of Cloudera Enterprise with advanced management features, run Launch Cloudera Enterprise (trial) on the Desktop. This requires at least 10 GB of RAM and at least 2 virtual CPUs.

Be aware that after rebooting, it may take several minutes before Cloudera Manager has started all of the services it manages and is ready to accept connects from clients.

 

Hive from terminal error

*************

[cloudera@quickstart ~]$ hive

Logging initialized using configuration in file:/etc/hive/conf.dist/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: java.net.ConnectException: Call From quickstart.cloudera/127.0.0.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:512)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.ConnectException: Call From quickstart.cloudera/127.0.0.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)
    at org.apache.hadoop.ipc.Client.call(Client.java:1476)
    at org.apache.hadoop.ipc.Client.call(Client.java:1403)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
    at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
    at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2095)
    at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1214)
    at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1210)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1210)
    at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1409)
    at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:588)
    at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:546)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:498)
    ... 8 more
Caused by: java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:609)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:708)

Am able to work with pig from terminal

I even uninstalled and installed my cdh5, my ram is 16GB.

 

PLease help

 

 

 

 

 

 


Hi Sirajuddin,

Check this line in your error,

quickstart.cloudera/127.0.0.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused 

It means your HDFS is not UP. Please check whether all the hadoop services are working properly and your HDFS is up.

To test: Try the WebUI, and check the logs.

If your Cloudera manager working properly, check the HDFS services. If it not started properly, please restart it.

If it shows some error, try to fix and try again.

Hope this helps.

Thanks.