hadoop daemons not running


1 Answer(s)


Hi Patrice,
If you are using cloudera, then go to cloudera manager and start all the services again or restart the whole cluster.
If you have install the apache hadoop in your system manually, then you can follow these steps to recover the cluster state:
1. Delete(rm) the hdfs folder
2. Create(mkdir) the hdfs folder
3. Assign the ownership(chown) of hdfs folder
4. format the namenode (hadoop namenode -format)
5. start the hadoop services(start-all.sh)
Note: Before doing this make sure hadoop services must be stop(stop-all.sh).
System IP and IP in configuration file must be same.

Hope this helps.
Thanks