DataNode Dies Periodically
5 Answer(s)
Hi Sachin,
Its hard to explain the reason.
Please share your 2 recent datanode logs with me and namenode UI snapshot.
Thanks.
Hi Abhijit - Pls find the attached requested information.
What I did today was deleted my virtual machine and re installed everything again and still no luck.
The moment I go to Browse File System from my namenode 192.168.230.128:50070
and click in usr folder OR data folder .. my datanode dies. and I get the error which I gave in my initial e mail.
I need this to be fixed as I am not able to make any progress in terms of classes and execute anything. Quick response is much appreciated as I need to get this done before the class in the evening.
May 30 2016 09:16 PM
Hi Sachin,
It happened because of leftover metadata. To resolve it, please do the following:
1. Stop all the services -
$ stop-all.sh
2. Power-off the machine.
3. Delete the HDFS -
$ sudo rm -r /usr/local/hadoop/hdfs
4. Create a HDFS -
$ sudo mkdir /usr/local/hadoop/hdfs
5. Assign ownership permission -
$ sudo chown hadoop /usr/local/hadoop/hdfs
$ sudo chmod 755 /usr/local/hadoop/hdfs
6. Format the namenode -
$ hadoop namenode -format
7. Start the hadoop services -
$ start-all.sh
Hope this helps.
Thanks.