I was going through the steps for this in Hadoop FAQ.pdf.
1. It has instructions only to decommission (on Google drive) - where are instructions for commissioning?
2. For decommission, I find this:
"Create a file named dfs.exclude and add the hostnames of machines that need to be decommissioned line by line.
What node name will I decommission, given I have only a pseudo hadoop setup with one node? Any one else in the class who has done this - please clarify.
In this Spark project, we are going to bring processing to the speed layer of the lambda architecture which opens up capabilities to monitor application real time performance, measure real time comfort with applications and real time alert in case of security
Machine Learning Project in R- Predict the customer churn of telecom sector and find out the key drivers that lead to churn. Learn how the logistic regression model using R can be used to identify the customer churn in telecom dataset.