Unable to connect ubuntu terminal from the putty and winSCP


Hi Team,

I am facing issue to connect terminal from the Putty and winSCP.

The error is connection timed out.I tried to resolve different  ways  but could not over come the issue.Please find the attached error message. 

Can you please help here.



6 Answer(s)


Hi Ramesh,

Please try the following solution:

1, Execute following commands:

$ sudo apt-get update
$ sudo apt-get install openssh-server -y

2. Please make sure your above command must run successfully without any error.

3. Check the IP address:

$ ifconfig

4. Open putty/Winscp and enter the following:

Host: IP    Port number : 22

5. Done

Please let me know if this works for you,




Thanks for your replay..

I have followed above mentioned steps but still i am facing same issue.

If you can be help today before my next session big data class(today night 8pm IST),it would be great for me to follow class.


Hi Abhijit,

As discussed  i created  new Virtual machine  and installed unix but even i am not able to connect SSH terminal from the PUTTY and winSCP.



I got session lunch break 30 mins .Can you please call me now if you are available to work on this issue.


Hi Abhijit,

As discussed I uninstall VMware and reinstalled. I Can now connect SSH terminal from the Putty and winSCP.

But while configuring cluster in pseudo distribution data node is not starting and remaining all nodes are starting(Name Node, Secondary  Node and Job Tracker)

Error is in data node log file:

Invalid directory in dfs.data.dir: can not create directory: /usr/local/hadoop/hdfs/dfs/data

Could you please help here, i will be available from 8 pm onwards today.


Hi Ramesh,

Please check, is not connected. Could you please update all the config files with new IP.

Please follow the below steps to resolve it:

  1. Stop all the running services - $ stop-all.sh
  2. Update all the IP address in the conf files with new IP.
  3. Delete the HDFS - $ sudo rm -r /usr/local/hadoop/hdfs
  4. Create a HDFS directory - $ sudo mkdir /usr/local/hadoop/hdfs
  5. Assign the permission to HDFS folder- $ sudo chown -R hadoop:hadoop /usr/local/hadoop/hdfs
  6. Format the namenode - $ hadoop namenode -format
  7. Start all hadoop services - $ start-all.sh

Please let me know if this works for you or not.