How to open terminal in VM-ware?
3 Answer(s)
Ram
The localhost and cloudera-vm are hostnames. When you install Cloudera VM, the default host name is set to "localhost". It looks like the host name has been changed to "cloudera-vm" by your tutor.
You can check your host name typing the following command: vi /etc/sysconfig/network
Dec 05 2014 05:31 AM
You can check your host name typing the following command: vi /etc/sysconfig/network
Deepthi
Ya I typed the command you mentioned and it displays: HOSTNAME= localhost.localdomain
Can I execute the same commands with the localhost as the tutor does with the cloud era-vm HOSTNAME?
I ask this because when I type the command Hadoop on the local host it displays a different set than what was shown in the class. Also when I open the terminal it shows-- "[cloudera@localhost ~]$"
In there I created a directory named dezyre within which I created another directory with the name deepthi. Now when I execute the command "hadoop fs -put -/dezyre/deepthi/file1.txt" it says illegal option.
What is wrong? I am new to Linux commands. Please help!
Dec 05 2014 05:57 AM
Can I execute the same commands with the localhost as the tutor does with the cloud era-vm HOSTNAME?
I ask this because when I type the command Hadoop on the local host it displays a different set than what was shown in the class. Also when I open the terminal it shows-- "[cloudera@localhost ~]$"
In there I created a directory named dezyre within which I created another directory with the name deepthi. Now when I execute the command "hadoop fs -put -/dezyre/deepthi/file1.txt" it says illegal option.
What is wrong? I am new to Linux commands. Please help!
Ram
Can I execute the same commands with the localhost as the tutor does with the cloud era-vm HOSTNAME?
Yes.
Now when I execute the command "hadoop fs -put -/dezyre/deepthi/file1.txt" it says illegal option.
The "hadoop fs -put" command needs source file path and destination file path as parameters.
hadoop fs -put file1.txt /dezyre/deepthi/file1.txt
Make sure the file you are trying to copy to HDFS exists in your local file system.
hadoop [ Generic Options ] fs
[-cat]
[-chgrp [-R] GROUP PATH...]
[-chmod [-R] PATH...]
[-chown [-R] [OWNER][:[GROUP]] PATH...]
[-copyFromLocal ... ]
[-copyToLocal [-ignoreCrc] [-crc] ]
[-count[-q]]
[-cp ]
[-df]
[-du]
[-dus]
[-expunge]
[-get [-ignoreCrc] [-crc]
[-getmerge [addnl]]
[-help [cmd]]
[-ls]
[-lsr]
[-mkdir]
[-moveFromLocal ... ]
[-moveToLocal ]
[-mv ]
[-put ... ]
[-rm [-skipTrash]]
[-rmr [-skipTrash]]
[-stat [format]]
[-tail [-f]]
[-test -[ezd]]
[-text]
[-touchz]
http://hadoop.apache.org/docs/r1.0.4/commands_manual.html
Dec 05 2014 06:27 AM
Yes.
Now when I execute the command "hadoop fs -put -/dezyre/deepthi/file1.txt" it says illegal option.
The "hadoop fs -put" command needs source file path and destination file path as parameters.
hadoop fs -put file1.txt /dezyre/deepthi/file1.txt
Make sure the file you are trying to copy to HDFS exists in your local file system.
hadoop [ Generic Options ] fs
[-cat
[-chgrp [-R] GROUP PATH...]
[-chmod [-R]
[-chown [-R] [OWNER][:[GROUP]] PATH...]
[-copyFromLocal
[-copyToLocal [-ignoreCrc] [-crc]
[-count[-q]
[-cp
[-df
[-du
[-dus
[-expunge]
[-get [-ignoreCrc] [-crc]
[-getmerge
[-help [cmd]]
[-ls
[-lsr
[-mkdir
[-moveFromLocal
[-moveToLocal
[-mv
[-put
[-rm [-skipTrash]
[-rmr [-skipTrash]
[-stat [format]
[-tail [-f]
[-test -[ezd]
[-text
[-touchz
http://hadoop.apache.org/docs/r1.0.4/commands_manual.html