Monday, July 01, 2013

Hadoop points at file:/// not hdfs:///

When I install CDH 4.5 using Whirr 0.8.2, hadoop points at file:/// not hdfs:///. Therefore, ‘ssh’ to the client node that is the last ssh of whirr. Then, I need to update Hadoop’s core-site.xml by adding its namenode, which is at /etc/hadoop/conf


 fs.defaultFS
 hdfs://hadoop-namenode:8020

For example, if a name node’s local ip address is 10.80.221.129

 fs.defaultFS
 hdfs://10.80.221.129:8020

Now you can see the HDFS directories and files. If you ssh to other nodes, you have to change other nodes’ core-site.xml too.

Reference
2. http://stackoverflow.com/questions/16008486/after-installing-hadoop-via-cloudera-manager-4-5-hdfs-only-points-to-the-local-f

3 comments:

  1. This is such a great resource that you are providing and you give it away for free. I love seeing websites that understand the value of providing a quality resource for free.
    Hadoop Training in hyderabad

    ReplyDelete
  2. Really awesome blog. Your blog is really useful for me. Thanks for sharing this informative blog. Keep update your blog.

    Hadoop Training in Chennai

    ReplyDelete
  3. Thanks for sharing this information .You may also refer http://www.s4techno.com/hadoop-training-in-pune/

    ReplyDelete

Followers

Profile