FAQ
Hi,
I am using Hadoop version 0.19. I set up a hadoop cluster for testing
purpose with 1 master
and one slave. I set up keyless ssh between the master node and all the
slave nodes as
well as modified the /etc/hosts/ on all nodes so hostname lookup works

I am using ip addresses for the namenode and datanode rather tahn
hostname as with hostname ,
datanode was not coming up

But when i try to execute any job or program, it gives me following
exception

following exceptions:
FAILED
Error initializing
*java*.*lang*.*IllegalArgumentException*: *Wrong* *FS*:
hdfs://172.16.6.102:21011/user/root/test
expected: hdfs://namnodemc:21011

Can you please help me out , from where it is taking hostname



Regards,
Snehal Nagmote
IIIT-H

Search Discussions

Discussion Posts

Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 1 of 2 | next ›
Discussion Overview
groupcommon-dev @
categorieshadoop
postedMar 23, '09 at 6:19p
activeMar 23, '09 at 10:34p
posts2
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Snehal nagmote: 1 post Raghu Angadi: 1 post

People

Translate

site design / logo © 2022 Grokbase