|| at Nov 11, 2009 at 2:01 am
I am trying to execute the example wordcount application on a cluster in my University's lab. Since I don't have write access to the /etc/hosts file (and the admin won't allow me to add entries for each node in the cluster), I am using the IP address of each node in all of Hadoop's configuration files. Now copying the input files into HDFS works fine, but when I start the application I get this message:
Error initializing attempt_200911102009_0001_m_000002_1:
java.lang.IllegalArgumentException: Wrong FS: hdfs://18.104.22.168:54310/var/work/aselvan1/hadoop-tmp/mapred/system/job_200911102009_0001/job.xml, expected: hdfs://node22.cs.binghamton.edu:54310
I am using release 0.18.3. I tried to find the cause for the error, and I think that the Authority check fails (probably due to specifying IP address in configuration files). I am stuck here, any help is highly appreciated.