FAQ
Hadoop-Common-trunk-Commit and Hadoop-Hdfs-trunk-Commit jobs on hudson is configured to publish core, core-test , hdfs and hdfs-test jars resp. to the apache nexus snapshot repository.

This means hdfs will always be build with the latest published common jars available in the apache nexus snapshot repo.

Thanks,
Giri

Search Discussions

  • Arun kumar at Nov 11, 2009 at 2:01 am
    Hi All,

    I am trying to execute the example wordcount application on a cluster in my University's lab. Since I don't have write access to the /etc/hosts file (and the admin won't allow me to add entries for each node in the cluster), I am using the IP address of each node in all of Hadoop's configuration files. Now copying the input files into HDFS works fine, but when I start the application I get this message:
    Error initializing attempt_200911102009_0001_m_000002_1:
    java.lang.IllegalArgumentException: Wrong FS: hdfs://128.226.118.98:54310/var/work/aselvan1/hadoop-tmp/mapred/system/job_200911102009_0001/job.xml, expected: hdfs://node22.cs.binghamton.edu:54310
    at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:327)

    I am using release 0.18.3. I tried to find the cause for the error, and I think that the Authority check fails (probably due to specifying IP address in configuration files). I am stuck here, any help is highly appreciated.

    Thank you,
    Arun

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-dev @
categorieshadoop
postedNov 10, '09 at 10:03p
activeNov 11, '09 at 2:01a
posts2
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Arun kumar: 1 post Giridharan Kesavan: 1 post

People

Translate

site design / logo © 2022 Grokbase