FAQ
I've build all the various dependencies

hadoop-commons (branch 0.21)
hadoop-hdfs (branch 0.21)
hadoop (trunk == 0.21?? there is no 0.21 branch)

and then

hadoop-mapreduce (branch 0.21)

Using it I get the following exeception:

Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.hadoop.conf.Configuration.addDeprecation(Ljava/lang/String;[Ljava/lang/String;)V
at org.apache.hadoop.mapreduce.util.ConfigUtil.addDeprecatedKeys(ConfigUtil.java:49)
at org.apache.hadoop.mapreduce.util.ConfigUtil.loadResources(ConfigUtil.java:40)
at org.apache.hadoop.mapreduce.Cluster.<clinit>(Cluster.java:64)

What's going on?

cheers
--
Torsten

Search Discussions

  • Tom White at Jun 8, 2010 at 10:39 pm
    Perhaps you're not getting up-to-date libraries? Try

    common> ant clean jar mvn-install
    hdfs> ant veryclean jar mvn-install -Dresolvers=internal
    mapreduce> ant veryclean jar -Dresolvers=internal

    This works for me with the 0.21 branches.

    Cheers,
    Tom
    On Tue, Jun 8, 2010 at 9:48 AM, Torsten Curdt wrote:
    I've build all the various dependencies

    hadoop-commons (branch 0.21)
    hadoop-hdfs (branch 0.21)
    hadoop (trunk == 0.21?? there is no 0.21 branch)

    and then

    hadoop-mapreduce (branch 0.21)

    Using it I get the following exeception:

    Exception in thread "main" java.lang.NoSuchMethodError:
    org.apache.hadoop.conf.Configuration.addDeprecation(Ljava/lang/String;[Ljava/lang/String;)V
    at org.apache.hadoop.mapreduce.util.ConfigUtil.addDeprecatedKeys(ConfigUtil.java:49)
    at org.apache.hadoop.mapreduce.util.ConfigUtil.loadResources(ConfigUtil.java:40)
    at org.apache.hadoop.mapreduce.Cluster.<clinit>(Cluster.java:64)

    What's going on?

    cheers
    --
    Torsten
  • Torsten Curdt at Jun 8, 2010 at 11:09 pm
    Finally got it working!

    The problem was that I thought I also needed a core jar. So I had 4
    instead of 3 jars.That resulted in a clash. Plus it picked the wrong
    resolver.

    That was kind of painful. Shouldn't the build be just like this?

    common> ant clean install
    hdfs> ant clean install
    mapreduce> ant clean install

    I think the modularity makes total sense, but to me this looks
    actually more like a multi-module build. So a single

    mapreduce> ant clean install

    should also build the other two.

    WDYT?

    cheers
    --
    Torsten

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupmapreduce-dev @
categorieshadoop
postedJun 8, '10 at 4:49p
activeJun 8, '10 at 11:09p
posts3
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase