FAQ
Finally got it working!

The problem was that I thought I also needed a core jar. So I had 4
instead of 3 jars.That resulted in a clash. Plus it picked the wrong
resolver.

That was kind of painful. Shouldn't the build be just like this?

common> ant clean install
hdfs> ant clean install
mapreduce> ant clean install

I think the modularity makes total sense, but to me this looks
actually more like a multi-module build. So a single

mapreduce> ant clean install

should also build the other two.

WDYT?

cheers
--
Torsten

Search Discussions

Discussion Posts

Previous

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 3 of 3 | next ›
Discussion Overview
groupmapreduce-dev @
categorieshadoop
postedJun 8, '10 at 4:49p
activeJun 8, '10 at 11:09p
posts3
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase