FAQ
Hello everyone.
We ran into a bunch of issues with building and deploying hadoop 0.21.
It would be great to get some answers about how things should work, so
we can try to fix them.

1. When checking out the repositories, each of them can be built by
itself perfectly. BUT, if you look in hdfs it has mapreduce libraries,
and in mapreduce it has hdfs libraries. That's kind of a cross-
reference between projects.
Q: Is this dependence necessary ? Can we get rid of it ?
Q: if it's necessary, how does one build the jars with the latest
version of the source code ? how are the jars in the scm repository
created (hadoop-hdfs/lib/hadoop-mapred-0.21-dev.jar) as long as there
is a cross-reference ?
2. There are issues with the jar files and the webapps (dfshealth.jsp,
etc). Right now, the only way to have a hadoop functioning system is
to: build hdfs and mapreduce; copy everything from hdfs/build and
mapreduce/build to common/build.
Q: Is there a better way of doing this ? What needs to be fixed to
have the webapps in the jar files (like on 0.20). Are there JIRA
issues logged on this ?

We would really appreciate some answers at least related to where
hadoop is going with this build step, so we can help with patches /
fixes.

Thank you,
Andrei Dragomirt

Search Discussions

  • Aaron Kimball at Nov 9, 2009 at 4:14 am

    On Thu, Nov 5, 2009 at 2:34 AM, Andrei Dragomir wrote:

    Hello everyone.
    We ran into a bunch of issues with building and deploying hadoop 0.21.
    It would be great to get some answers about how things should work, so
    we can try to fix them.

    1. When checking out the repositories, each of them can be built by
    itself perfectly. BUT, if you look in hdfs it has mapreduce libraries,
    and in mapreduce it has hdfs libraries. That's kind of a cross-
    reference between projects.
    Q: Is this dependence necessary ? Can we get rid of it ?
    Those are build-time dependencies. Ideally you'll ignore them post-build.

    Q: if it's necessary, how does one build the jars with the latest
    version of the source code ? how are the jars in the scm repository
    created (hadoop-hdfs/lib/hadoop-mapred-0.21-dev.jar) as long as there
    is a cross-reference ?
    2. There are issues with the jar files and the webapps (dfshealth.jsp,
    etc). Right now, the only way to have a hadoop functioning system is
    to: build hdfs and mapreduce; copy everything from hdfs/build and
    mapreduce/build to common/build.
    Yup.


    Q: Is there a better way of doing this ? What needs to be fixed to
    have the webapps in the jar files (like on 0.20). Are there JIRA
    issues logged on this ?
    I have created a Makefile and some associated scripts that will build
    everything and squash it together for you; see
    https://issues.apache.org/jira/browse/HADOOP-6342

    There is also a longer-term effort to use Maven to coordinate the three
    subprojects, and use a local repository for inter-project development on a
    single machine; see https://issues.apache.org/jira/browse/HADOOP-5107 for
    progress there.


    We would really appreciate some answers at least related to where
    hadoop is going with this build step, so we can help with patches /
    fixes.

    Thank you,
    Andrei Dragomirt

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-dev @
categorieshadoop
postedNov 5, '09 at 10:35a
activeNov 9, '09 at 4:14a
posts2
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Aaron Kimball: 1 post Andrei Dragomir: 1 post

People

Translate

site design / logo © 2022 Grokbase