FAQ
I have to admit that I don't know the official answer. The hack below seems working:
- compile all 3 sub-projects;
- copy everything in hdfs/build and mapreduce/build to common/build;
- then run hadoop by the scripts in common/bin as before.

Any better idea?

Nicholas Sze

Search Discussions

  • Arun C Murthy at Aug 11, 2009 at 12:50 am
    I'd hazard a guess and say we should hitch our wagon to https://issues.apache.org/jira/browse/HADOOP-5107
    .

    Arun
    On Aug 10, 2009, at 5:25 PM, Tsz Wo (Nicholas), Sze wrote:

    I have to admit that I don't know the official answer. The hack
    below seems working:
    - compile all 3 sub-projects;
    - copy everything in hdfs/build and mapreduce/build to common/build;
    - then run hadoop by the scripts in common/bin as before.

    Any better idea?

    Nicholas Sze
  • Todd Lipcon at Aug 11, 2009 at 5:38 am
    Hey Nicholas,

    Aaron gave a presentation with his best guess at the HUG last month. His
    slides are here: http://www.cloudera.com/blog/2009/07/17/the-project-split/
    (starting at slide 16)
    (I'd let him reply himself, but he's out of the office this afternoon ;-) )

    Hopefully we'll get towards something better soon :-/

    -Todd
    On Mon, Aug 10, 2009 at 5:25 PM, Tsz Wo (Nicholas), Sze wrote:

    I have to admit that I don't know the official answer. The hack below
    seems working:
    - compile all 3 sub-projects;
    - copy everything in hdfs/build and mapreduce/build to common/build;
    - then run hadoop by the scripts in common/bin as before.

    Any better idea?

    Nicholas Sze
  • Tsz Wo \(Nicholas\), Sze at Aug 11, 2009 at 12:59 am
    Hi Todd,

    Two problems:
    - The patch in HADOOP-6152 cannot be applied.

    - I have tried an approach similar to the one described by the slides but it did not work since jetty cannot find the webapps directory. See below:
    2009-08-10 17:54:41,671 WARN org.mortbay.log: Web application not found file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs
    2009-08-10 17:54:41,671 WARN org.mortbay.log: Failed startup of context org.mortbay.jetty.webapp.WebAppContext@1884a40{/,file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs}
    java.io.FileNotFoundException: file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs
    at org.mortbay.jetty.webapp.WebAppContext.resolveWebApp(WebAppContext.java:959)
    at org.mortbay.jetty.webapp.WebAppContext.getWebInf(WebAppContext.java:793)
    at org.mortbay.jetty.webapp.WebInfConfiguration.configureClassLoader(WebInfConfiguration.java:62)
    at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:456)
    at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152)
    at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156)
    at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
    at org.mortbay.jetty.Server.doStart(Server.java:222)
    at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at org.apache.hadoop.http.HttpServer.start(HttpServer.java:464)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:362)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.activate(NameNode.java:309)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:300)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:399)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1165)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1174)

    Thanks,
    Nicholas



    ----- Original Message ----
    From: Todd Lipcon <todd@cloudera.com>
    To: common-dev@hadoop.apache.org
    Cc: hdfs-dev@hadoop.apache.org; mapreduce-dev@hadoop.apache.org
    Sent: Monday, August 10, 2009 5:30:52 PM
    Subject: Re: Question: how to run hadoop after the project split?

    Hey Nicholas,

    Aaron gave a presentation with his best guess at the HUG last month. His
    slides are here: http://www.cloudera.com/blog/2009/07/17/the-project-split/
    (starting at slide 16)
    (I'd let him reply himself, but he's out of the office this afternoon ;-) )

    Hopefully we'll get towards something better soon :-/

    -Todd

    On Mon, Aug 10, 2009 at 5:25 PM, Tsz Wo (Nicholas), Sze <
    s29752-hadoopdev@yahoo.com> wrote:
    I have to admit that I don't know the official answer. The hack below
    seems working:
    - compile all 3 sub-projects;
    - copy everything in hdfs/build and mapreduce/build to common/build;
    - then run hadoop by the scripts in common/bin as before.

    Any better idea?

    Nicholas Sze
  • Jay Booth at Aug 11, 2009 at 1:24 am
    Yeah, I'm hitting the same issues, the patch problems weren't really an
    issue (same line for same line conflict on my checkout), but not having the
    webapp's sort of a pain.

    Looks like ant bin-package puts the webapps dir in
    HDFS_HOME/build/hadoop-hdfs-0.21.0-dev/webapps, while the daemon's expecting
    build/webapps/hdfs. Anyone know off the top of their heads where this is
    specified, or have a recommended solution? Otherwise I can hack away.
    On Mon, Aug 10, 2009 at 8:59 PM, Tsz Wo (Nicholas), Sze wrote:

    Hi Todd,

    Two problems:
    - The patch in HADOOP-6152 cannot be applied.

    - I have tried an approach similar to the one described by the slides but
    it did not work since jetty cannot find the webapps directory. See below:
    2009-08-10 17:54:41,671 WARN org.mortbay.log: Web application not found
    file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs
    2009-08-10 17:54:41,671 WARN org.mortbay.log: Failed startup of context
    org.mortbay.jetty.webapp.WebAppContext@1884a40
    {/,file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs}
    java.io.FileNotFoundException:
    file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs
    at
    org.mortbay.jetty.webapp.WebAppContext.resolveWebApp(WebAppContext.java:959)
    at
    org.mortbay.jetty.webapp.WebAppContext.getWebInf(WebAppContext.java:793)
    at
    org.mortbay.jetty.webapp.WebInfConfiguration.configureClassLoader(WebInfConfiguration.java:62)
    at
    org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:456)
    at
    org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at
    org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152)
    at
    org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156)
    at
    org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at
    org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
    at org.mortbay.jetty.Server.doStart(Server.java:222)
    at
    org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at org.apache.hadoop.http.HttpServer.start(HttpServer.java:464)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:362)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.activate(NameNode.java:309)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:300)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:405)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:399)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1165)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1174)

    Thanks,
    Nicholas



    ----- Original Message ----
    From: Todd Lipcon <todd@cloudera.com>
    To: common-dev@hadoop.apache.org
    Cc: hdfs-dev@hadoop.apache.org; mapreduce-dev@hadoop.apache.org
    Sent: Monday, August 10, 2009 5:30:52 PM
    Subject: Re: Question: how to run hadoop after the project split?

    Hey Nicholas,

    Aaron gave a presentation with his best guess at the HUG last month. His
    slides are here:
    http://www.cloudera.com/blog/2009/07/17/the-project-split/
    (starting at slide 16)
    (I'd let him reply himself, but he's out of the office this afternoon ;-) )
    Hopefully we'll get towards something better soon :-/

    -Todd

    On Mon, Aug 10, 2009 at 5:25 PM, Tsz Wo (Nicholas), Sze <
    s29752-hadoopdev@yahoo.com> wrote:
    I have to admit that I don't know the official answer. The hack below
    seems working:
    - compile all 3 sub-projects;
    - copy everything in hdfs/build and mapreduce/build to common/build;
    - then run hadoop by the scripts in common/bin as before.

    Any better idea?

    Nicholas Sze
  • Philip Zeyliger at Aug 11, 2009 at 1:33 am
    FWIW, I've been using the following simple shell script:

    [0]doorstop:hadoop(149128)$cat damnit.sh
    #!/bin/bash

    set -o errexit
    set -x

    cd hadoop-common
    ant binary
    cd ..
    cd hadoop-hdfs
    ant binary
    cd ..
    cd hadoop-mapreduce
    ant binary
    cd ..

    mkdir -p all/bin all/lib all/contrib
    cp hadoop-common/bin/* all/bin
    cp **/build/*.jar all/lib || true
    cp **/build/*-dev/lib/* all/lib || true
    cp **/build/*-dev/contrib/**/*.jar all/contrib

    It may very well make sense to have a meta-ant target that aggregates these
    things together in a sensible way.

    -- Philip
    On Mon, Aug 10, 2009 at 6:24 PM, Jay Booth wrote:

    Yeah, I'm hitting the same issues, the patch problems weren't really an
    issue (same line for same line conflict on my checkout), but not having the
    webapp's sort of a pain.

    Looks like ant bin-package puts the webapps dir in
    HDFS_HOME/build/hadoop-hdfs-0.21.0-dev/webapps, while the daemon's
    expecting
    build/webapps/hdfs. Anyone know off the top of their heads where this is
    specified, or have a recommended solution? Otherwise I can hack away.

    On Mon, Aug 10, 2009 at 8:59 PM, Tsz Wo (Nicholas), Sze <
    s29752-hadoopdev@yahoo.com> wrote:
    Hi Todd,

    Two problems:
    - The patch in HADOOP-6152 cannot be applied.

    - I have tried an approach similar to the one described by the slides but
    it did not work since jetty cannot find the webapps directory. See below:
    2009-08-10 17:54:41,671 WARN org.mortbay.log: Web application not found
    file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs
    2009-08-10 17:54:41,671 WARN org.mortbay.log: Failed startup of context
    org.mortbay.jetty.webapp.WebAppContext@1884a40
    {/,file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs}
    java.io.FileNotFoundException:
    file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs
    at
    org.mortbay.jetty.webapp.WebAppContext.resolveWebApp(WebAppContext.java:959)
    at
    org.mortbay.jetty.webapp.WebAppContext.getWebInf(WebAppContext.java:793)
    at
    org.mortbay.jetty.webapp.WebInfConfiguration.configureClassLoader(WebInfConfiguration.java:62)
    at
    org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:456)
    at
    org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at
    org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152)
    at
    org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156)
    at
    org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at
    org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
    at org.mortbay.jetty.Server.doStart(Server.java:222)
    at
    org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at org.apache.hadoop.http.HttpServer.start(HttpServer.java:464)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:362)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.activate(NameNode.java:309)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:300)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:405)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:399)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1165)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1174)

    Thanks,
    Nicholas



    ----- Original Message ----
    From: Todd Lipcon <todd@cloudera.com>
    To: common-dev@hadoop.apache.org
    Cc: hdfs-dev@hadoop.apache.org; mapreduce-dev@hadoop.apache.org
    Sent: Monday, August 10, 2009 5:30:52 PM
    Subject: Re: Question: how to run hadoop after the project split?

    Hey Nicholas,

    Aaron gave a presentation with his best guess at the HUG last month.
    His
    slides are here:
    http://www.cloudera.com/blog/2009/07/17/the-project-split/
    (starting at slide 16)
    (I'd let him reply himself, but he's out of the office this afternoon
    ;-)
    )
    Hopefully we'll get towards something better soon :-/

    -Todd

    On Mon, Aug 10, 2009 at 5:25 PM, Tsz Wo (Nicholas), Sze <
    s29752-hadoopdev@yahoo.com> wrote:
    I have to admit that I don't know the official answer. The hack
    below
    seems working:
    - compile all 3 sub-projects;
    - copy everything in hdfs/build and mapreduce/build to common/build;
    - then run hadoop by the scripts in common/bin as before.

    Any better idea?

    Nicholas Sze
  • Tsz Wo \(Nicholas\), Sze at Aug 11, 2009 at 2:05 am
    Hi Philip,

    Tried the script. It seems that the script could start a cluster but the web page did not work. Got the following error from the web interface:

    HTTP ERROR: 404
    /dfshealth.jsp
    RequestURI=/dfshealth.jsp
    Powered by Jetty://

    Thanks,
    Nicholas


    ----- Original Message ----
    From: Philip Zeyliger <philip@cloudera.com>
    To: hdfs-dev@hadoop.apache.org
    Cc: common-dev@hadoop.apache.org; mapreduce-dev@hadoop.apache.org
    Sent: Monday, August 10, 2009 6:32:40 PM
    Subject: Re: Question: how to run hadoop after the project split?

    FWIW, I've been using the following simple shell script:

    [0]doorstop:hadoop(149128)$cat damnit.sh
    #!/bin/bash

    set -o errexit
    set -x

    cd hadoop-common
    ant binary
    cd ..
    cd hadoop-hdfs
    ant binary
    cd ..
    cd hadoop-mapreduce
    ant binary
    cd ..

    mkdir -p all/bin all/lib all/contrib
    cp hadoop-common/bin/* all/bin
    cp **/build/*.jar all/lib || true
    cp **/build/*-dev/lib/* all/lib || true
    cp **/build/*-dev/contrib/**/*.jar all/contrib

    It may very well make sense to have a meta-ant target that aggregates these
    things together in a sensible way.

    -- Philip
    On Mon, Aug 10, 2009 at 6:24 PM, Jay Booth wrote:

    Yeah, I'm hitting the same issues, the patch problems weren't really an
    issue (same line for same line conflict on my checkout), but not having the
    webapp's sort of a pain.

    Looks like ant bin-package puts the webapps dir in
    HDFS_HOME/build/hadoop-hdfs-0.21.0-dev/webapps, while the daemon's
    expecting
    build/webapps/hdfs. Anyone know off the top of their heads where this is
    specified, or have a recommended solution? Otherwise I can hack away.

    On Mon, Aug 10, 2009 at 8:59 PM, Tsz Wo (Nicholas), Sze <
    s29752-hadoopdev@yahoo.com> wrote:
    Hi Todd,

    Two problems:
    - The patch in HADOOP-6152 cannot be applied.

    - I have tried an approach similar to the one described by the slides but
    it did not work since jetty cannot find the webapps directory. See below:
    2009-08-10 17:54:41,671 WARN org.mortbay.log: Web application not found
    file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs
    2009-08-10 17:54:41,671 WARN org.mortbay.log: Failed startup of context
    org.mortbay.jetty.webapp.WebAppContext@1884a40
    {/,file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs}
    java.io.FileNotFoundException:
    file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs
    at
    org.mortbay.jetty.webapp.WebAppContext.resolveWebApp(WebAppContext.java:959)
    at
    org.mortbay.jetty.webapp.WebAppContext.getWebInf(WebAppContext.java:793)
    at
    org.mortbay.jetty.webapp.WebInfConfiguration.configureClassLoader(WebInfConfiguration.java:62)
    at
    org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:456)
    at
    org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at
    org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152)
    at
    org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156)
    at
    org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at
    org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
    at org.mortbay.jetty.Server.doStart(Server.java:222)
    at
    org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at org.apache.hadoop.http.HttpServer.start(HttpServer.java:464)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:362)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.activate(NameNode.java:309)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:300)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:405)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:399)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1165)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1174)

    Thanks,
    Nicholas



    ----- Original Message ----
    From: Todd Lipcon
    To: common-dev@hadoop.apache.org
    Cc: hdfs-dev@hadoop.apache.org; mapreduce-dev@hadoop.apache.org
    Sent: Monday, August 10, 2009 5:30:52 PM
    Subject: Re: Question: how to run hadoop after the project split?

    Hey Nicholas,

    Aaron gave a presentation with his best guess at the HUG last month.
    His
    slides are here:
    http://www.cloudera.com/blog/2009/07/17/the-project-split/
    (starting at slide 16)
    (I'd let him reply himself, but he's out of the office this afternoon
    ;-)
    )
    Hopefully we'll get towards something better soon :-/

    -Todd

    On Mon, Aug 10, 2009 at 5:25 PM, Tsz Wo (Nicholas), Sze <
    s29752-hadoopdev@yahoo.com> wrote:
    I have to admit that I don't know the official answer. The hack
    below
    seems working:
    - compile all 3 sub-projects;
    - copy everything in hdfs/build and mapreduce/build to common/build;
    - then run hadoop by the scripts in common/bin as before.

    Any better idea?

    Nicholas Sze
  • Tsz Wo \(Nicholas\), Sze at Aug 11, 2009 at 7:05 pm
    The following script implements the hack mentioned in my previous message.

    Hope this help.
    Nicholas


    #!/bin/bash
    set -e -x

    #Set homes for the sub-projects
    COMMON_HOME=/cygdrive/d/@sze/hadoop/a/common
    HDFS_HOME=/cygdrive/d/@sze/hadoop/a/hdfs
    MAPREDUCE_HOME=/cygdrive/d/@sze/hadoop/a/mapreduce

    #Compile each sub-project
    #Change the ant command if necessary
    cd $COMMON_HOME
    ant clean compile-core-test
    cd $HDFS_HOME
    ant clean compile-hdfs-test
    cd $MAPREDUCE_HOME
    ant clean compile-mapred-test examples

    #Copy everything to common
    cp -R $HDFS_HOME/build/* $MAPREDUCE_HOME/build/* $COMMON_HOME/build

    #Then, you may use the scripts in $COMMON_HOME/bin to run hadoop as before.
    #For examples,
    #
    # > cd $COMMON_HOME
    #
    # > ./bin/start-dfs.sh
    #
    # > ./bin/start-mapred.sh
    #
    # > ./bin/hadoop fs -ls
    #
    # > ./bin/hadoop jar build/hadoop-mapred-examples-0.21.0-dev.jar pi 10 10000




    ----- Original Message ----
    From: "Tsz Wo (Nicholas), Sze" <s29752-hadoopdev@yahoo.com>
    To: common-dev@hadoop.apache.org; hdfs-dev@hadoop.apache.org; mapreduce-dev@hadoop.apache.org
    Sent: Monday, August 10, 2009 5:25:08 PM
    Subject: Question: how to run hadoop after the project split?

    I have to admit that I don't know the official answer. The hack below seems
    working:
    - compile all 3 sub-projects;
    - copy everything in hdfs/build and mapreduce/build to common/build;
    - then run hadoop by the scripts in common/bin as before.

    Any better idea?

    Nicholas Sze

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouphdfs-dev @
categorieshadoop
postedAug 11, '09 at 12:25a
activeAug 11, '09 at 7:05p
posts8
users5
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase