Hi,
Can anyone give the procedure about how to run Distibuted shell
example in hadoop yarn.So that i try to understand how applicatin master
really works.

Search Discussions

  • Hitesh Shah at Dec 14, 2011 at 6:39 pm
    Assuming you have a non-secure cluster setup ( the code does not handle security properly yet ), the following command would run the ls command on 5 allocated containers.

    $HADOOP_COMMON_HOME/bin/hadoop jar < path to hadoop-yarn-applications-distributedshell-0.24.0-SNAPSHOT.jar> org.apache.hadoop.yarn.applications.distributedshell.Client --jar < path to hadoop-yarn-applications-distributedshell-0.24.0-SNAPSHOT.jar> --shell_command ls --num_containers 5 --debug

    What the above does is upload the jar that contains the AppMaster class to hdfs, submits a new application request to launch the distributed shell app master on a container which then in turn runs the shell command on the no. of containers specified.

    -- Hitesh
    On Dec 14, 2011, at 1:06 AM, sri ram wrote:

    Hi,
    Can anyone give the procedure about how to run Distibuted shell example in hadoop yarn.So that i try to understand how applicatin master really works.
  • Raghavendhra rahul at Dec 15, 2011 at 4:36 am
    I get the following erroer by the given command to run distributed shell
    hadoop1@master:~/hadoop/bin$ ./hadoop jar
    ../modules/hadoop-yarn-applications-distributedshell-0.23.0.jar
    org.apache.hadoop.yarn.applications.distributedshell.Client --jar
    ../modules/hadoop-yarn-applications-distributedshell-0.23.0.jar
    --shell_command ls --num_containers 5 --debug
    2011-12-15 10:04:41,605 FATAL distributedshell.Client
    (Client.java:main(190)) - Error running CLient
    java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/ipc/YarnRPC
    at
    org.apache.hadoop.yarn.applications.distributedshell.Client.(Client.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:616)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:189)
    Caused by: java.lang.ClassNotFoundException:
    org.apache.hadoop.yarn.ipc.YarnRPC
    at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
    ... 7 more

    On Thu, Dec 15, 2011 at 12:09 AM, Hitesh Shah wrote:

    Assuming you have a non-secure cluster setup ( the code does not handle
    security properly yet ), the following command would run the ls command on
    5 allocated containers.

    $HADOOP_COMMON_HOME/bin/hadoop jar < path to
    hadoop-yarn-applications-distributedshell-0.24.0-SNAPSHOT.jar>
    org.apache.hadoop.yarn.applications.distributedshell.Client --jar < path to
    hadoop-yarn-applications-distributedshell-0.24.0-SNAPSHOT.jar>
    --shell_command ls --num_containers 5 --debug

    What the above does is upload the jar that contains the AppMaster class to
    hdfs, submits a new application request to launch the distributed shell app
    master on a container which then in turn runs the shell command on the no.
    of containers specified.

    -- Hitesh
    On Dec 14, 2011, at 1:06 AM, sri ram wrote:

    Hi,
    Can anyone give the procedure about how to run Distibuted shell
    example in hadoop yarn.So that i try to understand how applicatin master
    really works.
  • Hitesh Shah at Dec 15, 2011 at 5:10 am
    The yarn jars are likely missing from the classpath. Could you try creating the symlinks as per step 11 from
    http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-mapreduce-project/INSTALL?revision=1166955 ?

    -- Hitesh
    On Dec 14, 2011, at 8:35 PM, raghavendhra rahul wrote:

    I get the following erroer by the given command to run distributed shell
    hadoop1@master:~/hadoop/bin$ ./hadoop jar ../modules/hadoop-yarn-applications-distributedshell-0.23.0.jar org.apache.hadoop.yarn.applications.distributedshell.Client --jar ../modules/hadoop-yarn-applications-distributedshell-0.23.0.jar --shell_command ls --num_containers 5 --debug
    2011-12-15 10:04:41,605 FATAL distributedshell.Client (Client.java:main(190)) - Error running CLient
    java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/ipc/YarnRPC
    at org.apache.hadoop.yarn.applications.distributedshell.Client.<init>(Client.java:206)
    at org.apache.hadoop.yarn.applications.distributedshell.Client.main(Client.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:616)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:189)
    Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.ipc.YarnRPC
    at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
    ... 7 more


    On Thu, Dec 15, 2011 at 12:09 AM, Hitesh Shah wrote:
    Assuming you have a non-secure cluster setup ( the code does not handle security properly yet ), the following command would run the ls command on 5 allocated containers.

    $HADOOP_COMMON_HOME/bin/hadoop jar < path to hadoop-yarn-applications-distributedshell-0.24.0-SNAPSHOT.jar> org.apache.hadoop.yarn.applications.distributedshell.Client --jar < path to hadoop-yarn-applications-distributedshell-0.24.0-SNAPSHOT.jar> --shell_command ls --num_containers 5 --debug

    What the above does is upload the jar that contains the AppMaster class to hdfs, submits a new application request to launch the distributed shell app master on a container which then in turn runs the shell command on the no. of containers specified.

    -- Hitesh
    On Dec 14, 2011, at 1:06 AM, sri ram wrote:

    Hi,
    Can anyone give the procedure about how to run Distibuted shell example in hadoop yarn.So that i try to understand how applicatin master really works.
  • Raghavendhra rahul at Dec 15, 2011 at 5:49 am
    should i link the hadoop-yarn-applications-distributedshell-0.23.0.jar also.
    Without linking this jar it throws the same error.
    If linked it shows
    at org.apache.hadoop.util.RunJar.main(RunJar.java:130)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.(JarFile.java:150)
    at java.util.jar.JarFile.(RunJar.java:128)
  • Raghavendhra rahul at Dec 15, 2011 at 6:19 am
    Thanks for the help i made a mistake of creating symlinks within
    modules.Now everythng is fine.

    On Thu, Dec 15, 2011 at 11:18 AM, raghavendhra rahul wrote:

    should i link the hadoop-yarn-applications-distributedshell-0.23.0.jar
    also.
    Without linking this jar it throws the same error.
    If linked it shows
    at org.apache.hadoop.util.RunJar.main(RunJar.java:130)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:131)
    at java.util.jar.JarFile.<init>(JarFile.java:150)
    at java.util.jar.JarFile.<init>(JarFile.java:87)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:128)

  • Raghavendhra rahul at Dec 15, 2011 at 6:27 am
    How to run any script using this.When i tried it shows final status as
    failed.
    On Thu, Dec 15, 2011 at 11:48 AM, raghavendhra rahul wrote:

    Thanks for the help i made a mistake of creating symlinks within
    modules.Now everythng is fine.



    On Thu, Dec 15, 2011 at 11:18 AM, raghavendhra rahul <
    raghavendhrarahul@gmail.com> wrote:
    should i link the hadoop-yarn-applications-distributedshell-0.23.0.jar
    also.
    Without linking this jar it throws the same error.
    If linked it shows
    at org.apache.hadoop.util.RunJar.main(RunJar.java:130)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:131)
    at java.util.jar.JarFile.<init>(JarFile.java:150)
    at java.util.jar.JarFile.<init>(JarFile.java:87)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:128)

  • Raghavendhra rahul at Dec 15, 2011 at 6:53 am
    When we create a directory using distributed shell,any idea where it is
    created
    On Thu, Dec 15, 2011 at 11:57 AM, raghavendhra rahul wrote:

    How to run any script using this.When i tried it shows final status as
    failed.


    On Thu, Dec 15, 2011 at 11:48 AM, raghavendhra rahul <
    raghavendhrarahul@gmail.com> wrote:
    Thanks for the help i made a mistake of creating symlinks within
    modules.Now everythng is fine.



    On Thu, Dec 15, 2011 at 11:18 AM, raghavendhra rahul <
    raghavendhrarahul@gmail.com> wrote:
    should i link the hadoop-yarn-applications-distributedshell-0.23.0.jar
    also.
    Without linking this jar it throws the same error.
    If linked it shows
    at org.apache.hadoop.util.RunJar.main(RunJar.java:130)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:131)
    at java.util.jar.JarFile.<init>(JarFile.java:150)
    at java.util.jar.JarFile.<init>(JarFile.java:87)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:128)

  • Hitesh Shah at Dec 15, 2011 at 10:53 pm
    The shell script is invoked within the context of a container launched by the NodeManager. If you are creating a directory using a relative path, it will be created relative of the container's working directory and cleaned up when the container completes.

    If you really want to see some output, one option could be to have your script create some data on hdfs or echo output to stdout which will be captured in the container logs. The stdout/stderr logs generated by your script should be available wherever you have configured the node-manager's log dirs to point to.

    -- Hitesh
    On Dec 14, 2011, at 10:52 PM, raghavendhra rahul wrote:

    When we create a directory using distributed shell,any idea where it is created

    On Thu, Dec 15, 2011 at 11:57 AM, raghavendhra rahul wrote:
    How to run any script using this.When i tried it shows final status as failed.


    On Thu, Dec 15, 2011 at 11:48 AM, raghavendhra rahul wrote:
    Thanks for the help i made a mistake of creating symlinks within modules.Now everythng is fine.



    On Thu, Dec 15, 2011 at 11:18 AM, raghavendhra rahul wrote:
    should i link the hadoop-yarn-applications-distributedshell-0.23.0.jar also.
    Without linking this jar it throws the same error.
    If linked it shows
    at org.apache.hadoop.util.RunJar.main(RunJar.java:130)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:131)
    at java.util.jar.JarFile.<init>(JarFile.java:150)
    at java.util.jar.JarFile.<init>(JarFile.java:87)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:128)



  • Raghavendhra rahul at Dec 16, 2011 at 2:23 am
    Thanks for the reply.
    Also is there any other application master example other than
    DistributedShell.
    On Fri, Dec 16, 2011 at 4:23 AM, Hitesh Shah wrote:

    The shell script is invoked within the context of a container launched by
    the NodeManager. If you are creating a directory using a relative path, it
    will be created relative of the container's working directory and cleaned
    up when the container completes.

    If you really want to see some output, one option could be to have your
    script create some data on hdfs or echo output to stdout which will be
    captured in the container logs. The stdout/stderr logs generated by your
    script should be available wherever you have configured the node-manager's
    log dirs to point to.

    -- Hitesh
    On Dec 14, 2011, at 10:52 PM, raghavendhra rahul wrote:

    When we create a directory using distributed shell,any idea where it is created
    On Thu, Dec 15, 2011 at 11:57 AM, raghavendhra rahul <
    raghavendhrarahul@gmail.com> wrote:
    How to run any script using this.When i tried it shows final status as failed.

    On Thu, Dec 15, 2011 at 11:48 AM, raghavendhra rahul <
    raghavendhrarahul@gmail.com> wrote:
    Thanks for the help i made a mistake of creating symlinks within
    modules.Now everythng is fine.


    On Thu, Dec 15, 2011 at 11:18 AM, raghavendhra rahul <
    raghavendhrarahul@gmail.com> wrote:
    should i link the hadoop-yarn-applications-distributedshell-0.23.0.jar also.
    Without linking this jar it throws the same error.
    If linked it shows
    at org.apache.hadoop.util.RunJar.main(RunJar.java:130)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:131)
    at java.util.jar.JarFile.<init>(JarFile.java:150)
    at java.util.jar.JarFile.<init>(JarFile.java:87)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:128)



Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupmapreduce-user @
categorieshadoop
postedDec 14, '11 at 9:07a
activeDec 16, '11 at 2:23a
posts10
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase