FAQ
Hi All

I started here:
http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html

I copied the source code into my cluster and found out that the default
classpaths are not in the same place with CHD 4.3.-01
++++++++++++++++++++
Steps outlined in tutorial
++++++++++++++++++

Compile WordCount.java:

$ mkdir wordcount_classes

$ javac -cp *classpath* -d wordcount_classes WordCount.java

where *classpath* is:

    - CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/ <------ This
    does not exist in the stock default install of CDH 4.3.01


++++++++++++++++++++++++++++++++++++++++++++++++
my class path is
----> /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
and /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*

so that when I compile it looks like this:
  javac -cp
/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
WordCount.java

but this leads to compilation errors because the javac can't find or the
packages have different names
as in this the first error:

WordCount.java:8: package org.apache.hadoop.filecache does not exist
  import org.apache.hadoop.filecache.DistributedCache;
...

+++++++++++++++++++++++++++++++++++++
My question is how to correct this as this a default install using the CDH
manager etc. All nodes have the same classpath.

Thoughts?
Thanks
Tw

To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.

Search Discussions

  • Harsh J at Aug 31, 2013 at 4:22 am
    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java
    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn wrote:
    Hi All

    I started here:
    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html

    I copied the source code into my cluster and found out that the default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/ <------ This does not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*

    so that when I compile it looks like this:
    javac -cp
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac can't find or the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default install using the CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails from it, send an
    email to scm-users+unsubscribe@cloudera.org.


    --
    Harsh J

    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Tim Washburn at Sep 5, 2013 at 4:08 am
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on each of the
    servers in the cluster and the only place that I've have found the hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop. This is a stock install of CDH 4.3 with 2 mgr nodes and 3
    data nodes. So I'm kind of puzzled. I was able to compile the class files
    and such but from within Hue the same basic error happens which is that
    CDH43 can't find the classpaths. I suspect that Hue is looking for the
    classpath in /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied from /opt/cloudera/parcels/CDH/lib/hadoop/
    to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn
    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java
    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn wrote:
    Hi All

    I started here:
    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html
    I copied the source code into my cluster and found out that the default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/ <------ This does not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*

    so that when I compile it looks like this:
    javac -cp
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac can't find or the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default install using the CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails from it, send an
    email to scm-users+...@cloudera.org <javascript:>.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Harsh J at Sep 5, 2013 at 5:05 am
    Hey Tim,

    The doc references package based paths (/usr), and needs to be updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java
    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on each of the
    servers in the cluster and the only place that I've have found the hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr nodes and 3 data nodes. So I'm
    kind of puzzled. I was able to compile the class files and such but from
    within Hue the same basic error happens which is that CDH43 can't find the
    classpaths. I suspect that Hue is looking for the classpath in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn

    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:

    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html

    I copied the source code into my cluster and found out that the default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/ <------ This does
    not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*

    so that when I compile it looks like this:
    javac -cp

    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac can't find or the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default install using the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails from it, send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an
    email to scm-users+unsubscribe@cloudera.org.


    --
    Harsh J

    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Tim Washburn at Sep 5, 2013 at 4:20 pm
    Hi Hash J,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# javac -cp `hadoop classpath` WordCount.java

    WordCount.java:8: package org.apache.hadoop.filecache does not exist
      import org.apache.hadoop.filecache.DistributedCache;
                                        ^
    WordCount.java:16: cannot find symbol
    symbol : class MapReduceBase
    location: class org.myorg.WordCount
        public static class Map extends MapReduceBase implements
    Mapper<LongWritable, Text, Text, IntWritable> {

    (more of the same is sent to console)


    As you can see the system has no sense of the classpath. Thus the errors.
      I suspect that the libraries for Hadoop should all be
    under /usr/lib/hadoop/. But they are not. Given this should I just copy
    them there and fix the classpath or is there a config item in the cdh4.3
    manager that will allow me to correct the issue?

    Regards.
    Tim Washburn
    On Wednesday, September 4, 2013 10:05:07 PM UTC-7, Harsh J wrote:

    Hey Tim,

    The doc references package based paths (/usr), and needs to be updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java
    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on each of the
    servers in the cluster and the only place that I've have found the hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr nodes and 3 data nodes. So I'm
    kind of puzzled. I was able to compile the class files and such but from
    within Hue the same basic error happens which is that CDH43 can't find the
    classpaths. I suspect that Hue is looking for the classpath in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn

    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:
    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html
    I copied the source code into my cluster and found out that the
    default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/ <------ This
    does
    not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*

    so that when I compile it looks like this:
    javac -cp
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac can't find or
    the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default install using
    the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an
    email to scm-users+...@cloudera.org <javascript:>.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Harsh J at Sep 5, 2013 at 4:34 pm
    Hello Tim,

    It isn't vital for the files to be under /usr/lib/ really. You can use
    the /opt/cloudera path as well. If you type "hadoop classpath", you
    can notice a valid classpath structure using the parcel paths (/opt).

    However, the error is pretty odd. Can you send us the outputs of the following?

    $ hadoop classpath
    $ echo $HADOOP_HOME
    $ echo $HADOOP_PREFIX
    $ echo $HADOOP_MAPRED_HOME
    On Thu, Sep 5, 2013 at 9:50 PM, Tim Washburn wrote:
    Hi Hash J,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# javac -cp `hadoop classpath` WordCount.java

    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:16: cannot find symbol
    symbol : class MapReduceBase
    location: class org.myorg.WordCount
    public static class Map extends MapReduceBase implements
    Mapper<LongWritable, Text, Text, IntWritable> {

    (more of the same is sent to console)


    As you can see the system has no sense of the classpath. Thus the errors. I
    suspect that the libraries for Hadoop should all be under /usr/lib/hadoop/.
    But they are not. Given this should I just copy them there and fix the
    classpath or is there a config item in the cdh4.3 manager that will allow me
    to correct the issue?

    Regards.
    Tim Washburn
    On Wednesday, September 4, 2013 10:05:07 PM UTC-7, Harsh J wrote:

    Hey Tim,

    The doc references package based paths (/usr), and needs to be updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java

    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on each of the
    servers in the cluster and the only place that I've have found the
    hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr nodes and 3 data nodes. So
    I'm
    kind of puzzled. I was able to compile the class files and such but from
    within Hue the same basic error happens which is that CDH43 can't find
    the
    classpaths. I suspect that Hue is looking for the classpath in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn

    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:


    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html

    I copied the source code into my cluster and found out that the
    default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/ <------ This
    does
    not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*

    so that when I compile it looks like this:
    javac -cp


    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac can't find or
    the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default install using
    the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an
    email to scm-users+unsubscribe@cloudera.org.


    --
    Harsh J

    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Tim Washburn at Sep 5, 2013 at 6:37 pm
    HI Harsh,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/.//*

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_HOME
    /usr/lib/hadoop-0.20-mapreduce

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_PREFIX

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_MAPRED_HOME
    /usr/lib/hadoop-0.20-mapreduce

    +++++++++++++++++
    $HADOOP_PREFIX yielded nothing. But it looks like it is expecting things in
    /user/lib/hadoop-0.20-mapreduce which does not exist.

    On the datanodes $HADOOP_HOME, $HADOOP_PREFIX and $HADOOP_MAPRED_HOME
    display no entries where the hadoop classpath command shows this:

    [root@USHERLXSDN1 ~]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/.//*

    The secondary name node has the saem output as the datanodes.
    ++++++++++++++++++

    Regards
    Tim Washburn

    On Thursday, September 5, 2013 9:33:57 AM UTC-7, Harsh J wrote:

    Hello Tim,

    It isn't vital for the files to be under /usr/lib/ really. You can use
    the /opt/cloudera path as well. If you type "hadoop classpath", you
    can notice a valid classpath structure using the parcel paths (/opt).

    However, the error is pretty odd. Can you send us the outputs of the
    following?

    $ hadoop classpath
    $ echo $HADOOP_HOME
    $ echo $HADOOP_PREFIX
    $ echo $HADOOP_MAPRED_HOME
    On Thu, Sep 5, 2013 at 9:50 PM, Tim Washburn wrote:
    Hi Hash J,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# javac -cp `hadoop classpath` WordCount.java

    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:16: cannot find symbol
    symbol : class MapReduceBase
    location: class org.myorg.WordCount
    public static class Map extends MapReduceBase implements
    Mapper<LongWritable, Text, Text, IntWritable> {

    (more of the same is sent to console)


    As you can see the system has no sense of the classpath. Thus the
    errors. I
    suspect that the libraries for Hadoop should all be under
    /usr/lib/hadoop/.
    But they are not. Given this should I just copy them there and fix the
    classpath or is there a config item in the cdh4.3 manager that will allow me
    to correct the issue?

    Regards.
    Tim Washburn
    On Wednesday, September 4, 2013 10:05:07 PM UTC-7, Harsh J wrote:

    Hey Tim,

    The doc references package based paths (/usr), and needs to be updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java

    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on each of
    the
    servers in the cluster and the only place that I've have found the
    hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr nodes and 3 data nodes.
    So
    I'm
    kind of puzzled. I was able to compile the class files and such but
    from
    within Hue the same basic error happens which is that CDH43 can't
    find
    the
    classpaths. I suspect that Hue is looking for the classpath in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn

    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:

    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html
    I copied the source code into my cluster and found out that the
    default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/ <------
    This
    does
    not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*
    so that when I compile it looks like this:
    javac -cp

    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac can't find
    or
    the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default install using
    the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an
    email to scm-users+...@cloudera.org <javascript:>.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Harsh J at Sep 5, 2013 at 6:42 pm
    Hello,

    Thanks!

    Please unset HADOOP_HOME and HADOOP_MAPRED_HOME. We set them
    automatically for you beneath the layers:

    $ unset HADOOP_HOME
    $ unset HADOOP_MAPRED_HOME

    Please retry compilation after this.

    Alternatively, does the compilation succeed on USHERLXSDN1 by default,
    if you run that "javac -cp `hadoop classpath` WordCount.java" there?
    On Fri, Sep 6, 2013 at 12:07 AM, Tim Washburn wrote:
    HI Harsh,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/.//*

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_HOME
    /usr/lib/hadoop-0.20-mapreduce

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_PREFIX

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_MAPRED_HOME
    /usr/lib/hadoop-0.20-mapreduce

    +++++++++++++++++
    $HADOOP_PREFIX yielded nothing. But it looks like it is expecting things in
    /user/lib/hadoop-0.20-mapreduce which does not exist.

    On the datanodes $HADOOP_HOME, $HADOOP_PREFIX and $HADOOP_MAPRED_HOME
    display no entries where the hadoop classpath command shows this:

    [root@USHERLXSDN1 ~]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/.//*

    The secondary name node has the saem output as the datanodes.
    ++++++++++++++++++

    Regards
    Tim Washburn

    On Thursday, September 5, 2013 9:33:57 AM UTC-7, Harsh J wrote:

    Hello Tim,

    It isn't vital for the files to be under /usr/lib/ really. You can use
    the /opt/cloudera path as well. If you type "hadoop classpath", you
    can notice a valid classpath structure using the parcel paths (/opt).

    However, the error is pretty odd. Can you send us the outputs of the
    following?

    $ hadoop classpath
    $ echo $HADOOP_HOME
    $ echo $HADOOP_PREFIX
    $ echo $HADOOP_MAPRED_HOME

    On Thu, Sep 5, 2013 at 9:50 PM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi Hash J,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# javac -cp `hadoop classpath` WordCount.java

    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:16: cannot find symbol
    symbol : class MapReduceBase
    location: class org.myorg.WordCount
    public static class Map extends MapReduceBase implements
    Mapper<LongWritable, Text, Text, IntWritable> {

    (more of the same is sent to console)


    As you can see the system has no sense of the classpath. Thus the
    errors. I
    suspect that the libraries for Hadoop should all be under
    /usr/lib/hadoop/.
    But they are not. Given this should I just copy them there and fix the
    classpath or is there a config item in the cdh4.3 manager that will
    allow me
    to correct the issue?

    Regards.
    Tim Washburn
    On Wednesday, September 4, 2013 10:05:07 PM UTC-7, Harsh J wrote:

    Hey Tim,

    The doc references package based paths (/usr), and needs to be updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java

    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on each of
    the
    servers in the cluster and the only place that I've have found the
    hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr nodes and 3 data nodes.
    So
    I'm
    kind of puzzled. I was able to compile the class files and such but
    from
    within Hue the same basic error happens which is that CDH43 can't
    find
    the
    classpaths. I suspect that Hue is looking for the classpath in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn

    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn
    <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:



    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html

    I copied the source code into my cluster and found out that the
    default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/ <------
    This
    does
    not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*

    so that when I compile it looks like this:
    javac -cp



    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac can't find
    or
    the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default install using
    the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an
    email to scm-users+unsubscribe@cloudera.org.


    --
    Harsh J

    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Tim Washburn at Sep 6, 2013 at 3:37 pm
    Hi Harsh,

    Un-setting the paths clear the /usr/lib/ from being displayed. But the
    base issue still remains. That of the classpath not being read when using
    the -cp 'hadoop classpath' short cut. This is an issue in that I will
    eventually have other users who will be developing mapreduce jobs. I'd like
    to be able to allow them to run this from the command line and or from
    within Hue.

    +++++++++++++
    This works for compiling and creating the Jar file.
    +++++++++++++
    [root@USHERLXSNN1 javaCode]# mkdir wordcount_classes
    [root@USHERLXSNN1 javaCode]# javac -cp
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/client-0.20/\*
    -d wordcount_classes WordCount.java
    [root@USHERLXSNN1 javaCode]# jar -cvf WordCount.jar -C wordcount_classes/ .
    added manifest
    adding: org/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/WordCount$Reduce.class(in = 1611) (out= 648)(deflated 59%)
    adding: org/myorg/WordCount$Map$Counters.class(in = 983) (out=
    502)(deflated 48%)
    adding: org/myorg/WordCount.class(in = 2671) (out= 1288)(deflated 51%)
    adding: org/myorg/WordCount$Map.class(in = 4661) (out= 2214)(deflated 52%)

    +++++++++++++
    This fails
    +++++++++++++

    [root@USHERLXSNN1 javaCode]# javac -cp 'hadoop classpath' -d
    wordcount_classes WordCount.java
    WordCount.java:7: package org.apache.hadoop.fs does not exist
    import org.apache.hadoop.fs.Path;
                                 ^
    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
                                        ^
    WordCount.java:9: package org.apache.hadoop.conf does not exist
    import org.apache.hadoop.conf.*;
    ^
    … more errors after this but of the same nature.

    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    This Fails as well when using the Jar file that was created
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

    [root@USHERLXSNN1 javaCode]# hadoop jar wordcount.jar org.myorg.WordCount
    /user/data/input /user/data/output
    Exception in thread "main" java.io.IOException: Error opening job jar:
    wordcount.jar
             at org.apache.hadoop.util.RunJar.main(RunJar.java:135)
    Caused by: java.util.zip.ZipException: error in opening zip file
             at java.util.zip.ZipFile.open(Native Method)
             at java.util.zip.ZipFile.(JarFile.java:135)
             at java.util.jar.JarFile.[root@USHERLXSNN1 javaCode]#


    Thoughts?
    Regards
    Tim Washburn

    On Thursday, September 5, 2013 11:42:01 AM UTC-7, Harsh J wrote:

    Hello,

    Thanks!

    Please unset HADOOP_HOME and HADOOP_MAPRED_HOME. We set them
    automatically for you beneath the layers:

    $ unset HADOOP_HOME
    $ unset HADOOP_MAPRED_HOME

    Please retry compilation after this.

    Alternatively, does the compilation succeed on USHERLXSDN1 by default,
    if you run that "javac -cp `hadoop classpath` WordCount.java" there?
    On Fri, Sep 6, 2013 at 12:07 AM, Tim Washburn wrote:
    HI Harsh,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/.//*
    [root@USHERLXSNN1 javaCode]# echo $HADOOP_HOME
    /usr/lib/hadoop-0.20-mapreduce

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_PREFIX

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_MAPRED_HOME
    /usr/lib/hadoop-0.20-mapreduce

    +++++++++++++++++
    $HADOOP_PREFIX yielded nothing. But it looks like it is expecting things in
    /user/lib/hadoop-0.20-mapreduce which does not exist.

    On the datanodes $HADOOP_HOME, $HADOOP_PREFIX and $HADOOP_MAPRED_HOME
    display no entries where the hadoop classpath command shows this:

    [root@USHERLXSDN1 ~]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/.//*
    The secondary name node has the saem output as the datanodes.
    ++++++++++++++++++

    Regards
    Tim Washburn

    On Thursday, September 5, 2013 9:33:57 AM UTC-7, Harsh J wrote:

    Hello Tim,

    It isn't vital for the files to be under /usr/lib/ really. You can use
    the /opt/cloudera path as well. If you type "hadoop classpath", you
    can notice a valid classpath structure using the parcel paths (/opt).

    However, the error is pretty odd. Can you send us the outputs of the
    following?

    $ hadoop classpath
    $ echo $HADOOP_HOME
    $ echo $HADOOP_PREFIX
    $ echo $HADOOP_MAPRED_HOME

    On Thu, Sep 5, 2013 at 9:50 PM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi Hash J,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# javac -cp `hadoop classpath`
    WordCount.java
    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:16: cannot find symbol
    symbol : class MapReduceBase
    location: class org.myorg.WordCount
    public static class Map extends MapReduceBase implements
    Mapper<LongWritable, Text, Text, IntWritable> {

    (more of the same is sent to console)


    As you can see the system has no sense of the classpath. Thus the
    errors. I
    suspect that the libraries for Hadoop should all be under
    /usr/lib/hadoop/.
    But they are not. Given this should I just copy them there and fix
    the
    classpath or is there a config item in the cdh4.3 manager that will
    allow me
    to correct the issue?

    Regards.
    Tim Washburn
    On Wednesday, September 4, 2013 10:05:07 PM UTC-7, Harsh J wrote:

    Hey Tim,

    The doc references package based paths (/usr), and needs to be
    updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java

    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on each of
    the
    servers in the cluster and the only place that I've have found the
    hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr nodes and 3 data
    nodes.
    So
    I'm
    kind of puzzled. I was able to compile the class files and such
    but
    from
    within Hue the same basic error happens which is that CDH43 can't
    find
    the
    classpaths. I suspect that Hue is looking for the classpath in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn

    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn
    <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:


    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html
    I copied the source code into my cluster and found out that the
    default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/ <------
    This
    does
    not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*
    so that when I compile it looks like this:
    javac -cp


    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac can't
    find
    or
    the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default install
    using
    the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails from
    it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an
    email to scm-users+...@cloudera.org <javascript:>.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Harsh J at Sep 7, 2013 at 1:11 am
    Hi,

    Glad to know you have it working now.

    Regd. remaining problem:
    That's because you are now using single quote characters, not the back tick
    characters I originally sent you. It is `hadoop classpath` if you want it
    to auto expand, not 'hadoop classpath'. The back make your she'll execute
    it as a command inline.

    I'd also recommend eventually moving to use maven, as a javac command gets
    painful to run once your project gets larger.
    On Sep 6, 2013 9:07 PM, "Tim Washburn" wrote:

    Hi Harsh,

    Un-setting the paths clear the /usr/lib/ from being displayed. But the
    base issue still remains. That of the classpath not being read when using
    the -cp 'hadoop classpath' short cut. This is an issue in that I will
    eventually have other users who will be developing mapreduce jobs. I'd like
    to be able to allow them to run this from the command line and or from
    within Hue.
    +++++++++++++
    This works for compiling and creating the Jar file.
    +++++++++++++
    [root@USHERLXSNN1 javaCode]# mkdir wordcount_classes
    [root@USHERLXSNN1 javaCode]# javac -cp
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/client-0.20/\*
    -d wordcount_classes WordCount.java
    [root@USHERLXSNN1 javaCode]# jar -cvf WordCount.jar -C wordcount_classes/ .
    added manifest
    adding: org/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/WordCount$Reduce.class(in = 1611) (out= 648)(deflated 59%)
    adding: org/myorg/WordCount$Map$Counters.class(in = 983) (out=
    502)(deflated 48%)
    adding: org/myorg/WordCount.class(in = 2671) (out= 1288)(deflated 51%)
    adding: org/myorg/WordCount$Map.class(in = 4661) (out= 2214)(deflated 52%)

    +++++++++++++
    This fails
    +++++++++++++

    [root@USHERLXSNN1 javaCode]# javac -cp 'hadoop classpath' -d
    wordcount_classes WordCount.java
    WordCount.java:7: package org.apache.hadoop.fs does not exist
    import org.apache.hadoop.fs.Path;

    ^
    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:9: package org.apache.hadoop.conf does not exist
    import org.apache.hadoop.conf.*;
    ^
    … more errors after this but of the same nature.

    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    This Fails as well when using the Jar file that was created
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

    [root@USHERLXSNN1 javaCode]# hadoop jar wordcount.jar org.myorg.WordCount
    /user/data/input /user/data/output
    Exception in thread "main" java.io.IOException: Error opening job jar:
    wordcount.jar
    at org.apache.hadoop.util.RunJar.main(RunJar.java:135)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:127)
    at java.util.jar.JarFile.<init>(JarFile.java:135)
    at java.util.jar.JarFile.<init>(JarFile.java:72)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:133)
    [root@USHERLXSNN1 javaCode]#


    Thoughts?
    Regards
    Tim Washburn

    On Thursday, September 5, 2013 11:42:01 AM UTC-7, Harsh J wrote:

    Hello,

    Thanks!

    Please unset HADOOP_HOME and HADOOP_MAPRED_HOME. We set them
    automatically for you beneath the layers:

    $ unset HADOOP_HOME
    $ unset HADOOP_MAPRED_HOME

    Please retry compilation after this.

    Alternatively, does the compilation succeed on USHERLXSDN1 by default,
    if you run that "javac -cp `hadoop classpath` WordCount.java" there?
    On Fri, Sep 6, 2013 at 12:07 AM, Tim Washburn wrote:
    HI Harsh,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/.//*
    [root@USHERLXSNN1 javaCode]# echo $HADOOP_HOME
    /usr/lib/hadoop-0.20-mapreduce

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_PREFIX

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_MAPRED_HOME
    /usr/lib/hadoop-0.20-mapreduce

    +++++++++++++++++
    $HADOOP_PREFIX yielded nothing. But it looks like it is expecting
    things in
    /user/lib/hadoop-0.20-mapreduce which does not exist.

    On the datanodes $HADOOP_HOME, $HADOOP_PREFIX and $HADOOP_MAPRED_HOME
    display no entries where the hadoop classpath command shows this:

    [root@USHERLXSDN1 ~]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/.//*
    The secondary name node has the saem output as the datanodes.
    ++++++++++++++++++

    Regards
    Tim Washburn

    On Thursday, September 5, 2013 9:33:57 AM UTC-7, Harsh J wrote:

    Hello Tim,

    It isn't vital for the files to be under /usr/lib/ really. You can
    use
    the /opt/cloudera path as well. If you type "hadoop classpath", you
    can notice a valid classpath structure using the parcel paths (/opt).

    However, the error is pretty odd. Can you send us the outputs of the
    following?

    $ hadoop classpath
    $ echo $HADOOP_HOME
    $ echo $HADOOP_PREFIX
    $ echo $HADOOP_MAPRED_HOME

    On Thu, Sep 5, 2013 at 9:50 PM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi Hash J,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# javac -cp `hadoop classpath`
    WordCount.java
    WordCount.java:8: package org.apache.hadoop.filecache does not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:16: cannot find symbol
    symbol : class MapReduceBase
    location: class org.myorg.WordCount
    public static class Map extends MapReduceBase implements
    Mapper<LongWritable, Text, Text, IntWritable> {

    (more of the same is sent to console)


    As you can see the system has no sense of the classpath. Thus the
    errors. I
    suspect that the libraries for Hadoop should all be under
    /usr/lib/hadoop/.
    But they are not. Given this should I just copy them there and fix
    the
    classpath or is there a config item in the cdh4.3 manager that will
    allow me
    to correct the issue?

    Regards.
    Tim Washburn
    On Wednesday, September 4, 2013 10:05:07 PM UTC-7, Harsh J wrote:

    Hey Tim,

    The doc references package based paths (/usr), and needs to be
    updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java

    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on each
    of
    the
    servers in the cluster and the only place that I've have found
    the
    hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr nodes and 3 data
    nodes.
    So
    I'm
    kind of puzzled. I was able to compile the class files and such
    but
    from
    within Hue the same basic error happens which is that CDH43
    can't
    find
    the
    classpaths. I suspect that Hue is looking for the classpath in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied
    from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn

    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn
    <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:


    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html
    I copied the source code into my cluster and found out that
    the
    default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/
      <------
    This
    does
    not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*
    so that when I compile it looks like this:
    javac -cp


    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac can't
    find
    or
    the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does
    not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default install
    using
    the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails from
    it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from
    it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send an
    email to scm-users+unsubscribe@cloudera.org.

    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Harsh J at Sep 7, 2013 at 1:12 am
    Sorry for the typos, sent those from my mobile. Back ticks and Shell*
    On Sep 7, 2013 6:41 AM, "Harsh J" wrote:

    Hi,

    Glad to know you have it working now.

    Regd. remaining problem:
    That's because you are now using single quote characters, not the back
    tick characters I originally sent you. It is `hadoop classpath` if you want
    it to auto expand, not 'hadoop classpath'. The back make your she'll
    execute it as a command inline.

    I'd also recommend eventually moving to use maven, as a javac command gets
    painful to run once your project gets larger.
    On Sep 6, 2013 9:07 PM, "Tim Washburn" wrote:

    Hi Harsh,

    Un-setting the paths clear the /usr/lib/ from being displayed. But the
    base issue still remains. That of the classpath not being read when using
    the -cp 'hadoop classpath' short cut. This is an issue in that I will
    eventually have other users who will be developing mapreduce jobs. I'd like
    to be able to allow them to run this from the command line and or from
    within Hue.
    +++++++++++++
    This works for compiling and creating the Jar file.
    +++++++++++++
    [root@USHERLXSNN1 javaCode]# mkdir wordcount_classes
    [root@USHERLXSNN1 javaCode]# javac -cp
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/client-0.20/\*
    -d wordcount_classes WordCount.java
    [root@USHERLXSNN1 javaCode]# jar -cvf WordCount.jar -C
    wordcount_classes/ .
    added manifest
    adding: org/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/WordCount$Reduce.class(in = 1611) (out= 648)(deflated 59%)
    adding: org/myorg/WordCount$Map$Counters.class(in = 983) (out=
    502)(deflated 48%)
    adding: org/myorg/WordCount.class(in = 2671) (out= 1288)(deflated 51%)
    adding: org/myorg/WordCount$Map.class(in = 4661) (out= 2214)(deflated 52%)
    +++++++++++++
    This fails
    +++++++++++++

    [root@USHERLXSNN1 javaCode]# javac -cp 'hadoop classpath' -d
    wordcount_classes WordCount.java
    WordCount.java:7: package org.apache.hadoop.fs does not exist
    import org.apache.hadoop.fs.Path;

    ^
    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:9: package org.apache.hadoop.conf does not exist
    import org.apache.hadoop.conf.*;
    ^
    … more errors after this but of the same nature.

    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    This Fails as well when using the Jar file that was created
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

    [root@USHERLXSNN1 javaCode]# hadoop jar wordcount.jar
    org.myorg.WordCount /user/data/input /user/data/output
    Exception in thread "main" java.io.IOException: Error opening job jar:
    wordcount.jar
    at org.apache.hadoop.util.RunJar.main(RunJar.java:135)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:127)
    at java.util.jar.JarFile.<init>(JarFile.java:135)
    at java.util.jar.JarFile.<init>(JarFile.java:72)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:133)
    [root@USHERLXSNN1 javaCode]#


    Thoughts?
    Regards
    Tim Washburn

    On Thursday, September 5, 2013 11:42:01 AM UTC-7, Harsh J wrote:

    Hello,

    Thanks!

    Please unset HADOOP_HOME and HADOOP_MAPRED_HOME. We set them
    automatically for you beneath the layers:

    $ unset HADOOP_HOME
    $ unset HADOOP_MAPRED_HOME

    Please retry compilation after this.

    Alternatively, does the compilation succeed on USHERLXSDN1 by default,
    if you run that "javac -cp `hadoop classpath` WordCount.java" there?
    On Fri, Sep 6, 2013 at 12:07 AM, Tim Washburn wrote:
    HI Harsh,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/.//*
    [root@USHERLXSNN1 javaCode]# echo $HADOOP_HOME
    /usr/lib/hadoop-0.20-mapreduce

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_PREFIX

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_MAPRED_HOME
    /usr/lib/hadoop-0.20-mapreduce

    +++++++++++++++++
    $HADOOP_PREFIX yielded nothing. But it looks like it is expecting
    things in
    /user/lib/hadoop-0.20-mapreduce which does not exist.

    On the datanodes $HADOOP_HOME, $HADOOP_PREFIX and
    $HADOOP_MAPRED_HOME
    display no entries where the hadoop classpath command shows this:

    [root@USHERLXSDN1 ~]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/.//*
    The secondary name node has the saem output as the datanodes.
    ++++++++++++++++++

    Regards
    Tim Washburn

    On Thursday, September 5, 2013 9:33:57 AM UTC-7, Harsh J wrote:

    Hello Tim,

    It isn't vital for the files to be under /usr/lib/ really. You can
    use
    the /opt/cloudera path as well. If you type "hadoop classpath", you
    can notice a valid classpath structure using the parcel paths
    (/opt).
    However, the error is pretty odd. Can you send us the outputs of the
    following?

    $ hadoop classpath
    $ echo $HADOOP_HOME
    $ echo $HADOOP_PREFIX
    $ echo $HADOOP_MAPRED_HOME

    On Thu, Sep 5, 2013 at 9:50 PM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi Hash J,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# javac -cp `hadoop classpath`
    WordCount.java
    WordCount.java:8: package org.apache.hadoop.filecache does not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:16: cannot find symbol
    symbol : class MapReduceBase
    location: class org.myorg.WordCount
    public static class Map extends MapReduceBase implements
    Mapper<LongWritable, Text, Text, IntWritable> {

    (more of the same is sent to console)


    As you can see the system has no sense of the classpath. Thus the
    errors. I
    suspect that the libraries for Hadoop should all be under
    /usr/lib/hadoop/.
    But they are not. Given this should I just copy them there and fix
    the
    classpath or is there a config item in the cdh4.3 manager that
    will
    allow me
    to correct the issue?

    Regards.
    Tim Washburn
    On Wednesday, September 4, 2013 10:05:07 PM UTC-7, Harsh J wrote:

    Hey Tim,

    The doc references package based paths (/usr), and needs to be
    updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java

    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn <
    tim_wa...@bio-rad.com>
    wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on each
    of
    the
    servers in the cluster and the only place that I've have found
    the
    hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr nodes and 3 data
    nodes.
    So
    I'm
    kind of puzzled. I was able to compile the class files and such
    but
    from
    within Hue the same basic error happens which is that CDH43
    can't
    find
    the
    classpaths. I suspect that Hue is looking for the classpath in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied
    from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn

    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn
    <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:


    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html
    I copied the source code into my cluster and found out that
    the
    default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/
    <------
    This
    does
    not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*
    so that when I compile it looks like this:
    javac -cp


    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac can't
    find
    or
    the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does
    not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default install
    using
    the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails
    from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from
    it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it,
    send an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send
    an email to scm-users+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Tim Washburn at Sep 9, 2013 at 11:06 pm
    Hi Harsh,

    Thanks for pointing out the back-tick characters. I re-tested on my
    secondary namenode using --> javac -cp `hadoop classpath` -d
    wordcount_classes WordCount.java.
    This compiled correctly. This lead me to run printenv on the primary
    namenode. Many other hadoop environment settings were tied to /usr/lib

    HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec
    HADOOP_MAPRED_HOME=/usr/lib/hadoop-0.20-mapreduce
    HADOOP_HOME=/usr/lib/hadoop-0.20-mapreduce

    Un-setting these seems to have fixed this issue. I'm a bit puzzled as to
    how the above variables got set with a stock install of CDH4 using the
    manager when the other cluster members seem to be ok.

    Thanks for your time and thouhgts.
    Regards
    TimW
    On Friday, September 6, 2013 6:12:58 PM UTC-7, Harsh J wrote:

    Sorry for the typos, sent those from my mobile. Back ticks and Shell*
    On Sep 7, 2013 6:41 AM, "Harsh J" <ha...@cloudera.com <javascript:>>
    wrote:
    Hi,

    Glad to know you have it working now.

    Regd. remaining problem:
    That's because you are now using single quote characters, not the back
    tick characters I originally sent you. It is `hadoop classpath` if you want
    it to auto expand, not 'hadoop classpath'. The back make your she'll
    execute it as a command inline.

    I'd also recommend eventually moving to use maven, as a javac command
    gets painful to run once your project gets larger.

    On Sep 6, 2013 9:07 PM, "Tim Washburn" <tim_wa...@bio-rad.com<javascript:>>
    wrote:
    Hi Harsh,

    Un-setting the paths clear the /usr/lib/ from being displayed. But the
    base issue still remains. That of the classpath not being read when using
    the -cp 'hadoop classpath' short cut. This is an issue in that I will
    eventually have other users who will be developing mapreduce jobs. I'd like
    to be able to allow them to run this from the command line and or from
    within Hue.
    +++++++++++++
    This works for compiling and creating the Jar file.
    +++++++++++++
    [root@USHERLXSNN1 javaCode]# mkdir wordcount_classes
    [root@USHERLXSNN1 javaCode]# javac -cp
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/client-0.20/\*
    -d wordcount_classes WordCount.java
    [root@USHERLXSNN1 javaCode]# jar -cvf WordCount.jar -C
    wordcount_classes/ .
    added manifest
    adding: org/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/WordCount$Reduce.class(in = 1611) (out= 648)(deflated 59%)
    adding: org/myorg/WordCount$Map$Counters.class(in = 983) (out=
    502)(deflated 48%)
    adding: org/myorg/WordCount.class(in = 2671) (out= 1288)(deflated 51%)
    adding: org/myorg/WordCount$Map.class(in = 4661) (out= 2214)(deflated 52%)
    +++++++++++++
    This fails
    +++++++++++++

    [root@USHERLXSNN1 javaCode]# javac -cp 'hadoop classpath' -d
    wordcount_classes WordCount.java
    WordCount.java:7: package org.apache.hadoop.fs does not exist
    import org.apache.hadoop.fs.Path;

    ^
    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:9: package org.apache.hadoop.conf does not exist
    import org.apache.hadoop.conf.*;
    ^
    … more errors after this but of the same nature.

    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    This Fails as well when using the Jar file that was created
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

    [root@USHERLXSNN1 javaCode]# hadoop jar wordcount.jar
    org.myorg.WordCount /user/data/input /user/data/output
    Exception in thread "main" java.io.IOException: Error opening job jar:
    wordcount.jar
    at org.apache.hadoop.util.RunJar.main(RunJar.java:135)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:127)
    at java.util.jar.JarFile.<init>(JarFile.java:135)
    at java.util.jar.JarFile.<init>(JarFile.java:72)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:133)
    [root@USHERLXSNN1 javaCode]#


    Thoughts?
    Regards
    Tim Washburn

    On Thursday, September 5, 2013 11:42:01 AM UTC-7, Harsh J wrote:

    Hello,

    Thanks!

    Please unset HADOOP_HOME and HADOOP_MAPRED_HOME. We set them
    automatically for you beneath the layers:

    $ unset HADOOP_HOME
    $ unset HADOOP_MAPRED_HOME

    Please retry compilation after this.

    Alternatively, does the compilation succeed on USHERLXSDN1 by default,
    if you run that "javac -cp `hadoop classpath` WordCount.java" there?

    On Fri, Sep 6, 2013 at 12:07 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    HI Harsh,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/.//*
    [root@USHERLXSNN1 javaCode]# echo $HADOOP_HOME
    /usr/lib/hadoop-0.20-mapreduce

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_PREFIX

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_MAPRED_HOME
    /usr/lib/hadoop-0.20-mapreduce

    +++++++++++++++++
    $HADOOP_PREFIX yielded nothing. But it looks like it is expecting
    things in
    /user/lib/hadoop-0.20-mapreduce which does not exist.

    On the datanodes $HADOOP_HOME, $HADOOP_PREFIX and
    $HADOOP_MAPRED_HOME
    display no entries where the hadoop classpath command shows this:

    [root@USHERLXSDN1 ~]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/.//*
    The secondary name node has the saem output as the datanodes.
    ++++++++++++++++++

    Regards
    Tim Washburn

    On Thursday, September 5, 2013 9:33:57 AM UTC-7, Harsh J wrote:

    Hello Tim,

    It isn't vital for the files to be under /usr/lib/ really. You can
    use
    the /opt/cloudera path as well. If you type "hadoop classpath", you
    can notice a valid classpath structure using the parcel paths
    (/opt).
    However, the error is pretty odd. Can you send us the outputs of
    the
    following?

    $ hadoop classpath
    $ echo $HADOOP_HOME
    $ echo $HADOOP_PREFIX
    $ echo $HADOOP_MAPRED_HOME

    On Thu, Sep 5, 2013 at 9:50 PM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    Hi Hash J,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# javac -cp `hadoop classpath`
    WordCount.java
    WordCount.java:8: package org.apache.hadoop.filecache does not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:16: cannot find symbol
    symbol : class MapReduceBase
    location: class org.myorg.WordCount
    public static class Map extends MapReduceBase implements
    Mapper<LongWritable, Text, Text, IntWritable> {

    (more of the same is sent to console)


    As you can see the system has no sense of the classpath. Thus the
    errors. I
    suspect that the libraries for Hadoop should all be under
    /usr/lib/hadoop/.
    But they are not. Given this should I just copy them there and
    fix the
    classpath or is there a config item in the cdh4.3 manager that
    will
    allow me
    to correct the issue?

    Regards.
    Tim Washburn
    On Wednesday, September 4, 2013 10:05:07 PM UTC-7, Harsh J wrote:

    Hey Tim,

    The doc references package based paths (/usr), and needs to be
    updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java

    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn <
    tim_wa...@bio-rad.com>
    wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on
    each of
    the
    servers in the cluster and the only place that I've have found
    the
    hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr nodes and 3 data
    nodes.
    So
    I'm
    kind of puzzled. I was able to compile the class files and
    such but
    from
    within Hue the same basic error happens which is that CDH43
    can't
    find
    the
    classpaths. I suspect that Hue is looking for the classpath in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied
    from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn

    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn
    <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:


    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html
    I copied the source code into my cluster and found out that
    the
    default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/
    <------
    This
    does
    not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*
    so that when I compile it looks like this:
    javac -cp


    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac
    can't find
    or
    the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does
    not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default
    install using
    the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails
    from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from
    it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it,
    send an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send
    an email to scm-users+...@cloudera.org <javascript:>.
    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Shashi Ranjan at Nov 6, 2013 at 8:11 pm
    Hi Harsh ,

    can you please help me this .....I am getting this error while I am try to
    run the application thru jar file

    hadoop jar wordcount.jar org.myorg.WordCount /usr/cloudera/wordcount/input
    /usr/cloudera/wordcount/output
    Exception in thread "main" java.lang.ClassNotFoundException:
    org.myorg.WordCount
             at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
             at java.security.AccessController.doPrivileged(Native Method)
             at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
             at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
             at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
             at java.lang.Class.forName0(Native Method)
             at java.lang.Class.forName(Class.java:266)
             at org.apache.hadoop.util.RunJar.main(RunJar.java:201)

    On Tuesday, 10 September 2013 04:35:59 UTC+5:30, Tim Washburn wrote:

    Hi Harsh,

    Thanks for pointing out the back-tick characters. I re-tested on my
    secondary namenode using --> javac -cp `hadoop classpath` -d
    wordcount_classes WordCount.java.
    This compiled correctly. This lead me to run printenv on the primary
    namenode. Many other hadoop environment settings were tied to /usr/lib

    HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec
    HADOOP_MAPRED_HOME=/usr/lib/hadoop-0.20-mapreduce
    HADOOP_HOME=/usr/lib/hadoop-0.20-mapreduce

    Un-setting these seems to have fixed this issue. I'm a bit puzzled as to
    how the above variables got set with a stock install of CDH4 using the
    manager when the other cluster members seem to be ok.

    Thanks for your time and thouhgts.
    Regards
    TimW
    On Friday, September 6, 2013 6:12:58 PM UTC-7, Harsh J wrote:

    Sorry for the typos, sent those from my mobile. Back ticks and Shell*
    On Sep 7, 2013 6:41 AM, "Harsh J" wrote:

    Hi,

    Glad to know you have it working now.

    Regd. remaining problem:
    That's because you are now using single quote characters, not the back
    tick characters I originally sent you. It is `hadoop classpath` if you want
    it to auto expand, not 'hadoop classpath'. The back make your she'll
    execute it as a command inline.

    I'd also recommend eventually moving to use maven, as a javac command
    gets painful to run once your project gets larger.
    On Sep 6, 2013 9:07 PM, "Tim Washburn" wrote:

    Hi Harsh,

    Un-setting the paths clear the /usr/lib/ from being displayed. But
    the base issue still remains. That of the classpath not being read when
    using the -cp 'hadoop classpath' short cut. This is an issue in that I will
    eventually have other users who will be developing mapreduce jobs. I'd like
    to be able to allow them to run this from the command line and or from
    within Hue.
    +++++++++++++
    This works for compiling and creating the Jar file.
    +++++++++++++
    [root@USHERLXSNN1 javaCode]# mkdir wordcount_classes
    [root@USHERLXSNN1 javaCode]# javac -cp
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/client-0.20/\*
    -d wordcount_classes WordCount.java
    [root@USHERLXSNN1 javaCode]# jar -cvf WordCount.jar -C
    wordcount_classes/ .
    added manifest
    adding: org/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/WordCount$Reduce.class(in = 1611) (out=
    648)(deflated 59%)
    adding: org/myorg/WordCount$Map$Counters.class(in = 983) (out=
    502)(deflated 48%)
    adding: org/myorg/WordCount.class(in = 2671) (out= 1288)(deflated 51%)
    adding: org/myorg/WordCount$Map.class(in = 4661) (out= 2214)(deflated 52%)
    +++++++++++++
    This fails
    +++++++++++++

    [root@USHERLXSNN1 javaCode]# javac -cp 'hadoop classpath' -d
    wordcount_classes WordCount.java
    WordCount.java:7: package org.apache.hadoop.fs does not exist
    import org.apache.hadoop.fs.Path;

    ^
    WordCount.java:8: package org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:9: package org.apache.hadoop.conf does not exist
    import org.apache.hadoop.conf.*;
    ^
    … more errors after this but of the same nature.

    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    This Fails as well when using the Jar file that was created
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

    [root@USHERLXSNN1 javaCode]# hadoop jar wordcount.jar
    org.myorg.WordCount /user/data/input /user/data/output
    Exception in thread "main" java.io.IOException: Error opening job jar:
    wordcount.jar
    at org.apache.hadoop.util.RunJar.main(RunJar.java:135)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:127)
    at java.util.jar.JarFile.<init>(JarFile.java:135)
    at java.util.jar.JarFile.<init>(JarFile.java:72)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:133)
    [root@USHERLXSNN1 javaCode]#


    Thoughts?
    Regards
    Tim Washburn

    On Thursday, September 5, 2013 11:42:01 AM UTC-7, Harsh J wrote:

    Hello,

    Thanks!

    Please unset HADOOP_HOME and HADOOP_MAPRED_HOME. We set them
    automatically for you beneath the layers:

    $ unset HADOOP_HOME
    $ unset HADOOP_MAPRED_HOME

    Please retry compilation after this.

    Alternatively, does the compilation succeed on USHERLXSDN1 by
    default,
    if you run that "javac -cp `hadoop classpath` WordCount.java" there?

    On Fri, Sep 6, 2013 at 12:07 AM, Tim Washburn <tim_wa...@bio-rad.com>
    wrote:
    HI Harsh,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/.//*
    [root@USHERLXSNN1 javaCode]# echo $HADOOP_HOME
    /usr/lib/hadoop-0.20-mapreduce

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_PREFIX

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_MAPRED_HOME
    /usr/lib/hadoop-0.20-mapreduce

    +++++++++++++++++
    $HADOOP_PREFIX yielded nothing. But it looks like it is expecting
    things in
    /user/lib/hadoop-0.20-mapreduce which does not exist.

    On the datanodes $HADOOP_HOME, $HADOOP_PREFIX and
    $HADOOP_MAPRED_HOME
    display no entries where the hadoop classpath command shows this:

    [root@USHERLXSDN1 ~]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/.//*
    The secondary name node has the saem output as the datanodes.
    ++++++++++++++++++

    Regards
    Tim Washburn

    On Thursday, September 5, 2013 9:33:57 AM UTC-7, Harsh J wrote:

    Hello Tim,

    It isn't vital for the files to be under /usr/lib/ really. You can
    use
    the /opt/cloudera path as well. If you type "hadoop classpath",
    you
    can notice a valid classpath structure using the parcel paths
    (/opt).
    However, the error is pretty odd. Can you send us the outputs of
    the
    following?

    $ hadoop classpath
    $ echo $HADOOP_HOME
    $ echo $HADOOP_PREFIX
    $ echo $HADOOP_MAPRED_HOME

    On Thu, Sep 5, 2013 at 9:50 PM, Tim Washburn <
    tim_wa...@bio-rad.com>
    wrote:
    Hi Hash J,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# javac -cp `hadoop classpath`
    WordCount.java
    WordCount.java:8: package org.apache.hadoop.filecache does not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:16: cannot find symbol
    symbol : class MapReduceBase
    location: class org.myorg.WordCount
    public static class Map extends MapReduceBase implements
    Mapper<LongWritable, Text, Text, IntWritable> {

    (more of the same is sent to console)


    As you can see the system has no sense of the classpath. Thus
    the
    errors. I
    suspect that the libraries for Hadoop should all be under
    /usr/lib/hadoop/.
    But they are not. Given this should I just copy them there and
    fix the
    classpath or is there a config item in the cdh4.3 manager that
    will
    allow me
    to correct the issue?

    Regards.
    Tim Washburn

    On Wednesday, September 4, 2013 10:05:07 PM UTC-7, Harsh J
    wrote:
    Hey Tim,

    The doc references package based paths (/usr), and needs to be
    updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java

    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn <
    tim_wa...@bio-rad.com>
    wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I poked about on
    each of
    the
    servers in the cluster and the only place that I've have
    found the
    hadoop
    libs is in /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr nodes and 3
    data nodes.
    So
    I'm
    kind of puzzled. I was able to compile the class files and
    such but
    from
    within Hue the same basic error happens which is that CDH43
    can't
    find
    the
    classpaths. I suspect that Hue is looking for the classpath
    in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or something not copied
    from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to /usr/lib/hadoop???

    Thoughts anyone?
    Regards
    Tim Washburn

    On Friday, August 30, 2013 9:22:19 PM UTC-7, Harsh J wrote:

    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn
    <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:


    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html
    I copied the source code into my cluster and found out
    that the
    default
    classpaths are not in the same place with CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes WordCount.java

    where classpath is:

    CDH4 - /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/
    <------
    This
    does
    not
    exist in the stock default install of CDH 4.3.01

    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*
    so that when I compile it looks like this:
    javac -cp


    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because the javac
    can't find
    or
    the
    packages have different names
    as in this the first error:

    WordCount.java:8: package org.apache.hadoop.filecache does
    not
    exist
    import org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a default
    install using
    the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop receiving emails
    from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from
    it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from
    it, send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it,
    send an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails from it, send
    an email to scm-users+...@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.
  • Chris Conner at Nov 7, 2013 at 7:44 pm
    Hey,

    In your WordCount.java do you specify it as "package org.myorg;"?

    Thanks
    Chris
    On 11/6/13, 3:11 PM, Shashi Ranjan wrote:
    Hi Harsh ,

    can you please help me this .....I am getting this error while I am
    try to run the application thru jar file

    hadoop jar wordcount.jar org.myorg.WordCount
    /usr/cloudera/wordcount/input /usr/cloudera/wordcount/output
    Exception in thread "main" java.lang.ClassNotFoundException:
    org.myorg.WordCount
    at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:266)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:201)


    On Tuesday, 10 September 2013 04:35:59 UTC+5:30, Tim Washburn wrote:

    Hi Harsh,

    Thanks for pointing out the back-tick characters. I re-tested on
    my secondary namenode using --> javac -cp `hadoop classpath` -d
    wordcount_classes WordCount.java.
    This compiled correctly. This lead me to run printenv on the
    primary namenode. Many other hadoop environment settings were tied
    to /usr/lib

    HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec
    HADOOP_MAPRED_HOME=/usr/lib/hadoop-0.20-mapreduce
    HADOOP_HOME=/usr/lib/hadoop-0.20-mapreduce

    Un-setting these seems to have fixed this issue. I'm a bit puzzled
    as to how the above variables got set with a stock install of CDH4
    using the manager when the other cluster members seem to be ok.

    Thanks for your time and thouhgts.
    Regards
    TimW

    On Friday, September 6, 2013 6:12:58 PM UTC-7, Harsh J wrote:

    Sorry for the typos, sent those from my mobile. Back ticks and
    Shell*

    On Sep 7, 2013 6:41 AM, "Harsh J" wrote:

    Hi,

    Glad to know you have it working now.

    Regd. remaining problem:
    That's because you are now using single quote characters,
    not the back tick characters I originally sent you. It is
    `hadoop classpath` if you want it to auto expand, not
    'hadoop classpath'. The back make your she'll execute it
    as a command inline.

    I'd also recommend eventually moving to use maven, as a
    javac command gets painful to run once your project gets
    larger.

    On Sep 6, 2013 9:07 PM, "Tim Washburn"
    wrote:
    Hi Harsh,

    Un-setting the paths clear the /usr/lib/ from being
    displayed. But the base issue still remains. That of the
    classpath not being read when using the -cp 'hadoop
    classpath' short cut. This is an issue in that I will
    eventually have other users who will be developing
    mapreduce jobs. I'd like to be able to allow them to run
    this from the command line and or from within Hue.
    +++++++++++++
    This works for compiling and creating the Jar file.
    +++++++++++++
    [root@USHERLXSNN1 javaCode]# mkdir wordcount_classes
    [root@USHERLXSNN1 javaCode]# javac -cp
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/client-0.20/\*
    -d wordcount_classes WordCount.java
    [root@USHERLXSNN1 javaCode]# jar -cvf WordCount.jar -C
    wordcount_classes/ .
    added manifest
    adding: org/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/(in = 0) (out= 0)(stored 0%)
    adding: org/myorg/WordCount$Reduce.class(in = 1611)
    (out= 648)(deflated 59%)
    adding: org/myorg/WordCount$Map$Counters.class(in = 983)
    (out= 502)(deflated 48%)
    adding: org/myorg/WordCount.class(in = 2671) (out=
    1288)(deflated 51%)
    adding: org/myorg/WordCount$Map.class(in = 4661) (out=
    2214)(deflated 52%)
    +++++++++++++
    This fails
    +++++++++++++

    [root@USHERLXSNN1 javaCode]# javac -cp 'hadoop
    classpath' -d wordcount_classes WordCount.java
    WordCount.java:7: package org.apache.hadoop.fs does not exist
    import org.apache.hadoop.fs.Path;

    ^
    WordCount.java:8: package org.apache.hadoop.filecache
    does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:9: package org.apache.hadoop.conf does not exist
    import org.apache.hadoop.conf.*;
    ^
    … more errors after this but of the same nature.

    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    This Fails as well when using the Jar file that was created
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    [root@USHERLXSNN1 javaCode]# hadoop jar wordcount.jar
    org.myorg.WordCount /user/data/input /user/data/output
    Exception in thread "main" java.io.IOException: Error
    opening job jar: wordcount.jar
    at
    org.apache.hadoop.util.RunJar.main(RunJar.java:135)
    Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:127)
    at java.util.jar.JarFile.<init>(JarFile.java:135)
    at java.util.jar.JarFile.<init>(JarFile.java:72)
    at
    org.apache.hadoop.util.RunJar.main(RunJar.java:133)
    [root@USHERLXSNN1 javaCode]#


    Thoughts?
    Regards
    Tim Washburn


    On Thursday, September 5, 2013 11:42:01 AM UTC-7, Harsh
    J wrote:
    Hello,

    Thanks!

    Please unset HADOOP_HOME and HADOOP_MAPRED_HOME. We set
    them
    automatically for you beneath the layers:

    $ unset HADOOP_HOME
    $ unset HADOOP_MAPRED_HOME

    Please retry compilation after this.

    Alternatively, does the compilation succeed on
    USHERLXSDN1 by default,
    if you run that "javac -cp `hadoop classpath`
    WordCount.java" there?
    On Fri, Sep 6, 2013 at 12:07 AM, Tim Washburn
    wrote:
    HI Harsh,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/.//*
    [root@USHERLXSNN1 javaCode]# echo $HADOOP_HOME
    /usr/lib/hadoop-0.20-mapreduce

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_PREFIX

    [root@USHERLXSNN1 javaCode]# echo $HADOOP_MAPRED_HOME
    /usr/lib/hadoop-0.20-mapreduce

    +++++++++++++++++
    $HADOOP_PREFIX yielded nothing. But it looks like it
    is expecting things in
    /user/lib/hadoop-0.20-mapreduce which does not exist.

    On the datanodes $HADOOP_HOME, $HADOOP_PREFIX and
    $HADOOP_MAPRED_HOME
    display no entries where the hadoop classpath command
    shows this:
    [root@USHERLXSDN1 ~]# hadoop classpath
    /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/./:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/lib/*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/.//*
    The secondary name node has the saem output as the
    datanodes.
    ++++++++++++++++++

    Regards
    Tim Washburn


    On Thursday, September 5, 2013 9:33:57 AM UTC-7,
    Harsh J wrote:
    Hello Tim,

    It isn't vital for the files to be under /usr/lib/
    really. You can use
    the /opt/cloudera path as well. If you type "hadoop
    classpath", you
    can notice a valid classpath structure using the
    parcel paths (/opt).
    However, the error is pretty odd. Can you send us
    the outputs of the
    following?

    $ hadoop classpath
    $ echo $HADOOP_HOME
    $ echo $HADOOP_PREFIX
    $ echo $HADOOP_MAPRED_HOME

    On Thu, Sep 5, 2013 at 9:50 PM, Tim Washburn
    <tim_wa...@bio-rad.com>
    wrote:
    Hi Hash J,

    Here's the output:

    [root@USHERLXSNN1 javaCode]# javac -cp `hadoop
    classpath` WordCount.java
    WordCount.java:8: package
    org.apache.hadoop.filecache does not exist
    import org.apache.hadoop.filecache.DistributedCache;
    ^
    WordCount.java:16: cannot find symbol
    symbol : class MapReduceBase
    location: class org.myorg.WordCount
    public static class Map extends MapReduceBase
    implements
    Mapper<LongWritable, Text, Text, IntWritable> {

    (more of the same is sent to console)


    As you can see the system has no sense of the
    classpath. Thus the
    errors. I
    suspect that the libraries for Hadoop should all
    be under
    /usr/lib/hadoop/.
    But they are not. Given this should I just copy
    them there and fix the
    classpath or is there a config item in the cdh4.3
    manager that will
    allow me
    to correct the issue?

    Regards.
    Tim Washburn

    On Wednesday, September 4, 2013 10:05:07 PM UTC-7,
    Harsh J wrote:
    Hey Tim,

    The doc references package based paths (/usr),
    and needs to be updated
    to also reference parcel based paths (/opt).

    However, what does the below exact command display?

    javac -cp `hadoop classpath` WordCount.java

    On Thu, Sep 5, 2013 at 9:38 AM, Tim Washburn
    <tim_wa...@bio-rad.com>
    wrote:
    Hi Harsh,

    Tested your suggestion and well no luck. I
    poked about on each of
    the
    servers in the cluster and the only place that
    I've have found the
    hadoop
    libs is in
    /opt/cloudera/parcels/CDH/lib/hadoop/ and not in
    /usr/lib/hadoop.
    This is a stock install of CDH 4.3 with 2 mgr
    nodes and 3 data nodes.
    So
    I'm
    kind of puzzled. I was able to compile the
    class files and such but
    from
    within Hue the same basic error happens which
    is that CDH43 can't
    find
    the
    classpaths. I suspect that Hue is looking for
    the classpath in
    /usr/lib/hadoop which doesn't exist.

    From Cloudera's web site the following is
    referenced: CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*

    Was something missed during the setup or
    something not copied from
    /opt/cloudera/parcels/CDH/lib/hadoop/ to
    /usr/lib/hadoop???
    Thoughts anyone?
    Regards
    Tim Washburn


    On Friday, August 30, 2013 9:22:19 PM UTC-7,
    Harsh J wrote:
    Hi Tim,

    You can alternatively try:

    javac -cp `hadoop classpath` WordCount.java

    On Sat, Aug 31, 2013 at 3:38 AM, Tim Washburn
    <tim_wa...@bio-rad.com>
    wrote:
    Hi All

    I started here:


    http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html
    <http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_7_1.html>
    I copied the source code into my cluster and
    found out that the
    default
    classpaths are not in the same place with
    CHD 4.3.-01
    ++++++++++++++++++++
    Steps outlined in tutorial
    ++++++++++++++++++

    Compile WordCount.java:

    $ mkdir wordcount_classes

    $ javac -cp classpath -d wordcount_classes
    WordCount.java
    where classpath is:

    CDH4 -
    /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/ <------
    This
    does
    not
    exist in the stock default install of CDH
    4.3.01
    ++++++++++++++++++++++++++++++++++++++++++++++++
    my class path is ---->
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop*
    and
    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20*
    so that when I compile it looks like this:
    javac -cp


    /opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/\*:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/client-0.20/\*
    WordCount.java

    but this leads to compilation errors because
    the javac can't find
    or
    the
    packages have different names
    as in this the first error:

    WordCount.java:8: package
    org.apache.hadoop.filecache does not
    exist
    import
    org.apache.hadoop.filecache.DistributedCache;
    ...

    +++++++++++++++++++++++++++++++++++++
    My question is how to correct this as this a
    default install using
    the
    CDH
    manager etc. All nodes have the same classpath.

    Thoughts?
    Thanks
    Tw

    To unsubscribe from this group and stop
    receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop
    receiving emails from it,
    send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving
    emails from it, send
    an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving
    emails from it, send an
    email to scm-users+...@cloudera.org.


    --
    Harsh J
    To unsubscribe from this group and stop receiving emails
    from it, send an email to scm-users+...@cloudera.org.

    To unsubscribe from this group and stop receiving emails from it, send
    an email to scm-users+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupscm-users @
categorieshadoop
postedAug 30, '13 at 10:08p
activeNov 7, '13 at 7:44p
posts14
users4
websitecloudera.com
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase