FAQ
Hi!

I am trying to use pig and hbase but i keep running in to
classNotFoundException error. I have tried few things but they have never
worked.

I am using pig 0.10.1 and hbase 0.94.1, hadoop 1.0.4. I have updated my
HADOOP_CLASSPATH in hadoop-env.sh as per this post [0]

After updating my classpath, when i do the command '/opt/hadoop/bin/hadoop
classpath' i could see the protobuf jar file in it.

Yet, when i run the pig script that loads from hadoop class, i keep getting
this error in the map tasks (Error: java.lang.ClassNotFoundException:
com.google.protobuf.Message).

Every data node has the protobuf jar file in its hadoop classpath. I have
also tried adding the jar file like this (/opt/pig-0.10.1/bin/pig
/opt/pig_programs/testHbase.pig
-Dpig.additional.jars=/opt/hbase-0.94.1/lib/protobuf-java-2.4.0a.jar ).

I keep running in to this error.

Can anyone please let me know how to solve this issue ?


Many Thanks,
Kiran.

[0] -
http://mail-archives.apache.org/mod_mbox/pig-user/201211.mbox/%3CCANBTPCHb5+kFyew+rG8EvyEXTOutX2p4gxkyFV1LuC94qDTviQ@mail.gmail.com%3E

--
Kiran Chitturi

Search Discussions

  • Harsha at Feb 9, 2013 at 1:44 am
    kiran,
    if you are trying to access protobuf from inside pig script put the jar in PIG_CLASSPATH.

    --
    Harsha

    On Friday, February 8, 2013 at 4:56 PM, kiran chitturi wrote:

    Hi!

    I am trying to use pig and hbase but i keep running in to
    classNotFoundException error. I have tried few things but they have never
    worked.

    I am using pig 0.10.1 and hbase 0.94.1, hadoop 1.0.4. I have updated my
    HADOOP_CLASSPATH in hadoop-env.sh (http://hadoop-env.sh) as per this post [0]

    After updating my classpath, when i do the command '/opt/hadoop/bin/hadoop
    classpath' i could see the protobuf jar file in it.

    Yet, when i run the pig script that loads from hadoop class, i keep getting
    this error in the map tasks (Error: java.lang.ClassNotFoundException:
    com.google.protobuf.Message).

    Every data node has the protobuf jar file in its hadoop classpath. I have
    also tried adding the jar file like this (/opt/pig-0.10.1/bin/pig
    /opt/pig_programs/testHbase.pig
    -Dpig.additional.jars=/opt/hbase-0.94.1/lib/protobuf-java-2.4.0a.jar ).

    I keep running in to this error.

    Can anyone please let me know how to solve this issue ?


    Many Thanks,
    Kiran.

    [0] -
    http://mail-archives.apache.org/mod_mbox/pig-user/201211.mbox/%3CCANBTPCHb5+kFyew+rG8EvyEXTOutX2p4gxkyFV1LuC94qDTviQ@mail.gmail.com%3E

    --
    Kiran Chitturi
  • Kiran chitturi at Feb 9, 2013 at 1:47 am
    Thanks for the reply!

    Yes. I did add hbase dependencies to the PIG_CLASSPATH. I am using the
    below lines in my pig script.

    field = LOAD 'hbase://documents' USING
    org.apache.pig.backend.hadoop.hbase.HBaseStorage('info:collection',
    '-loadKey false') as (fieldOL);
    store field into 'results/extract' using PigStorage(';');

    Thanks,
    Kiran.


    On Fri, Feb 8, 2013 at 8:44 PM, Harsha wrote:

    kiran,
    if you are trying to access protobuf from inside pig script put the
    jar in PIG_CLASSPATH.

    --
    Harsha

    On Friday, February 8, 2013 at 4:56 PM, kiran chitturi wrote:

    Hi!

    I am trying to use pig and hbase but i keep running in to
    classNotFoundException error. I have tried few things but they have never
    worked.

    I am using pig 0.10.1 and hbase 0.94.1, hadoop 1.0.4. I have updated my
    HADOOP_CLASSPATH in hadoop-env.sh (http://hadoop-env.sh) as per this post [0]
    After updating my classpath, when i do the command
    '/opt/hadoop/bin/hadoop
    classpath' i could see the protobuf jar file in it.

    Yet, when i run the pig script that loads from hadoop class, i keep getting
    this error in the map tasks (Error: java.lang.ClassNotFoundException:
    com.google.protobuf.Message).

    Every data node has the protobuf jar file in its hadoop classpath. I have
    also tried adding the jar file like this (/opt/pig-0.10.1/bin/pig
    /opt/pig_programs/testHbase.pig
    -Dpig.additional.jars=/opt/hbase-0.94.1/lib/protobuf-java-2.4.0a.jar ).

    I keep running in to this error.

    Can anyone please let me know how to solve this issue ?


    Many Thanks,
    Kiran.

    [0] -
    http://mail-archives.apache.org/mod_mbox/pig-user/201211.mbox/%3CCANBTPCHb5+kFyew+rG8EvyEXTOutX2p4gxkyFV1LuC94qDTviQ@mail.gmail.com%3E
    --
    Kiran Chitturi

    --
    Kiran Chitturi
  • Ramakrishna Nalam at Feb 9, 2013 at 6:21 am
    Also 'REGISTER' the jars that'll be used in the MR tasks in the script.
    Else add them 'pig.additional.jars' in pig.properties.

    Regards,
    Rama.


    On Sat, Feb 9, 2013 at 7:16 AM, kiran chitturi wrote:

    Thanks for the reply!

    Yes. I did add hbase dependencies to the PIG_CLASSPATH. I am using the
    below lines in my pig script.

    field = LOAD 'hbase://documents' USING
    org.apache.pig.backend.hadoop.hbase.HBaseStorage('info:collection',
    '-loadKey false') as (fieldOL);
    store field into 'results/extract' using PigStorage(';');

    Thanks,
    Kiran.


    On Fri, Feb 8, 2013 at 8:44 PM, Harsha wrote:

    kiran,
    if you are trying to access protobuf from inside pig script put the
    jar in PIG_CLASSPATH.

    --
    Harsha

    On Friday, February 8, 2013 at 4:56 PM, kiran chitturi wrote:

    Hi!

    I am trying to use pig and hbase but i keep running in to
    classNotFoundException error. I have tried few things but they have
    never
    worked.

    I am using pig 0.10.1 and hbase 0.94.1, hadoop 1.0.4. I have updated my
    HADOOP_CLASSPATH in hadoop-env.sh (http://hadoop-env.sh) as per this post [0]
    After updating my classpath, when i do the command
    '/opt/hadoop/bin/hadoop
    classpath' i could see the protobuf jar file in it.

    Yet, when i run the pig script that loads from hadoop class, i keep getting
    this error in the map tasks (Error: java.lang.ClassNotFoundException:
    com.google.protobuf.Message).

    Every data node has the protobuf jar file in its hadoop classpath. I
    have
    also tried adding the jar file like this (/opt/pig-0.10.1/bin/pig
    /opt/pig_programs/testHbase.pig
    -Dpig.additional.jars=/opt/hbase-0.94.1/lib/protobuf-java-2.4.0a.jar ).

    I keep running in to this error.

    Can anyone please let me know how to solve this issue ?


    Many Thanks,
    Kiran.

    [0] -
    http://mail-archives.apache.org/mod_mbox/pig-user/201211.mbox/%3CCANBTPCHb5+kFyew+rG8EvyEXTOutX2p4gxkyFV1LuC94qDTviQ@mail.gmail.com%3E
    --
    Kiran Chitturi

    --
    Kiran Chitturi
  • Kiran chitturi at Feb 9, 2013 at 6:58 pm
    Thank you very much. The 'REGISTER' has worked for me. I gave the full path
    of the jar file in 'REGISTER' and the program ran successfully.

    Thanks again,
    Kiran.

    On Sat, Feb 9, 2013 at 1:20 AM, Ramakrishna Nalam wrote:

    Also 'REGISTER' the jars that'll be used in the MR tasks in the script.
    Else add them 'pig.additional.jars' in pig.properties.

    Regards,
    Rama.



    On Sat, Feb 9, 2013 at 7:16 AM, kiran chitturi <chitturikiran15@gmail.com
    wrote:
    Thanks for the reply!

    Yes. I did add hbase dependencies to the PIG_CLASSPATH. I am using the
    below lines in my pig script.

    field = LOAD 'hbase://documents' USING
    org.apache.pig.backend.hadoop.hbase.HBaseStorage('info:collection',
    '-loadKey false') as (fieldOL);
    store field into 'results/extract' using PigStorage(';');

    Thanks,
    Kiran.


    On Fri, Feb 8, 2013 at 8:44 PM, Harsha wrote:

    kiran,
    if you are trying to access protobuf from inside pig script put the
    jar in PIG_CLASSPATH.

    --
    Harsha

    On Friday, February 8, 2013 at 4:56 PM, kiran chitturi wrote:

    Hi!

    I am trying to use pig and hbase but i keep running in to
    classNotFoundException error. I have tried few things but they have
    never
    worked.

    I am using pig 0.10.1 and hbase 0.94.1, hadoop 1.0.4. I have updated
    my
    HADOOP_CLASSPATH in hadoop-env.sh (http://hadoop-env.sh) as per this post [0]
    After updating my classpath, when i do the command
    '/opt/hadoop/bin/hadoop
    classpath' i could see the protobuf jar file in it.

    Yet, when i run the pig script that loads from hadoop class, i keep getting
    this error in the map tasks (Error: java.lang.ClassNotFoundException:
    com.google.protobuf.Message).

    Every data node has the protobuf jar file in its hadoop classpath. I
    have
    also tried adding the jar file like this (/opt/pig-0.10.1/bin/pig
    /opt/pig_programs/testHbase.pig
    -Dpig.additional.jars=/opt/hbase-0.94.1/lib/protobuf-java-2.4.0a.jar
    ).
    I keep running in to this error.

    Can anyone please let me know how to solve this issue ?


    Many Thanks,
    Kiran.

    [0] -
    http://mail-archives.apache.org/mod_mbox/pig-user/201211.mbox/%3CCANBTPCHb5+kFyew+rG8EvyEXTOutX2p4gxkyFV1LuC94qDTviQ@mail.gmail.com%3E
    --
    Kiran Chitturi

    --
    Kiran Chitturi


    --
    Kiran Chitturi

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupuser @
categoriespig, hadoop
postedFeb 9, '13 at 12:57a
activeFeb 9, '13 at 6:58p
posts5
users3
websitepig.apache.org

People

Translate

site design / logo © 2021 Grokbase