FAQ
I upgraded my CDH3u4 cluster to CDH4 which I had snappy working on. The
cluster is 32bit on some older hardware using fedora14. I installed CDH4
using yum which does not include the snappy libraries (it does in the 64bit
distro). I compiled the snappy libraries from source and linked those in
(using 1.1.3, the one included in the yum repo for 64bit is 1.1.1) .
Successfully got sqoop to pull dumps from an oracle database into hive and
compress them using snappy. But when I run any hive queries on it this
exception is thrown:

Exception in thread "main" java.lang.UnsatisfiedLinkError:
org.apache.hadoop.io.compress.snappy.SnappyDecompressor.decompressBytesDirect()I

Any pointers on this would be helpful.

--

Search Discussions

  • James Pettyjohn at Jul 29, 2012 at 10:43 pm
    Update on this - doesn't look like it's actually compressing on any of the
    nodes except maybe the one 64bit node I have. An earlie sqoop job finished
    with one mapper by what looks like pure chance - a job with 50 mappers
    bails out with compression exception the same way as mentioned above.

    I have no idea which sub project is calling this code, i.e. by the
    mapreduce job or the datanode etc, so I'm shooting in the dark here. I've
    downgraded the snappy installation back to 1.0.3 release which had the same
    libsnappy.so.1.1.1 as what came with the 64 cdh4 release but no change. Any
    tips would be appreciated.

    --
  • Todd Lipcon at Jul 30, 2012 at 3:52 am
    Hi James,

    You'll need to re-compile the Hadoop native libraries as well, with
    the Snappy libraries present. Otherwise, the Snappy support isn't
    compiled into the Hadoop native libraries, as far as I know.

    It also seems like a mistake that Snappy support is not provided in
    our 32-bit RPMs. Would you mind filing a bug on our "DISTRO" jira if
    this is indeed the case?

    Thanks
    -Todd
    On Sun, Jul 29, 2012 at 3:43 PM, James Pettyjohn wrote:
    Update on this - doesn't look like it's actually compressing on any of the
    nodes except maybe the one 64bit node I have. An earlie sqoop job finished
    with one mapper by what looks like pure chance - a job with 50 mappers bails
    out with compression exception the same way as mentioned above.

    I have no idea which sub project is calling this code, i.e. by the mapreduce
    job or the datanode etc, so I'm shooting in the dark here. I've downgraded
    the snappy installation back to 1.0.3 release which had the same
    libsnappy.so.1.1.1 as what came with the 64 cdh4 release but no change. Any
    tips would be appreciated.

    --



    --
    Todd Lipcon
    Software Engineer, Cloudera

    --
  • James Pettyjohn at Jul 30, 2012 at 5:44 am
    Hi Todd,

    Thanks a lot for the reply. The .so files are are the only things that seem
    to come missing. I added those and started getting
    the UnsatisfiedLinkError. Even with snappy jars included with the rpms I'll
    need to recompile? There aren't any tricky RPM install checks that disabled
    snappy when it didn't find the libraries already installed?

    If I need to recompile this from source do you have any suggestions? Mixing
    rpms and source compiles sounds a little dicey but workable I guess.

    Best,
    James


    Best, James
    On Sunday, July 29, 2012 8:52:02 PM UTC-7, Todd Lipcon wrote:

    Hi James,

    You'll need to re-compile the Hadoop native libraries as well, with
    the Snappy libraries present. Otherwise, the Snappy support isn't
    compiled into the Hadoop native libraries, as far as I know.

    It also seems like a mistake that Snappy support is not provided in
    our 32-bit RPMs. Would you mind filing a bug on our "DISTRO" jira if
    this is indeed the case?

    Thanks
    -Todd
    On Sun, Jul 29, 2012 at 3:43 PM, James Pettyjohn <> wrote:
    Update on this - doesn't look like it's actually compressing on any of the
    nodes except maybe the one 64bit node I have. An earlie sqoop job finished
    with one mapper by what looks like pure chance - a job with 50 mappers bails
    out with compression exception the same way as mentioned above.

    I have no idea which sub project is calling this code, i.e. by the mapreduce
    job or the datanode etc, so I'm shooting in the dark here. I've
    downgraded
    the snappy installation back to 1.0.3 release which had the same
    libsnappy.so.1.1.1 as what came with the 64 cdh4 release but no change. Any
    tips would be appreciated.

    --



    --
    Todd Lipcon
    Software Engineer, Cloudera
    --

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcdh-user @
categorieshadoop
postedJul 29, '12 at 9:34p
activeJul 30, '12 at 5:44a
posts4
users2
websitecloudera.com
irc#hadoop

2 users in discussion

James Pettyjohn: 3 posts Todd Lipcon: 1 post

People

Translate

site design / logo © 2022 Grokbase