FAQ
I am trying to use SequenceFiles with LZO compression outside the
context of a MapReduce application. However, when I try to use the LZO
codec, I get the following errors in the log:

08/09/30 11:09:56 DEBUG conf.Configuration: java.io.IOException:
config()
at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:157)
at
com
.rapleaf
.formats
.stream.TestSequenceFileStreams.setUp(TestSequenceFileStreams.java:22)
at junit.framework.TestCase.runBare(TestCase.java:125)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at
org
.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:
81)
at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:36)
at
org
.apache
.tools
.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:
421)
at
org
.apache
.tools
.ant
.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:
912)
at
org
.apache
.tools
.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:
766)

08/09/30 11:09:56 DEBUG security.UserGroupInformation: Unix Login:
nathan,staff,_lpadmin,com.apple.sharepoint.group.
1,_appserveradm,_appserverusr,admin,com.apple.access_ssh
08/09/30 11:09:56 DEBUG util.NativeCodeLoader: Trying to load the
custom-built native-hadoop library...
08/09/30 11:09:56 DEBUG util.NativeCodeLoader: Failed to load native-
hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in
java.library.path
08/09/30 11:09:56 DEBUG util.NativeCodeLoader: java.library.path=.:/
Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java
08/09/30 11:09:56 WARN util.NativeCodeLoader: Unable to load native-
hadoop library for your platform... using builtin-java classes where
applicable
08/09/30 11:09:56 ERROR compress.LzoCodec: Cannot load native-lzo
without native-hadoop


What is the native hadoop library and how should I configure things to
use it?



Thanks,

Nathan Marz
RapLeaf

Search Discussions

  • Arun C Murthy at Sep 30, 2008 at 6:33 pm
    Nathan,

    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the lib/
    native directory.

    Arun
    On Sep 30, 2008, at 11:14 AM, Nathan Marz wrote:

    I am trying to use SequenceFiles with LZO compression outside the
    context of a MapReduce application. However, when I try to use the
    LZO codec, I get the following errors in the log:

    08/09/30 11:09:56 DEBUG conf.Configuration: java.io.IOException:
    config()
    at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:
    157)
    at
    com
    .rapleaf
    .formats
    .stream.TestSequenceFileStreams.setUp(TestSequenceFileStreams.java:22)
    at junit.framework.TestCase.runBare(TestCase.java:125)
    at junit.framework.TestResult$1.protect(TestResult.java:106)
    at junit.framework.TestResult.runProtected(TestResult.java:124)
    at junit.framework.TestResult.run(TestResult.java:109)
    at junit.framework.TestCase.run(TestCase.java:118)
    at junit.framework.TestSuite.runTest(TestSuite.java:208)
    at junit.framework.TestSuite.run(TestSuite.java:203)
    at
    org
    .junit
    .internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:81)
    at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:36)
    at
    org
    .apache
    .tools
    .ant
    .taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:421)
    at
    org
    .apache
    .tools
    .ant
    .taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:
    912)
    at
    org
    .apache
    .tools
    .ant
    .taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:
    766)

    08/09/30 11:09:56 DEBUG security.UserGroupInformation: Unix Login:
    nathan,staff,_lpadmin,com.apple.sharepoint.group.
    1,_appserveradm,_appserverusr,admin,com.apple.access_ssh
    08/09/30 11:09:56 DEBUG util.NativeCodeLoader: Trying to load the
    custom-built native-hadoop library...
    08/09/30 11:09:56 DEBUG util.NativeCodeLoader: Failed to load native-
    hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in
    java.library.path
    08/09/30 11:09:56 DEBUG util.NativeCodeLoader: java.library.path=.:/
    Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java
    08/09/30 11:09:56 WARN util.NativeCodeLoader: Unable to load native-
    hadoop library for your platform... using builtin-java classes where
    applicable
    08/09/30 11:09:56 ERROR compress.LzoCodec: Cannot load native-lzo
    without native-hadoop


    What is the native hadoop library and how should I configure things
    to use it?



    Thanks,

    Nathan Marz
    RapLeaf
  • Doug Cutting at Sep 30, 2008 at 6:47 pm

    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch. libhadoop.so
    is available in the corresponding release in the lib/native directory.
    I think he needs to first build libhadoop.so, since he appears to be
    running on OS X and we only provide Linux builds of this in releases.

    Doug
  • Colin Evans at Sep 30, 2008 at 6:55 pm
    There's a patch to get the native targets to build on Mac OS X:

    http://issues.apache.org/jira/browse/HADOOP-3659

    You probably will need to monkey with LDFLAGS as well to get it to work,
    but we've been able to build the native libs for the Mac without too
    much trouble.


    Doug Cutting wrote:
    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the
    lib/native directory.
    I think he needs to first build libhadoop.so, since he appears to be
    running on OS X and we only provide Linux builds of this in releases.

    Doug
  • Nathan Marz at Sep 30, 2008 at 7:18 pm
    Thanks for the help. I was able to get past my previous issue, but the
    native build is still failing. Here is the end of the log output:

    [exec] then mv -f ".deps/LzoCompressor.Tpo" ".deps/
    LzoCompressor.Plo"; else rm -f ".deps/LzoCompressor.Tpo"; exit 1; fi
    [exec] mkdir .libs
    [exec] gcc -DHAVE_CONFIG_H -I. -I/Users/nathan/Downloads/
    hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo -
    I../../../../../../.. -I/Library/Java/Home//include -I/Users/nathan/
    Downloads/hadoop-0.18.1/src/native/src -g -Wall -fPIC -O2 -m32 -g -O2 -
    MT LzoCompressor.lo -MD -MP -MF .deps/LzoCompressor.Tpo -c /Users/
    nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/
    compress/lzo/LzoCompressor.c -fno-common -DPIC -o .libs/LzoCompressor.o
    [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/
    apache/hadoop/io/compress/lzo/LzoCompressor.c: In function
    'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
    [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/
    apache/hadoop/io/compress/lzo/LzoCompressor.c:135: error: syntax error
    before ',' token
    [exec] make[2]: *** [LzoCompressor.lo] Error 1
    [exec] make[1]: *** [all-recursive] Error 1
    [exec] make: *** [all] Error 2


    Any ideas?


    On Sep 30, 2008, at 11:53 AM, Colin Evans wrote:

    There's a patch to get the native targets to build on Mac OS X:

    http://issues.apache.org/jira/browse/HADOOP-3659

    You probably will need to monkey with LDFLAGS as well to get it to
    work, but we've been able to build the native libs for the Mac
    without too much trouble.


    Doug Cutting wrote:
    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the lib/
    native directory.
    I think he needs to first build libhadoop.so, since he appears to
    be running on OS X and we only provide Linux builds of this in
    releases.

    Doug
  • Colin Evans at Sep 30, 2008 at 7:35 pm
    Hi Nathan,
    You probably need to add the Java headers to your build path as well - I
    don't know why the Mac doesn't ship with this as a default setting:

    export
    CPATH="/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/include
    "
    export
    CPPFLAGS="-I/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/include"





    Nathan Marz wrote:
    Thanks for the help. I was able to get past my previous issue, but the
    native build is still failing. Here is the end of the log output:

    [exec] then mv -f ".deps/LzoCompressor.Tpo"
    ".deps/LzoCompressor.Plo"; else rm -f ".deps/LzoCompressor.Tpo"; exit
    1; fi
    [exec] mkdir .libs
    [exec] gcc -DHAVE_CONFIG_H -I.
    -I/Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo
    -I../../../../../../.. -I/Library/Java/Home//include
    -I/Users/nathan/Downloads/hadoop-0.18.1/src/native/src -g -Wall -fPIC
    -O2 -m32 -g -O2 -MT LzoCompressor.lo -MD -MP -MF
    .deps/LzoCompressor.Tpo -c
    /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c
    -fno-common -DPIC -o .libs/LzoCompressor.o
    [exec]
    /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:
    In function
    'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
    [exec]
    /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:135:
    error: syntax error before ',' token
    [exec] make[2]: *** [LzoCompressor.lo] Error 1
    [exec] make[1]: *** [all-recursive] Error 1
    [exec] make: *** [all] Error 2


    Any ideas?


    On Sep 30, 2008, at 11:53 AM, Colin Evans wrote:

    There's a patch to get the native targets to build on Mac OS X:

    http://issues.apache.org/jira/browse/HADOOP-3659

    You probably will need to monkey with LDFLAGS as well to get it to
    work, but we've been able to build the native libs for the Mac
    without too much trouble.


    Doug Cutting wrote:
    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the
    lib/native directory.
    I think he needs to first build libhadoop.so, since he appears to be
    running on OS X and we only provide Linux builds of this in releases.

    Doug
  • Nathan Marz at Sep 30, 2008 at 8:53 pm
    Unfortunately, setting those environment variables did not help my
    issue. It appears that the "HADOOP_LZO_LIBRARY" variable is not
    defined in both LzoCompressor.c and LzoDecompressor.c. Where is this
    variable supposed to be set?


    On Sep 30, 2008, at 12:33 PM, Colin Evans wrote:

    Hi Nathan,
    You probably need to add the Java headers to your build path as well
    - I don't know why the Mac doesn't ship with this as a default
    setting:

    export CPATH="/System/Library/Frameworks/JavaVM.framework/Versions/
    CurrentJDK/Home/include "
    export CPPFLAGS="-I/System/Library/Frameworks/JavaVM.framework/
    Versions/CurrentJDK/Home/include"




    Nathan Marz wrote:
    Thanks for the help. I was able to get past my previous issue, but
    the native build is still failing. Here is the end of the log output:

    [exec] then mv -f ".deps/LzoCompressor.Tpo" ".deps/
    LzoCompressor.Plo"; else rm -f ".deps/LzoCompressor.Tpo"; exit 1; fi
    [exec] mkdir .libs
    [exec] gcc -DHAVE_CONFIG_H -I. -I/Users/nathan/Downloads/
    hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo -
    I../../../../../../.. -I/Library/Java/Home//include -I/Users/nathan/
    Downloads/hadoop-0.18.1/src/native/src -g -Wall -fPIC -O2 -m32 -g -
    O2 -MT LzoCompressor.lo -MD -MP -MF .deps/LzoCompressor.Tpo -c /
    Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/
    hadoop/io/compress/lzo/LzoCompressor.c -fno-common -DPIC -o .libs/
    LzoCompressor.o
    [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/
    apache/hadoop/io/compress/lzo/LzoCompressor.c: In function
    'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
    [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/
    apache/hadoop/io/compress/lzo/LzoCompressor.c:135: error: syntax
    error before ',' token
    [exec] make[2]: *** [LzoCompressor.lo] Error 1
    [exec] make[1]: *** [all-recursive] Error 1
    [exec] make: *** [all] Error 2


    Any ideas?


    On Sep 30, 2008, at 11:53 AM, Colin Evans wrote:

    There's a patch to get the native targets to build on Mac OS X:

    http://issues.apache.org/jira/browse/HADOOP-3659

    You probably will need to monkey with LDFLAGS as well to get it to
    work, but we've been able to build the native libs for the Mac
    without too much trouble.


    Doug Cutting wrote:
    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the
    lib/native directory.
    I think he needs to first build libhadoop.so, since he appears to
    be running on OS X and we only provide Linux builds of this in
    releases.

    Doug
  • Colin Evans at Oct 1, 2008 at 12:22 am
    Hi Nathan,
    This is defined in build/native/<architecture>/config.h. It is
    generated by autoconf during the build, and if it is missing or
    incorrect then you probably need to make sure that the LZO libraries and
    headers are in your search paths and then do a clean build.

    -Colin


    Nathan Marz wrote:
    Unfortunately, setting those environment variables did not help my
    issue. It appears that the "HADOOP_LZO_LIBRARY" variable is not
    defined in both LzoCompressor.c and LzoDecompressor.c. Where is this
    variable supposed to be set?


    On Sep 30, 2008, at 12:33 PM, Colin Evans wrote:

    Hi Nathan,
    You probably need to add the Java headers to your build path as well
    - I don't know why the Mac doesn't ship with this as a default setting:

    export
    CPATH="/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/include
    "
    export
    CPPFLAGS="-I/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/include"





    Nathan Marz wrote:
    Thanks for the help. I was able to get past my previous issue, but
    the native build is still failing. Here is the end of the log output:

    [exec] then mv -f ".deps/LzoCompressor.Tpo"
    ".deps/LzoCompressor.Plo"; else rm -f ".deps/LzoCompressor.Tpo";
    exit 1; fi
    [exec] mkdir .libs
    [exec] gcc -DHAVE_CONFIG_H -I.
    -I/Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo
    -I../../../../../../.. -I/Library/Java/Home//include
    -I/Users/nathan/Downloads/hadoop-0.18.1/src/native/src -g -Wall
    -fPIC -O2 -m32 -g -O2 -MT LzoCompressor.lo -MD -MP -MF
    .deps/LzoCompressor.Tpo -c
    /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c
    -fno-common -DPIC -o .libs/LzoCompressor.o
    [exec]
    /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:
    In function
    'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
    [exec]
    /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:135:
    error: syntax error before ',' token
    [exec] make[2]: *** [LzoCompressor.lo] Error 1
    [exec] make[1]: *** [all-recursive] Error 1
    [exec] make: *** [all] Error 2


    Any ideas?


    On Sep 30, 2008, at 11:53 AM, Colin Evans wrote:

    There's a patch to get the native targets to build on Mac OS X:

    http://issues.apache.org/jira/browse/HADOOP-3659

    You probably will need to monkey with LDFLAGS as well to get it to
    work, but we've been able to build the native libs for the Mac
    without too much trouble.


    Doug Cutting wrote:
    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the
    lib/native directory.
    I think he needs to first build libhadoop.so, since he appears to
    be running on OS X and we only provide Linux builds of this in
    releases.

    Doug
  • Amareshwari Sriramadasu at Oct 1, 2008 at 3:52 am
    Are you seeing HADOOP-2009?

    Thanks
    Amareshwari
    Nathan Marz wrote:
    Unfortunately, setting those environment variables did not help my
    issue. It appears that the "HADOOP_LZO_LIBRARY" variable is not
    defined in both LzoCompressor.c and LzoDecompressor.c. Where is this
    variable supposed to be set?


    On Sep 30, 2008, at 12:33 PM, Colin Evans wrote:

    Hi Nathan,
    You probably need to add the Java headers to your build path as well
    - I don't know why the Mac doesn't ship with this as a default setting:

    export
    CPATH="/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/include
    "
    export
    CPPFLAGS="-I/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/include"





    Nathan Marz wrote:
    Thanks for the help. I was able to get past my previous issue, but
    the native build is still failing. Here is the end of the log output:

    [exec] then mv -f ".deps/LzoCompressor.Tpo"
    ".deps/LzoCompressor.Plo"; else rm -f ".deps/LzoCompressor.Tpo";
    exit 1; fi
    [exec] mkdir .libs
    [exec] gcc -DHAVE_CONFIG_H -I.
    -I/Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo
    -I../../../../../../.. -I/Library/Java/Home//include
    -I/Users/nathan/Downloads/hadoop-0.18.1/src/native/src -g -Wall
    -fPIC -O2 -m32 -g -O2 -MT LzoCompressor.lo -MD -MP -MF
    .deps/LzoCompressor.Tpo -c
    /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c
    -fno-common -DPIC -o .libs/LzoCompressor.o
    [exec]
    /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:
    In function
    'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
    [exec]
    /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:135:
    error: syntax error before ',' token
    [exec] make[2]: *** [LzoCompressor.lo] Error 1
    [exec] make[1]: *** [all-recursive] Error 1
    [exec] make: *** [all] Error 2


    Any ideas?


    On Sep 30, 2008, at 11:53 AM, Colin Evans wrote:

    There's a patch to get the native targets to build on Mac OS X:

    http://issues.apache.org/jira/browse/HADOOP-3659

    You probably will need to monkey with LDFLAGS as well to get it to
    work, but we've been able to build the native libs for the Mac
    without too much trouble.


    Doug Cutting wrote:
    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the
    lib/native directory.
    I think he needs to first build libhadoop.so, since he appears to
    be running on OS X and we only provide Linux builds of this in
    releases.

    Doug
  • Nathan Marz at Oct 1, 2008 at 7:55 pm
    Yes, this is exactly what I'm seeing. To be honest, I don't know which
    LZO native library it should be looking for. The LZO install dropped
    "liblzo2.la" and "liblzo2.a" in my /usr/local/lib directory, but not a
    file with a ".so" extension. Hardcoding would be fine as a temporary
    solution, but I don't know what to hardcode.

    Thanks,
    Nathan

    On Sep 30, 2008, at 8:45 PM, Amareshwari Sriramadasu wrote:

    Are you seeing HADOOP-2009?

    Thanks
    Amareshwari
    Nathan Marz wrote:
    Unfortunately, setting those environment variables did not help my
    issue. It appears that the "HADOOP_LZO_LIBRARY" variable is not
    defined in both LzoCompressor.c and LzoDecompressor.c. Where is
    this variable supposed to be set?


    On Sep 30, 2008, at 12:33 PM, Colin Evans wrote:

    Hi Nathan,
    You probably need to add the Java headers to your build path as
    well - I don't know why the Mac doesn't ship with this as a
    default setting:

    export CPATH="/System/Library/Frameworks/JavaVM.framework/Versions/
    CurrentJDK/Home/include "
    export CPPFLAGS="-I/System/Library/Frameworks/JavaVM.framework/
    Versions/CurrentJDK/Home/include"




    Nathan Marz wrote:
    Thanks for the help. I was able to get past my previous issue,
    but the native build is still failing. Here is the end of the log
    output:

    [exec] then mv -f ".deps/LzoCompressor.Tpo" ".deps/
    LzoCompressor.Plo"; else rm -f ".deps/LzoCompressor.Tpo"; exit 1;
    fi
    [exec] mkdir .libs
    [exec] gcc -DHAVE_CONFIG_H -I. -I/Users/nathan/Downloads/
    hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo -
    I../../../../../../.. -I/Library/Java/Home//include -I/Users/
    nathan/Downloads/hadoop-0.18.1/src/native/src -g -Wall -fPIC -O2 -
    m32 -g -O2 -MT LzoCompressor.lo -MD -MP -MF .deps/
    LzoCompressor.Tpo -c /Users/nathan/Downloads/hadoop-0.18.1/src/
    native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c -
    fno-common -DPIC -o .libs/LzoCompressor.o
    [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/
    org/apache/hadoop/io/compress/lzo/LzoCompressor.c: In function
    'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
    [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/
    org/apache/hadoop/io/compress/lzo/LzoCompressor.c:135: error:
    syntax error before ',' token
    [exec] make[2]: *** [LzoCompressor.lo] Error 1
    [exec] make[1]: *** [all-recursive] Error 1
    [exec] make: *** [all] Error 2


    Any ideas?


    On Sep 30, 2008, at 11:53 AM, Colin Evans wrote:

    There's a patch to get the native targets to build on Mac OS X:

    http://issues.apache.org/jira/browse/HADOOP-3659

    You probably will need to monkey with LDFLAGS as well to get it
    to work, but we've been able to build the native libs for the
    Mac without too much trouble.


    Doug Cutting wrote:
    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the
    lib/native directory.
    I think he needs to first build libhadoop.so, since he appears
    to be running on OS X and we only provide Linux builds of this
    in releases.

    Doug
  • Arun C Murthy at Oct 1, 2008 at 8:36 pm

    On Oct 1, 2008, at 12:54 PM, Nathan Marz wrote:

    Yes, this is exactly what I'm seeing. To be honest, I don't know
    which LZO native library it should be looking for. The LZO install
    dropped "liblzo2.la" and "liblzo2.a" in my /usr/local/lib directory,
    but not a file with a ".so" extension. Hardcoding would be fine as a
    temporary solution, but I don't know what to hardcode.
    You do need liblzo2.so for this to work.

    The hardcoded value has to be liblzo2.so too ...

    Arun
    Thanks,
    Nathan

    On Sep 30, 2008, at 8:45 PM, Amareshwari Sriramadasu wrote:

    Are you seeing HADOOP-2009?

    Thanks
    Amareshwari
    Nathan Marz wrote:
    Unfortunately, setting those environment variables did not help my
    issue. It appears that the "HADOOP_LZO_LIBRARY" variable is not
    defined in both LzoCompressor.c and LzoDecompressor.c. Where is
    this variable supposed to be set?


    On Sep 30, 2008, at 12:33 PM, Colin Evans wrote:

    Hi Nathan,
    You probably need to add the Java headers to your build path as
    well - I don't know why the Mac doesn't ship with this as a
    default setting:

    export CPATH="/System/Library/Frameworks/JavaVM.framework/
    Versions/CurrentJDK/Home/include "
    export CPPFLAGS="-I/System/Library/Frameworks/JavaVM.framework/
    Versions/CurrentJDK/Home/include"




    Nathan Marz wrote:
    Thanks for the help. I was able to get past my previous issue,
    but the native build is still failing. Here is the end of the
    log output:

    [exec] then mv -f ".deps/LzoCompressor.Tpo" ".deps/
    LzoCompressor.Plo"; else rm -f ".deps/LzoCompressor.Tpo"; exit
    1; fi
    [exec] mkdir .libs
    [exec] gcc -DHAVE_CONFIG_H -I. -I/Users/nathan/Downloads/
    hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo -
    I../../../../../../.. -I/Library/Java/Home//include -I/Users/
    nathan/Downloads/hadoop-0.18.1/src/native/src -g -Wall -fPIC -O2
    -m32 -g -O2 -MT LzoCompressor.lo -MD -MP -MF .deps/
    LzoCompressor.Tpo -c /Users/nathan/Downloads/hadoop-0.18.1/src/
    native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c -
    fno-common -DPIC -o .libs/LzoCompressor.o
    [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/
    org/apache/hadoop/io/compress/lzo/LzoCompressor.c: In function
    'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
    [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/
    org/apache/hadoop/io/compress/lzo/LzoCompressor.c:135: error:
    syntax error before ',' token
    [exec] make[2]: *** [LzoCompressor.lo] Error 1
    [exec] make[1]: *** [all-recursive] Error 1
    [exec] make: *** [all] Error 2


    Any ideas?


    On Sep 30, 2008, at 11:53 AM, Colin Evans wrote:

    There's a patch to get the native targets to build on Mac OS X:

    http://issues.apache.org/jira/browse/HADOOP-3659

    You probably will need to monkey with LDFLAGS as well to get it
    to work, but we've been able to build the native libs for the
    Mac without too much trouble.


    Doug Cutting wrote:
    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the
    lib/native directory.
    I think he needs to first build libhadoop.so, since he appears
    to be running on OS X and we only provide Linux builds of this
    in releases.

    Doug
  • Arun C Murthy at Sep 30, 2008 at 7:01 pm

    On Sep 30, 2008, at 11:46 AM, Doug Cutting wrote:

    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the lib/
    native directory.
    I think he needs to first build libhadoop.so, since he appears to be
    running on OS X and we only provide Linux builds of this in releases.
    Ah, good point.

    Unfortunately the work on getting native libs on Mac OS X stalled... http://issues.apache.org/jira/browse/HADOOP-3659

    Arun
  • Songting Chen at Oct 10, 2008 at 7:32 am
    It seems that I encountered a similar problem:

    Zlib , lzo installed.

    Running ant -Dcompile.native=true gave the following error.

    [exec] /server/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c: In function 'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
    [exec] /server/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:135: error: expected expression before ',' token

    * I updated build/native/<platform>/config.h 's HADOOP_LZO_LIBRARY to "liblzo2.so"
    * However, I don’t know how to "hardcode" liblzo2.so ...

    Thanks
    -Songting

    -----Original Message-----
    From: Arun C Murthy
    Sent: Wednesday, October 01, 2008 1:35 PM
    To: core-user@hadoop.apache.org
    Subject: Re: LZO and native hadoop libraries

    On Oct 1, 2008, at 12:54 PM, Nathan Marz wrote:

    Yes, this is exactly what I'm seeing. To be honest, I don't know
    which LZO native library it should be looking for. The LZO install
    dropped "liblzo2.la" and "liblzo2.a" in my /usr/local/lib directory,
    but not a file with a ".so" extension. Hardcoding would be fine as a
    temporary solution, but I don't know what to hardcode.
    You do need liblzo2.so for this to work.

    The hardcoded value has to be liblzo2.so too ...

    Arun
    Thanks,
    Nathan

    On Sep 30, 2008, at 8:45 PM, Amareshwari Sriramadasu wrote:

    Are you seeing HADOOP-2009?

    Thanks
    Amareshwari
    Nathan Marz wrote:
    Unfortunately, setting those environment variables did not help my
    issue. It appears that the "HADOOP_LZO_LIBRARY" variable is not
    defined in both LzoCompressor.c and LzoDecompressor.c. Where is
    this variable supposed to be set?


    On Sep 30, 2008, at 12:33 PM, Colin Evans wrote:

    Hi Nathan,
    You probably need to add the Java headers to your build path as
    well - I don't know why the Mac doesn't ship with this as a
    default setting:

    export CPATH="/System/Library/Frameworks/JavaVM.framework/
    Versions/CurrentJDK/Home/include "
    export CPPFLAGS="-I/System/Library/Frameworks/JavaVM.framework/
    Versions/CurrentJDK/Home/include"




    Nathan Marz wrote:
    Thanks for the help. I was able to get past my previous issue,
    but the native build is still failing. Here is the end of the
    log output:

    [exec] then mv -f ".deps/LzoCompressor.Tpo" ".deps/
    LzoCompressor.Plo"; else rm -f ".deps/LzoCompressor.Tpo"; exit
    1; fi
    [exec] mkdir .libs
    [exec] gcc -DHAVE_CONFIG_H -I. -I/Users/nathan/Downloads/
    hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo -
    I../../../../../../.. -I/Library/Java/Home//include -I/Users/
    nathan/Downloads/hadoop-0.18.1/src/native/src -g -Wall -fPIC -O2
    -m32 -g -O2 -MT LzoCompressor.lo -MD -MP -MF .deps/
    LzoCompressor.Tpo -c /Users/nathan/Downloads/hadoop-0.18.1/src/
    native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c -
    fno-common -DPIC -o .libs/LzoCompressor.o
    [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/
    org/apache/hadoop/io/compress/lzo/LzoCompressor.c: In function
    'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
    [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/
    org/apache/hadoop/io/compress/lzo/LzoCompressor.c:135: error:
    syntax error before ',' token
    [exec] make[2]: *** [LzoCompressor.lo] Error 1
    [exec] make[1]: *** [all-recursive] Error 1
    [exec] make: *** [all] Error 2


    Any ideas?


    On Sep 30, 2008, at 11:53 AM, Colin Evans wrote:

    There's a patch to get the native targets to build on Mac OS X:

    http://issues.apache.org/jira/browse/HADOOP-3659

    You probably will need to monkey with LDFLAGS as well to get it
    to work, but we've been able to build the native libs for the
    Mac without too much trouble.


    Doug Cutting wrote:
    Arun C Murthy wrote:
    You need to add libhadoop.so to your java.library.patch.
    libhadoop.so is available in the corresponding release in the
    lib/native directory.
    I think he needs to first build libhadoop.so, since he appears
    to be running on OS X and we only provide Linux builds of this
    in releases.

    Doug

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedSep 30, '08 at 6:15p
activeOct 10, '08 at 7:32a
posts13
users6
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase