FAQ
Hi,

I'm a new Hadoop user, so if this question is blatantly obvious, I
apologize. I'm trying to load a native shared library using the
DistributedCache as outlined in
https://issues.apache.org/jira/browse/HADOOP-1660?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
HADOOP-1660 . However, when I call System.load() , I continually get an
"UnsatisfiedLinkException: can't load library " error. I've checked the
java.library.path and LD_LIBRARY_PATH variables, and all seems to be in
order. I've also tried using System.loadLibrary(), but that call doesn't
even appear to find the library.

I have a feeling that I'm not properly creating the symlinks to the
library within the DistributedCache. Could someone that has successfully
loaded a native library using this functionality possibly provide a code
snippet of how this is done? Currently my code for loading the library
looks like this:

DistributedCache.addCacheFile(libPath.toUri(), conf);
DistributedCache.setLocalFiles(conf, "lib.so"); //I'm not really sure if
this is necessary
DistributedCache.createSymlink(conf);

and then within the M/R classes:

Path[] path = DistributedCache.getLocalCacheFiles(conf);

System.load(new File("lib.so").getAbsolutePath());//I've also tried
System.load(path[0].toString) but that didn't work either.

Is this incorrect? Any help would be greatly appreciated.

Thanks,
Mike
--
View this message in context: http://www.nabble.com/Issue-loading-a-native-library-through-the-DistributedCache-tp17800388p17800388.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.

Search Discussions

  • Arun C Murthy at Jun 12, 2008 at 4:04 pm

    On Jun 12, 2008, at 6:47 AM, montag wrote:
    Hi,

    I'm a new Hadoop user, so if this question is blatantly obvious, I
    apologize. I'm trying to load a native shared library using the
    DistributedCache as outlined in
    https://issues.apache.org/jira/browse/HADOOP-1660?
    page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
    HADOOP-1660 .
    The DistributedCache will use the 'fragment' of the URI as the name
    of the symlink:
    hdfs://namenode:port/lib.so.1#lib.so

    Thus in the above case you will find:
    lib.so -> lib.so.1

    Then in your main:
    DistributedCache.addCacheFile("hdfs://namenode:port/lib.so.1#lib.so",
    conf);
    DistributedCache.createSymLink(conf);

    In the map/reduce task:
    System.loadLibrary("lib.so");

    Hope that helps...

    Arun
    However, when I call System.load() , I continually get an
    "UnsatisfiedLinkException: can't load library " error. I've
    checked the
    java.library.path and LD_LIBRARY_PATH variables, and all seems to
    be in
    order. I've also tried using System.loadLibrary(), but that call
    doesn't
    even appear to find the library.

    I have a feeling that I'm not properly creating the symlinks to the
    library within the DistributedCache. Could someone that has
    successfully
    loaded a native library using this functionality possibly provide a
    code
    snippet of how this is done? Currently my code for loading the
    library
    looks like this:

    DistributedCache.addCacheFile(libPath.toUri(), conf);
    DistributedCache.setLocalFiles(conf, "lib.so"); //I'm not really
    sure if
    this is necessary
    DistributedCache.createSymlink(conf);

    and then within the M/R classes:

    Path[] path = DistributedCache.getLocalCacheFiles(conf);

    System.load(new File("lib.so").getAbsolutePath());//I've also tried
    System.load(path[0].toString) but that didn't work either.

    Is this incorrect? Any help would be greatly appreciated.

    Thanks,
    Mike
    --
    View this message in context: http://www.nabble.com/Issue-loading-a-
    native-library-through-the-DistributedCache-tp17800388p17800388.html
    Sent from the Hadoop core-user mailing list archive at Nabble.com.
  • Arun C Murthy at Jun 12, 2008 at 4:12 pm
    I assume you are trying to load a JNI-based library (since you refer
    to System.load/System.loadLibrary) ...

    I've opened https://issues.apache.org/jira/browse/HADOOP-3547 to fix
    the documentation with better examples.

    Arun
    On Jun 12, 2008, at 9:01 AM, Arun C Murthy wrote:

    On Jun 12, 2008, at 6:47 AM, montag wrote:


    Hi,

    I'm a new Hadoop user, so if this question is blatantly obvious, I
    apologize. I'm trying to load a native shared library using the
    DistributedCache as outlined in
    https://issues.apache.org/jira/browse/HADOOP-1660?
    page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
    HADOOP-1660 .
    The DistributedCache will use the 'fragment' of the URI as the name
    of the symlink:
    hdfs://namenode:port/lib.so.1#lib.so

    Thus in the above case you will find:
    lib.so -> lib.so.1

    Then in your main:
    DistributedCache.addCacheFile("hdfs://namenode:port/lib.so.
    1#lib.so", conf);
    DistributedCache.createSymLink(conf);

    In the map/reduce task:
    System.loadLibrary("lib.so");

    Hope that helps...

    Arun
    However, when I call System.load() , I continually get an
    "UnsatisfiedLinkException: can't load library " error. I've
    checked the
    java.library.path and LD_LIBRARY_PATH variables, and all seems to
    be in
    order. I've also tried using System.loadLibrary(), but that call
    doesn't
    even appear to find the library.

    I have a feeling that I'm not properly creating the symlinks to the
    library within the DistributedCache. Could someone that has
    successfully
    loaded a native library using this functionality possibly provide
    a code
    snippet of how this is done? Currently my code for loading the
    library
    looks like this:

    DistributedCache.addCacheFile(libPath.toUri(), conf);
    DistributedCache.setLocalFiles(conf, "lib.so"); //I'm not really
    sure if
    this is necessary
    DistributedCache.createSymlink(conf);

    and then within the M/R classes:

    Path[] path = DistributedCache.getLocalCacheFiles(conf);

    System.load(new File("lib.so").getAbsolutePath());//I've also tried
    System.load(path[0].toString) but that didn't work either.

    Is this incorrect? Any help would be greatly appreciated.

    Thanks,
    Mike
    --
    View this message in context: http://www.nabble.com/Issue-loading-
    a-native-library-through-the-DistributedCache-
    tp17800388p17800388.html
    Sent from the Hadoop core-user mailing list archive at Nabble.com.
  • Montag at Jun 12, 2008 at 5:00 pm
    Hi Arun,

    Thanks for your reply! Yes, I'm trying to load a JNI-based library. I
    tried what you suggested, but I'm still receiving an UnsatisfiedLink
    exception I noticed that if I use System.loadLibrary("lib.so") I get the
    following error:

    java.lang.UnsatisfiedLinkError: no lib.so in java.library.path

    But if i use System.load(new File("lib.so").getAbsolutePath) :

    java.lang.UnsatisfiedLinkError: Can't load library: /local/path/to/lib.so

    Thus, the symlink is working properly since the call to getAbsolutePath is
    returning the local directory, but it still isn't loading the library.

    Am I still loading it into the cache wrong? Do I need to set the
    conf.setLocalFiles to the localized directory or the directory in HDFS?

    Thanks!
    Mike

    Arun C Murthy wrote:

    I assume you are trying to load a JNI-based library (since you refer
    to System.load/System.loadLibrary) ...

    I've opened https://issues.apache.org/jira/browse/HADOOP-3547 to fix
    the documentation with better examples.

    Arun
    On Jun 12, 2008, at 9:01 AM, Arun C Murthy wrote:

    On Jun 12, 2008, at 6:47 AM, montag wrote:


    Hi,

    I'm a new Hadoop user, so if this question is blatantly obvious, I
    apologize. I'm trying to load a native shared library using the
    DistributedCache as outlined in
    https://issues.apache.org/jira/browse/HADOOP-1660?
    page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
    HADOOP-1660 .
    The DistributedCache will use the 'fragment' of the URI as the name
    of the symlink:
    hdfs://namenode:port/lib.so.1#lib.so

    Thus in the above case you will find:
    lib.so -> lib.so.1

    Then in your main:
    DistributedCache.addCacheFile("hdfs://namenode:port/lib.so.
    1#lib.so", conf);
    DistributedCache.createSymLink(conf);

    In the map/reduce task:
    System.loadLibrary("lib.so");

    Hope that helps...

    Arun
    However, when I call System.load() , I continually get an
    "UnsatisfiedLinkException: can't load library " error. I've
    checked the
    java.library.path and LD_LIBRARY_PATH variables, and all seems to
    be in
    order. I've also tried using System.loadLibrary(), but that call
    doesn't
    even appear to find the library.

    I have a feeling that I'm not properly creating the symlinks to the
    library within the DistributedCache. Could someone that has
    successfully
    loaded a native library using this functionality possibly provide
    a code
    snippet of how this is done? Currently my code for loading the
    library
    looks like this:

    DistributedCache.addCacheFile(libPath.toUri(), conf);
    DistributedCache.setLocalFiles(conf, "lib.so"); //I'm not really
    sure if
    this is necessary
    DistributedCache.createSymlink(conf);

    and then within the M/R classes:

    Path[] path = DistributedCache.getLocalCacheFiles(conf);

    System.load(new File("lib.so").getAbsolutePath());//I've also tried
    System.load(path[0].toString) but that didn't work either.

    Is this incorrect? Any help would be greatly appreciated.

    Thanks,
    Mike
    --
    View this message in context: http://www.nabble.com/Issue-loading-
    a-native-library-through-the-DistributedCache-
    tp17800388p17800388.html
    Sent from the Hadoop core-user mailing list archive at Nabble.com.
    --
    View this message in context: http://www.nabble.com/Issue-loading-a-native-library-through-the-DistributedCache-tp17800388p17804785.html
    Sent from the Hadoop core-user mailing list archive at Nabble.com.
  • Montag at Jun 12, 2008 at 5:51 pm
    Nevermind! The library loads fine. Using :

    DistributedCache.addCacheFile(new URI("path/to/lib.so#lib.so"), conf);

    did the trick. Granted, I'm getting new JNI based errors now, but I have a
    feeling they are unrelated( If they are, I will post here again).

    Thanks,
    Mike


    montag wrote:
    Hi Arun,

    Thanks for your reply! Yes, I'm trying to load a JNI-based library. I
    tried what you suggested, but I'm still receiving an UnsatisfiedLink
    exception I noticed that if I use System.loadLibrary("lib.so") I get the
    following error:

    java.lang.UnsatisfiedLinkError: no lib.so in java.library.path

    But if i use System.load(new File("lib.so").getAbsolutePath) :

    java.lang.UnsatisfiedLinkError: Can't load library: /local/path/to/lib.so

    Thus, the symlink is working properly since the call to getAbsolutePath is
    returning the local directory, but it still isn't loading the library.

    Am I still loading it into the cache wrong? Do I need to set the
    conf.setLocalFiles to the localized directory or the directory in HDFS?

    Thanks!
    Mike

    Arun C Murthy wrote:

    I assume you are trying to load a JNI-based library (since you refer
    to System.load/System.loadLibrary) ...

    I've opened https://issues.apache.org/jira/browse/HADOOP-3547 to fix
    the documentation with better examples.

    Arun
    On Jun 12, 2008, at 9:01 AM, Arun C Murthy wrote:

    On Jun 12, 2008, at 6:47 AM, montag wrote:


    Hi,

    I'm a new Hadoop user, so if this question is blatantly obvious, I
    apologize. I'm trying to load a native shared library using the
    DistributedCache as outlined in
    https://issues.apache.org/jira/browse/HADOOP-1660?
    page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
    HADOOP-1660 .
    The DistributedCache will use the 'fragment' of the URI as the name
    of the symlink:
    hdfs://namenode:port/lib.so.1#lib.so

    Thus in the above case you will find:
    lib.so -> lib.so.1

    Then in your main:
    DistributedCache.addCacheFile("hdfs://namenode:port/lib.so.
    1#lib.so", conf);
    DistributedCache.createSymLink(conf);

    In the map/reduce task:
    System.loadLibrary("lib.so");

    Hope that helps...

    Arun
    However, when I call System.load() , I continually get an
    "UnsatisfiedLinkException: can't load library " error. I've
    checked the
    java.library.path and LD_LIBRARY_PATH variables, and all seems to
    be in
    order. I've also tried using System.loadLibrary(), but that call
    doesn't
    even appear to find the library.

    I have a feeling that I'm not properly creating the symlinks to the
    library within the DistributedCache. Could someone that has
    successfully
    loaded a native library using this functionality possibly provide
    a code
    snippet of how this is done? Currently my code for loading the
    library
    looks like this:

    DistributedCache.addCacheFile(libPath.toUri(), conf);
    DistributedCache.setLocalFiles(conf, "lib.so"); //I'm not really
    sure if
    this is necessary
    DistributedCache.createSymlink(conf);

    and then within the M/R classes:

    Path[] path = DistributedCache.getLocalCacheFiles(conf);

    System.load(new File("lib.so").getAbsolutePath());//I've also tried
    System.load(path[0].toString) but that didn't work either.

    Is this incorrect? Any help would be greatly appreciated.

    Thanks,
    Mike
    --
    View this message in context: http://www.nabble.com/Issue-loading-
    a-native-library-through-the-DistributedCache-
    tp17800388p17800388.html
    Sent from the Hadoop core-user mailing list archive at Nabble.com.
    --
    View this message in context: http://www.nabble.com/Issue-loading-a-native-library-through-the-DistributedCache-tp17800388p17805835.html
    Sent from the Hadoop core-user mailing list archive at Nabble.com.
  • Chang Hu at Jun 12, 2008 at 9:01 pm
    Out of curiosity: what happens if the slave nodes are running a different
    os, or just missing the libraries the native library needs? Is this why
    using native libraries in Hadoop is not recommended?

    - Chang
    On Thu, Jun 12, 2008 at 10:51 AM, montag wrote:


    Nevermind! The library loads fine. Using :

    DistributedCache.addCacheFile(new URI("path/to/lib.so#lib.so"), conf);

    did the trick. Granted, I'm getting new JNI based errors now, but I have
    a
    feeling they are unrelated( If they are, I will post here again).

    Thanks,
    Mike


    montag wrote:
    Hi Arun,

    Thanks for your reply! Yes, I'm trying to load a JNI-based library. I
    tried what you suggested, but I'm still receiving an UnsatisfiedLink
    exception I noticed that if I use System.loadLibrary("lib.so") I get the
    following error:

    java.lang.UnsatisfiedLinkError: no lib.so in java.library.path

    But if i use System.load(new File("lib.so").getAbsolutePath) :

    java.lang.UnsatisfiedLinkError: Can't load library: /local/path/to/lib.so

    Thus, the symlink is working properly since the call to getAbsolutePath is
    returning the local directory, but it still isn't loading the library.

    Am I still loading it into the cache wrong? Do I need to set the
    conf.setLocalFiles to the localized directory or the directory in HDFS?

    Thanks!
    Mike

    Arun C Murthy wrote:

    I assume you are trying to load a JNI-based library (since you refer
    to System.load/System.loadLibrary) ...

    I've opened https://issues.apache.org/jira/browse/HADOOP-3547 to fix
    the documentation with better examples.

    Arun
    On Jun 12, 2008, at 9:01 AM, Arun C Murthy wrote:

    On Jun 12, 2008, at 6:47 AM, montag wrote:


    Hi,

    I'm a new Hadoop user, so if this question is blatantly obvious, I
    apologize. I'm trying to load a native shared library using the
    DistributedCache as outlined in
    https://issues.apache.org/jira/browse/HADOOP-1660?
    page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
    HADOOP-1660 .
    The DistributedCache will use the 'fragment' of the URI as the name
    of the symlink:
    hdfs://namenode:port/lib.so.1#lib.so

    Thus in the above case you will find:
    lib.so -> lib.so.1

    Then in your main:
    DistributedCache.addCacheFile("hdfs://namenode:port/lib.so.
    1#lib.so", conf);
    DistributedCache.createSymLink(conf);

    In the map/reduce task:
    System.loadLibrary("lib.so");

    Hope that helps...

    Arun
    However, when I call System.load() , I continually get an
    "UnsatisfiedLinkException: can't load library " error. I've
    checked the
    java.library.path and LD_LIBRARY_PATH variables, and all seems to
    be in
    order. I've also tried using System.loadLibrary(), but that call
    doesn't
    even appear to find the library.

    I have a feeling that I'm not properly creating the symlinks to the
    library within the DistributedCache. Could someone that has
    successfully
    loaded a native library using this functionality possibly provide
    a code
    snippet of how this is done? Currently my code for loading the
    library
    looks like this:

    DistributedCache.addCacheFile(libPath.toUri(), conf);
    DistributedCache.setLocalFiles(conf, "lib.so"); //I'm not really
    sure if
    this is necessary
    DistributedCache.createSymlink(conf);

    and then within the M/R classes:

    Path[] path = DistributedCache.getLocalCacheFiles(conf);

    System.load(new File("lib.so").getAbsolutePath());//I've also tried
    System.load(path[0].toString) but that didn't work either.

    Is this incorrect? Any help would be greatly appreciated.

    Thanks,
    Mike
    --
    View this message in context: http://www.nabble.com/Issue-loading-
    a-native-library-through-the-DistributedCache-
    tp17800388p17800388.html
    Sent from the Hadoop core-user mailing list archive at Nabble.com.
    --
    View this message in context:
    http://www.nabble.com/Issue-loading-a-native-library-through-the-DistributedCache-tp17800388p17805835.html
    Sent from the Hadoop core-user mailing list archive at Nabble.com.

    --
    ---------------
    Überstehen ist alles.


    Chang Hu
    Ph.D. student
    Computer Science Department
    University of Maryland
  • Allen Wittenauer at Jun 13, 2008 at 4:14 pm

    On 6/12/08 2:00 PM, "Chang Hu" wrote:
    Out of curiosity: what happens if the slave nodes are running a different
    os, or just missing the libraries the native library needs?
    The same thing that happens when you try to run any compiled program on
    a foreign OS: it fails to execute.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedJun 12, '08 at 1:48p
activeJun 13, '08 at 4:14p
posts7
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase