FAQ
Hi, I am trying to run a JNI application on StandAlone mode and I have been
getting this result ever since.
I've looked up every possible webs and sites but never could get the
solution to it.
Please help me find out what's wrong.

I have 2 java files (LangAnal.java, freeparser.java)
LangAnal.java belongs to a package "org.etri.mp2893"
freeparser.java belongs to a package "org.knu.freeparser"

freeparser.java loads "libetri_lmi.so" which is in "/home/qa/lang/dll4jni/"
and I believe "libetri_lmi.so" uses another library "libparser.so" which is
in "/home/qa/lang/lib".

I have compiled LangAnal.java and freeparser.java. The resulting classes are
placed respectively at "/home/qa/lang/dll4jni/org/etri/mp2893/" and
"/home/qa/lang/dll4jni/org/knu/freeparser/".

Then I have archived those classes by

"jar -cvef org.etri.mp2893.LangAnal ./LangAnal.jar ./org/*"

from the directory of "/home/qa/lang/dll4jni/".

Then I tried to run

hadoop jar LangAnal.jar ./news.txt ./output

from the same directory and I get the result down below.

I tried to set LD_LIBRARY_PATH to
".:/home/qa/lang/dll4jni:/home/qa/lang/lib"
I tried java.setProperty("java.library.path", "/home/qa/lang/dll4jni"); in
freeparser.java.
So far nothing has worked.

But the funny thing is, if I run

hadoop -Djava.library.path=/home/qa/lang/dll4jni org.etri.mp2893.LangAnal
./news.txt ./output

from "/home/qa/lang/dll4jni/" it works just fine. So I guess the problem is
just about jar file.
And I intend to run this application in Fully-Distributed mode soon, I have
to figure out how to run jar files.

Please someone help me.

-------------------------------------------------------------------------------------------------------------------

[qa@qa128 dll4jni]$ hadoop jar LangAnal.jar ./news.txt ./output
10/06/01 01:19:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
10/06/01 01:19:39 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
10/06/01 01:19:39 WARN mapred.JobClient: No job jar file set. User classes
may not be found. See JobConf(Class) or JobConf#setJar(String).
10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process :
1
10/06/01 01:19:39 INFO mapred.JobClient: Running job: job_local_0001
10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process :
1
10/06/01 01:19:39 WARN mapred.LocalJobRunner: job_local_0001
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
... 3 more
Caused by: java.lang.UnsatisfiedLinkError: no etri_lmi in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1709)
at java.lang.Runtime.loadLibrary0(Runtime.java:823)
at java.lang.System.loadLibrary(System.java:1030)
at org.knu.freeparser.freeparser.(LangAnal.java:26)
... 8 more
10/06/01 01:19:40 INFO mapred.JobClient: map 0% reduce 0%
10/06/01 01:19:40 INFO mapred.JobClient: Job complete: job_local_0001
10/06/01 01:19:40 INFO mapred.JobClient: Counters: 0

Search Discussions

  • Alex Kozlov at May 31, 2010 at 7:36 pm
    Try

    1. export JAVA_LIBRARY_PATH=/home/qa/lang/dll4jni/
    2. mkdir lib; mv LangAnal.jar lib; jar -cvef org.etri.mp2893.LangAnal
    ./lib ./org

    In a distributed mode you'll have to copy your dynamic libs to some
    directory thoughout the cluster and point JAVA_LIBRARY_PATH to it (or you
    can use distributed cache).

    Let me know if you have other config problems.

    Alex K
    On Mon, May 31, 2010 at 9:49 AM, edward choi wrote:

    Hi, I am trying to run a JNI application on StandAlone mode and I have been
    getting this result ever since.
    I've looked up every possible webs and sites but never could get the
    solution to it.
    Please help me find out what's wrong.

    I have 2 java files (LangAnal.java, freeparser.java)
    LangAnal.java belongs to a package "org.etri.mp2893"
    freeparser.java belongs to a package "org.knu.freeparser"

    freeparser.java loads "libetri_lmi.so" which is in "/home/qa/lang/dll4jni/"
    and I believe "libetri_lmi.so" uses another library "libparser.so" which is
    in "/home/qa/lang/lib".

    I have compiled LangAnal.java and freeparser.java. The resulting classes
    are
    placed respectively at "/home/qa/lang/dll4jni/org/etri/mp2893/" and
    "/home/qa/lang/dll4jni/org/knu/freeparser/".

    Then I have archived those classes by

    "jar -cvef org.etri.mp2893.LangAnal ./LangAnal.jar ./org/*"

    from the directory of "/home/qa/lang/dll4jni/".

    Then I tried to run

    hadoop jar LangAnal.jar ./news.txt ./output

    from the same directory and I get the result down below.

    I tried to set LD_LIBRARY_PATH to
    ".:/home/qa/lang/dll4jni:/home/qa/lang/lib"
    I tried java.setProperty("java.library.path", "/home/qa/lang/dll4jni"); in
    freeparser.java.
    So far nothing has worked.

    But the funny thing is, if I run

    hadoop -Djava.library.path=/home/qa/lang/dll4jni org.etri.mp2893.LangAnal
    ./news.txt ./output

    from "/home/qa/lang/dll4jni/" it works just fine. So I guess the problem is
    just about jar file.
    And I intend to run this application in Fully-Distributed mode soon, I have
    to figure out how to run jar files.

    Please someone help me.


    -------------------------------------------------------------------------------------------------------------------

    [qa@qa128 dll4jni]$ hadoop jar LangAnal.jar ./news.txt ./output
    10/06/01 01:19:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with
    processName=JobTracker, sessionId=
    10/06/01 01:19:39 WARN mapred.JobClient: Use GenericOptionsParser for
    parsing the arguments. Applications should implement Tool for the same.
    10/06/01 01:19:39 WARN mapred.JobClient: No job jar file set. User classes
    may not be found. See JobConf(Class) or JobConf#setJar(String).
    10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process
    :
    1
    10/06/01 01:19:39 INFO mapred.JobClient: Running job: job_local_0001
    10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process
    :
    1
    10/06/01 01:19:39 WARN mapred.LocalJobRunner: job_local_0001
    java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
    at

    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at

    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at

    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at

    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
    ... 3 more
    Caused by: java.lang.UnsatisfiedLinkError: no etri_lmi in java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1709)
    at java.lang.Runtime.loadLibrary0(Runtime.java:823)
    at java.lang.System.loadLibrary(System.java:1030)
    at org.knu.freeparser.freeparser.<clinit>(freeparser.java:9)
    at org.etri.mp2893.LangAnal$LangAnalMapper.<init>(LangAnal.java:26)
    ... 8 more
    10/06/01 01:19:40 INFO mapred.JobClient: map 0% reduce 0%
    10/06/01 01:19:40 INFO mapred.JobClient: Job complete: job_local_0001
    10/06/01 01:19:40 INFO mapred.JobClient: Counters: 0
  • Edward choi at Jun 1, 2010 at 1:20 am
    Alex,

    Thanks for your advice.
    I tried both of them and neither works.
    I am still getting the UnsatisfiedLinkError when I try to run my application
    in StandAlone mode.

    I also made LangAnal.java belong to the package of "org.knu.freeparser" just
    in case. But still no luck.
    I put libetri_lmi.so to "/usr/lib" and still no luck.

    2010/6/1 Alex Kozlov <alexvk@cloudera.com>
    Try

    1. export JAVA_LIBRARY_PATH=/home/qa/lang/dll4jni/
    2. mkdir lib; mv LangAnal.jar lib; jar -cvef org.etri.mp2893.LangAnal
    ./lib ./org

    In a distributed mode you'll have to copy your dynamic libs to some
    directory thoughout the cluster and point JAVA_LIBRARY_PATH to it (or you
    can use distributed cache).

    Let me know if you have other config problems.

    Alex K
    On Mon, May 31, 2010 at 9:49 AM, edward choi wrote:

    Hi, I am trying to run a JNI application on StandAlone mode and I have been
    getting this result ever since.
    I've looked up every possible webs and sites but never could get the
    solution to it.
    Please help me find out what's wrong.

    I have 2 java files (LangAnal.java, freeparser.java)
    LangAnal.java belongs to a package "org.etri.mp2893"
    freeparser.java belongs to a package "org.knu.freeparser"

    freeparser.java loads "libetri_lmi.so" which is in
    "/home/qa/lang/dll4jni/"
    and I believe "libetri_lmi.so" uses another library "libparser.so" which is
    in "/home/qa/lang/lib".

    I have compiled LangAnal.java and freeparser.java. The resulting classes
    are
    placed respectively at "/home/qa/lang/dll4jni/org/etri/mp2893/" and
    "/home/qa/lang/dll4jni/org/knu/freeparser/".

    Then I have archived those classes by

    "jar -cvef org.etri.mp2893.LangAnal ./LangAnal.jar ./org/*"

    from the directory of "/home/qa/lang/dll4jni/".

    Then I tried to run

    hadoop jar LangAnal.jar ./news.txt ./output

    from the same directory and I get the result down below.

    I tried to set LD_LIBRARY_PATH to
    ".:/home/qa/lang/dll4jni:/home/qa/lang/lib"
    I tried java.setProperty("java.library.path", "/home/qa/lang/dll4jni"); in
    freeparser.java.
    So far nothing has worked.

    But the funny thing is, if I run

    hadoop -Djava.library.path=/home/qa/lang/dll4jni org.etri.mp2893.LangAnal
    ./news.txt ./output

    from "/home/qa/lang/dll4jni/" it works just fine. So I guess the problem is
    just about jar file.
    And I intend to run this application in Fully-Distributed mode soon, I have
    to figure out how to run jar files.

    Please someone help me.


    -------------------------------------------------------------------------------------------------------------------
    [qa@qa128 dll4jni]$ hadoop jar LangAnal.jar ./news.txt ./output
    10/06/01 01:19:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with
    processName=JobTracker, sessionId=
    10/06/01 01:19:39 WARN mapred.JobClient: Use GenericOptionsParser for
    parsing the arguments. Applications should implement Tool for the same.
    10/06/01 01:19:39 WARN mapred.JobClient: No job jar file set. User classes
    may not be found. See JobConf(Class) or JobConf#setJar(String).
    10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process
    :
    1
    10/06/01 01:19:39 INFO mapred.JobClient: Running job: job_local_0001
    10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process
    :
    1
    10/06/01 01:19:39 WARN mapred.LocalJobRunner: job_local_0001
    java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
    at

    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
    at
    org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at

    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at

    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at

    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
    ... 3 more
    Caused by: java.lang.UnsatisfiedLinkError: no etri_lmi in
    java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1709)
    at java.lang.Runtime.loadLibrary0(Runtime.java:823)
    at java.lang.System.loadLibrary(System.java:1030)
    at org.knu.freeparser.freeparser.<clinit>(freeparser.java:9)
    at org.etri.mp2893.LangAnal$LangAnalMapper.<init>(LangAnal.java:26)
    ... 8 more
    10/06/01 01:19:40 INFO mapred.JobClient: map 0% reduce 0%
    10/06/01 01:19:40 INFO mapred.JobClient: Job complete: job_local_0001
    10/06/01 01:19:40 INFO mapred.JobClient: Counters: 0
  • Hemanth Yamijala at Jun 1, 2010 at 3:47 am
    Edward,

    If it's an option to copy the libraries to a fixed location on all the
    cluster nodes, you could do that and configure them in the library
    path via mapred.child.java.opts. Please look at http://bit.ly/ab93Z8
    (MapReduce tutorial on Hadoop site) to see how to use this config
    option for setting paths to native libraries.

    Thanks
    Hemanth
    On Tue, Jun 1, 2010 at 6:50 AM, edward choi wrote:
    Alex,

    Thanks for your advice.
    I tried both of them and neither works.
    I am still getting the UnsatisfiedLinkError when I try to run my application
    in StandAlone mode.

    I also made LangAnal.java belong to the package of "org.knu.freeparser" just
    in case. But still no luck.
    I put libetri_lmi.so to "/usr/lib" and still no luck.

    2010/6/1 Alex Kozlov <alexvk@cloudera.com>
    Try

    1. export JAVA_LIBRARY_PATH=/home/qa/lang/dll4jni/
    2. mkdir lib; mv LangAnal.jar lib; jar -cvef org.etri.mp2893.LangAnal
    ./lib ./org

    In a distributed mode you'll have to copy your dynamic libs to some
    directory thoughout the cluster and point JAVA_LIBRARY_PATH to it (or you
    can use distributed cache).

    Let me know if you have other config problems.

    Alex K
    On Mon, May 31, 2010 at 9:49 AM, edward choi wrote:

    Hi, I am trying to run a JNI application on StandAlone mode and I have been
    getting this result ever since.
    I've looked up every possible webs and sites but never could get the
    solution to it.
    Please help me find out what's wrong.

    I have 2 java files (LangAnal.java, freeparser.java)
    LangAnal.java belongs to a package "org.etri.mp2893"
    freeparser.java belongs to a package "org.knu.freeparser"

    freeparser.java loads "libetri_lmi.so" which is in
    "/home/qa/lang/dll4jni/"
    and I believe "libetri_lmi.so" uses another library "libparser.so" which is
    in "/home/qa/lang/lib".

    I have compiled LangAnal.java and freeparser.java. The resulting classes
    are
    placed respectively at "/home/qa/lang/dll4jni/org/etri/mp2893/" and
    "/home/qa/lang/dll4jni/org/knu/freeparser/".

    Then I have archived those classes by

    "jar -cvef org.etri.mp2893.LangAnal ./LangAnal.jar ./org/*"

    from the directory of "/home/qa/lang/dll4jni/".

    Then I tried to run

    hadoop jar LangAnal.jar ./news.txt ./output

    from the same directory and I get the result down below.

    I tried to set LD_LIBRARY_PATH to
    ".:/home/qa/lang/dll4jni:/home/qa/lang/lib"
    I tried java.setProperty("java.library.path", "/home/qa/lang/dll4jni"); in
    freeparser.java.
    So far nothing has worked.

    But the funny thing is, if I run

    hadoop -Djava.library.path=/home/qa/lang/dll4jni org.etri.mp2893.LangAnal
    ./news.txt ./output

    from "/home/qa/lang/dll4jni/" it works just fine. So I guess the problem is
    just about jar file.
    And I intend to run this application in Fully-Distributed mode soon, I have
    to figure out how to run jar files.

    Please someone help me.


    -------------------------------------------------------------------------------------------------------------------
    [qa@qa128 dll4jni]$ hadoop jar LangAnal.jar ./news.txt ./output
    10/06/01 01:19:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with
    processName=JobTracker, sessionId=
    10/06/01 01:19:39 WARN mapred.JobClient: Use GenericOptionsParser for
    parsing the arguments. Applications should implement Tool for the same.
    10/06/01 01:19:39 WARN mapred.JobClient: No job jar file set.  User classes
    may not be found. See JobConf(Class) or JobConf#setJar(String).
    10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process
    :
    1
    10/06/01 01:19:39 INFO mapred.JobClient: Running job: job_local_0001
    10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process
    :
    1
    10/06/01 01:19:39 WARN mapred.LocalJobRunner: job_local_0001
    java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
    at

    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
    at
    org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at

    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at

    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at

    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
    ... 3 more
    Caused by: java.lang.UnsatisfiedLinkError: no etri_lmi in
    java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1709)
    at java.lang.Runtime.loadLibrary0(Runtime.java:823)
    at java.lang.System.loadLibrary(System.java:1030)
    at org.knu.freeparser.freeparser.<clinit>(freeparser.java:9)
    at org.etri.mp2893.LangAnal$LangAnalMapper.<init>(LangAnal.java:26)
    ... 8 more
    10/06/01 01:19:40 INFO mapred.JobClient:  map 0% reduce 0%
    10/06/01 01:19:40 INFO mapred.JobClient: Job complete: job_local_0001
    10/06/01 01:19:40 INFO mapred.JobClient: Counters: 0
  • Alex Kozlov at Jun 1, 2010 at 3:08 pm
    Hi Edward, Can you provide the output of 'ps -aef -u hadoop | grep Child'
    on a tasktracker node while your job is running? -- Alex K
    On Mon, May 31, 2010 at 8:47 PM, Hemanth Yamijala wrote:

    Edward,

    If it's an option to copy the libraries to a fixed location on all the
    cluster nodes, you could do that and configure them in the library
    path via mapred.child.java.opts. Please look at http://bit.ly/ab93Z8
    (MapReduce tutorial on Hadoop site) to see how to use this config
    option for setting paths to native libraries.

    Thanks
    Hemanth
    On Tue, Jun 1, 2010 at 6:50 AM, edward choi wrote:
    Alex,

    Thanks for your advice.
    I tried both of them and neither works.
    I am still getting the UnsatisfiedLinkError when I try to run my
    application
    in StandAlone mode.

    I also made LangAnal.java belong to the package of "org.knu.freeparser" just
    in case. But still no luck.
    I put libetri_lmi.so to "/usr/lib" and still no luck.

    2010/6/1 Alex Kozlov <alexvk@cloudera.com>
    Try

    1. export JAVA_LIBRARY_PATH=/home/qa/lang/dll4jni/
    2. mkdir lib; mv LangAnal.jar lib; jar -cvef org.etri.mp2893.LangAnal
    ./lib ./org

    In a distributed mode you'll have to copy your dynamic libs to some
    directory thoughout the cluster and point JAVA_LIBRARY_PATH to it (or
    you
    can use distributed cache).

    Let me know if you have other config problems.

    Alex K
    On Mon, May 31, 2010 at 9:49 AM, edward choi wrote:

    Hi, I am trying to run a JNI application on StandAlone mode and I have been
    getting this result ever since.
    I've looked up every possible webs and sites but never could get the
    solution to it.
    Please help me find out what's wrong.

    I have 2 java files (LangAnal.java, freeparser.java)
    LangAnal.java belongs to a package "org.etri.mp2893"
    freeparser.java belongs to a package "org.knu.freeparser"

    freeparser.java loads "libetri_lmi.so" which is in
    "/home/qa/lang/dll4jni/"
    and I believe "libetri_lmi.so" uses another library "libparser.so"
    which
    is
    in "/home/qa/lang/lib".

    I have compiled LangAnal.java and freeparser.java. The resulting
    classes
    are
    placed respectively at "/home/qa/lang/dll4jni/org/etri/mp2893/" and
    "/home/qa/lang/dll4jni/org/knu/freeparser/".

    Then I have archived those classes by

    "jar -cvef org.etri.mp2893.LangAnal ./LangAnal.jar ./org/*"

    from the directory of "/home/qa/lang/dll4jni/".

    Then I tried to run

    hadoop jar LangAnal.jar ./news.txt ./output

    from the same directory and I get the result down below.

    I tried to set LD_LIBRARY_PATH to
    ".:/home/qa/lang/dll4jni:/home/qa/lang/lib"
    I tried java.setProperty("java.library.path",
    "/home/qa/lang/dll4jni");
    in
    freeparser.java.
    So far nothing has worked.

    But the funny thing is, if I run

    hadoop -Djava.library.path=/home/qa/lang/dll4jni
    org.etri.mp2893.LangAnal
    ./news.txt ./output

    from "/home/qa/lang/dll4jni/" it works just fine. So I guess the
    problem
    is
    just about jar file.
    And I intend to run this application in Fully-Distributed mode soon, I have
    to figure out how to run jar files.

    Please someone help me.

    -------------------------------------------------------------------------------------------------------------------
    [qa@qa128 dll4jni]$ hadoop jar LangAnal.jar ./news.txt ./output
    10/06/01 01:19:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with
    processName=JobTracker, sessionId=
    10/06/01 01:19:39 WARN mapred.JobClient: Use GenericOptionsParser for
    parsing the arguments. Applications should implement Tool for the
    same.
    10/06/01 01:19:39 WARN mapred.JobClient: No job jar file set. User classes
    may not be found. See JobConf(Class) or JobConf#setJar(String).
    10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process
    :
    1
    10/06/01 01:19:39 INFO mapred.JobClient: Running job: job_local_0001
    10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process
    :
    1
    10/06/01 01:19:39 WARN mapred.LocalJobRunner: job_local_0001
    java.lang.RuntimeException:
    java.lang.reflect.InvocationTargetException
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
    at
    org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
    ... 3 more
    Caused by: java.lang.UnsatisfiedLinkError: no etri_lmi in
    java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1709)
    at java.lang.Runtime.loadLibrary0(Runtime.java:823)
    at java.lang.System.loadLibrary(System.java:1030)
    at org.knu.freeparser.freeparser.<clinit>(freeparser.java:9)
    at org.etri.mp2893.LangAnal$LangAnalMapper.<init>(LangAnal.java:26)
    ... 8 more
    10/06/01 01:19:40 INFO mapred.JobClient: map 0% reduce 0%
    10/06/01 01:19:40 INFO mapred.JobClient: Job complete: job_local_0001
    10/06/01 01:19:40 INFO mapred.JobClient: Counters: 0
  • Edward Choi at Jun 2, 2010 at 1:12 am
    Hi Alex,
    I haven't even succeded running jobs on standalone mode yet. I will
    post the result of your "ps -aef" command as soon as I can get my job
    running on fully distributed mode though. Thanks for the support.

    2010. 6. 2. 오전 12:07 Alex Kozlov <alexvk@cloudera.com> 작성:
    Hi Edward, Can you provide the output of 'ps -aef -u hadoop | grep
    Child'
    on a tasktracker node while your job is running? -- Alex K

    On Mon, May 31, 2010 at 8:47 PM, Hemanth Yamijala
    wrote:
    Edward,

    If it's an option to copy the libraries to a fixed location on all
    the
    cluster nodes, you could do that and configure them in the library
    path via mapred.child.java.opts. Please look at http://bit.ly/ab93Z8
    (MapReduce tutorial on Hadoop site) to see how to use this config
    option for setting paths to native libraries.

    Thanks
    Hemanth
    On Tue, Jun 1, 2010 at 6:50 AM, edward choi wrote:
    Alex,

    Thanks for your advice.
    I tried both of them and neither works.
    I am still getting the UnsatisfiedLinkError when I try to run my
    application
    in StandAlone mode.

    I also made LangAnal.java belong to the package of
    "org.knu.freeparser" just
    in case. But still no luck.
    I put libetri_lmi.so to "/usr/lib" and still no luck.

    2010/6/1 Alex Kozlov <alexvk@cloudera.com>
    Try

    1. export JAVA_LIBRARY_PATH=/home/qa/lang/dll4jni/
    2. mkdir lib; mv LangAnal.jar lib; jar -cvef
    org.etri.mp2893.LangAnal
    ./lib ./org

    In a distributed mode you'll have to copy your dynamic libs to some
    directory thoughout the cluster and point JAVA_LIBRARY_PATH to it
    (or
    you
    can use distributed cache).

    Let me know if you have other config problems.

    Alex K

    On Mon, May 31, 2010 at 9:49 AM, edward choi <mp2893@gmail.com>
    wrote:
    Hi, I am trying to run a JNI application on StandAlone mode and
    I have been
    getting this result ever since.
    I've looked up every possible webs and sites but never could get
    the
    solution to it.
    Please help me find out what's wrong.

    I have 2 java files (LangAnal.java, freeparser.java)
    LangAnal.java belongs to a package "org.etri.mp2893"
    freeparser.java belongs to a package "org.knu.freeparser"

    freeparser.java loads "libetri_lmi.so" which is in
    "/home/qa/lang/dll4jni/"
    and I believe "libetri_lmi.so" uses another library "libparser.so"
    which
    is
    in "/home/qa/lang/lib".

    I have compiled LangAnal.java and freeparser.java. The resulting
    classes
    are
    placed respectively at "/home/qa/lang/dll4jni/org/etri/mp2893/"
    and
    "/home/qa/lang/dll4jni/org/knu/freeparser/".

    Then I have archived those classes by

    "jar -cvef org.etri.mp2893.LangAnal ./LangAnal.jar ./org/*"

    from the directory of "/home/qa/lang/dll4jni/".

    Then I tried to run

    hadoop jar LangAnal.jar ./news.txt ./output

    from the same directory and I get the result down below.

    I tried to set LD_LIBRARY_PATH to
    ".:/home/qa/lang/dll4jni:/home/qa/lang/lib"
    I tried java.setProperty("java.library.path",
    "/home/qa/lang/dll4jni");
    in
    freeparser.java.
    So far nothing has worked.

    But the funny thing is, if I run

    hadoop -Djava.library.path=/home/qa/lang/dll4jni
    org.etri.mp2893.LangAnal
    ./news.txt ./output

    from "/home/qa/lang/dll4jni/" it works just fine. So I guess the
    problem
    is
    just about jar file.
    And I intend to run this application in Fully-Distributed mode
    soon, I have
    to figure out how to run jar files.

    Please someone help me.

    ---
    ---
    ---
    ---
    ---
    ---
    ---
    ---
    ---
    ---
    ---
    ---
    ---
    ---
    ---
    ---
    -------------------------------------------------------------------
    [qa@qa128 dll4jni]$ hadoop jar LangAnal.jar ./news.txt ./output
    10/06/01 01:19:39 INFO jvm.JvmMetrics: Initializing JVM Metrics
    with
    processName=JobTracker, sessionId=
    10/06/01 01:19:39 WARN mapred.JobClient: Use
    GenericOptionsParser for
    parsing the arguments. Applications should implement Tool for the
    same.
    10/06/01 01:19:39 WARN mapred.JobClient: No job jar file set.
    User classes
    may not be found. See JobConf(Class) or JobConf#setJar(String).
    10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process
    :
    1
    10/06/01 01:19:39 INFO mapred.JobClient: Running job:
    job_local_0001
    10/06/01 01:19:39 INFO input.FileInputFormat: Total input paths to process
    :
    1
    10/06/01 01:19:39 WARN mapred.LocalJobRunner: job_local_0001
    java.lang.RuntimeException:
    java.lang.reflect.InvocationTargetException
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance
    (ReflectionUtils.java:115)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
    at
    org.apache.hadoop.mapred.LocalJobRunner$Job.run
    (LocalJobRunner.java:177)
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance
    (NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance
    (DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance
    (ReflectionUtils.java:113)
    ... 3 more
    Caused by: java.lang.UnsatisfiedLinkError: no etri_lmi in
    java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1709)
    at java.lang.Runtime.loadLibrary0(Runtime.java:823)
    at java.lang.System.loadLibrary(System.java:1030)
    at org.knu.freeparser.freeparser.<clinit>(freeparser.java:9)
    at org.etri.mp2893.LangAnal$LangAnalMapper.<init>(LangAnal.java:
    26)
    ... 8 more
    10/06/01 01:19:40 INFO mapred.JobClient: map 0% reduce 0%
    10/06/01 01:19:40 INFO mapred.JobClient: Job complete:
    job_local_0001
    10/06/01 01:19:40 INFO mapred.JobClient: Counters: 0

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedMay 31, '10 at 4:49p
activeJun 2, '10 at 1:12a
posts6
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase