FAQ
I had difficulty upgrading applications from Hadoop 0.20.2 to Hadoop
0.20.203.0.

The standalone mode runs without problem. In real cluster mode, the
program freeze at map 0% reduce 0% and there is only one attempt file
in the log directory. The only information is contained in stdout file :

#
# An unexpected error has been detected by Java Runtime Environment:
#
# SIGFPE (0x8) at pc=0x00002ae751a87b83, pid=5801, tid=1076017504
#
# Java VM: Java HotSpot(TM) 64-Bit Server VM (1.6.0-b105 mixed mode)
# Problematic frame:
# C [ld-linux-x86-64.so.2+0x7b83]
#
# An error report file with more information is saved as hs_err_pid5801.log
#
# If you would like to submit a bug report, please visit:
# http://java.sun.com/webapps/bugreport/crash.jsp


(stderr and syslog are empty)

So what is the problem in ld-linux-x86-64.so.2+0x7b83 ?

The program I was testing uses identity Mapper and Reducer, and the
input file is a single 1M plain text file. Then I found several hs_err
logs under the default directory of hadoop, I attach the log file in
this email.

The reason I upgrade from 0.20.2 is I had lots of disk check error when
processing TB data when the disks still have plenty of space. But then I
was stuck at getting a simple toy problem to work in 0.20.203.0.

Shi

Search Discussions

  • Edward Capriolo at Jul 1, 2011 at 11:42 pm
    That looks like an ancient version of java. Get 1.6.0_u24 or 25 from oracle.

    Upgrade to a recent java and possibly update your c libs.

    Edward
    On Fri, Jul 1, 2011 at 7:24 PM, Shi Yu wrote:

    I had difficulty upgrading applications from Hadoop 0.20.2 to Hadoop
    0.20.203.0.

    The standalone mode runs without problem. In real cluster mode, the
    program freeze at map 0% reduce 0% and there is only one attempt file in
    the log directory. The only information is contained in stdout file :

    #
    # An unexpected error has been detected by Java Runtime Environment:
    #
    # SIGFPE (0x8) at pc=0x00002ae751a87b83, pid=5801, tid=1076017504
    #
    # Java VM: Java HotSpot(TM) 64-Bit Server VM (1.6.0-b105 mixed mode)
    # Problematic frame:
    # C [ld-linux-x86-64.so.2+0x7b83]
    #
    # An error report file with more information is saved as hs_err_pid5801.log
    #
    # If you would like to submit a bug report, please visit:
    # http://java.sun.com/webapps/**bugreport/crash.jsp<http://java.sun.com/webapps/bugreport/crash.jsp>


    (stderr and syslog are empty)

    So what is the problem in ld-linux-x86-64.so.2+0x7b83 ?

    The program I was testing uses identity Mapper and Reducer, and the input
    file is a single 1M plain text file. Then I found several hs_err logs
    under the default directory of hadoop, I attach the log file in this email.

    The reason I upgrade from 0.20.2 is I had lots of disk check error when
    processing TB data when the disks still have plenty of space. But then I was
    stuck at getting a simple toy problem to work in 0.20.203.0.

    Shi

  • Shi Yu at Jul 2, 2011 at 12:09 am
    Thanks Edward! I upgraded to 1.6.0_26 and it worked.
    On 7/1/2011 6:42 PM, Edward Capriolo wrote:
    That looks like an ancient version of java. Get 1.6.0_u24 or 25 from oracle.

    Upgrade to a recent java and possibly update your c libs.

    Edward

    On Fri, Jul 1, 2011 at 7:24 PM, Shi Yuwrote:
    I had difficulty upgrading applications from Hadoop 0.20.2 to Hadoop
    0.20.203.0.

    The standalone mode runs without problem. In real cluster mode, the
    program freeze at map 0% reduce 0% and there is only one attempt file in
    the log directory. The only information is contained in stdout file :

    #
    # An unexpected error has been detected by Java Runtime Environment:
    #
    # SIGFPE (0x8) at pc=0x00002ae751a87b83, pid=5801, tid=1076017504
    #
    # Java VM: Java HotSpot(TM) 64-Bit Server VM (1.6.0-b105 mixed mode)
    # Problematic frame:
    # C [ld-linux-x86-64.so.2+0x7b83]
    #
    # An error report file with more information is saved as hs_err_pid5801.log
    #
    # If you would like to submit a bug report, please visit:
    # http://java.sun.com/webapps/**bugreport/crash.jsp<http://java.sun.com/webapps/bugreport/crash.jsp>


    (stderr and syslog are empty)

    So what is the problem in ld-linux-x86-64.so.2+0x7b83 ?

    The program I was testing uses identity Mapper and Reducer, and the input
    file is a single 1M plain text file. Then I found several hs_err logs
    under the default directory of hadoop, I attach the log file in this email.

    The reason I upgrade from 0.20.2 is I had lots of disk check error when
    processing TB data when the disks still have plenty of space. But then I was
    stuck at getting a simple toy problem to work in 0.20.203.0.

    Shi

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedJul 1, '11 at 11:24p
activeJul 2, '11 at 12:09a
posts3
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Shi Yu: 2 posts Edward Capriolo: 1 post

People

Translate

site design / logo © 2022 Grokbase