FAQ
I have a simple map-reduce program, which runs fine under eclipse. However
when I execute it using hadoop, it gives me an out of memory error.
Hadoop_heapsize is 2000MB

Not sure what the problem is.

Search Discussions

  • Yin Lou at Oct 19, 2010 at 4:28 pm
    Hi,

    You can increase heapsize by -D mapred.child.java.opts="-d64 -Xmx4096m"

    Hope it helps.
    Yin
    On Tue, Oct 19, 2010 at 12:03 PM, web service wrote:

    I have a simple map-reduce program, which runs fine under eclipse. However
    when I execute it using hadoop, it gives me an out of memory error.
    Hadoop_heapsize is 2000MB

    Not sure what the problem is.
  • Shrijeet Paliwal at Oct 19, 2010 at 5:07 pm
    Where is it failing exactly? Map/Reduce tasks are failing or something else?
    On Tue, Oct 19, 2010 at 9:28 AM, Yin Lou wrote:

    Hi,

    You can increase heapsize by -D mapred.child.java.opts="-d64 -Xmx4096m"

    Hope it helps.
    Yin

    On Tue, Oct 19, 2010 at 12:03 PM, web service wrote:

    I have a simple map-reduce program, which runs fine under eclipse. However
    when I execute it using hadoop, it gives me an out of memory error.
    Hadoop_heapsize is 2000MB

    Not sure what the problem is.
  • Juwei Shi at Oct 20, 2010 at 2:08 am
    You should increase the heap size of the child JVM process running task
    tracker rather than that of the process running job tracker. By default,
    Hadoop allocates 1000 MB of memory to each daemon it runs. This is
    controlled by the HADOOP_HEAPSIZE setting in hadoop-env.sh. Note that this
    value is not for the child JVM to run map and reduce tasks.

    The memory given to each of these child JVMs can be changed by setting the
    mapred.child.java.opts property. The default setting is -Xmx200m, which
    gives each task 200 MB of memory.

    2010/10/20 Shrijeet Paliwal <shrijeet@rocketfuel.com>
    Where is it failing exactly? Map/Reduce tasks are failing or something
    else?

    On Tue, Oct 19, 2010 at 9:28 AM, Yin Lou wrote:

    Hi,

    You can increase heapsize by -D mapred.child.java.opts="-d64 -Xmx4096m"

    Hope it helps.
    Yin

    On Tue, Oct 19, 2010 at 12:03 PM, web service wrote:

    I have a simple map-reduce program, which runs fine under eclipse.
    However when I execute it using hadoop, it gives me an out of memory error.
    Hadoop_heapsize is 2000MB

    Not sure what the problem is.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupmapreduce-user @
categorieshadoop
postedOct 19, '10 at 4:03p
activeOct 20, '10 at 2:08a
posts4
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase