FAQ
hi,
One of my programs create a huge python dictionary and reducers fails with
Memory Error everytime.

Is there a way to specify reducer memory to be a bigger value for reducers
to succeed ?

I know we shuold not have this requirement in first place and not cerate
this kind of dictionary, but still can I finish this job with giving more
memory in jar command ?


Thanks,
JJ

Search Discussions

  • Harsh J at Jul 29, 2012 at 8:24 am
    Hi,

    You may raise your heap size via mapred.child.java.opts (or
    mapred.reduce.child.java.opts for reducers alone), and further raise
    the virtual-mem
    limit via mapred.child.ulimit (try setting it to 2x or 3x the heap
    size, in KB, or higher). I think its the latter you're running out
    with, since there's a subprocess involved.

    Let us know if that helps.
    On Sun, Jul 29, 2012 at 1:47 PM, Mapred Learn wrote:
    hi,
    One of my programs create a huge python dictionary and reducers fails with
    Memory Error everytime.

    Is there a way to specify reducer memory to be a bigger value for reducers
    to succeed ?

    I know we shuold not have this requirement in first place and not cerate
    this kind of dictionary, but still can I finish this job with giving more
    memory in jar command ?


    Thanks,
    JJ


    --
    Harsh J
  • Mapred Learn at Jul 29, 2012 at 8:25 am
    Hi Harsh,
    I tried all these but still fails.


    Sent from my iPhone
    On Jul 29, 2012, at 1:23 AM, Harsh J wrote:

    Hi,

    You may raise your heap size via mapred.child.java.opts (or
    mapred.reduce.child.java.opts for reducers alone), and further raise
    the virtual-mem
    limit via mapred.child.ulimit (try setting it to 2x or 3x the heap
    size, in KB, or higher). I think its the latter you're running out
    with, since there's a subprocess involved.

    Let us know if that helps.
    On Sun, Jul 29, 2012 at 1:47 PM, Mapred Learn wrote:
    hi,
    One of my programs create a huge python dictionary and reducers fails with
    Memory Error everytime.

    Is there a way to specify reducer memory to be a bigger value for reducers
    to succeed ?

    I know we shuold not have this requirement in first place and not cerate
    this kind of dictionary, but still can I finish this job with giving more
    memory in jar command ?


    Thanks,
    JJ


    --
    Harsh J
  • Mapred Learn at Jul 29, 2012 at 8:24 am
    + CDH users

    Sent from my iPhone
    On Jul 29, 2012, at 1:17 AM, Mapred Learn wrote:

    hi,
    One of my programs create a huge python dictionary and reducers fails with Memory Error everytime.

    Is there a way to specify reducer memory to be a bigger value for reducers to succeed ?

    I know we shuold not have this requirement in first place and not cerate this kind of dictionary, but still can I finish this job with giving more memory in jar command ?


    Thanks,
    JJ
  • Harsh J at Jul 29, 2012 at 6:46 pm
    ML,

    Upto what mapred.child.ulimit values have you tried submitting with?
    How large of a dict do you build in your program?
    On Sun, Jul 29, 2012 at 1:54 PM, Mapred Learn wrote:
    + CDH users

    Sent from my iPhone
    On Jul 29, 2012, at 1:17 AM, Mapred Learn wrote:

    hi,
    One of my programs create a huge python dictionary and reducers fails with Memory Error everytime.

    Is there a way to specify reducer memory to be a bigger value for reducers to succeed ?

    I know we shuold not have this requirement in first place and not cerate this kind of dictionary, but still can I finish this job with giving more memory in jar command ?


    Thanks,
    JJ


    --
    Harsh J

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupmapreduce-user @
categorieshadoop
postedJul 29, '12 at 8:17a
activeJul 29, '12 at 6:46p
posts5
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Mapred Learn: 3 posts Harsh J: 2 posts

People

Translate

site design / logo © 2022 Grokbase