FAQ
Hello,

Does anyone know how to increase heap allocation for Hadoop QA runs, or at
least check the available amount of memory?

Thanks,
--Mikhail

Search Discussions

  • Ted Yu at Feb 10, 2012 at 8:51 pm
    This should do:

    Index: pom.xml
    ===================================================================
    --- pom.xml (revision 1242915)
    +++ pom.xml (working copy)
    @@ -350,7 +350,7 @@

    <configuration>

    <forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
    - <argLine>-enableassertions -Xmx1900m
    -Djava.security.egd=file:/dev/./urandom</argLine>
    + <argLine>-d32 -enableassertions -Xmx2300m
    -Djava.security.egd=file:/dev/./urandom</argLine>
    <redirectTestOutputToFile>true</redirectTestOutputToFile>
    </configuration>
    </plugin>
    On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin wrote:

    Hello,

    Does anyone know how to increase heap allocation for Hadoop QA runs, or at
    least check the available amount of memory?

    Thanks,
    --Mikhail
  • Mikhail Bautin at Feb 10, 2012 at 9:06 pm
    @Ted: thanks for the suggestion.

    Maybe I should have worded my question differently. I am interested in the
    actual amount of memory available on Hadoop QA machines, because I see
    out-of-memory errors in native memory allocation (not part of Java heap)
    that only happen in Hadoop QA.

    Perhaps we should define a "reference configuration" for HBase test suite.
    E.g. do we expect all unit tests to pass on a 2 GB box, a 4 GB box, etc.?

    Thanks,
    --Mikhail
    On Fri, Feb 10, 2012 at 12:50 PM, Ted Yu wrote:

    This should do:

    Index: pom.xml
    ===================================================================
    --- pom.xml (revision 1242915)
    +++ pom.xml (working copy)
    @@ -350,7 +350,7 @@

    <configuration>

    <forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
    - <argLine>-enableassertions -Xmx1900m
    -Djava.security.egd=file:/dev/./urandom</argLine>
    + <argLine>-d32 -enableassertions -Xmx2300m
    -Djava.security.egd=file:/dev/./urandom</argLine>
    <redirectTestOutputToFile>true</redirectTestOutputToFile>
    </configuration>
    </plugin>

    On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin <
    bautin.mailing.lists@gmail.com> wrote:
    Hello,

    Does anyone know how to increase heap allocation for Hadoop QA runs, or at
    least check the available amount of memory?

    Thanks,
    --Mikhail
  • Ted Yu at Feb 10, 2012 at 9:16 pm
    Mikhail:
    Would this help
    http://stackoverflow.com/questions/6878883/how-do-i-determine-maxdirectmemorysize-on-a-running-jvm?

    I tried to set XX:MaxDirectMemorySize
    According to
    http://stackoverflow.com/questions/3773775/default-for-xxmaxdirectmemorysize,
    the default is 64 MB.

    But even if I set XX:MaxDirectMemorySize=64m, I got the following on
    MacBook:

    Error occurred during initialization of VM
    Could not reserve enough space for object heap
    Could not create the Java virtual machine.

    So some expert advice is needed :-)
    On Fri, Feb 10, 2012 at 1:06 PM, Mikhail Bautin wrote:

    @Ted: thanks for the suggestion.

    Maybe I should have worded my question differently. I am interested in the
    actual amount of memory available on Hadoop QA machines, because I see
    out-of-memory errors in native memory allocation (not part of Java heap)
    that only happen in Hadoop QA.

    Perhaps we should define a "reference configuration" for HBase test suite.
    E.g. do we expect all unit tests to pass on a 2 GB box, a 4 GB box, etc.?

    Thanks,
    --Mikhail
    On Fri, Feb 10, 2012 at 12:50 PM, Ted Yu wrote:

    This should do:

    Index: pom.xml
    ===================================================================
    --- pom.xml (revision 1242915)
    +++ pom.xml (working copy)
    @@ -350,7 +350,7 @@

    <configuration>

    <forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
    - <argLine>-enableassertions -Xmx1900m
    -Djava.security.egd=file:/dev/./urandom</argLine>
    + <argLine>-d32 -enableassertions -Xmx2300m
    -Djava.security.egd=file:/dev/./urandom</argLine>
    <redirectTestOutputToFile>true</redirectTestOutputToFile>
    </configuration>
    </plugin>

    On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin <
    bautin.mailing.lists@gmail.com> wrote:
    Hello,

    Does anyone know how to increase heap allocation for Hadoop QA runs, or at
    least check the available amount of memory?

    Thanks,
    --Mikhail
  • N Keywal at Feb 10, 2012 at 9:43 pm
    Hi,

    If you want to check the resources available during the tests execution you
    can enhance org.apache.hadoop.hbase.ResourceChecker, and log a message if
    something looks wrong. There's a UnixOperatingSystemMXBean from which you
    can get some stuff. This rule is executed before & after each test method.

    Cheers,

    N.
    On Fri, Feb 10, 2012 at 10:16 PM, Ted Yu wrote:

    Mikhail:
    Would this help

    http://stackoverflow.com/questions/6878883/how-do-i-determine-maxdirectmemorysize-on-a-running-jvm
    ?

    I tried to set XX:MaxDirectMemorySize
    According to

    http://stackoverflow.com/questions/3773775/default-for-xxmaxdirectmemorysize
    ,
    the default is 64 MB.

    But even if I set XX:MaxDirectMemorySize=64m, I got the following on
    MacBook:

    Error occurred during initialization of VM
    Could not reserve enough space for object heap
    Could not create the Java virtual machine.

    So some expert advice is needed :-)

    On Fri, Feb 10, 2012 at 1:06 PM, Mikhail Bautin <
    bautin.mailing.lists@gmail.com> wrote:
    @Ted: thanks for the suggestion.

    Maybe I should have worded my question differently. I am interested in the
    actual amount of memory available on Hadoop QA machines, because I see
    out-of-memory errors in native memory allocation (not part of Java heap)
    that only happen in Hadoop QA.

    Perhaps we should define a "reference configuration" for HBase test suite.
    E.g. do we expect all unit tests to pass on a 2 GB box, a 4 GB box, etc.?

    Thanks,
    --Mikhail
    On Fri, Feb 10, 2012 at 12:50 PM, Ted Yu wrote:

    This should do:

    Index: pom.xml
    ===================================================================
    --- pom.xml (revision 1242915)
    +++ pom.xml (working copy)
    @@ -350,7 +350,7 @@

    <configuration>

    <forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
    - <argLine>-enableassertions -Xmx1900m
    -Djava.security.egd=file:/dev/./urandom</argLine>
    + <argLine>-d32 -enableassertions -Xmx2300m
    -Djava.security.egd=file:/dev/./urandom</argLine>
    <redirectTestOutputToFile>true</redirectTestOutputToFile>
    </configuration>
    </plugin>

    On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin <
    bautin.mailing.lists@gmail.com> wrote:
    Hello,

    Does anyone know how to increase heap allocation for Hadoop QA runs,
    or
    at
    least check the available amount of memory?

    Thanks,
    --Mikhail
  • Ted Yu at Feb 11, 2012 at 4:51 am
    I found that it was -Xmx2300m that caused JVM error, not
    -XX:MaxDirectMemorySize=200m

    The following setting allows unit tests to run on MacBook:
    -d32 -XX:MaxDirectMemorySize=200m -enableassertions -Xmx1900m

    FYI
    On Fri, Feb 10, 2012 at 1:42 PM, N Keywal wrote:

    Hi,

    If you want to check the resources available during the tests execution you
    can enhance org.apache.hadoop.hbase.ResourceChecker, and log a message if
    something looks wrong. There's a UnixOperatingSystemMXBean from which you
    can get some stuff. This rule is executed before & after each test method.

    Cheers,

    N.
    On Fri, Feb 10, 2012 at 10:16 PM, Ted Yu wrote:

    Mikhail:
    Would this help

    http://stackoverflow.com/questions/6878883/how-do-i-determine-maxdirectmemorysize-on-a-running-jvm
    ?

    I tried to set XX:MaxDirectMemorySize
    According to

    http://stackoverflow.com/questions/3773775/default-for-xxmaxdirectmemorysize
    ,
    the default is 64 MB.

    But even if I set XX:MaxDirectMemorySize=64m, I got the following on
    MacBook:

    Error occurred during initialization of VM
    Could not reserve enough space for object heap
    Could not create the Java virtual machine.

    So some expert advice is needed :-)

    On Fri, Feb 10, 2012 at 1:06 PM, Mikhail Bautin <
    bautin.mailing.lists@gmail.com> wrote:
    @Ted: thanks for the suggestion.

    Maybe I should have worded my question differently. I am interested in the
    actual amount of memory available on Hadoop QA machines, because I see
    out-of-memory errors in native memory allocation (not part of Java
    heap)
    that only happen in Hadoop QA.

    Perhaps we should define a "reference configuration" for HBase test suite.
    E.g. do we expect all unit tests to pass on a 2 GB box, a 4 GB box,
    etc.?
    Thanks,
    --Mikhail
    On Fri, Feb 10, 2012 at 12:50 PM, Ted Yu wrote:

    This should do:

    Index: pom.xml
    ===================================================================
    --- pom.xml (revision 1242915)
    +++ pom.xml (working copy)
    @@ -350,7 +350,7 @@

    <configuration>

    <forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
    - <argLine>-enableassertions -Xmx1900m
    -Djava.security.egd=file:/dev/./urandom</argLine>
    + <argLine>-d32 -enableassertions -Xmx2300m
    -Djava.security.egd=file:/dev/./urandom</argLine>
    <redirectTestOutputToFile>true</redirectTestOutputToFile>
    </configuration>
    </plugin>

    On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin <
    bautin.mailing.lists@gmail.com> wrote:
    Hello,

    Does anyone know how to increase heap allocation for Hadoop QA
    runs,
    or
    at
    least check the available amount of memory?

    Thanks,
    --Mikhail
  • Stack at Feb 11, 2012 at 4:53 am

    On Fri, Feb 10, 2012 at 8:50 PM, Ted Yu wrote:
    I found that it was -Xmx2300m that caused JVM error, not
    -XX:MaxDirectMemorySize=200m

    The following setting allows unit tests to run on MacBook:
    -d32 -XX:MaxDirectMemorySize=200m -enableassertions -Xmx1900m
    I committed patch which sets hadoopqa to run w/ 3g heap. Should I
    have set maxdirectmemorysize too?
    St.Ack
  • Ted Yu at Feb 11, 2012 at 5:00 am
    May not be necessary. Mikhail has found the cause for Out Of Memory error
    in TestHFileBlock.

    Cheers
    On Fri, Feb 10, 2012 at 8:53 PM, Stack wrote:
    On Fri, Feb 10, 2012 at 8:50 PM, Ted Yu wrote:
    I found that it was -Xmx2300m that caused JVM error, not
    -XX:MaxDirectMemorySize=200m

    The following setting allows unit tests to run on MacBook:
    -d32 -XX:MaxDirectMemorySize=200m -enableassertions -Xmx1900m
    I committed patch which sets hadoopqa to run w/ 3g heap. Should I
    have set maxdirectmemorysize too?
    St.Ack

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupdev @
categorieshbase, hadoop
postedFeb 10, '12 at 8:49p
activeFeb 11, '12 at 5:00a
posts8
users4
websitehbase.apache.org

People

Translate

site design / logo © 2022 Grokbase