FAQ
I am trying to understand how much memory is available to a 64 bit python
process running under Windows XP 64 bit.

When I run tests just creating a series of large dictionaries containing
string keys and float values I do not seem to be able to grow the process
beyond the amount of RAM present.

For example, on a box with 2GB RAM and 3 GB pagefile the process stalls at
around 2GB.

On another machine with 16GB RAM and 24GB pagefile the process stalls at
16GB.

In other tests where a C++ program loads and runs the python DLL, if C++
based operations are performed the memory usage will grow to 40GB, but if
python is used to grab the memory it can still only grow to 16GB. With this
program if the memory usage is grown to over 16GB on the C++ side,
attempting to grab any from python crashes the process.

I was under the impression that python could grab as much memory as other
programs.

Can anyone tell me what is happening or where I may be going wrong?

Thanks,
Rob Randall
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-list/attachments/20101208/34348554/attachment-0001.html>

Search Discussions

  • Ian Kelly at Dec 9, 2010 at 7:40 am

    On 12/8/2010 11:42 PM, Dennis Lee Bieber wrote:
    The page file can be larger than physical memory because it contains
    memory "images" for multiple processes. However, all those "images" have
    to map into the physically addressable memory -- so a process is likely
    limited to physical memory, but you can have multiple processes adding
    up to physical + pagefile in total.
    Only those pages that are currently paged in need be mapped to physical
    memory. The rest are not mapped to anything at all (other than a
    location in the page file) -- once a page is paged out, it need not be
    put back in its original page frame when it is paged in again.

    Since a process need not have all its pages in physical memory
    simultaneously, there is no reason to suppose that a single process
    could not consume the entirety of the available virtual memory (minus
    what is used by the operating system) on a 64-bit system (the same
    cannot be said of a 32-bit system, where the total virtual memory
    available may well be larger than the addressable space).
  • Heather Brown at Dec 9, 2010 at 1:18 pm

    On 01/-10/-28163 02:59 PM, Dennis Lee Bieber wrote:
    On Wed, 8 Dec 2010 14:44:30 +0000, Rob Randall<rob.randall2 at gmail.com>
    declaimed the following in gmane.comp.python.general:
    I am trying to understand how much memory is available to a 64 bit python
    process running under Windows XP 64 bit.

    When I run tests just creating a series of large dictionaries containing
    string keys and float values I do not seem to be able to grow the process
    beyond the amount of RAM present.

    For example, on a box with 2GB RAM and 3 GB pagefile the process stalls at
    around 2GB.

    On another machine with 16GB RAM and 24GB pagefile the process stalls at
    16GB.
    Probably need to ask M$, but I can understand this behavior as a
    hypothetical...

    The page file can be larger than physical memory because it contains
    memory "images" for multiple processes. However, all those "images" have
    to map into the physically addressable memory -- so a process is likely
    limited to physical memory, but you can have multiple processes adding
    up to physical + pagefile in total.
    It's plausible that MS did that, but it's not reasonable. An
    application's entire data space is never in physical memory, except for
    trivial applications. When new pages are needed, old ones are swapped
    out, in an LRU manner. If the application is the only thing "running,"
    it'll eventually be mapped into most of physical memory, but even then,
    the swapper keeps some back.

    The limit in 32bit world was 4gb, not whatever RAM happened to be in the
    machine. That limit came from the address space (or linear space, as MS
    calls it), not from the amount of RAM. It's only in recent years that
    those numbers have tended to be close.

    DaveA
  • Nobody at Dec 9, 2010 at 4:17 pm

    Rob Randall wrote:

    I am trying to understand how much memory is available to a 64 bit python
    process running under Windows XP 64 bit.

    When I run tests just creating a series of large dictionaries containing
    string keys and float values I do not seem to be able to grow the process
    beyond the amount of RAM present.

    For example, on a box with 2GB RAM and 3 GB pagefile the process stalls at
    around 2GB.
    What do you mean by "stalls"? Do you get an exception, or does the program
    just slow to a crawl?
  • Antoine Pitrou at Dec 9, 2010 at 4:54 pm

    On Wed, 8 Dec 2010 14:44:30 +0000 Rob Randall wrote:
    I am trying to understand how much memory is available to a 64 bit python
    process running under Windows XP 64 bit.

    When I run tests just creating a series of large dictionaries containing
    string keys and float values I do not seem to be able to grow the process
    beyond the amount of RAM present.

    For example, on a box with 2GB RAM and 3 GB pagefile the process stalls at
    around 2GB.

    On another machine with 16GB RAM and 24GB pagefile the process stalls at
    16GB.
    How is it surprising? When you go past the available RAM, your process
    starts swapping and everything becomes incredibly slower.

    Regards

    Antoine.
  • Rob Randall at Dec 9, 2010 at 5:18 pm
    But the C++ program using up memory does not slow up.
    It has gone to 40GB without much trouble.

    Does anyone have a 64 bit python application that uses more the 2GB?
    On 9 December 2010 16:54, Antoine Pitrou wrote:

    On Wed, 8 Dec 2010 14:44:30 +0000
    Rob Randall wrote:
    I am trying to understand how much memory is available to a 64 bit python
    process running under Windows XP 64 bit.

    When I run tests just creating a series of large dictionaries containing
    string keys and float values I do not seem to be able to grow the process
    beyond the amount of RAM present.

    For example, on a box with 2GB RAM and 3 GB pagefile the process stalls at
    around 2GB.

    On another machine with 16GB RAM and 24GB pagefile the process stalls at
    16GB.
    How is it surprising? When you go past the available RAM, your process
    starts swapping and everything becomes incredibly slower.

    Regards

    Antoine.


    --
    http://mail.python.org/mailman/listinfo/python-list
    -------------- next part --------------
    An HTML attachment was scrubbed...
    URL: <http://mail.python.org/pipermail/python-list/attachments/20101209/81aa640a/attachment.html>
  • Benjamin Kaplan at Dec 9, 2010 at 5:29 pm

    On Thursday, December 9, 2010, Rob Randall wrote:
    But the C++ program using up memory does not slow up.
    It has gone to 40GB without much trouble.
    Your C++ program probably doesn't have a garbage collector traversing
    the entire allocated memory looking for reference cycles.
    Does anyone have a 64 bit python application that uses more the 2GB?

    On 9 December 2010 16:54, Antoine Pitrou wrote:
    On Wed, 8 Dec 2010 14:44:30 +0000
    Rob Randall wrote:
    I am trying to understand how much memory is available to a 64 bit python
    process running under Windows XP 64 bit.

    When I run tests just creating a series of large dictionaries containing
    string keys and float values I do not seem to be able to grow the process
    beyond the amount of RAM present.

    For example, on a box with 2GB RAM and 3 GB pagefile the process stalls at
    around 2GB.

    On another machine with 16GB RAM and 24GB pagefile the process stalls at
    16GB.
    How is it surprising? When you go past the available RAM, your process
    starts swapping and everything becomes incredibly slower.

    Regards

    Antoine.


    --
    http://mail.python.org/mailman/listinfo/python-list
  • Rob Randall at Dec 9, 2010 at 5:56 pm
    I will give it a try with the garbage collector disabled.
    On 9 December 2010 17:29, Benjamin Kaplan wrote:
    On Thursday, December 9, 2010, Rob Randall wrote:
    But the C++ program using up memory does not slow up.
    It has gone to 40GB without much trouble.
    Your C++ program probably doesn't have a garbage collector traversing
    the entire allocated memory looking for reference cycles.
    Does anyone have a 64 bit python application that uses more the 2GB?

    On 9 December 2010 16:54, Antoine Pitrou wrote:
    On Wed, 8 Dec 2010 14:44:30 +0000
    Rob Randall wrote:
    I am trying to understand how much memory is available to a 64 bit
    python
    process running under Windows XP 64 bit.

    When I run tests just creating a series of large dictionaries containing
    string keys and float values I do not seem to be able to grow the
    process
    beyond the amount of RAM present.

    For example, on a box with 2GB RAM and 3 GB pagefile the process stalls
    at
    around 2GB.

    On another machine with 16GB RAM and 24GB pagefile the process stalls at
    16GB.
    How is it surprising? When you go past the available RAM, your process
    starts swapping and everything becomes incredibly slower.

    Regards

    Antoine.


    --
    http://mail.python.org/mailman/listinfo/python-list
    --
    http://mail.python.org/mailman/listinfo/python-list
    -------------- next part --------------
    An HTML attachment was scrubbed...
    URL: <http://mail.python.org/pipermail/python-list/attachments/20101209/1a9de2b0/attachment-0001.html>
  • Rob Randall at Dec 9, 2010 at 5:18 pm
    Basically the process runs at around 1% and it never seems to grow in size
    again.
    When running the C++ with python app the process slows when a new 'page' is
    required but then goes back to 'full' speed. It does this until basically
    all the virtual memory is used.

    I have had memory exceptions when running the same sort of stuff on 32 bit,
    but never 64 bit.
    On 9 December 2010 16:54, Antoine Pitrou wrote:

    On Wed, 8 Dec 2010 14:44:30 +0000
    Rob Randall wrote:
    I am trying to understand how much memory is available to a 64 bit python
    process running under Windows XP 64 bit.

    When I run tests just creating a series of large dictionaries containing
    string keys and float values I do not seem to be able to grow the process
    beyond the amount of RAM present.

    For example, on a box with 2GB RAM and 3 GB pagefile the process stalls at
    around 2GB.

    On another machine with 16GB RAM and 24GB pagefile the process stalls at
    16GB.
    How is it surprising? When you go past the available RAM, your process
    starts swapping and everything becomes incredibly slower.

    Regards

    Antoine.


    --
    http://mail.python.org/mailman/listinfo/python-list
    -------------- next part --------------
    An HTML attachment was scrubbed...
    URL: <http://mail.python.org/pipermail/python-list/attachments/20101209/60d00e53/attachment.html>
  • Antoine Pitrou at Dec 9, 2010 at 8:01 pm

    On Thu, 9 Dec 2010 17:18:58 +0000 Rob Randall wrote:
    Basically the process runs at around 1% and it never seems to grow in size
    again.
    When running the C++ with python app the process slows when a new 'page' is
    required but then goes back to 'full' speed. It does this until basically
    all the virtual memory is used.
    Intuitively, Benjamin Kaplan had the right answer: Python will
    periodically walk the heap of objects to look for dead reference cycles
    to collect; if your working set is larger than the available RAM, then
    this will thrash the pagefile to death.

    So try gc.disable() before doing your tests.

    I would stress that, of course, you will still have performance problems
    as soon as you start using all those areas you are allocating. And if
    you don't use them, I guess there's no point allocating them either. So
    I don't know what exactly you're trying to do (is this an actual
    application? or just some random test you're doing?), but relying on
    the pagefile to have more available memory than the system RAM is a
    very bad idea IMO.

    Regards

    Antoine.
  • John Nagle at Dec 9, 2010 at 5:23 pm

    On 12/8/2010 11:40 PM, Ian Kelly wrote:
    Since a process need not have all its pages in physical memory
    simultaneously, there is no reason to suppose that a single process
    could not consume the entirety of the available virtual memory (minus
    what is used by the operating system) on a 64-bit system (the same
    cannot be said of a 32-bit system, where the total virtual memory
    available may well be larger than the addressable space).
    Actually, the "32-bit" x86 machines since the Pentium Pro
    are really 36 to 48-bit machines. They only offer 32-bit flat address
    spaces to user programs, but the MMU and memory interface support a
    larger address space. The page table design supports 64-bit
    physical memory, but most of the bits beyond 36 usually aren't
    implemented. Linux fully supported this; Windows tried, but
    older drivers were a problem. That's why there are 32 bit
    machines with more than 4GB of RAM.

    None of the real 64-bit architectures, from AMD64 to SPARC
    to Itanium, need this hack.

    John Nagle
  • John Nagle at Dec 9, 2010 at 10:44 pm

    On 12/8/2010 10:42 PM, Dennis Lee Bieber wrote:
    On Wed, 8 Dec 2010 14:44:30 +0000, Rob Randall<rob.randall2 at gmail.com>
    declaimed the following in gmane.comp.python.general:
    I am trying to understand how much memory is available to a 64 bit python
    process running under Windows XP 64 bit.

    When I run tests just creating a series of large dictionaries containing
    string keys and float values I do not seem to be able to grow the process
    beyond the amount of RAM present.
    If you get to the point where you need multi-gigabyte Python
    dictionaries, you may be using the wrong tool for the job.
    If it's simply that you need to manage a large amount of data,
    that's what databases are for.

    If this is some super high performance application that needs to
    keep a big database in memory for performance reasons, CPython
    is probably too slow. For that, something like Google's BigTable
    may be more appropriate, and will scale to terabytes if necessary.

    John Nagle
  • Rob Randall at Dec 10, 2010 at 1:54 pm
    You guys are right. If I disable the gc it will use all the virtual RAM in
    my test.

    The application I have been running these tests for is a port of a program
    written in a LISP-based tool running on Unix.
    It does a mass of stress calculations.

    The port has been written using a python-based toolkit I am responsible for.
    This toolkit offers much of the same functionlity as the LISP tool.
    It is based around the use of demand-driven/declarative programming.

    When the porting project started no one realised just how much memory the
    heaviest of the test cases used.
    It uses 40+ GB on an HP Unix machine.

    It is easy to see now that the port should have been written differently,
    but it is essentially complete now.

    This has lead me to see if a hardware solution can be found using 64 bit
    windows machnes.

    I will try running one the tests next to see what impact disabling the gc
    will have.

    Thanks,
    Rob.

    On 9 December 2010 22:44, John Nagle wrote:
    On 12/8/2010 10:42 PM, Dennis Lee Bieber wrote:

    On Wed, 8 Dec 2010 14:44:30 +0000, Rob Randall<rob.randall2 at gmail.com>
    declaimed the following in gmane.comp.python.general:

    I am trying to understand how much memory is available to a 64 bit python
    process running under Windows XP 64 bit.

    When I run tests just creating a series of large dictionaries containing
    string keys and float values I do not seem to be able to grow the process
    beyond the amount of RAM present.
    If you get to the point where you need multi-gigabyte Python
    dictionaries, you may be using the wrong tool for the job.
    If it's simply that you need to manage a large amount of data,
    that's what databases are for.

    If this is some super high performance application that needs to keep a
    big database in memory for performance reasons, CPython
    is probably too slow. For that, something like Google's BigTable
    may be more appropriate, and will scale to terabytes if necessary.

    John Nagle

    --
    http://mail.python.org/mailman/listinfo/python-list
    -------------- next part --------------
    An HTML attachment was scrubbed...
    URL: <http://mail.python.org/pipermail/python-list/attachments/20101210/4d2959f9/attachment.html>
  • Rob Randall at Dec 10, 2010 at 7:03 pm
    I manged to get my python app past 3GB on a smaller 64 bit machine.
    On a test to check memory usage with gc disabled only an extra 6MB was used.

    The figures were 1693MB to 1687MB.

    This is great.

    Thanks again for the help.

    On 10 December 2010 13:54, Rob Randall wrote:

    You guys are right. If I disable the gc it will use all the virtual RAM in
    my test.

    The application I have been running these tests for is a port of a program
    written in a LISP-based tool running on Unix.
    It does a mass of stress calculations.

    The port has been written using a python-based toolkit I am responsible
    for. This toolkit offers much of the same functionlity as the LISP tool.
    It is based around the use of demand-driven/declarative programming.

    When the porting project started no one realised just how much memory the
    heaviest of the test cases used.
    It uses 40+ GB on an HP Unix machine.

    It is easy to see now that the port should have been written differently,
    but it is essentially complete now.

    This has lead me to see if a hardware solution can be found using 64 bit
    windows machnes.

    I will try running one the tests next to see what impact disabling the gc
    will have.

    Thanks,
    Rob.


    On 9 December 2010 22:44, John Nagle wrote:
    On 12/8/2010 10:42 PM, Dennis Lee Bieber wrote:

    On Wed, 8 Dec 2010 14:44:30 +0000, Rob Randall<rob.randall2 at gmail.com>
    declaimed the following in gmane.comp.python.general:

    I am trying to understand how much memory is available to a 64 bit
    python
    process running under Windows XP 64 bit.

    When I run tests just creating a series of large dictionaries containing
    string keys and float values I do not seem to be able to grow the
    process
    beyond the amount of RAM present.
    If you get to the point where you need multi-gigabyte Python
    dictionaries, you may be using the wrong tool for the job.
    If it's simply that you need to manage a large amount of data,
    that's what databases are for.

    If this is some super high performance application that needs to keep a
    big database in memory for performance reasons, CPython
    is probably too slow. For that, something like Google's BigTable
    may be more appropriate, and will scale to terabytes if necessary.

    John Nagle

    --
    http://mail.python.org/mailman/listinfo/python-list
    -------------- next part --------------
    An HTML attachment was scrubbed...
    URL: <http://mail.python.org/pipermail/python-list/attachments/20101210/911ebb6d/attachment.html>
  • Steve Holden at Dec 11, 2010 at 2:32 pm

    On 12/10/2010 2:03 PM, Rob Randall wrote:
    I manged to get my python app past 3GB on a smaller 64 bit machine.
    On a test to check memory usage with gc disabled only an extra 6MB was
    used.
    The figures were 1693MB to 1687MB.

    This is great.

    Thanks again for the help.
    Do remember, though, that with the GC turned off you will "lose" memory
    if you accidentally create cyclic data structures, since they will never
    be reclaimed. It doesn't sound like this is an issue, but I wanted this
    to act as a warning to others who might come across your solution but
    have programmed less carefully.

    regards
    Steve
    --
    Steve Holden +1 571 484 6266 +1 800 494 3119
    PyCon 2011 Atlanta March 9-17 http://us.pycon.org/
    See Python Video! http://python.mirocommunity.org/
    Holden Web LLC http://www.holdenweb.com/

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouppython-list @
categoriespython
postedDec 8, '10 at 2:44p
activeDec 11, '10 at 2:32p
posts15
users8
websitepython.org

People

Translate

site design / logo © 2022 Grokbase