Allocating more than 2 Go of memory

Discussion in 'Apple' started by Mats Weber, Sep 7, 2004.

  1. Mats Weber

    Mats Weber Guest

    I need to allocate more than 2 Go of memory in a C++ program with no
    user interface (no Carbon or Cocoa).

    When I try to cross the 2 Go barrier, I get the following messages:

    *** malloc: vm_allocate(size=1069056) failed (error code=3)
    *** malloc[15141]: error: Can't allocate region
    *** malloc: vm_allocate(size=1069056) failed (error code=3)
    *** malloc[15141]: error: Can't allocate region

    and the program stops.

    This is on a dual G5 with 4 Go of RAM and enough hard disk space,
    running Mac OS X 10.3.5. I checked ulimit, there is no limit.
     
    Mats Weber, Sep 7, 2004
    #1
    1. Advertisements

  2. In article <>,
    Mats Weber <> wrote:

    > I need to allocate more than 2 Go of memory in a C++ program with no
    > user interface (no Carbon or Cocoa).
    >
    > When I try to cross the 2 Go barrier, I get the following messages:
    >
    > *** malloc: vm_allocate(size=1069056) failed (error code=3)
    > *** malloc[15141]: error: Can't allocate region
    > *** malloc: vm_allocate(size=1069056) failed (error code=3)
    > *** malloc[15141]: error: Can't allocate region
    >
    > and the program stops.
    >
    > This is on a dual G5 with 4 Go of RAM and enough hard disk space,
    > running Mac OS X 10.3.5. I checked ulimit, there is no limit.


    That's a documented limitation in OS X. I can't remember if it's known
    that this will change in 10.4.

    --
    Standard output is like your butt. Everyone has one. When using a bathroom,
    they all default to going into a toilet. However, a person can redirect his
    "standard output" to somewhere else, if he so chooses. - Jeremy Nixon
     
    Gregory Weston, Sep 7, 2004
    #2
    1. Advertisements

  3. Mats Weber

    Eric Albert Guest

    In article <>,
    Gregory Weston <> wrote:

    > In article <>,
    > Mats Weber <> wrote:
    >
    > > I need to allocate more than 2 Go of memory in a C++ program with no
    > > user interface (no Carbon or Cocoa).
    > >
    > > When I try to cross the 2 Go barrier, I get the following messages:
    > >
    > > *** malloc: vm_allocate(size=1069056) failed (error code=3)
    > > *** malloc[15141]: error: Can't allocate region
    > > *** malloc: vm_allocate(size=1069056) failed (error code=3)
    > > *** malloc[15141]: error: Can't allocate region
    > >
    > > and the program stops.
    > >
    > > This is on a dual G5 with 4 Go of RAM and enough hard disk space,
    > > running Mac OS X 10.3.5. I checked ulimit, there is no limit.

    >
    > That's a documented limitation in OS X. I can't remember if it's known
    > that this will change in 10.4.


    I don't think it's strictly a 2 GB limit -- rather, it has to do with
    how things are laid out in memory and a very carefully designed program
    may be able to allocate a little bit more.

    That said, you can access more memory on a G5 today with certain messy
    techniques involving something like mapping and unmapping large blocks
    of memory. Check the archives of Apple's darwin-development mailing
    list or ask DTS for details.

    -Eric

    --
    Eric Albert
    http://rescomp.stanford.edu/~ejalbert/
     
    Eric Albert, Sep 7, 2004
    #3
  4. In article <ejalbert-866509.10272407092004@localhost>,
    Eric Albert <> wrote:

    > In article <>,
    > Gregory Weston <> wrote:
    >
    > > In article <>,
    > > Mats Weber <> wrote:
    > >
    > > > I need to allocate more than 2 Go of memory in a C++ program with no
    > > > user interface (no Carbon or Cocoa).
    > > >
    > > > When I try to cross the 2 Go barrier, I get the following messages:
    > > >
    > > > *** malloc: vm_allocate(size=1069056) failed (error code=3)
    > > > *** malloc[15141]: error: Can't allocate region
    > > > *** malloc: vm_allocate(size=1069056) failed (error code=3)
    > > > *** malloc[15141]: error: Can't allocate region
    > > >
    > > > and the program stops.
    > > >
    > > > This is on a dual G5 with 4 Go of RAM and enough hard disk space,
    > > > running Mac OS X 10.3.5. I checked ulimit, there is no limit.

    > >
    > > That's a documented limitation in OS X. I can't remember if it's known
    > > that this will change in 10.4.

    >
    > I don't think it's strictly a 2 GB limit -- rather, it has to do with
    > how things are laid out in memory and a very carefully designed program
    > may be able to allocate a little bit more.
    >
    > That said, you can access more memory on a G5 today with certain messy
    > techniques involving something like mapping and unmapping large blocks
    > of memory. Check the archives of Apple's darwin-development mailing
    > list or ask DTS for details.
    >
    > -Eric


    Sorry. Wetware memory failure. OS X has a limit of _four_ GB per
    process. I'm not sure where I lost a bit.

    --
    Standard output is like your butt. Everyone has one. When using a bathroom,
    they all default to going into a toilet. However, a person can redirect his
    "standard output" to somewhere else, if he so chooses. - Jeremy Nixon
     
    Gregory Weston, Sep 7, 2004
    #4
  5. Mats Weber

    Dave Seaman Guest

    On Tue, 07 Sep 2004 21:43:40 GMT, Gregory Weston wrote:
    > In article <ejalbert-866509.10272407092004@localhost>,
    > Eric Albert <> wrote:


    >> In article <>,
    >> Gregory Weston <> wrote:
    >>
    >> > In article <>,
    >> > Mats Weber <> wrote:
    >> >
    >> > > I need to allocate more than 2 Go of memory in a C++ program with no
    >> > > user interface (no Carbon or Cocoa).
    >> > >
    >> > > When I try to cross the 2 Go barrier, I get the following messages:
    >> > >
    >> > > *** malloc: vm_allocate(size=1069056) failed (error code=3)
    >> > > *** malloc[15141]: error: Can't allocate region
    >> > > *** malloc: vm_allocate(size=1069056) failed (error code=3)
    >> > > *** malloc[15141]: error: Can't allocate region


    I have successfully allocated over 2 GB using malloc, but the amount to be
    allocated must be unsigned if it is to fit into a 32-bit address.

    --
    Dave Seaman
    Judge Yohn's mistakes revealed in Mumia Abu-Jamal ruling.
    <http://www.commoncouragepress.com/index.cfm?action=book&bookid=228>
     
    Dave Seaman, Sep 8, 2004
    #5
  6. Mats Weber

    Mats Weber Guest

    In article <ejalbert-866509.10272407092004@localhost>,
    Eric Albert <> wrote:

    >I don't think it's strictly a 2 GB limit -- rather, it has to do with
    >how things are laid out in memory and a very carefully designed program
    >may be able to allocate a little bit more.


    You are right, I did some more testing and with a very simple program I
    was able to allocate 3.5 Go, so there is no 2 Go barrier:

    #include <iostream>

    int main()
    {
    for (int i = 0; ; ++i)
    {
    char* p = new char[1024 * 1024];

    // make sure each page actually gets mapped by writing to it.
    for (int j = 0; j < 1024; ++j)
    p[1024 * j] = 77;

    if (i % 100 == 0)
    {
    std::cout << i << "\n";
    sleep(1);
    }
    }
    }

    Now if I modify the program to allocate 1 Ko blocks (much more
    realistic, at least for what I am doing), the maximum I can allocate is
    only 1.7 Go !

    How can the memory allocator be so bad (note that there is absolutely no
    fragmentation here) ?

    Is there anyway around this ?
     
    Mats Weber, Sep 8, 2004
    #6
  7. Mats Weber

    Mats Weber Guest

    BTW I just tried this on Linux (kernel version 2.4.18) and I was able to
    allocate 3 Go in both cases: 1 Ko blocks and 1 Mo blocks.
     
    Mats Weber, Sep 8, 2004
    #7
  8. Mats Weber

    Thomas Jahns Guest

    Mats Weber <> writes:

    [...]

    > Now if I modify the program to allocate 1 Ko blocks (much more
    > realistic, at least for what I am doing), the maximum I can allocate is
    > only 1.7 Go !
    >
    > How can the memory allocator be so bad (note that there is absolutely no
    > fragmentation here) ?
    >
    > Is there anyway around this ?


    The problem to allocate a large total amount of memory in small blocks
    exists because memory is segmented (though not in the old x86
    sense). One place in memory is your code and constant statically
    allocated data (text segment), that's in most cases of negligible size
    compared to total address space. Then there is the bss segment
    containing uninitilized statically allocated data and the data segment
    containing initialized statically allocated datafrom the executable who
    both go into the data segment. Next there is a place for both, the heap
    (growing towards higher addresses) and the stack (growing in the
    opposite direction).

    If these were the only segments, it would be easy to allocate nearly 4gb
    in a 32 bit system but there is still the following two players in the
    memory competition:

    there is space reserved for mmap allocated memory (which malloc also
    uses for very large chunks and which is also used for shared memory) and
    the kernel is also mapped into every process for communication
    purposes. Shared libraries go into the mmap space BTW.

    Thus for every binary there is a number of restrictions on the placement
    of different parts of the programs in memory. On IRIX the sizes of
    heap/stack and mmap segments can be adjusted (by enlarging one while
    making the other smaller), but I don't know if that is possible for
    MacOS X.

    The placement of large chunks in the mmap space explains why your
    program can allocate more memory with large blocks and you might
    consider using a layer above malloc to split large blocks into the
    smaller ones you need.

    However complicated the above description might sound, things are really
    even much more involved. You might consider reading
    <>
    by Randolph J. Herber for more enlightenment or confusion, depending on
    the time you can afford, about the relationship of virtual and real
    memory (IMO a quite complex subject).

    Thomas Jahns
    --
    "Computers are good at following instructions,
    but not at reading your mind."
    D. E. Knuth, The TeXbook, Addison-Wesley 1984, 1986, 1996, p. 9
     
    Thomas Jahns, Sep 9, 2004
    #8
    1. Advertisements

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Similar Threads
  1. Guest
    Replies:
    0
    Views:
    1,483
    Guest
    Apr 12, 2005
  2. Peter Tögel
    Replies:
    4
    Views:
    714
    Michel
    Jan 28, 2004
  3. Jeniffer

    Need advice on CPU power needs for a Intel Core 2 Duo E6600

    Jeniffer, Nov 19, 2006, in forum: Motherboard Purchase Recommendations
    Replies:
    0
    Views:
    613
    Jeniffer
    Nov 19, 2006
  4. mamtim
    Replies:
    0
    Views:
    872
    mamtim
    May 14, 2007
  5. fentual

    Core 2 duo e6600 not running how i like it.

    fentual, Jun 27, 2007, in forum: Motherboard General Discussion
    Replies:
    1
    Views:
    666
    fentual
    Jun 27, 2007
  6. ahfei

    Motherboard that can support 2 processors?

    ahfei, Jun 2, 2008, in forum: Motherboard General Discussion
    Replies:
    3
    Views:
    1,459
    richgardon9
    Jul 7, 2008
  7. whatsupdude2007

    custom computer, and i need usb 2.0, help?

    whatsupdude2007, Mar 22, 2009, in forum: Motherboard General Discussion
    Replies:
    0
    Views:
    816
    whatsupdude2007
    Mar 22, 2009
  8. whipersGhost

    2 speaker stereo not shown in audio Device

    whipersGhost, Aug 2, 2012, in forum: Motherboard General Discussion
    Replies:
    0
    Views:
    1,468
    whipersGhost
    Aug 2, 2012
Loading...