[erlang-questions] GC memory consumption

Ryan Zezeski rzezeski@REDACTED
Fri Jul 1 01:29:17 CEST 2011


Dmitriy,

If you know the amount you need ahead of time you could set the initial heap size at spawn and avoid GC altogether.

-Ryan

[Sent from my iPhone]

On Jun 23, 2011, at 11:49 PM, Dmitriy Kargapolov <dmitriy.kargapolov@REDACTED> wrote:

> Hello,
> 
> Erlang application we develop encountered memory lack problem (It's 32-bit unix system, so memory is limited by 2GB total per application). Analysing crash dumps I have found consistent scenario for the issue. Important facts:
> 
> 1. Slogan: eheap_alloc: Cannot allocate 298930300 bytes of memory (of type "heap").
> 
> 2. All the processes are in 'waiting' state except one which is in 'Garbing' state.
> 
> 3. The 'Garbing' process 'Stack+heap' is 59786060 bytes, OldHeap is 0 bytes, some fragment data available (174663 bytes or so).
> 
> 4. System has also pretty large mnesia tables, this explains why there is not enough memory to allocate 300 MB segment at some point of execution.
> 
> Looking at the GC code I found that it always allocates new heap segment to do its work. So in our consistent pattern GC tries to allocate 298930300 bytes to perform garbage collection for the process which heap is 59786060 bytes.
> 
> Looks ridiculous - to de-fragment ~60 MB of the process heap, GC needs additional ~300 MB segment... This makes very difficult to design processes which have to keep large amount of data it its space (for example big tree-like structure or in-memory graph).
> 
> Is there any known workaround for this (except moving it to 64 bit OS) ?
> 
> Thanks.
> _______________________________________________
> erlang-questions mailing list
> erlang-questions@REDACTED
> http://erlang.org/mailman/listinfo/erlang-questions



More information about the erlang-questions mailing list