[erlang-questions] High memory consumption inside docker

Michał Muskała michal@REDACTED
Fri May 20 08:00:20 CEST 2016


Hello,

  

It took me some time to finally get around to looking again at this.

  

Running the instrument module turned out to be very helpful, thank you Lukas
for this suggestion.

  

I got very high usage for drv_tab, fd_tab and port_tab:

{drv_tab,[{sizes,50331648,50331648,50331648},  
          {blocks,1,1,1}]},  
{fd_tab,[{sizes,33554432,33554432,33554432},{blocks,1,1,1}]},  
{port_tab,[{sizes,12582975,12582975,12582975},  
           {blocks,1,1,1}]}  

  

Running docker with decreased ulimit via the --ulimit nofile=1024:1024 flag
seems to fix this issue.

There is still increased binary usage, but it's not in any way as severe as
this was.

  

Thank you for your help,

Michał.

On Mar 29 2016, at 6:44 pm, Lukas Larsson <lukas@REDACTED> wrote:  

> Hello,

>

>  

>

> You want to start the emulator with the "+Mim true +Mis true" flags and then
use instrument (<http://erlang.org/doc/man/instrument.html>) to inspect what
is going on. The most interesting for you should be something like:

>

>  

>

> lists:reverse(lists:keysort(2,instrument:memory_status(types))).

>

>  

>

> For example:

>

>  

>

> erl +Mis true +Mim true

>

> Erlang/OTP 19 [DEVELOPMENT] [erts-8.0] [source-9946a17] [64-bit] [smp:16:16]
[async-threads:10] [hipe] [kernel-poll:false]

>

> 1> lists:reverse(lists:keysort(2,instrument:memory_status(types))).

>

> [{timer_wheel,[{sizes,8391792,8391792,8391792},

>

>                {blocks,16,16,16}]},

>

>  {code,[{sizes,3463220,3463220,3469945},{blocks,84,84,84}]},

>

>  {proc_tab,[{sizes,3145791,3145791,3145791},{blocks,1,1,1}]},

>

>  {heap,[{sizes,1882920,1932208,3456120},{blocks,26,28,28}]},

>

>  {pre_alloc_data,[{sizes,1032303,1032303,1032303},

>

>                   {blocks,17,17,17}]},

>

>  {port_tab,[{sizes,786495,786495,786495},{blocks,1,1,1}]},

>

>  {old_heap,[{sizes,679872,682880,2941656},{blocks,10,11,11}]},

>

>  {scheduler_data,[{sizes,673141,673141,673141},

>

>                   {blocks,3,3,3}]},

>

>  {export_entry,[{sizes,462528,462528,462528},

>

>                 {blocks,2628,2628,2628}]},

>

>  {literal,[{sizes,379648,379648,379648},{blocks,84,84,84}]},

>

>  {ethread_long_lived,[{sizes,326016,326016,326016},

>

>                       {blocks,283,283,283}]},

>

>  {atom_entry,[{sizes,304840,304840,304840},

>

>               {blocks,7621,7621,7621}]},

>

>  {beam_register,[{sizes,264544,264544,264544},

>

>                  {blocks,32,32,32}]},

>

>  {export_tab,[{sizes,162888,162888,162888},

>

>               {blocks,15,15,15}]},

>

>  {hipe_data,[{sizes,148464,148464,148464},{blocks,18,18,18}]},

>

>  {ethread_standard,[{sizes,133487,133487,133487},

>

>                     {blocks,6,6,6}]},

>

>  {db_term,[{sizes,119568,119568,119568},

>

>            {blocks,388,388,388}]},

>

>  {atom_tab,[{sizes,112232,112232,121472},{blocks,10,10,10}]},

>

>  {atom_text,[{sizes,98328,98328,98328},{blocks,3,3,3}]},

>

>  {db_tabs,[{sizes,82023,82023,82023},{blocks,3,3,3}]},

>

>  {drv_internal,[{sizes,79352,79352,79608},{blocks,4,4,19}]},

>

>  {db_segment,[{sizes,51744,51744,53824},{blocks,19,19,...}]},

>

>  {fun_entry,[{sizes,51568,51568,...},{blocks,586,...}]},

>

>  {drv_binary,[{sizes,47774,...},{blocks,...}]},

>

>  {driver_event_state,[{sizes,...},{...}]},

>

>  {module_tab,[{...}|...]},

>

>  {module_entry,[...]},

>

>  {proc,...},

>

>  {...}|...]

>

>  

>

> So in the above we can see that the timer wheel structures take the most
memory and then code, proc tab etc etc.

>

>  

>

> Lukas

>

>  

>

> On Tue, Mar 29, 2016 at 2:01 PM, Michał Muskała
<[michal@REDACTED](mailto:michal@REDACTED)> wrote:  

>

>> Hello everybody,  
  
This is a followup from the elixir mailing list:  
<https://groups.google.com/forum/#!msg/elixir-lang-
talk/TqIcSVkHBxs/N1nWRWW9BwAJ>  
There were several issues involved in that particular problem, but one  
of them might be of interest to the people here as well, as it's  
generally erlang-related rather than tied to elixir or any package in  
particular.  
  
The issue comes down to unusually high memory consumption for erlang,  
specifically the system part from erlang:memory/0. The output below  
shows values when running outside docker and inside:  
  
Outside docker:  
erl -eval 'io:format("~w~n", [erlang:memory()]), init:stop().' -noshell  
[{total,15182640},{processes,3799768},{processes_used,3797720},{system,1138287
2},{atom,194289},{atom_used,169621},{binary,78344},{code,3868184},{ets,229416}
]  
  
Inside docker's official erlang image (debian based)  
docker run --rm -it erlang:18-slim erl -eval 'io:format("~w~n",  
[erlang:memory()]), init:stop().' -noshell  
[{total,111303160},{processes,3799768},{processes_used,3797720},{system,107503
392},{atom,194289},{atom_used,169584},{binary,595608},{code,3860673},{ets,2394
64}]  
  
Inside minimal alpine linux erlang image  
docker run --rm -it msaraiva/erlang erl -eval 'io:format("~w~n",  
[erlang:memory()]), init:stop().' -noshell  
[{total,111432888},{processes,3748312},{processes_used,3746312},{system,107684
576},{atom,194289},{atom_used,169869},{binary,596568},{code,3847723},{ets,1852
96}]  
  
As you can see all the values except for system are very similar, with  
system being almost 10x as large. The issue is also consistent across  
different images. While outside docker a simple shell uses around  
15MB, inside docker it grows to 80MB.  
I tried to investigate the issue further using recon, and it looks  
like the excessive memory is allocated with ll_alloc, although I have  
to admit I'm out of my depth here.  
  
I'll be thankful for any help or indications where I could dig deeper.  
  
Michał.  
_______________________________________________  
erlang-questions mailing list  
[erlang-questions@REDACTED](mailto:erlang-questions@REDACTED)  
<http://erlang.org/mailman/listinfo/erlang-questions>  

>

>  

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://erlang.org/pipermail/erlang-questions/attachments/20160519/6e381b56/attachment.htm>


More information about the erlang-questions mailing list