[erlang-questions] How to dig why get_tcp/port allocate so much binary memory?

叶少波 <>
Thu Sep 22 11:41:14 CEST 2016

Hi experts,

I wrote a server that accepts TCP connections. The listen socket starts with the options below:
  -define(TCP_OPTS, [
    {backlog, 256},
    {packet, 0},
    {active, false},
    {reuseaddr, true},
    {nodelay, false},
    {delay_send, true},
    {keepalive, true},
    {send_timeout, 60000},
    {exit_on_close, true}

On the server node every new connection will spawn a new gen_server to handle it.

And  then I spawn 5000 gen_servers on another erlang node(I call it Client node), every gen_server will connect to the server via TCP.

It is a really simple case.

After setup 5000 connections I found the binary memory on server node was used up to 17G;
and the binary memory on the Client node was 42M.  It is a huge difference.

Then I rebooted  the erlang node with "+Mim true"  and "+Mis true"; after re-setup 5000 connections again, I used 
instrument:memory_status(types) to check the memory status, I found the dry_binary allocated 17G memory:


My question is : How can I decrease the drv_binary memory? What parameter caused the server used so much memory?

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://erlang.org/pipermail/erlang-questions/attachments/20160922/22134504/attachment.html>

More information about the erlang-questions mailing list