[erlang-questions] How to dig why get_tcp/port allocate so much binary memory?
Thu Sep 22 11:41:14 CEST 2016
I wrote a server that accepts TCP connections. The listen socket starts with the options below:
On the server node every new connection will spawn a new gen_server to handle it.
And then I spawn 5000 gen_servers on another erlang node(I call it Client node), every gen_server will connect to the server via TCP.
It is a really simple case.
After setup 5000 connections I found the binary memory on server node was used up to 17G;
and the binary memory on the Client node was 42M. It is a huge difference.
Then I rebooted the erlang node with "+Mim true" and "+Mis true"; after re-setup 5000 connections again, I used
instrument:memory_status(types) to check the memory status, I found the dry_binary allocated 17G memory:
My question is : How can I decrease the drv_binary memory? What parameter caused the server used so much memory?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the erlang-questions