Transfering large data

Behdad Forghani <>
Fri Aug 13 00:09:28 CEST 2010



I am writing a remote collection agent software in Erlang. I have
gen_server application that reads binary data (protocol traces) and then
sends the contents to another gen_server. The way I have designed it is that
I process a file and write it to an output file. Then I read the output file
using file:read_file and then send it to the central server with
gen_server:call. The reason I chose this model is:


1 - If I send data packet by packet with a gen_server:call, the resulting
"stop and go" protocol would slower than sending bigger chunk.

2 - This is an atomic one shot operation and I will not have to keep track
of which collection agent is sending the file and keep a table of processes
and file handles and take care of closing file handles if the process dies.


My question is:

1-      Shall I be worried if the file is too large. Will I run into
limitation of how much data gen_server:call can embed? Is there a limit?

2-      Is there a better way of doing this in Erlang.


I appreciate your comments.




More information about the erlang-questions mailing list