[erlang-questions] Maximize Client TCP Sockets
Sun Jan 17 23:33:16 CET 2016
I have an application where I need to make 10s of millions of short-lived
plain TCP (gen_tcp) or HTTP/HTTPS (httpc/gun/hackney) requests to different
servers. The parameters for the requests are stored in a database. My plan
is for a controlling gen_server on each node in the cluster to:
* grab a group of requests from the database
* spawn a process for each request
* the spawned process connects, makes the request, processes the response
and shuts down the connection
What is the best strategy to maximize the number of concurrent clients? My
current plan is to maintain a list of pending requests in the controlling
process (gen_server). As processes complete, they remove themselves from
the list. If they error out on initialization I'd implement some sort of
back-off timer, pushing the unexecuted requests back into the database if
they couldn't execute after a predefined time limit.
Is there a better way to maximize the number of TCP client connections per
node in Erlang?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the erlang-questions