[erlang-questions] Measuring message queue delay
Roger Lipscombe
roger@REDACTED
Wed Apr 29 11:12:11 CEST 2015
For various reasons, I want a metric that measures how long messages
spend in a process message queue. The process is a gen_server, if that
has any bearing on the solution. Also, this is for production, so it
will be always-on and must be low-impact.
I could do this by timestamping _every_ message sent to the process
and then reporting the deltas, but I think that's ugly.
I thought of posting a message to self(), with a timestamp and then
measuring the delta. I could then post another one as soon as that
message is processed. Obviously, this leaves the process continually
looping, which is not good.
So, instead, I could use erlang:send_after, but that requires two
messages: a delayed one triggered by erlang:send_after, and an
immediate one for measuring the delay.
That's a bit complicated.
Would it be sensible to send a message, with erlang:send_after, with
the _expected_ timestamp, and then compute the delta from that?
Or, alternatively, what other ways could I measure how long a process
is taking to handle its message queue?
Thanks,
Roger.
More information about the erlang-questions
mailing list