[erlang-questions] System timers, now/0, and instrumentation

Amy Lear <>
Tue May 24 06:06:24 CEST 2011


While I was working on getting used to working in Erlang and doing some
tests on various data constructs, I made an attempt to make some
measurements on performance -- mostly out of curiosity, not out of an
attempt to actually optimize -- when I started noticing some irregularities
on my Windows 7 machine.

On research, I've discovered I'm not the first to hit problems of strange
variance in latency, and others in the past (example here:
http://erlang.2086793.n4.nabble.com/How-to-make-the-Erlang-VM-predictible-when-spawning-processes-td2242097.html)
have reported mysterious latencies in the teens of ms on Windows.
After
doing some more tests then eventually resorting to msdn, it became clear
that the issue is one of clock granularity, but it's catching people off
guard because of the guaranteed uniqueness of now/0 results. This results in
the appearance of time passing when it hasn't.

The issue here as I understand it is that the system clock itself can't
report things in the microsecond range (msdn states the granularity is
between 10ms and 15ms:
http://msdn.microsoft.com/en-us/library/system.datetimeoffset.utcnow.aspx ),
but the real values are being mangled to provide those guaranteed unique
values. Given that now/0 is the obvious means of doing measurement for an
erlang user -- and in fact the instrumentation tools that ship with erlang
also appear to rely on it -- this results in puzzling and undesired
behavior.

My question is: Is there a way to do viable instrumentation in Erlang on
Windows systems, or is it expected to use another system for
instrumentation?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://erlang.org/pipermail/erlang-questions/attachments/20110523/fa7926c9/attachment.html>


More information about the erlang-questions mailing list