While I was working on getting used to working in Erlang and doing some tests on various data constructs, I made an attempt to make some measurements on performance -- mostly out of curiosity, not out of an attempt to actually optimize -- when I started noticing some irregularities on my Windows 7 machine.<br>
<br>On research, I've discovered I'm not the first to hit problems of strange variance in latency, and others in the past (example here: <a href="http://erlang.2086793.n4.nabble.com/How-to-make-the-Erlang-VM-predictible-when-spawning-processes-td2242097.html">http://erlang.2086793.n4.nabble.com/How-to-make-the-Erlang-VM-predictible-when-spawning-processes-td2242097.html</a> ) have reported mysterious latencies in the teens of ms on Windows. After doing some more tests then eventually resorting to msdn, it became clear that the issue is one of clock granularity, but it's catching people off guard because of the guaranteed uniqueness of now/0 results. This results in the appearance of time passing when it hasn't.<br>
<br>The issue here as I understand it is that the system clock itself can't report things in the microsecond range (msdn states the granularity is between 10ms and 15ms: <a href="http://msdn.microsoft.com/en-us/library/system.datetimeoffset.utcnow.aspx">http://msdn.microsoft.com/en-us/library/system.datetimeoffset.utcnow.aspx</a> ), but the real values are being mangled to provide those guaranteed unique values. Given that now/0 is the obvious means of doing measurement for an erlang user -- and in fact the instrumentation tools that ship with erlang also appear to rely on it -- this results in puzzling and undesired behavior.<br>
<br>My question is: Is there a way to do viable instrumentation in Erlang on Windows systems, or is it expected to use another system for instrumentation?<br>