<html>
<head>
<meta http-equiv="Content-Type" content="text/html;
charset=windows-1252">
</head>
<body>
<div class="moz-cite-prefix">On 2/27/21 8:52 AM, Max Lapshin wrote:<br>
</div>
<blockquote type="cite"
cite="mid:CAMxVRxAjXmOJH2Txb09jwGfBC_vB1rBhCZ3zDhaAL8YaRa_aEw@mail.gmail.com">
<pre class="moz-quote-pre" wrap="">I've read this article:
<a class="moz-txt-link-freetext" href="https://www.erlang-solutions.com/blog/performance-testing-the-jit-compiler-for-the-beam-vm/">https://www.erlang-solutions.com/blog/performance-testing-the-jit-compiler-for-the-beam-vm/</a>
and I see that perf top shows functions that are looking like $gen_server2_loop
When I launch my flussonic with erl 24, I do not see any functions
with this pattern, only functions from C.
So this is why I want to understand: is JIT working or I haven't
launched it at all?</pre>
</blockquote>
If <span class="pl-en">erlang</span>:<span class="pl-en">system_info</span>(<span
class="pl-c1">emu_flavor</span>) returns jit it should be
working. You should be able to see a difference if you profile the
execution with and without to determine how the jit execution
changes performance.<br>
<br>
Best Regards,<br>
Michael<br>
<br>
<br>
<blockquote type="cite"
cite="mid:CAMxVRxAjXmOJH2Txb09jwGfBC_vB1rBhCZ3zDhaAL8YaRa_aEw@mail.gmail.com">
<pre class="moz-quote-pre" wrap="">
On Wed, Feb 24, 2021 at 10:43 PM Max Lapshin <a class="moz-txt-link-rfc2396E" href="mailto:max.lapshin@gmail.com"><max.lapshin@gmail.com></a> wrote:
</pre>
<blockquote type="cite">
<pre class="moz-quote-pre" wrap="">
23.77% beam.smp [.] ac_find_all_non_overlapping
13.11% libc-2.27.so [.] __memmove_avx_unaligned_erms
I think that it is rather clear that we need to do something with
binary:split(..., [global])
On Wed, Feb 24, 2021 at 8:57 PM Dan Gudmundsson <a class="moz-txt-link-rfc2396E" href="mailto:dgud@erlang.org"><dgud@erlang.org></a> wrote:
</pre>
<blockquote type="cite">
<pre class="moz-quote-pre" wrap="">
All code is jit-compiled, so there is nothing to check.
The jit helps sequentiell code only, if that is not your bottleneck you will not see any speedup.
as if your current code uses a lot of bif's/nif's and message passing.
And I guess there is still minor stuff that can be improved.
On Wed, Feb 24, 2021 at 6:01 PM Max Lapshin <a class="moz-txt-link-rfc2396E" href="mailto:max.lapshin@gmail.com"><max.lapshin@gmail.com></a> wrote:
</pre>
<blockquote type="cite">
<pre class="moz-quote-pre" wrap="">
Hi.
I've tried to launch our flussonic under jit.
Thank you a lot for this work. We got zero speedup, but perf top have
showed where we can achieve up to 20% of boost (or maybe more).
Is it possible to introspect current jit status? Something like:
</pre>
<blockquote type="cite">
<pre class="moz-quote-pre" wrap="">jit:info().
</pre>
</blockquote>
<pre class="moz-quote-pre" wrap="">#{
calls => ...
misses => ...
...
}
or what do you use to check it in runtime?
</pre>
</blockquote>
</blockquote>
</blockquote>
</blockquote>
<br>
</body>
</html>