cross-module inlining - just in time?
Richard A. O'Keefe
ok@REDACTED
Wed Apr 5 06:43:08 CEST 2006
Roger Larsson <roger.larsson@REDACTED> wrote:
Why do all cross-module inlining at compile time?
Why not when (re)loading?
In this particular thread, it doesn't matter.
Ever since I read about Kistler's "Juice" implementation of Oberon,
I've liked the idea. Oberon is a Wirth language, sort of a cleaned up
and stripped down Modula-2 with a few extra goodies thrown in. One of
them is dynamically loadable and unloadable modules. Oberon was
implemented in a fairly conventional way, with a compiler generating
linkable object files, and a dynamic relocating linker pulling them in.
Kistler wrote an Oberon compiler that compiled a module into a
(rather cunningly) compressed abstract syntax tree. These compressed
ASTs were so much smaller than the usual sort of linkable object file,
the the cost of reading an AST from disc *and* dynamically generating
native object code was less than the cost of linking an ordinary
object file, and the code was better too. (And of course Juice files
were so much smaller than .class files that it really wasn't funny.
Oddly enough, we never heard about .jar files until _after_ Kistler
published...)
Now the Juice compiler for Oberon could be trusting. It could assume that
the compressed AST it was reading was a tree that described a *legal*
Oberon module (and variables had already been looked up &c). So the run-
time code generator didn't have to repeat the work the compiler had
already done.
I have been saying for years (mainly to people at SERC) that I couldn't
see any reason why the idea wouldn't work for Erlang. I have already
pointed out with measurements this year that if you take Erlang source
code and strip out comments and indentation and then compress, you get
something _much_ smaller than .beam files. An adaptation of the Juice
idea should yield a compressed representation that is smaller still AND
which makes code generation easy.
So "compilation" could really be "static checking and compression"
and "loading" could really be "decompression-and-code-generation".
So I don't see "compiling" -vs- "loading" as a big deal.
In both cases, the crucial points is that code from the exporting module
winds up indissolubly tangled up with code from the importing module.
The problem this creates is that in the present system, if module M
is reloaded, *only* module M is affected. The amount of work is bounded
by some (hopefully linear) function of the size of M. But in a system
with cross-module inlining, a change to a 3-line module might require
300 MB of code from 30,000 modules to be recompiled. (Imagine the fun
if one of the affected modules is the recompiler...)
When a module is reloaded all such optimizations could be discarded.
Understand that "optimisations" here includes constant propagation and
dead code elimination. Suppose you have 200 modules that
-import(options, [debug/0]).
...
f(...) when debug() -> ...
f(...) -> ...
and you used to have
-module(options).
-export([debug/0, ...]).
debug() -> false.
Now you change the last line to
debug() -> true.
recompile, and reload. The code that needs to be "unoptimised" is a lot
of code that isn't _there_ any more. In general, there isn't much you
can do better than going back to the sources and running the whole code
generation process (including constant propagation and dead code
elimination) again.
Wouldn't just in time inlining solve:
Nothing that any other kind of cross-module inlining wouldn't solve.
- records
- interface functions (with data validation, send, receive)
- constant functions
It's a nice idea, although as I pointed out this kind of thing is anything
but new. But "just in time inlining" doesn't seem to be relevantly different
from doing cross-module inlining at any other time; the issue is that
cross-module inlining
- creates dependencies between modules which have to be tracked
- may cause extremely large amounts of recompilation to be required
in response to what looks like a small change (and even if this is
deferred, it still has to happen SOME time)
- therefore needs to be under explicit programmer control.
Maybe there should be two types of cross-module inlining.
1. Modules compiled together with other modules - subsystem
Contained and loaded together. ...
2. Modules loaded together at runtime - system
Several years ago, I proposed a distinction between
-import_late(Module[, Functions]).
Late binding dependency on Module,
listed Functions must be provided by Module
and may be used without Module: prefixes.
-import_early(Module[, Functions]).
Early binding dependency on Module,
listed Functions must be provided by Module
and may be used without Module: prefixes.
This would let you say
-import_early(lists, [map/2]).
or whatever, announcing that you are willing to pay the price of having
this module recompiled/regenerated/whatever if/when the lists module is
reloaded, or -import_late(...) if you wanted reloading to be cheap and
calls to be dear.
Because of the potential benefits of cross-module inlining, it should be
available. Because of the potential very high costs, it should not be
the default. I think something like the -import_early/-import_late
explicit indication is the way to go.
More information about the erlang-questions
mailing list