[erlang-questions] kicking my servers all day long...

Jason Dusek jsnx@REDACTED
Sun May 20 03:00:14 CEST 2007

I'm writing a distributed, finite differences heat diffusion
simulation in Erlang. My idea about how to do it goes like this:
    a) break the big grid into many little grids
    b) assign the grids to individual servers, and connect each
       server to those servers with adjacent grids.
    c) for each time step, have the servers evolve their state and
       then spawn a new server with the updated data
I have sample code which models (c) as updating an int every 50
milliseconds -- it's posted on pastie:


Although (c) is conceptually simple, I'm concerned it may be a source
of evil performance problems -- I have to spawn bazillions of servers,
over and over and over again! Is there another way to do it? What are
some other approaches to distributed, finite differences computing in

I tried posting this on comp.lang.functional and they steered me
toward shared memory concurrency and mutable state!


More information about the erlang-questions mailing list