what is a good way to do 'maintaining data distribution'
HP Wei
hp@REDACTED
Sat Apr 12 18:16:18 CEST 2003
On of my responsibilities is to make sure that all databases
generated on a few master machines get distributed to other machines
in a timely and reliable fashion.
Scale: machine number is around 240.
Dependence: intertwining.
Currently, I have many python scripts here and there for doing this task.
When the machines number increases and interdependence rules get more
intertwined, it becomes a headache to keep on top of things.
Perhaps I did not make full use of python to deal
with the ever changing inter-dependence and inter-machine communication. :)
Anyway, I am looking into Erlang's way of doing this task, because of
the built-in inter-communication at the languague-level.
The goal is to set up one system across the entire LAN network, instead
of many isolated codes here and there.
Here are two approaches that I can think of.
(1) Write down the 'dependence rules'
--- in analogous to an erlang makefile.
Then, a sync-engine (in analogous to ermake.erl)
should gather data and execute functions to maintain rule's integrity.
(2) More general way (?).
we can have a list of entities,
[entity1, entity2, ...]
Each entity has its value (representing its state)
and has associated with it a rule (one or more functions) which may
depend on other entities.
---> sort of like a Spreadsheet model.
If the value of one entity gets changed, all other entities
that depend on it will be updated too.
------------------------------------------------------------
(1) is straightforward to do since ermake is an example.
I don't know Erlang deep enough to know
how to express (2) in Erlang.
Any suggestions are appreciated.
If anyone on this list has already such modules
(either ready-to-go (1) or example of (2)) that you don't
mind sharing, it would spare me more time to play the game of GO :-)
Thanks,
HP
More information about the erlang-questions
mailing list