Hello Shahrdad,<br><br>Thanks. <br>The reason for the concentration on Neuroevolution is because the goal of my research is to develop computational intelligence systems that are flexible, scalable, general, and capable of learning and self modification. Seed computational intelligence... At this time we know with absolute certainty of one approach that has worked and is capable of producing such systems; the method is Evolution, the evolution of neural networks, and the proof that it works, is us, biological thinking machines. Our brains are the result of billions of years of evolution, which has carved out in flesh our neural circuitry through trial and error.<br>
<br>Because DXNN MK2 is fully decoupled, it allows for other search methods, selection, mutation... to be implemented and simply plugged in. As others begin using the system, they will most likely modify and try out and create new modules and functions with regards to search optimisation (artificial immune system, ant colony, swarm, CMA-CS...), selection, mutation..., and hopefully contribute those new functions and modules, thus making them selectable as options within the DXNN, and make the system even more useful within the field, and in general.<br>
<br>Best regards,<br>-Gene<br><br><div class="gmail_quote">On Fri, Jun 15, 2012 at 9:52 AM, Shahrdad Shadab <span dir="ltr"><<a href="mailto:shahrdad1@gmail.com" target="_blank">shahrdad1@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Grate work! I cannot wait to read your book.<br> I am also working on using Erlang in statistical machine learning, this requires of mathematical/statistical <br>
library functions (like linear algebra / statistical libraries and so on) to be implemented in Erlang which is taking a lot of time from me.<br>
I wonder if you ever looked at statistical approach to AI and why you didn't follow that path as opposed to neuron-genetic approach.<br><br>Thanks a lot<br>Best regards<br>Shahrdad<br><br><br><br><div class="gmail_quote">
<div><div>
On Tue, Jun 12, 2012 at 1:31 PM, G.S. <span dir="ltr"><<a href="mailto:corticalcomputer@gmail.com" target="_blank">corticalcomputer@gmail.com</a>></span> wrote:<br></div></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div><div>
Hello all,<br><br>DXNN [1,4] is Topology and Parameter Evolving Universal Learning Network (TPEULN) system, similar to topology and weight evolving artificial neural network, but more general, and not constrained to the use of only sigmoid based activation function neurons. Erlang was chosen because of its perfect and complete mapping to the neural network architecture.<br>
<br>
DXNN is a TPEULN platform that uses direct and indirect encoding (neural and substrate respectively [5]), has a cross-validation system for experimentation, decoupled sensor/actuator systems, decoupled learning/selection/... algorithms (in MK2), a built in 2d world simulator called flatland for ALife experiments (all in gs()).<br>
<br>The second generation (mk2) DXNN is available as a branch of the original project, and is a clean implementation of this computational intelligence evolving system. It is also the system explained and created in my Springer book: Handbook of Neuroevolution Through Erlang [2,3], with a foreword written by Joe Armstrong. The book will go into print this September.<br>
<br>There are not a lot of comments within the source code on github, but I will continue to add more comments as time permits.<br>
<br>Upcoming features:<br>1. Visualisation system.<br>2. New selection algorithm modules.<br>3. New speciation and diversification functions.<br>4. An improved cross-validation system for the experiment database.<br>5. Full population backup, so that all agents are saved, and only manually deleted at the researcher's request (they don't take much space, and it would make for an interesting visualisation, and ability to traverse from the seed agent to the current agent).<br>
<br>-Gene<br>[1] <a href="https://github.com/CorticalComputer/DXNN" target="_blank">https://github.com/CorticalComputer/DXNN</a> First generation DXNN has a convoluted implementation. DXNN mk2 is a very clean implementation and is currently on the non master branch, it will eventually overwrite the master branch but both have the same features (almost) at this time.<br>
[2] <a href="http://www.springer.com/computer/swe/book/978-1-4614-4462-6" target="_blank">http://www.springer.com/computer/swe/book/978-1-4614-4462-6</a><br>[3] <a href="http://www.amazon.com/Handbook-Neuroevolution-through-Erlang-Gene/dp/1461444624/ref=sr_1_1?ie=UTF8&qid=1338163875&sr=8-1" target="_blank">http://www.amazon.com/Handbook-Neuroevolution-through-Erlang-Gene/dp/1461444624/ref=sr_1_1?ie=UTF8&qid=1338163875&sr=8-1</a><a href="https://github.com/CorticalComputer/DXNN" target="_blank"></a><br>
[4] <a href="http://www.erlang-factory.com/conference/SFBay2012/speakers/GeneSher" target="_blank">http://www.erlang-factory.com/conference/SFBay2012/speakers/GeneSher</a><br>[5] <a href="http://arxiv.org/abs/1111.5892" target="_blank">http://arxiv.org/abs/1111.5892</a><br>
<br></div></div><div>_______________________________________________<br>
erlang-questions mailing list<br>
<a href="mailto:erlang-questions@erlang.org" target="_blank">erlang-questions@erlang.org</a><br>
<a href="http://erlang.org/mailman/listinfo/erlang-questions" target="_blank">http://erlang.org/mailman/listinfo/erlang-questions</a><br>
<br></div></blockquote></div><span><font color="#888888"><br><br clear="all"><br>-- <br>Software Architect & Computer Scientist<br>
</font></span></blockquote></div><br>