[erlang-questions] Machine Learning

Alex Alvarez eajam@REDACTED
Sun Mar 6 23:10:30 CET 2016


I'd say the opposite.  The more complicated the algorithm is, the more 
you might be able to get out of Erlang because it's a higher level 
language and it was explicitly built to run large number of processes.  
Again, for this and any other problem, the key thing is to visualize how 
to divide the problem into one or more types of units/functions that can 
be repeated in order to truly make the most out of the platform.


My biggest question regarding Erlang in general is if it's currently 
making use of GPUs, say through OpenCL or other mechanisms?  Sorry if 
this has already been covered some place else!


Thanks,
Alex


On 03/05/2016 06:12 PM, lloyd@REDACTED wrote:
> Hello,
>
> I can't claim anything but the shallowest machine learning chops beyond cursory exploration of the literature and what my data scientist son has taught me. But it seems like the first and foremost consideration in determining whether or not to proceed with a machine learning project in Erlang is careful consideration of the problem you're trying to solve.
>
> My sense is that, as Alex points out, Erlang may work quite well for problems in the Machine Learning 101  domain; e.g. limited number of perceptron elements. How do we define limited? Exactly why it may be worthwhile to experiment. Certainly Gene Shor has done interesting work. Does anyone know of worthy follow-up?
>
> That said, big-data machine learning isn't the only game in town. This piece throws down a significant challenge:
>
> Neural modelling: Abstractions of the mind
> http://www.nature.com/nature/journal/v531/n7592_supp/full/531S16a.html
>
> All the best,
>
> LRP
>
> -----Original Message-----
> From: "Alex Alvarez" <eajam@REDACTED>
> Sent: Saturday, March 5, 2016 4:58pm
> To: erlang-questions@REDACTED
> Subject: Re: [erlang-questions] Machine Learning
>
> _______________________________________________
> erlang-questions mailing list
> erlang-questions@REDACTED
> http://erlang.org/mailman/listinfo/erlang-questions
> Sorry for my late response to this topic, but I do believe Erlang is
> actually a great language for the ML and statistics space.  Take a basic
> feed-forward NN with back-propagation, for example.  What you'd normally
> have in terms of mathematical computation is mainly addition and
> multiplication.  You only need to put together a perceptron, which are
> only inputs (including a bias) multiplied by respective weights, you add
> them up and pass this value through a function like sigmoid or
> hyperbolic tangent and that's that. Back-propagation, as a way to adjust
> the weights during the training phase, doesn't require math-wise
> anything more complicated.  You combine the perceptrons for the hidden
> and output layers and you got yourself a NN.  In this configuration,
> deep learning will simply be two or more hidden layers, instead of one.
> The key thing to maximize the use of Erlang is certainly to distribute
> the load through processes, so each perception could be one individual
> process, for example.  Definitely, not rocket science.  Now, I concur
> that in some situations it might be advantageous to write a module, say
> for a perception, in C and make use of it within Erlang, but there is no
> reason why you couldn't start with Erlang and gradually move to that
> direction, if need be.
>
> Cheers,
> Alex
>
>
> On 02/10/2016 06:31 AM, Jesper Louis Andersen wrote:
>> On Wed, Feb 10, 2016 at 10:34 AM, Samuel <samuelrivas@REDACTED
>> <mailto:samuelrivas@REDACTED>> wrote:
>>
>>      I am not aware of existing ML or linear algrebra libraries in erlang
>>      that can be used to quick start an ML project, but just piping into
>>      tensorflow (or any other existing library/framework) isn't really
>>      doing ML with erlang, is it? You can as well just use tensorflow
>>      directly.
>>
>>
>> The question is if this is a practical problem which needs solving or
>> it is for research. If you are researching how to construct, say, SVM
>> or NNs, then surely Erlang is a good vehicle. But in practice, there
>> is a number of things which makes Erlang unsuitable:
>>
>> * ML is often CPU bound. You don't want a bytecode interpreter to be a
>> limiting factor here. Even if the interpreter in Erlang is
>> state-of-the-art and highly optimized, it is not far fetched that a
>> FP-intensive program will be roughly a factor of 30 faster if compiled
>> in a lower level language.
>>
>> * GPUs are popular in ML models for a reason: they speed up the FP
>> computations by a factor of 3000 or more. This in itself should hint
>> you that you need something else than Erlang.
>>
>> * Erlangs word overhead per process and terms means a lower-level
>> model can pack many more entities in memory. This affects caching
>> behavior.
>>
>> Training of the model is often off-line and using the model is online
>> in the system. How you train your model is less important. This is why
>> I'd just outsource this problem to the libraries built and tuned for
>> it. It is like solving LinAlg problems but forgetting everything about
>> existing LAPACK and ATLAS routines in Fortran.
>>
>> A model, in Erlang, which could be viable is to use Erlang to produce
>> programs for lower level consumption by compilation. But these are
>> problems for which languages such as Haskell and OCaml dominates for a
>> reason: their type systems makes it far easier to pull off.
>>
>>
>>
>> -- 
>> J.
>>
>>
>> _______________________________________________
>> erlang-questions mailing list
>> erlang-questions@REDACTED
>> http://erlang.org/mailman/listinfo/erlang-questions
>
>
>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://erlang.org/pipermail/erlang-questions/attachments/20160306/98506ac2/attachment.htm>


More information about the erlang-questions mailing list