[erlang-questions] wkipedia rendering engine
Mon Jun 30 13:36:25 CEST 2008
On Jun 30, 2008, at 13:23, Joe Armstrong wrote:
> Is there a REST interface so that I can retreive the latest version of
> the MetaWiki markup for a specific page with, for example,
> a wget command.
You can get bulk dumps http://en.wikipedia.org/wiki/Wikipedia:Database_download#Where_do_I_get
Why would you do individual scraping? In order to keep up to date with
changes that happened between the last dump and now()?
> Has anybody made an erlang interface to scrape individual pages from
> the wikipedia - or to bulk convert the entire
> wikipedia to erlang terms :-)
> On Mon, Jun 30, 2008 at 11:39 AM, Joe Armstrong <>
>> I was at the erlang exchange and heard the *magnificant* talk
>> "Building a transactional distributed data store with Erlang", by
>> Alexander Reinefeld.
>> I'll be blogging this as soon as I have the URL of the video of the
>> (in advance of this there was talk at the google conference on
>> oh and they also seem to have won the SCALE 2008 prize at the
>> CCGrid conferense in Lyon but there is zero publicity about this
>> We (collectively) promised to help Alexander - I promised to
>> provide him with a
>> rendering engine (in Erlang) for the wikipedia markup language.
>> Before I start hacking has anybody done this before?
>> /Joe Armstrong
> [Kopia av detta meddelande skickas till FRA för övervakningsändamål.
> De vill ju ändå läsa min e-post.]
> [A copy of this mail has been sent to
> FRA for monitoring purposes. FRA wants to read all my e-mail and have
> been allowed to do by the Swedish parliment - in violation of article
> 12 of the UN Universal Declaration of Human Rights]
> erlang-questions mailing list
More information about the erlang-questions