[erlang-questions] SQL Like Language To Crawl Websites

Timmy Turner timm.turn@REDACTED
Sun Dec 2 23:32:04 CET 2007


Hi everyone out there!

I'm still quite new, to erlang, so please try to not be mad, even if I say
or suggest something stupid...

I thought of creating a sql like language to crawl websites - an example of
such a query could be:

GET_ME ALL_PAGES FROM en.wikipedia.org
WHERE uri LIKE wiki/.*;

this would crawl all articles of wikipedia (the like-expression is supposed
to be a regular expression)

I'm not quite sure of how to approach this...

One idea would be with macros:

-define("GET_PAGE", "get_pages(\"" ).
-define(";", "\").").

such that this query

?GET_ME ALL_PAGES FROM en.wikipedia.org
WHERE uri LIKE /wiki/.* ?;

would be transformed in

get_pages("ALL_PAGES FROM en.wikipedia.org
WHERE uri LIKE wiki/.*");

1. Now, the define syntax isn't perfectly valid - I'm aware of that, so the
first question would be, is what I'm thinking of even possible?

2. Does it make sense do to something like this?

3. Are there any better approaches?

4. Is there any general introduction for meta-programming in erlang?

5. Anything else you'd like to tell me?

Thanks!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://erlang.org/pipermail/erlang-questions/attachments/20071202/0a0c5c5c/attachment.htm>


More information about the erlang-questions mailing list