[erlang-questions] Market data, trading strategies and dropping the last element of a list

Joel Reymont joelr1@REDACTED
Tue Dec 19 15:55:15 CET 2006


On Dec 19, 2006, at 2:35 PM, Christian S wrote:

> What information is associated with each quote?
>
> Only point-in-time and the price?
> Volume?
> Buyer and Seller?
> Transaction id?

Probably all of the above, although a single quote can be split into  
multiple time series. I haven't figured it out yet but I suppose I'll  
strat simple.

> 100 quotes or even 1000 quotes doesnt strike me as being very
> much data.
> [...]
> I would use binaries. They are compact and the data posts in question
> are fixed-size and do not change (more posts are added though, that
> needs
> special casing). Random accesses into a binary can be O(1). The
> problem is to grow them.

Yes, that's what I'm leaning towards. I could keep a single day of  
history in a binary, for example and just calculate the offset to the  
required element knowing the frequency of elements. The easiest way  
would be to store quotes every second, every minute, etc. It's much  
harder if you store a timestamp with every quote, I think.

> You collect the most recent quotes in a list, and compact the list of
> quotes into a binary every N quote or so, and put those binaries of N
> quotes into a list of binaries, which you could in turn compact and/or
> push out to mnesia to be replicated to other nodes.

This could be a RA-list, as suggested by Vlad. The strategies can  
help themselves to the required number of elements so long as they  
don't try to access more quotes than have been accumulated thus far.

> That way a request for the 100 last trades to the subscriber for that
> security would be answered as "here are the at most N last trades and
> the rest can be obtained by calling this function that i supply (which
> makes a mnesia lookup)".

Right. For intra-day strategies it could be "here are the quotes  
accumulated today", access as many as you want to. This would allow  
me to keep a single quote list for each security instead of  
generating custom extracts for each strategy. Less work for the  
garbage collector equals faster processing.

> Think a single replicated "security" mnesia table could take enough
> write-locked transactions to it to keep up with the number of trades
> on a large stock market like NASDAQ? Using table fragmentation one
> could break it down across several nodes if it is too much.

I'm not sure yet if Mnesia wins over disk logs here but I would  
definitely use table fragmentation if I were to go with Mnesia.

	Thanks, Joel

--
http://wagerlabs.com/








More information about the erlang-questions mailing list