mnesia and binary/large files

Vladimir Ulogov gandalf@REDACTED
Wed Oct 13 21:26:10 CEST 1999


Michael Skowronski wrote:
Hi there,
> hello,
> i just read some info about mnesia and the literature suggests that it
> might not be appropriate for text and binary data.  what would happen if
My feeling it will be very slow on write operation. Delete shoudn't
cause any problems. The read operation will be depend from the blob size
and lookup from the keysize. So, do not use BLOB as keyfield (if you
computer are not 20t megaelephant, same precautions like for pangalactic
gargleblaster) You can speed-up read and write operations using old
trick - split BLOB on the parts, so:
-record(blobs, {blob_id, sequence_number, blob_data}).
instead
-record(blobs, {blob_id, blob_data}).
in This case the second record could be a part of the set type table,
but first definitely should be bag.

You can write the functions which will split and pack blob in the binary
object youself. It suppose to be easy. BTW, if you will need to cache
the requests, as sample split the big image in the set of small portions
and send it to the user portion-by-portion, you should take care about
making these small BLOBS to be a full functional images (if they require
some header or so on).
> mike

-- 
%% Vladimir I. Ulogov (gandalf@REDACTED) AT&T Labs
"Where lands meets water. Where earth meets air. Where body meets
mind. Where space meets time. We like to be on one side, and look 
at the other." D.Adams "Mostly harmless"



More information about the erlang-questions mailing list