<div dir="ltr">Hi All,<div><br></div><div style>I need to implement a very large set of data, with the following requirements:</div><div style>- It will be populated EXCLUSIVELY by 64-bit integers.</div><div style>- The only operations will be: </div>
<div style> - add element,</div><div style> - get number of elements, and</div><div style> - fold/foreach over the SORTED dataset.</div><div style>- The invocation order will be strictly:</div><div style> - create data structure,</div>
<div style> - add elements sequentially,</div><div style> - run one or more iteration operations,</div><div style> - discard data structure.</div><div style>- The size of the dataset MUST scale to 500M elements, preferably billions should be possible too.</div>
<div style>- The data does not have to reside in memory - however, 32 to 64 GB of RAM may be allocated. (of course, these will be used by the OS buffer cache in case a file-based solution is chosen).</div><div style><br></div>
<div style>In summary: Performance is not a must, but volume and the ability to iterate over the ordered values is.</div><div style><br></div><div style>Thanks in advance!!!</div><div style><br></div></div>