[Dub-pm] big data structures relative to memory size

David Cantrell david at cantrell.org.uk
Fri Apr 16 09:29:39 CDT 2004

On Fri, Apr 16, 2004 at 03:08:23PM +0100, Sean O'Riordain wrote:
> I've an analysis program with a couple of million records that i really 
> need to keep in memory as i need to scan back and forth etc... With 5 
> million odd records (written as a couple of independent 'arrays' or 
> should i say 'lists') ...

Can you change the code to use hashes instead of lists?  Then you could
use dbm files.  dbm is VERY fast, certainly faster than thrashing your
swap partition.

Lord Protector David Cantrell  |  http://www.cantrell.org.uk/david

   I hear you asking yourselves "why?".  Hurd will be out in a
   year ...
        -- Linus Torvalds, in <1991Oct5.054106.4647 at klaava.Helsinki.FI>

More information about the Dublin-pm mailing list