performance question

Joshua Keroes jkeroes at eli.net
Fri Feb 15 12:35:42 CST 2002


Why not store them in memory, and instead of calling another program, call
another subroutine? This would save a number of file opens and some forking.
Regardless, since it's only 200 files and calls to launch the other program,
that probably won't amount to more than a few seconds or minutes at the
most.

Your call.
Joshua

PS When in doubt, benchmark.

-----Original Message-----
From: Tom Keller <kellert at ohsu.edu>
To: PDX Perl List <pdx-pm-list at pm.org>
Date: Friday, February 15, 2002 10:12 AM
Subject: performance question


>I hope this isn't off topic. I'm using perl for the following:
>Parse a list of start and stop positions for a fairly large (217061
>chars) file of DNA sequence data (m/[acgt]+/i - hopefully Not
>alphabetized!). I have about 200 putative genes demarcated with these
>start and stop positions within that sequence that I wish to further
>analyze. From a performance pov, am I better off reading the same
>larger file 200 times or should I read the large file into memory,
>create 200 smaller files, then pass each of these to my next program?

TIMTOWTDI



More information about the Pdx-pm-list mailing list