Phoenix.pm: update of HUGE text file

sinck at ugive.com sinck at ugive.com
Mon Oct 2 14:19:05 CDT 2000


\_ Is there a way in perl to do an update on a huge (100MB+ text file) without
\_ bringing the whole thing into memory?

sysread, syswrite, seek, flock.

Trust me.  Do flock first, or you're not gonna be happy.  Unless, of
course, you've already flocked the file, for which I would award
several brownie points.

The biggest trick with data-already-existant is to write the *same*
size records.  If you need to  s/a123b/c12345667d/, you're gonna need
to update all the bytes in the file after the 'b' in 'a123b'.  Also,
if you'll need to cope with s/a123b/ab/ ... where do those three bytes
go?  If you're not careful, you're liable to wind up with 'ab23b' in
that spot.

Happy joy.


\_ I have client that is using a text file as a make-shift database table.  The
\_ client needs to be able to add/modify/delete lines in the text file.  He does a
\_ search against the file, edits the results, and then needs to make the edits
\_ "stick".

@ 100MB, it's prolly time to take it to a sql db and let it worry
about edits natively.

David



More information about the Phoenix-pm mailing list