problem with opening up huge sized file

Rick J pisium at yahoo.com
Sat Jan 20 01:56:17 CST 2001


Hi, Perl Mongers!

I have a question regarding opening up a huge sized
file.

I was trying to open up a 2.6G file on Unix Solaris
5.6, 32bit. Perl barfed out "Variable value too
large."

I tried different ways to open like "open (F,
"filename")" and sysopen. Eventually I tried using cat
and redirect to the file handle like open(F, "cat
filename | "), which worked. 

I thought it was memory problem, cuz I believed that
when PERL opens up a file, it reads whole content into
its memory. Since the momory is low, it can't suck in
all the lines. Just like when you create an array too
big @array = (1..100_000_000), it will complain "out
of memory". But someone thought it's the OS problem,
because my Solaris is 32bit, and it can only handle
the file smaller than 2.6G something. It it's true,
why I can open the file with 'textedit' or 'vi' or
even with command like 'more' or 'grep' search?

I wonder why I can use cat and pipe in open function
to open up a huge file, is it because when it cats and
pipes a file, it only sends 64k(?) bytes?

Please advise the reason and if there are other ways
to open a large file.

Thanks,

Ricky



__________________________________________________
Do You Yahoo!?
Yahoo! Auctions - Buy the things you want at great prices. 
http://auctions.yahoo.com/
=================================================
Mailing list info:  If at any time you wish to (un|re)subscribe to
the list send the request to majordomo at hfb.pm.org.  All requests
should be in the body, and look like such
                  subscribe anchorage-pm-list
                  unsubscribe anchorage-pm-list



More information about the Anchorage-pm mailing list