[pm-h] Perl merging many large files into one

Uri Guttman uri at stemsystems.com
Sun Mar 30 21:20:19 PDT 2014


On 03/31/2014 12:00 AM, Michael R. Davis wrote:
> Perl Folks,
> Can anyone tell me if the diamond operator is optimized in a print
statement or does it really read the file into memory then print it?


> use Path::Class qw{file};
> my @files=qw{X Y Z}; #really large files

how large is really large? in the olden days, 64k was large. today 1GB 
isn't large at all, even in ram.

> my $out=file("out.txt")->openw;
> foreach my $file (@files) {
>    my $fh=file($file)->openr;
>    print $out <$fh>; #does this read to memory then print or does it do something better?

why would you want to copy files line by line if you aren't doing 
anything with them? for just pure speed, this is one time i would shell 
out to cat as it will write all 3 files very efficiently to the out 
file. File::Copy may work too. depending on the size of the files, 
File::Slurp::append_file may be what you want or even just read_file (3 
times) and write_file.

so it all depends on my question above. how large is really large?

uri

-- 
Uri Guttman - The Perl Hunter
The Best Perl Jobs, The Best Perl Hackers
http://PerlHunter.com


More information about the Houston mailing list