concating files together
Andrew Wilson
andrew at rivendale.net
Fri May 23 10:32:06 CDT 2003
On Fri, May 23, 2003 at 04:09:24PM +0100, Boyle Bernadette wrote:
> Andrew,
>
> I was using it in my perl script as exec `more /var/log/maillog* >
> /tmp/output` but would have preferred to use Perl syntax. Your suggestion
> is very cool!!
Thanks
> perl -p -e'' /var/log/maillog* > /tmp/output
In a script, I wouldn't do that. The @ARGV array in perl is special, it
holds the command line arguments. when you use the -p option, perl
treats each entry as a filename, it opens each in turn, gives you one
line at a time then moves to the next file. In a perl script, you can
use <> to get the same special behaviour, so you can do.
while (<>) {
print;
}
And get a similar effect to the -p option. You can set the @ARGV array
at any point, so you could do:
{
@ARGV = qw{file1 file2 file3};
while (<>) {
print;
}
}
But that destroys @ARGV. You can avoid that problem by localising
@ARGV:
{
local @ARGV = qw{file1 file2 file3};
while (<>) {
print;
}
}
And since you know what output file you want (lets say you assign it to
$maillog) then you can do:
my $maillog = '/tmp/maillog';
{
open LOG, $maillog or die "Can't open $maillog $!";
local @ARGV = qw{file1 file2 file3};
while (<>) {
print OUT $_;
}
}
The only remaining problem is that the list of files you're going to
concat is fixed. You can get a list that changes like the /log/maillog*
in your original example using the glob function.
my $maillog = '/tmp/maillog';
{
open LOG, $maillog or die "Can't open $maillog $!";
local @ARGV = glob('/log/maillog*');
while (<>) {
print OUT $_;
}
}
Hope that helps.
andrew
--
Virgo: (Aug. 23 - Sept. 22)
If there's one thing you should try to learn from next week's events,
it's the precise melting point of aluminium.
More information about the Belfast-pm
mailing list