[Chicago-talk] writing log files to a DB

Richard Reina richard at rushlogistics.com
Sat Mar 26 17:10:49 PST 2005


I wrote a simple script that reads an application's
log file and dumps it into a database table. Like
this:

@ARGV = "log_files";
while (<>) {

my ($src, $dsc, $clid, $chn, $PU, $HU, $timestamp,
$dur) = split(",",$_);

my $q = "REPLACE INTO master_log VALUES
(?,?,?,?,?,?,?,?)";
my $sth = $dbh->prepare($q);
$sth->execute($src, $dsc, $clid, $chn, $PU, $HU,
$timestamp, $dur);
}

This seems inefficient because it reinserts the whole
file each time it's run and the file keeps getting
larger. So I was going to change so that it first gets
the timestamp value of the last record in the table
and then only starts inserting the the lines/values
after thet record in file.  Like this:

while (<>) {

my ($src, $dsc, $clid, $chn, $PU, $HU, $timestamp,
$dur) = split(",",$_);

while ($timestamp > $TS) {
my $q = "REPLACE INTO master_log VALUES
(?,?,?,?,?,?,?,?)";
my $sth = $dbh->prepare($q);
$sth->execute($src, $dsc, $clid, $chn, $PU, $HU,
$timestamp, $dur);
}
}

Is this the most efficient way of writing a script
like this?




 


More information about the Chicago-talk mailing list