[Phoenix-pm] PERL DBI

Scott Walters scott at illogics.org
Mon Mar 13 12:28:37 PST 2006


Hi Peter,

Was it running slow before?  I can't imagine that this would make it slower...

You could get a little parallelism if you used threads and made a Thread::Queue
to pass rows from a reader thread to a writer thread... but Perl threads are
pretty grumpy.

Just a thought.

Cheers,
-scott

On  0, "Loo, Peter # PHX" <Peter.Loo at source.wolterskluwer.com> wrote:
> 
> Hi All,
> 
> Alright, I got it working, but it is running so slow.  I am wondering if
> there is anyway to speed this up?  Here is the syntax I am using:
> 
>     $bindVars = join ', ', ('?') x @column_names;
>     $d_sth = $d_dbh->prepare(qq{insert into
> dssppv.tmp_falcon_projections ($columnNames)
>                               values ($bindVars)}) || die
> "$DBI::errstr\n";
>     $s_sth = $s_dbh->prepare("select * from
> dssppv.p_falcon_projections") || die "$DBI::errstr\n";
>     $s_sth->execute() || die "$DBI::errstr\n";
>     while ( my @row = $s_sth->fetchrow_array ) {
>       $d_sth->execute(@row);
>       } 
> 
> 
>  
> Peter Loo
> Wolters Kluwer Health
> (602) 381-9553
> 
> -----Original Message-----
> From: Scott Walters [mailto:scott at illogics.org] 
> Sent: Monday, March 13, 2006 10:30 AM
> To: Loo, Peter # PHX
> Cc: phoenix-pm at pm.org
> Subject: Re: [Phoenix-pm] PERL DBI
> 
> Hi Peter,
> 
> Use placeholders:
> 
>     my $question_marks = join ', ', '?' x @column_names;  # one ? for
> each value
>     my $sth = $dbh->prepare(qq{insert into foo (@column_names) values
> ($question_marks)}) or die $dbh->errstr;
>     while(my @row = $old_database->fetchrow_array) {  # read from old
> database
>          $sth->execute(@row); # insert into new database
>     }
> 
> If you use placeholders, you never need to quote or escape data, and you
> don't suffer from SQL command length limitations, and the database
> doesn't have to parse potentially megs of data just to parse the SQL
> command, and it doesn't have to parse the SQL more than once at all --
> you prepare it once and then execute it as many times as you like for a
> huge speed increase.  If you're loading a lot of data, you'll also want
> to drop indexes first and the rebuild them later.  This is a huge time
> savings over incrementally updating them for millions of records.
> 
> I might have missed the point of your question -- if so, please clarify,
> and I aplogize.
> 
> Cheers,
> -scott
> 
> On  0, "Loo, Peter # PHX" <Peter.Loo at source.wolterskluwer.com> wrote:
> > 
> >    Hi,
> >    
> >    
> >            $columnsToAssign = $columnsToAssign . ", \"" .
> @$d_arrayref[0]
> >    . "\"";
> >            }
> >          else {
> >            $columnsToAssign = $columnsToAssign . ", " .
> @$d_arrayref[0];
> >            }
> >    
> >    Is there an easier way?
> >    
> >    
> >    
> >    Thanks.
> >    
> >    
> >    
> >    Peter Loo
> >    
> >    
> 
> 
> This E-mail message is for the sole use of the intended recipient(s) and
> may contain confidential and privileged information.  Any unauthorized
> review, use, disclosure or distribution is prohibited.  If you are not
> the intended recipient, please contact the sender by reply E-mail, and
> destroy all copies of the original message.
> 
> 
> This E-mail message is for the sole use of the intended recipient(s) and may contain confidential and privileged information.  Any unauthorized review, use, disclosure or distribution is prohibited.  If you are not the intended recipient, please contact the sender by reply E-mail, and destroy all copies of the original message.


More information about the Phoenix-pm mailing list