SPUG: Perl Question?

Aaron W. West tallpeak at hotmail.com
Tue May 3 16:10:03 PDT 2005


This feels like about a dozen questions, many of them not asked directly.
It's hard to know where to start. You didn't say what performance problems
you were experiencing.

I would say to begin with, in the design and testing phase, it would be good
to get a better understanding of the behavior of your system through better
unit testing. That is, try each script, determine how long it "typically"
runs. But maybe the runs vary widely.

However, at this point, you could log the start and stop times of each
execution of each command, along with the start time of the batch file.

You could use some sort of mutex or database-level lock or file locking to
prevent the execution of two concurrent instances of the same script with
the same arguments, if that is determined to be problematic. Intent locks
(flock) may not work across a network share, I understand. Database locks
might be the best.

I suspect (but have no knowledge) that with database activity and email
occurring, there is a good deal of time spent waiting for I/O. If the issue
is one of latency, and CPU usage and memory consumption are low, there may
be no problem with executing multiple perl interpreters concurrently, eg:

script1 arg1 & 
script1 arg2 & 
script1 arg3

.. as long as these instances of the script don't have any big contention
problems with a resource, and have no dependencies upon each other. However,
if one of the three instances takes over 15 minutes and the other 2 take
only a few seconds, this won't buy you much.

You might try ODBC tracing or other means of identifying performance
problems with any database calls you may be doing. Check the documentation
for your database, DBI, and the drivers you are using (eg. DBD::ODBC,
DBD::Sybase, DBD::Oracle, etc.)


-----Original Message-----
From: spug-list-bounces at pm.org [mailto:spug-list-bounces at pm.org] On Behalf
Of Mike Alba
Sent: Tuesday, May 03, 2005 3:28 PM
To: spug-list at pm.org
Subject: SPUG: Perl Question?

Hi,

  I have a question regarding perl performance.
I have batch file that executes the same perl
program a few times but with different parameters.
In other words there are different operations
being performed, sql reports are generated from
the database, files are moved around, files are
emailled, etc. This batch runs every 15 minutes
and occasionally there seems to be performance
problems. The thinking was that maybe some of the
perl calls are bumping into each other. For instance
a database call may be taking longer then expected.
Is there a way I can monitor this? When I have
tried to duplicate this it seems that each perl
call isnt made until the previous one finishes,
from the batch file. Am I incorrect here and
there might be other factors at play as far as
why performance issues are occuring? Any tools
available that I can use to monitor each running
perl call? Also since it only happens intermittently
I guess I am going to have to capture a lot of data?

Thanks in advance for your help!!!

__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 
_____________________________________________________________
Seattle Perl Users Group Mailing List  
     POST TO: spug-list at pm.org
SUBSCRIPTION: http://mail.pm.org/mailman/listinfo/spug-list
    MEETINGS: 3rd Tuesdays, Location: Amazon.com Pac-Med
    WEB PAGE: http://seattleperl.org/


More information about the spug-list mailing list