[Omaha.pm] Self monitoring a perl scripts memory usage...
Jay Hannah
jay at jays.net
Wed Oct 21 13:40:31 PDT 2009
On Oct 21, 2009, at 2:17 PM, Dan Linder wrote:
> I have a script that usually maxes out around 100MB of memory (viewed
> via /proc/<PID>/status and watching the "VmPeak" value).
> In one test case, the complexity of the task grows, and the VmPeak
> tops out at over 300MB. The server this is running on has "ulimit -v
> 256000" (roughly 256MB), so once the script passes 256MB it is killed.
Wow. That's a big process. :) You have to have all that in memory
all at once? Can you cache out some of it or something?
(Cache::FileCache?)
> Questions:
> 1: Is there a way to let a perl script query itself or the perl
> executable to determine how much memory it is currently using? I'd
> like to log the highwater mark at exit so we can watch this over time,
> or possibly exit when "250MB" is used and give the user instructions
> on upping the limit in a config file after upping the ulimit value.
I'm not aware of a smart way. On Linux you could always run fork a "ps
-ef" and find yourself.
irc.perl.org #perl-help might be helpful.
> 2: Can the signal be captured in a handler? If I read chapter 17 of
> the OReilly "Learning Perl 3rd Edition", it says that they can't be
> trapped... :-(
If the OS sends the signal to perl then it's trivial to assign that
signal to a subroutine and Do The Right Thing... In Linux, anyway. I
don't know anything about Windows. :)
http://perldoc.perl.org/perlipc.html#Signals
HTH,
j
More information about the Omaha-pm
mailing list