Cleaning Up on Edit from Perl

Scott Penrose scottp at
Tue Jun 10 05:51:39 CDT 2003

Hash: SHA1

Hey Dudes,

We have a daemon which takes up over 120MB before exit. It has HEAPS of 
objects all open. It performs adequately and at least 80MB of the 
system is shared, so that when we fork we are getting a best economy of 

Like Apache we tend to prefork and only allow our processes to run for 
a while to capture those occasional memory leaks (somewhere between 25 
and 200 requests before exiting).

On Exit the daemon flushes log entries and dies. This sends a HUP to 
the parent which restarts a new child.

My problem is that on exit Perl unwinds each of its objects that is 
created, calling DESTROY where possible on each. This takes a MASSIVE 
amount of time, during which, even the normally read only modules (with 
huge data) can tend to come out of share as they are written to. So 
time is huge and memory is huge !

Two question - Can and Should I do a ABSOLUTE EXIT. What I mean to say 
is - QUIT the process (equal to do a kill -9).

Even if my code was perfect Perl still deallocates all its memory - 
which all takes a while with such a large amount of code.

So... I know that I can send myself a KILL, and I know I can do an exit 
in C - but ...

	* Should I?

	* Is there a Perl way to do it (properly - not kill)

There are other good reasons to do this - eg: a system is in an 
unstable state (database are fragile or going down) and you don't want 
your objects to exit.

- -- 
Scott Penrose
Welcome to the Digital Dimension
scottp at

Dismaimer: Contents of this mail and signature are bound to change 
randomly. Whilst every attempt has been made to control said 
randomness, the author wishes to remain blameless for the number of 
eggs that damn chicken laid. Oh and I don't want to hear about 
butterflies either.
Version: GnuPG v1.0.6 (Darwin)
Comment: For info see


More information about the Melbourne-pm mailing list