[Chicago-talk] Limiting system impact.

Troy Denkinger tdenkinger at gmail.com
Wed May 9 13:13:03 PDT 2007


I would suggest profiling the code to see where the slowness is coming
from.  Perhaps you can do some caching or other optimizations to decrease
the sizes of the hashes you're building in memory.  For instances, if you're
building a bunch of hashes serially, consider using Storable to cache each
to disk before beginning the next one.  If you must keep them all in memory
at the same time, then you'd need to try something else.

On 5/9/07, Jim Jacobus <JJacobus at ponyx.com> wrote:
>
>
> Is there any way to limit, from within the Perl script, the amount of
> memory or CPU a Perl process utilizes? I've got a script on a Redhat
> Linux system that crunches a lot of data. Script and output are ok,
> but the process can take up to 30 minutes wall clock time. As a
> result the system appears to slow down considerably until the the
> script finishes. I tried adding a few WAIT commands in the script,
> but it didn't help much. The system reads some really large files,
> parses the info and stores it in some hashes for later processing. So
> I'm using I/o, memory and CPU. It would be good if I could lower the
> process priority of the script when it's running so the O/S could
> give more weight to other processes.
>
> _______________________________________________
> Chicago-talk mailing list
> Chicago-talk at pm.org
> http://mail.pm.org/mailman/listinfo/chicago-talk
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.pm.org/pipermail/chicago-talk/attachments/20070509/2c2cdb39/attachment.html 


More information about the Chicago-talk mailing list