SPUG: Segmentation Fault in mod_perl
jcokos at iwebsys.com
Mon Mar 12 21:06:10 CST 2001
You're barking right up my tree here .... I do CGI
work for a living, and work with mod_perl quite
That being said, let me ask you a few preliminary
1. The CGI Script that actually runs: is it mod_perl
compliant, in terms of exit status, "use strict" compatability,
2. How do you have httpd.conf setup to accept mod_perl
3. Sounds like you're going through a lot of machinations
to get XML output. Seems to me that a "from scratch"
solution, or one that's designed from the outset to
work within the confines of mod_perl would be better
suited than to try and shoehorn in something else.
Generally, coredumps from mod_perl are the result of
"dirty" coding. A few variables out of place, or a bad
exit status can kill it pretty quickly.
----- Original Message -----
From: Jonathan Gardner <gardner at sounddomain.com>
To: <spug-list at pm.org>
Sent: Monday, March 12, 2001 4:16 PM
Subject: SPUG: Segmentation Fault in mod_perl
> Okay, run while you still can!
> We're running mod_perl on Apache. I am working on a module to
> allow people to search our pages.
> I got ht://dig. This is a fine piece of software, and it works great. Ht://dig
> has a program that works as a CGI script, and using the configuration files and
> templates, you can configure almost anything you would like to.
> We want to really customize stuff, really really customize it. So I figured
> we'd trick htdig to output XML, and use XML parser to parse it out, and then
> take that and convert it into hashes and arrays and scalars we can use to fully
> customize everything in Perl.
> Okay, so I wrote a module that takes the search parameters and returns a hash
> with all the results stored inside. I wrote a test script that runs from the
> command line and everything works beautifully. I am so happy I beam for joy!
> I plug it into Apache mod_perl and I get the dreaded segmentation fault. Not
> just once, over and over again, almost every time I try to run it.
> The only hint I have is that perhaps it is an out-of-memory problem. This would
> make sense. I have the program that consumes a lot of memory (the search), and
> XML::Parser trying to pack everything into 'Tree' style. But it doesn't fail
> during that - it fails when I start working on the tree with a couple of subs,
> so if it WAS a memory problem shouldn't it fail during the XML parsing or
> during the searching? How do I check to see how much memory the command-line
> script uses?
> What do I do to find out the problem, short of putting a "warn"
> between every line of code? How do I use the core dump - if it is useful at
> all? What rotting nuggets of useful knowledge are lodged in the wisdom
> teeth of the gurus, and could they use the toothbrush of the transfer of
> knowledge to reveal those morsels to me? :-)
> Jonathan Gardner
> gardner at sounddomain.com
> (425)820-2244 x123
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> POST TO: spug-list at pm.org PROBLEMS: owner-spug-list at pm.org
> Subscriptions; Email to majordomo at pm.org: ACTION LIST EMAIL
> Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
> For daily traffic, use spug-list for LIST ; for weekly, spug-list-digest
> Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
POST TO: spug-list at pm.org PROBLEMS: owner-spug-list at pm.org
Subscriptions; Email to majordomo at pm.org: ACTION LIST EMAIL
Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
For daily traffic, use spug-list for LIST ; for weekly, spug-list-digest
Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/
More information about the spug-list