SPUG: Segmentation Fault in mod_perl
wyllie at dilex.net
Tue Mar 13 01:54:21 CST 2001
This has been discussed to some extent on the mod_perl mailing list
so you may want to check the archives (perl.apache.org). In a
nutshell, the expat that comes with apache should be disabled when
you build apache with mod_perl. Depending on how you built mod_perl
this may or may not have been done for you. The apache expat and
the version of expat used by XML::Parser are not the same and cross
each other up causing the segfaults.
my apache config command looks like this:
./configure --disable-rule=EXPAT \
I also found that using the DSO stuff and expat gave me some problems, but I
did not really investigate this too much - I don't really need the DSO stuff.
hope this helps
On Mon, 12 Mar 2001, Jonathan Gardner wrote:
> Okay, run while you still can!
> We're running mod_perl on Apache. I am working on a module to
> allow people to search our pages.
> I got ht://dig. This is a fine piece of software, and it works great. Ht://dig
> has a program that works as a CGI script, and using the configuration files and
> templates, you can configure almost anything you would like to.
> We want to really customize stuff, really really customize it. So I figured
> we'd trick htdig to output XML, and use XML parser to parse it out, and then
> take that and convert it into hashes and arrays and scalars we can use to fully
> customize everything in Perl.
> Okay, so I wrote a module that takes the search parameters and returns a hash
> with all the results stored inside. I wrote a test script that runs from the
> command line and everything works beautifully. I am so happy I beam for joy!
> I plug it into Apache mod_perl and I get the dreaded segmentation fault. Not
> just once, over and over again, almost every time I try to run it.
> The only hint I have is that perhaps it is an out-of-memory problem. This would
> make sense. I have the program that consumes a lot of memory (the search), and
> XML::Parser trying to pack everything into 'Tree' style. But it doesn't fail
> during that - it fails when I start working on the tree with a couple of subs,
> so if it WAS a memory problem shouldn't it fail during the XML parsing or
> during the searching? How do I check to see how much memory the command-line
> script uses?
> What do I do to find out the problem, short of putting a "warn"
> between every line of code? How do I use the core dump - if it is useful at
> all? What rotting nuggets of useful knowledge are lodged in the wisdom
> teeth of the gurus, and could they use the toothbrush of the transfer of
> knowledge to reveal those morsels to me? :-)
> Jonathan Gardner
> gardner at sounddomain.com
> (425)820-2244 x123
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> POST TO: spug-list at pm.org PROBLEMS: owner-spug-list at pm.org
> Subscriptions; Email to majordomo at pm.org: ACTION LIST EMAIL
> Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
> For daily traffic, use spug-list for LIST ; for weekly, spug-list-digest
> Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/
Andrew Wyllie <wyllie at omasum.com> Open Source Integrator
v.206.729.7439 __We can catify or stringify,
c.206.851.9876 separately or together!__ perl-5.005_03
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
POST TO: spug-list at pm.org PROBLEMS: owner-spug-list at pm.org
Subscriptions; Email to majordomo at pm.org: ACTION LIST EMAIL
Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
For daily traffic, use spug-list for LIST ; for weekly, spug-list-digest
Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/
More information about the spug-list