SPUG: Segmentation Fault in mod_perl

John Cokos, CEO, iWeb, Inc. jcokos at iwebsys.com
Thu Mar 15 12:24:27 CST 2001


Although I'm not sure if this is related to the original
problem, it's food for thought, nonetheless .....

I've noticed a quite serious problem when you have
php and mod_perl both compiled into apache.  The problems
only arise when a mod_perl program attempts to
connect to a database.  At that point, mod_perl programs
start throwing up Segfaults, and php scripts start returning
no data at all.

Definitely a bug somewhere, where these 2 technologies
are bumping into each other's shared memory or something.

I've found that in every case, turning php "off" always fixes
the mod_perl problem.  While not the best solution, it
is the only solution when you're trying to do both.

J-

========================================
  John Cokos, President / CEO: iWeb Inc.
  http://www.iwebsys.com
  jcokos at ccs.net
========================================
----- Original Message -----
From: "Richard Anderson" <Richard.Anderson at raycosoft.com>
To: "John Cokos" <jcokos at iwebsys.com>; "Jonathan Gardner" <gardner at sounddomain.com>; <spug-list at pm.org>
Cc: "Colin Meyer" <cmeyer at helvella.org>
Sent: Tuesday, May 15, 2001 9:14 AM
Subject: Re: SPUG: Segmentation Fault in mod_perl


> I've done a bit of work on search engines, and I have a couple of thoughts:
>
> John pointed us toward a few potentential minor syntax problems.  The out-of-memory problem will pop up quite a bit if you are
searching a large internap site like yahoo, yellowpages.com etc. and the search returns a lot of results.  Imagine seaching the NYC
phonebook for the string Smith!
>
> One try might be to use a more memory-efficient XML parser routine.  Damian is probably a good resource on this (are you out
there, mate?).
>
> Another approach is to limit your search to returning the first 50 results you find.  If the user wants more results, add a link
at the top and bottom of the page that takes him to results 51-100, and so on
>
> In future versions, you might consider ranking the results based on some goodness-of-fit criterion based on the search string.
Word stemming and language dependencies are some more goodies to add.
> Richard Anderson, Ph.D.          www.unixscripts.com
> Perl / Oracle / Unix                Richard.Anderson at raycosoft.com
> Raycosoft, LLC                        Seattle, WA, USA
> ----- Original Message -----
> From: "John Cokos" <jcokos at iwebsys.com>
> To: "Jonathan Gardner" <gardner at sounddomain.com>; <spug-list at pm.org>
> Sent: Monday, March 12, 2001 8:06 PM
> Subject: Re: SPUG: Segmentation Fault in mod_perl
>
>
> > You're barking right up my tree here .... I do CGI
> > work for a living, and work with mod_perl quite
> > a bit.
> >
> > That being said, let  me ask you a few preliminary
> > questions:
> >
> > 1.  The CGI Script that actually runs: is it mod_perl
> >      compliant, in terms of exit status, "use strict" compatability,
> >      etc?
> >
> > 2.  How do you have httpd.conf setup to accept mod_perl
> >      programs?
> >
> > 3.  Sounds like you're going through a lot of machinations
> >      to get XML output.  Seems to me that a "from scratch"
> >      solution, or one that's designed from the outset to
> >      work within the confines of mod_perl would be better
> >      suited than to try and shoehorn in something else.
> >
> > Generally, coredumps from mod_perl are the result of
> > "dirty" coding.  A few variables out of place, or a bad
> > exit status can kill it pretty quickly.
> >
> > John
> >
> > ----- Original Message -----
> > From: Jonathan Gardner <gardner at sounddomain.com>
> > To: <spug-list at pm.org>
> > Sent: Monday, March 12, 2001 4:16 PM
> > Subject: SPUG: Segmentation Fault in mod_perl
> >
> >
> > > Okay, run while you still can!
> > >
> > > We're running mod_perl on Apache. I am working on a module to
> > > allow people to search our pages.
> > >
> > > I got ht://dig. This is a fine piece of software, and it works great. Ht://dig
> > > has a program that works as a CGI script, and using the configuration files and
> > > templates, you can configure almost anything you would like to.
> > >
> > > We want to really customize stuff, really really customize it. So I figured
> > > we'd trick htdig to output XML, and use XML parser to parse it out, and then
> > > take that and convert it into hashes and arrays and scalars we can use to fully
> > > customize everything in Perl.
> > >
> > > Okay, so I wrote a module that takes the search parameters and returns a hash
> > > with all the results stored inside. I wrote a test script that runs from the
> > > command line and everything works beautifully. I am so happy I beam for joy!
> > >
> > > I plug it into Apache mod_perl and I get the dreaded segmentation fault. Not
> > > just once, over and over again, almost every time I try to run it.
> > >
> > > The only hint I have is that perhaps it is an out-of-memory problem. This would
> > > make sense. I have the program that consumes a lot of memory (the search), and
> > > XML::Parser trying to pack everything into 'Tree' style. But it doesn't fail
> > > during that - it fails when I start working on the tree with a couple of subs,
> > > so if it WAS a memory problem shouldn't it fail during the XML parsing or
> > > during the searching? How do I check to see how much memory the command-line
> > > script uses?
> > >
> > > What do I do to find out the problem, short of putting a "warn"
> > > between every line of code? How do I use the core dump - if it is useful at
> > > all? What rotting nuggets of useful knowledge are lodged in the wisdom
> > > teeth of the gurus, and could they use the toothbrush of the transfer of
> > > knowledge to reveal those morsels to me? :-)
> > >
> > > --
> > > Jonathan Gardner
> > > gardner at sounddomain.com
> > > (425)820-2244 x123
> > >
> > >
> > >  - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> > >      POST TO: spug-list at pm.org       PROBLEMS: owner-spug-list at pm.org
> > >       Subscriptions; Email to majordomo at pm.org:  ACTION  LIST  EMAIL
> > >   Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
> > >  For daily traffic, use spug-list for LIST ;  for weekly, spug-list-digest
> > >   Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/
> > >
> > >
> >
> >
> >  - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> >      POST TO: spug-list at pm.org       PROBLEMS: owner-spug-list at pm.org
> >       Subscriptions; Email to majordomo at pm.org:  ACTION  LIST  EMAIL
> >   Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
> >  For daily traffic, use spug-list for LIST ;  for weekly, spug-list-digest
> >   Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/
> >
> >
> >
>


 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
     POST TO: spug-list at pm.org       PROBLEMS: owner-spug-list at pm.org
      Subscriptions; Email to majordomo at pm.org:  ACTION  LIST  EMAIL
  Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
 For daily traffic, use spug-list for LIST ;  for weekly, spug-list-digest
  Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/





More information about the spug-list mailing list