SPUG: Web interface problem

Jim Ludwig jsl at blarg.net
Wed Mar 6 15:03:51 CST 2002


Additional note on this:

  system( "/usr/bin/at -q b -f " . $global{Command} . " now" );

$global{Command} is an on-the-fly constructed
shell script or Perl script, saved out to a file.
Where I've run into this, the processing which
needs to occur inside that script could take
anywhere from 10 seconds to 10 hours.

So, for the problem in question, large downloads,
one could capture an email address, say, and send
an email out to the user, at the end of the
on-the-fly script, once the download is complete.

-----Original Message-----
From: Jim Ludwig <jsl at blarg.net>
To: Seattle Perl Users Group <spug-list at pm.org>
Subject: Re: SPUG: Web interface problem

When I've run into this problem in the past, I
also could only find a reasonable solution by
handing the task off to the OS as a batch job:

system( "/usr/bin/at -q b -f " . $global{Command} . " now" );

-----Original Message-----
Date: Wed, 6 Mar 2002 11:26:49 -0800 (PST)
From: Daryn Nakhuda <daryn at marinated.org>
To: Mikel Tidwell <dragon at dreamhaven.net>
cc: <spug-list at pm.org>
Subject: Re: SPUG: Web interface problem


I'd love to hear a good way to do this as well.. Fork seemed to keep the 
little globe spinning in IE, but maybe I was doing it wrong. My hack to 
make it seem okay to the end user was just to schedule an at job (+10 
seconds) from the cgi, then tell them it was being processed. I then had 
another place where you could check the status of jobs (the script being 
run by at updated the database as it ran).



On Wed, 6 Mar 2002, Mikel Tidwell wrote:

> Hi everyone,
> 
> I'm stumped on how to proceed on a page for a web interface I'm creating
> for my web site.  I want to be able to download a typically large file off
> another web server, and store it on the local server.  By large, I mean an
> average of 20 MB or so, but up to 100 MB at a typical rate of 30K/s.
> 
> Here's how I expect it to work:
> 
>  * User locates the file on another site, comes to the interface, and
> commands the file be downloaded.
>  * Script uses either LWP or `wget` to fetch file. <Problem>
>  * Script moves on to allow the person how to name the file, etc.
> 
> Problem: Downloading a file usually takes more than two minutes.  If the
> client doesn't time out by then, the web server will terminate the process
> anyway (cgi time limit in Roxen).  I could simply change this limit, but
> I'm looking for a smarter solution.
> 
> I thought about creating a child process, but I know very little about
> fork(), or what happens to the parent in this scenario.  I also have an
> issue with, should the child be able to download the file, notifying the
> user that the download was complete, so he/she can continue.
> 
> If this still makes any sense, please help.  Thanks... ^^
> 
>  _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _
>   -- Mikel Tidwell   President: RPGamer -- http://www.rpgamer.com/
>    MSNM: FireMyst    Personal Home Page -- http://dragon.rpgamer.com/
>  - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ -

 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
     POST TO: spug-list at pm.org       PROBLEMS: owner-spug-list at pm.org
      Subscriptions; Email to majordomo at pm.org:  ACTION  LIST  EMAIL
  Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
 For daily traffic, use spug-list for LIST ;  for weekly, spug-list-digest
     Seattle Perl Users Group (SPUG) Home Page: http://seattleperl.org





More information about the spug-list mailing list