SPUG: Web interface problem
dragon at dreamhaven.net
Wed Mar 6 12:39:12 CST 2002
I'm stumped on how to proceed on a page for a web interface I'm creating
for my web site. I want to be able to download a typically large file off
another web server, and store it on the local server. By large, I mean an
average of 20 MB or so, but up to 100 MB at a typical rate of 30K/s.
Here's how I expect it to work:
* User locates the file on another site, comes to the interface, and
commands the file be downloaded.
* Script uses either LWP or `wget` to fetch file. <Problem>
* Script moves on to allow the person how to name the file, etc.
Problem: Downloading a file usually takes more than two minutes. If the
client doesn't time out by then, the web server will terminate the process
anyway (cgi time limit in Roxen). I could simply change this limit, but
I'm looking for a smarter solution.
I thought about creating a child process, but I know very little about
fork(), or what happens to the parent in this scenario. I also have an
issue with, should the child be able to download the file, notifying the
user that the download was complete, so he/she can continue.
If this still makes any sense, please help. Thanks... ^^
_ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _
-- Mikel Tidwell President: RPGamer -- http://www.rpgamer.com/
MSNM: FireMyst Personal Home Page -- http://dragon.rpgamer.com/
- _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ - _ -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
POST TO: spug-list at pm.org PROBLEMS: owner-spug-list at pm.org
Subscriptions; Email to majordomo at pm.org: ACTION LIST EMAIL
Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
For daily traffic, use spug-list for LIST ; for weekly, spug-list-digest
Seattle Perl Users Group (SPUG) Home Page: http://seattleperl.org
More information about the spug-list