SSH, FTP, HTTP, etc all from one

Scott Penrose scottp at
Fri May 17 22:35:01 CDT 2002

Hash: SHA1

Hey Dudes,

I often find a need to have a single URL that is used to download files. 
Think like wget you can type things like
	wget http://user:password@host/directory/file.ext
	wget ftp://user:password@host/directory/file.ext

Now add to that the ability to do things like
	wget ssh://user:password@host/directory/file.ext

and I am sure there are more...
	wget dbi://user:password@dsn/query
or something inventive

The thing is that parsing the URL and then turning it into either a get 
with LWP or an ftp query with Net::FTP or even Net::SSH::Perl is fine, 
but I don't want to have to re-implement it every time. Does anyone know 
of a product available for any of these generically ?

I would prefer to separate out the url from login and password

Something like...

	use MyGet;
	use IO::File;
	my $f = new IO::File '> /tmp/outfile.$$';

	my $g = MyGet->new();

Is LWP the right place to start?

Many years ago now I wrote 'fcp'. Literally it is equiv to 'rcp' and 
'scp' but for FTP files. Meaning I can basically do:

	fcp *.html scottp at

and it uploads the file. Thus I could download and upload, and do things 

	fcp scottp at*.html scottp at

Things like that - but it was REALLY hard to not only parse the URL 
correctly but also work out exactly the things like recursive 
directories etc. wget is great, but it is download only and only ftp and 

Any ideas? Do you find that you do this alot yourself ?

- ---
Scott Penrose
Open source and Linux Developer
scottp at
Version: GnuPG v1.0.6 (Darwin)
Comment: For info see


More information about the Melbourne-pm mailing list