FW: Lynx???
Joshua Keroes
joshua_keroes at eli.net
Tue Jul 9 16:37:33 CDT 2002
If that's all you're doing, a shell script will suffice:
#!/bin/sh
lynx -dump "http://www.moreover.com/cgi-local/page?feed=239&o=xml" > news.xml
If you intend on doing something else with the data, perl is keen. Here's
a one-liner to do the same thing, kinda. This one accepts the url as an
argument and lets you redirect the output to a file of your choosing.
perl -MLWP::Simple -e "getprint(shift)" \
"http://www.moreover.com/cgi-local/page?feed=239&o=xml" \
> news.xml
(n.b. the URL has an '&' in it and needed to be quoted)
-Joshua
On (Tue, Jul 09 14:29), Aaron Kilbey wrote:
> Aloha everybody -
>
> I'm attempting to issue an http request through lynx to save an XML file
> to a local server.
> Here's the code:
>
> ------------------------------------------------------------------------
> newsdump.pl
> ------------------------------------------------------------------------
> #!/usr/bin/perl
>
> system("lynx -dump http://www.moreover.com/cgi-local/page?feed=239&o=xml
> > news.xml");
> ------------------------------------------------------------------------
>
>
> 1. When I do: "bash$ perl newsdump.pl" I get a zero length news.xml
>
> 2. When I do: "bash$ perl newsdump.pl > news.xml" I get a news.xml that
> is a numbered list of the news links which is not XML.
>
> 3. When I run http://www.moreover.com/cgi-local/page?feed=239&o=xml in a
> web browser, I get the XML document.
>
> I need to have the ability to call this script to automatically retrieve
> and save the new file. Please help.
>
>
> -Aaron Kilbey
>
> TIMTOWTDI
TIMTOWTDI
More information about the Pdx-pm-list
mailing list