LWP::UserAgent and referer?
Keary Suska
aksuska at webflyer.com
Mon Nov 26 16:19:27 CST 2001
Yes, you can have them automatically stored in a file. I don't recall the
syntax offhand, but http://www.perldoc.com/ has the docs on it, or you can
$ perldoc HTTP::Cookies
A perhaps simpler way is getting the complete response using
$response->as_string. This will include headers and all, and you can inspect
the headers for Set-cookie headers. Like:
# $response is the acquired response object
$page = $response->as_string;
print join( "\n", $page =~ /^(set-cookie:.*)/ig );
Should do the trick.
Keary Suska
Esoteritech, Inc.
"Leveraging Open Source for a better Internet"
> From: "Robert L. Harris" <Robert.L.Harris at rdlg.net>
> Date: Mon, 26 Nov 2001 13:58:52 -0700
> To: Keary Suska <aksuska at webflyer.com>
> Cc: John Evans <evansj at kilnar.com>, "Robert L. Harris"
> <Robert.L.Harris at rdlg.net>, Pikes-Peak Perl Mongers
> <pikes-peak-pm-list at happyfunball.pm.org>
> Subject: Re: LWP::UserAgent and referer?
>
>
> Is there a way to find out what cookies it's setting with HTTP::Cookies?
> Got a good example of some code I can poke?
>
>
> Thus spake Keary Suska (aksuska at webflyer.com):
>
>> HTTP::Cookies handles cookies rather easily and seamlessly.
>>
>> Keary Suska
>> Esoteritech, Inc.
>> "Leveraging Open Source for a better Internet"
>>
>>> From: John Evans <evansj at kilnar.com>
>>> Date: Mon, 26 Nov 2001 15:19:07 -0500 (EST)
>>> To: "Robert L. Harris" <Robert.L.Harris at rdlg.net>
>>> Cc: Pikes-Peak Perl Mongers <pikes-peak-pm-list at happyfunball.pm.org>
>>> Subject: Re: LWP::UserAgent and referer?
>>>
>>> On Mon, 26 Nov 2001, Robert L. Harris wrote:
>>>
>>>> Hmm, I did this one. Looks nice. Problem is now I'm getting a 404.
>>>> I put a few prints in. If I go to the index.html and select "save as"
>>>> it downloads fine. If I go directly to the link I'm trying to get I
>>>> get a 403, denied. When I run this segment of code I get a 404. If I
>>>> check the URL I'm accessing in "$url" below against my save as,
>>>> they're identicle.
>>>
>>>
>>> Keary already suggested that http://www.foo.com/ and
>>> http://www.foo.com/index.html are not the same and that you may need to
>>> use HTTP 1.1 instead of HTTP 1.0. He also mentioned cookies and I'm
>>> starting to wonder if that's where the problem lies.
>>>
>>> I know that you can emulate cookies with Perl, but I'm not sure what
>>> module does it or even where to start looking for something like that.
>>> Check your system to see if you have any cookies from the host that you
>>> are contacting for the patches and see if they have set one.
>>>
>>>
>>> --
>>> John Evans
>>> http://evansj.kilnar.com/
>>>
>>> -----BEGIN GEEK CODE BLOCK-----
>>> Version: 3.1
>>> GCS d- s++:- a- C+++>++++ ULSB++++$ P+++$ L++++$
>>> E--- W++ N+ o? K? w O- M V PS+ !PE Y+ PGP t(--) 5-- X++(+++)
>>> R+++ tv+ b+++(++++) DI+++ D++>+++ G+ e h--- r+++ y+++
>>> ------END GEEK CODE BLOCK------
>>>
>>>
>
>
>
> :wq!
> ---------------------------------------------------------------------------
> Robert L. Harris | Micros~1 :
> Senior System Engineer | For when quality, reliability
> at RnD Consulting | and security just aren't
> \_ that important!
> DISCLAIMER:
> These are MY OPINIONS ALONE. I speak for no-one else.
> FYI:
> perl -e 'print $i=pack(c5,(41*2),sqrt(7056),(unpack(c,H)-2),oct(115),10);'
>
>
More information about the Pikes-peak-pm
mailing list