From dan at concolor.org Fri Aug 1 09:08:49 2003 From: dan at concolor.org (Dan Sabath) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Open Sauce Lunch Today Message-ID: There will be an open sauce lunch today in the International district at 12pm. The blue angels will be performing in the skies over town at that time. I-90 will be closed and expect heavy traffic. Sign up at: http://spugwiki.perlocity.org/ index.cgi?FriAug01ChineseInInternationalDistrict -dan Open Sauce Lunch Chinese (Dim Sum) food in International District Friday, 8/1, 12:00pm House of Hong 409 8th Ave S Seattle, Washington 98104 (206) 622-7997 MAP Website Look for a guy wearing a hawaiian shirt and (no) gotee; that will be this lunch's Convener, Dan Sabath. From tim at consultix-inc.com Fri Aug 1 17:07:37 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: pod2man: changing point size? Message-ID: <20030801150737.A6061@timji.consultix-inc.com> It seems hard to imagine, but there doesn't seem to be any way to specify what point-size and vertical-spacing you want when using pod2man to convert POD format documentation into troff. There's no reason why this should be disallowed, because POD documents don't have page-breaks in the first place, and therefore don't need preservation by regulating the amount of text per page. I can get the (slightly larger) size settings I want in the resulting PS (via groff -Tps file.man>file.ps) or PDF (via pod2pdf file.ps>file.pdf) files, but only by doing a lot of editing of file.man, to insert .ps 13 .vs 15 in many places. I could certainly change the macro file that's being sourced, but this seems like a problem that somebody should already have solved -- many years ago! 8-} Anybody have any suggestions? -Tim *------------------------------------------------------------* | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* | Watch for my Book: "Minimal Perl for Shell Programmers" | *------------------------------------------------------------* From devnull at devnullsoftware.com Fri Aug 1 16:49:14 2003 From: devnull at devnullsoftware.com (Lee Wilson) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: RE / Split Question In-Reply-To: Message-ID: On Wed, 30 Jul 2003, Orr, Chuck (NOC) wrote: > Thank You all for the excellent responses. > > I tried several of these because I felt guilty wasting all the thought > and code. The solution that seems to fit my life best is the one from > Lee Wilson. Lee, thanks a million. If you're ever in the Bothell area, > I owe you a lunch. I appreciate everyone's efforts on this one. No problems. I enjoyed the momentary distraction from a boring day =) Personally, I liked many of the other solutions a lot better. I consider myself pretty good at regex, but I learned a lot of new things from the other solution. I think people should post more problems like this. The solutions are interesting because there are so MANY of them, all different. ============================================================================== Lee Wilson - INTP http://www.devnullsoftware.com Software Developer / RealNetworks http://www.realarcade.com ============================================================================== There are 10 kinds of people in the world: The people who understand ternary, The people who dont, but care, and the people who don't understand or care. ============================================================================== From christopher.w.cantrall at boeing.com Fri Aug 1 17:29:09 2003 From: christopher.w.cantrall at boeing.com (Cantrall, Christopher W) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: pod2man: changing point size? Message-ID: > groff -Tps file.man>file.ps So you're the one making the TPS reports. We're putting cover-sheets on those reports now. Did you get that memo? Chris PS Yes, I've been watching Office Space ( http://us.imdb.com/Title?0151804 ) - a great movie. ____________________________________________ Chris Cantrall Structural Engineer, 767 Fuselage, Boeing Christopher.W.Cantrall@Boeing.com chris@cantrall.org http://perlmonks.org/index.pl?node=Louis_Wu > -----Original Message----- > From: Tim Maher [mailto:tim@consultix-inc.com] > Sent: Friday, August 01, 2003 3:08 PM > To: SPUG Announcements > Subject: SPUG: pod2man: changing point size? > > > It seems hard to imagine, but there doesn't seem to be any way to > specify what point-size and vertical-spacing you want when using > pod2man to convert POD format documentation into troff. There's > no reason why this should be disallowed, because POD documents > don't have page-breaks in the first place, and therefore don't > need preservation by regulating the amount of text per page. > > I can get the (slightly larger) size settings I want in the > resulting PS (via groff -Tps file.man>file.ps) or PDF > (via pod2pdf file.ps>file.pdf) files, but only by doing a lot of > editing of file.man, to insert > .ps 13 > .vs 15 > in many places. > > I could certainly change the macro file that's being sourced, but > this seems like a problem that somebody should already have solved -- > many years ago! 8-} > > Anybody have any suggestions? > > -Tim > *------------------------------------------------------------* > | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | > | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | > *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* > | Watch for my Book: "Minimal Perl for Shell Programmers" | > *------------------------------------------------------------* > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org Wiki: spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: www.seattleperl.org > > From tim at consultix-inc.com Fri Aug 1 17:39:22 2003 From: tim at consultix-inc.com (SPUG-list-owner) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: pod2man: changing point size? In-Reply-To: References: Message-ID: <20030801153922.A6153@timji.consultix-inc.com> On Fri, Aug 01, 2003 at 03:29:09PM -0700, Cantrall, Christopher W wrote: > > groff -Tps file.man>file.ps > > So you're the one making the TPS reports. We're putting > cover-sheets on those reports now. Did you get that memo? -Tim > > Chris > > PS > Yes, I've been watching Office Space ( > http://us.imdb.com/Title?0151804 ) - a great movie. Chris, I'm sure that remark is very funny, but alas, I don't get it. Is it an Office Space thing? 8-} -Tim *------------------------------------------------------------* | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* | Watch for my Book: "Minimal Perl for Shell Programmers" | *------------------------------------------------------------* From christopher.w.cantrall at boeing.com Fri Aug 1 17:47:35 2003 From: christopher.w.cantrall at boeing.com (Cantrall, Christopher W) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: pod2man: changing point size? Message-ID: > alas, I don't get it. > Is it an Office Space thing? 8-} Yes. It's a minor plot point. Note to self: Movie references should be restricted to the Geek Canon: Star Wars/Trek, Matrix, Buck Rogers, Pink Panther, Mary Poppins, ... Errrm, that' not right. Where's my Geek Certification, it' got the "cheat codes" on the back. ____________________________________________ Chris Cantrall Structural Engineer, 767 Fuselage, Boeing Christopher.W.Cantrall@Boeing.com chris@cantrall.org http://perlmonks.org/index.pl?node=Louis_Wu > -----Original Message----- > From: SPUG-list-owner [mailto:tim@consultix-inc.com] > Sent: Friday, August 01, 2003 3:39 PM > To: Cantrall, Christopher W > Cc: Tim Maher; SPUG Announcements > Subject: Re: SPUG: pod2man: changing point size? > > > On Fri, Aug 01, 2003 at 03:29:09PM -0700, Cantrall, > Christopher W wrote: > > > groff -Tps file.man>file.ps > > > > So you're the one making the TPS reports. We're putting > > cover-sheets on those reports now. Did you get that memo? > -Tim > > > > Chris > > > > PS > > Yes, I've been watching Office Space ( > > http://us.imdb.com/Title?0151804 ) - a great movie. > > Chris, I'm sure that remark is very funny, but alas, I don't get it. > Is it an Office Space thing? 8-} > > -Tim > *------------------------------------------------------------* > | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | > | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | > *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* > | Watch for my Book: "Minimal Perl for Shell Programmers" | > *------------------------------------------------------------* > From ben at reser.org Fri Aug 1 17:59:20 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: pod2man: changing point size? In-Reply-To: References: Message-ID: <20030801225920.GI1862@titanium.brain.org> On Fri, Aug 01, 2003 at 03:47:35PM -0700, Cantrall, Christopher W wrote: > Note to self: Movie references should be restricted to the Geek Canon: > Star Wars/Trek, Matrix, Buck Rogers, Pink Panther, Mary Poppins, ... > Errrm, that' not right. Where's my Geek Certification, it' got the > "cheat codes" on the back. As far as I knew Office Space was in the geek canon now. I'm not sure how Tim missed it. -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From tim at consultix-inc.com Fri Aug 1 19:02:19 2003 From: tim at consultix-inc.com (SPUG-list-owner) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: pod2man: changing point size? In-Reply-To: <20030801150737.A6061@timji.consultix-inc.com> References: <20030801150737.A6061@timji.consultix-inc.com> Message-ID: <20030801170219.A6428@timji.consultix-inc.com> On Fri, Aug 01, 2003 at 03:07:37PM -0700, Tim Maher wrote: I found a fix; see below. -Tim > It seems hard to imagine, but there doesn't seem to be any way to > specify what point-size and vertical-spacing you want when using > pod2man to convert POD format documentation into troff. There's > no reason why this should be disallowed, because POD documents > don't have page-breaks in the first place, and therefore don't > need preservation by regulating the amount of text per page. > > I can get the (slightly larger) size settings I want in the > resulting PS (via groff -Tps file.man>file.ps) or PDF > (via pod2pdf file.ps>file.pdf) files, but only by doing a lot of > editing of file.man, to insert in many places. > .ps 13 > .vs 15 I figured it out; in addition to issue the directives, I have to set the number registers too, because the changes via directives are undone on the next .RT (reset) call, used by many macros, whereas the ones established through register settings specify new undo-values, so they remain in effect. So here's the solution =begin man .ps 13 .nr PS 13p .vs 15 .nr VS 15p =end Believe it or not, I like troff! I just implemented a table-formatting mechanism for Magicpoint using "tbl | groff -mm" to get the columns aligned, and headings spanned, etc. 8-} -Tim *------------------------------------------------------------* | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* | Watch for my Book: "Minimal Perl for Shell Programmers" | *------------------------------------------------------------* From andrew at sweger.net Fri Aug 1 20:36:48 2003 From: andrew at sweger.net (Andrew Sweger) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Office Space, the movie Message-ID: Maybe, if we ever become desperate for a meeting topic, we can watch Office Space on the projector screen. I can't believe Tim hasn't seen this cult classic. I just need to figure out how to wire the line out on my Mac to the speaker system in the auditorium and we'll have movie night, SPUG style. -- Andrew B. Sweger -- The great thing about multitasking is that several things can go wrong at once. From tim at consultix-inc.com Fri Aug 1 22:37:45 2003 From: tim at consultix-inc.com (SPUG-list-owner) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Office Space, the movie In-Reply-To: References: Message-ID: <20030801203745.A7031@timji.consultix-inc.com> On Fri, Aug 01, 2003 at 06:36:48PM -0700, Andrew Sweger wrote: > Maybe, if we ever become desperate for a meeting topic, we can watch > Office Space on the projector screen. I can't believe Tim hasn't seen this > cult classic. I just need to figure out how to wire the line out on my Mac > to the speaker system in the auditorium and we'll have movie night, SPUG > style. I sure haven't seen it. And apparently I should feel deprived about that! (But I *do* have Ravi Shankar's autograph on the back of my VietNam-War-era draft card! ) My guess is that it wasn't a Hollywood or Merchant/Ivory movie, which are the kinds I usually see, but perhaps just a Slashdot-distribution. Am I on the right track? -Tim > > -- > Andrew B. Sweger -- The great thing about multitasking is that several > things can go wrong at once. > > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org Wiki: spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: www.seattleperl.org -- -Tim *------------------------------------------------------------* | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* | Watch for my Book: "Minimal Perl for Shell Programmers" | *------------------------------------------------------------* From creede at penguinsinthenight.com Sat Aug 2 00:44:25 2003 From: creede at penguinsinthenight.com (Creede Lambard) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Office Space, the movie In-Reply-To: <20030801203745.A7031@timji.consultix-inc.com> References: <20030801203745.A7031@timji.consultix-inc.com> Message-ID: <1059803056.8145.6.camel@boris> On Fri, 2003-08-01 at 20:37, SPUG-list-owner wrote: > My guess is that it wasn't a Hollywood or Merchant/Ivory movie, which > are the kinds I usually see, but perhaps just a Slashdot-distribution. > Am I on the right track? > No, it was a Hollywood movie all right. 20th Century Fox. http://us.imdb.com/Title?0151804 I get the definite impression that, like many cult favorites, it didn't get a lot of exposure the first time around, but its reputation has grown since then. -- * .~. `( ---------------------------------------------------------- ` / V \ . Creede Lambard : Never rush a miracle man. /( )\ creede@penguinsinthenight.com : You get rotten miracles. ^^-^^ ---------------------------------------------------------- GPG key at http://www.penguinsinthenight.com/creede_public_key.asc -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part Url : http://mail.pm.org/pipermail/spug-list/attachments/20030802/1ea4f3c5/attachment.bin From kirbyk at idiom.com Sat Aug 2 01:12:36 2003 From: kirbyk at idiom.com (Kirby Krueger) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Office Space, the movie In-Reply-To: <1059803056.8145.6.camel@boris> Message-ID: > No, it was a Hollywood movie all right. 20th Century Fox. > > http://us.imdb.com/Title?0151804 > > I get the definite impression that, like many cult favorites, it didn't > get a lot of exposure the first time around, but its reputation has > grown since then. > I think a lot of folks were turned off by the promos, which seemed to focus on the fact that it was from the creator of Beavis and Butthead. While this certainly appeals to a certain crowd, that's not exactly a selling point to an older or more intellectual audience, which covers a lot of the geek crowd. But, this movie turns out to be a far cry from a fart-joke on wheels - really, there's very little of the lowbrow B&B-style humor. Instead, you get something that's more akin to Dilbert at its peak - the eerie combination of absurity and familiarity to anyone who's ever worked in something similar to a software house. It manages to pack in a lot of memorable characters and moments, and isn't the one-note show I'd have expected. Worth seeing for anyone here just to pick up on a common source of pop-culture references, and almost everyone I've talked to actually likes the film. Er, not to overhype it or anything. It's not The Princess Bride or anything. It did indeed bomb at the box office, with only 4.2M on its opening weekend, and just over 10M overall. That's terrible. Came out in February 1999. The trailer is available on IMDB, and watching it, it's like I'm watching all the _worst_ scenes of the movie, how bizarre. But, at least it's found its audience a bit later in life. -- Kirby From perl at pryan.org Sat Aug 2 01:33:55 2003 From: perl at pryan.org (Patrick Ryan) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Office Space, the movie In-Reply-To: References: <1059803056.8145.6.camel@boris> Message-ID: <20030802063355.GJ17904@stingray.velvet> On Fri, Aug 01, 2003 at 11:12:36PM -0700, Kirby Krueger wrote: > It manages to pack in a lot of memorable characters and moments, and > isn't the one-note show I'd have expected. Worth seeing for anyone > here just to pick up on a common source of pop-culture references, and > almost everyone I've talked to actually likes the film. It is a great movie, mainly because it is so easy to identify with, but also because elements from the movie have become part of geek culture. It has had such an impact that Swingline brought back its discontinued red staplers just because of the demand caused by the movie. That's pretty impressive if you ask me. I'm not sure if they made that exact stapler in red and then later discontinued it, but Swingline has made red heavy metal staplers in the past. http://www.mdac.net/weeklyphoto/archives/00000025.shtml http://www.virtualstapler.com/office_space/ http://www.swingline.com/html/1695.html Buy one if you like (I haven't): http://www.techcomedy.com/store/products/show_product.php?product_type=Red+Staplers http://www.thinkgeek.com/cubegoodies/toys/61b7/ - Patrick From jay at scherrer.com Sun Aug 3 12:07:38 2003 From: jay at scherrer.com (Jay Scherrer) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: CGI.pm cross site scripting vulnerability Message-ID: <200308031007.38405.jay@scherrer.com> Just in case you haven't heard. Security advisory for Perl SUMMARY : CGI.pm cross site scripting vulnerability DATE : 2003-07-29 14:53:00 more Info: Jay -- Personalized e-mail and domain names: Jay wrote: >Just in case you haven't heard. > >Security advisory for Perl >SUMMARY : CGI.pm cross site scripting vulnerability >DATE : 2003-07-29 14:53:00 >more Info: I wonder what they changed the default behavior to? Somehow, I always figured that sort of behavior was "by design". Let's take a close look at the scenario here: 1. Someone creates a page with malware. The malicious tag could execute a script, directly. Almost any page that allows (ab)users to create HTMLized markup is by definition such an accident waiting to happen. (Wikis, anyone?). It might not even need to be a tag, depending on what kind of markup the site allows, JavaScript (or any other kind of client script) could be entered and run in the browser, simply by loading the page. Let's take that a final step further, and what if the site post-processes the content on the server, think maybe I can get a little Perl or PHP in there? Indeed. 2. User submits a malicious URL, from a malicious page. Why? I guess people will just click on any old link, anywhere. Yup, no guessing about it, they will! The (ab)user could have manually created a malicious URL. No, that would never happen.. yeah, right! 3. The server gets the URL and doesn't validate the parameters. That appeals to SQO cowboys. Doesn't appeal to me. Hey, maybe they do validate, but not particularly closely. Taint? Formmail? 4. The server creates a page containing the payload from step 1 and sends it back to the user. The subtle issue here is not that it's the same site (site A), but some other site (site B). Furthermore the user "trusts" site B, but not site A. The trust is the issue here: consequently, they allow site B to do things they wouldn't allow site A to do. The user is trusting a site which allows, intentionally or otherwise, unvalidated input to become part of its content. That's really smart, I'm sure we agree. Or, could it be that site B is using cookies or other client fluff in maybe ways that it shouldn't oughta? Is there some dirty underwear in those cookies, maybe, that site B doesn't want site A (or some other site C) to get ahold of? Other than somebody legitimate like doubleclick, of course (gee, I never visited doubleclick.com, why do I have doubleclick cookies? duuuh). Several questions and observations need to be made about all of this: Cookies are just so stupid for persistence of sensitive information. Most people don't spend a whole lot of time evaluating which sites to trust... they don't even know how! Even if they did, they don't have much choice if their idiot bank or online merchant ("A" word, anyone?) ties identity to cookies or any other mechanism which operates without user intervention. Even if they do know how, do they know how to adjust the settings in their browser (presuming it's possible), and is the browser trustworthy? Given the above, rationally is the issue really about protecting users, or protecting sites which do stupid things? Obviously, the user's aren't running CGI.pm (well, they are in an OO/Smalltalkish/message passing sense), but that's not their choice. Given that the site made a stupid decision as a matter of policy to begin with, are they really going to revisit those decisions, or are they just going to apply a bandaid? Think there are any sites out there which allow third parties (advertisers) to place JavaScript on pages where they exchange sensitive information with the user? I bet there are! Ya think they personally vett that code, and accept liability for its being there? I bet they don't want to, but it might be an interesting court case. ObPerl: Remember a few months ago when I asked about escaping HTML? While researching this to make sure I wasn't losing my mind, I found the really obvious answer. Check out Apache::Util::escape_html(). Ingy: I like wikis. The web is not just for passive lusers, it's what we make of it. Still whom do we trust, why, and are we making good decisions? -- Fred Morris m3047@inwa.net From cjcollier at colliertech.org Sun Aug 3 16:53:38 2003 From: cjcollier at colliertech.org (C.J. Collier) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Module Dist Message-ID: <1059947091.785.308.camel@karma.localnet> Hey all, I'm working on a cute little hack for what's becoming Seattle Wireless' favorite new router, the WRT54G. This little box runs Linux 2.4.5-mips, and we've found a way to get a shell on it, among other things. I'm writing a perl module and a set of scripts that use it to automate a bunch of the things required to get said shell. I'm thinking of releasing the module on the CPAN if anyone thinks it's a useful tool. Could I get some of you to review it? I release new versions when I think they're ready, so I won't give you an exact version ;) Check this directory and get the one with the biggest number: http://cj.colliertech.org/swn/ Thanks much! C.J. From ekahklen-chhip at qwest.net Mon Aug 4 12:16:23 2003 From: ekahklen-chhip at qwest.net (Eric Kahklen) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: mod_perl Newbie Help Message-ID: <000f01c35aac$21a8ef20$5d00000a@chhip2.local> This was originally posted on a Linux list but was suggested that I look here for further help on the issue. I've also included some suggestions for others and the resulting errors. Sorry for the crude descriptions, but I am still quite new to Perl in general. I am trying to learn about mod_perl and seem to be having problems trying to get it to work correctly. I've been struggling with this for a few days and am a beginner so please keep that in mind. I am reading a book on Open Source Web Development and am following the directions to configure mod_perl. The suggested perl script fails and says it cannot find the Registry.pm. From my understanding, the script is called from the httpd.conf file to load modules needed for mod_perl. In addition, the settings are also allowing the CGI scripts from the previous chapter to be run without being re-written. Here is what is being suggested by the author: #!/usr/bin/perl #startup.pl # Tell Perl where to find our modules use lib '/var/www/mod_perl'; # use some common modules use Apache::Registry(); use Apache::Constants(); use CGI ':standard'; use DBI; # add other modules here.. # the file needs to end with 1; 1; This is added to the end of the httpd.conf file PerlRequire conf/startup.pl PerlFreshRestart On When I do a /etc/init.d/httpd graceful I get the following error: [error] Can't locate Apache/Registry.pm in @INC (@INC contains: /var/www/mod_perl /usr/lib/perl5/5.8.0/i386-linux-thread-multi /usr/lib/perl5/5.8.0 /usr/lib/perl5/site_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.0 /usr/lib/perl5/site_perl /usr/lib/perl5/vendor_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.0 /usr/lib/perl5/vendor_perl .) at /etc/httpd/conf/startup.pl line 7. BEGIN failed--compilation aborted at /etc/httpd/conf/startup.pl line 7. Compilation failed in require at (eval 1) line 1. When I do perl -we 'use Apache::Registry();' I get this error: Can't locate Apache/Registry.pm in @INC (@INC contains: /usr/lib/perl5/5.8.0/i386-linux-thread-multi /usr/lib/perl5/5.8.0 /usr/lib/perl5/site_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.0 /usr/lib/perl5/site_perl /usr/lib/perl5/vendor_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.0 /usr/lib/perl5/vendor_perl .) at -e line 1. BEGIN failed--compilation aborted at -e line 1. I am running Red Hat 8 and as far as I can tell, I have everything installed. A 'locate Registry.pm' results in this: /root/.cpan/build/mod_perl-1.28/lib/Apache/Registry.pm /root/.cpan/build/mod_perl-1.28/blib/lib/Apache/Registry.pm /usr/lib/perl5/vendor_perl/5.8.0/i386-linux-thread-multi/ModPerl/Registry.pm Any help would be appreciated on this. Thanks, Eric __________________________________________________ Eric Kahklen System Administrator Capitol Hill Housing Improvement Program 1406 10th Ave, Suite 101 Seattle, WA 98122 206-329-7303 From pdarley at kinesis-cem.com Mon Aug 4 14:13:52 2003 From: pdarley at kinesis-cem.com (Peter Darley) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: mod_perl Newbie Help In-Reply-To: <000f01c35aac$21a8ef20$5d00000a@chhip2.local> Message-ID: Eric, I think that RH8 ships with a newer version of Apache and ModPerl 2.0 which isn't set up the same way as older versions of ModPerl, which is probably what your book is using for reference. You could try using ModPerl::Registery, rather than Apache::Registry. Other than that, you might try looking for a document that shows the differences between the old and new versions of mod_perl on the web. My two second search came up with nothing, but I'm sure there's something out there. Thanks, Peter Darley -----Original Message----- From: spug-list-bounces@mail.pm.org [mailto:spug-list-bounces@mail.pm.org]On Behalf Of Eric Kahklen Sent: Monday, August 04, 2003 10:16 AM To: Perl-List (E-mail) Subject: SPUG: mod_perl Newbie Help This was originally posted on a Linux list but was suggested that I look here for further help on the issue. I've also included some suggestions for others and the resulting errors. Sorry for the crude descriptions, but I am still quite new to Perl in general. I am trying to learn about mod_perl and seem to be having problems trying to get it to work correctly. I've been struggling with this for a few days and am a beginner so please keep that in mind. I am reading a book on Open Source Web Development and am following the directions to configure mod_perl. The suggested perl script fails and says it cannot find the Registry.pm. From my understanding, the script is called from the httpd.conf file to load modules needed for mod_perl. In addition, the settings are also allowing the CGI scripts from the previous chapter to be run without being re-written. Here is what is being suggested by the author: #!/usr/bin/perl #startup.pl # Tell Perl where to find our modules use lib '/var/www/mod_perl'; # use some common modules use Apache::Registry(); use Apache::Constants(); use CGI ':standard'; use DBI; # add other modules here.. # the file needs to end with 1; 1; This is added to the end of the httpd.conf file PerlRequire conf/startup.pl PerlFreshRestart On When I do a /etc/init.d/httpd graceful I get the following error: [error] Can't locate Apache/Registry.pm in @INC (@INC contains: /var/www/mod_perl /usr/lib/perl5/5.8.0/i386-linux-thread-multi /usr/lib/perl5/5.8.0 /usr/lib/perl5/site_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.0 /usr/lib/perl5/site_perl /usr/lib/perl5/vendor_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.0 /usr/lib/perl5/vendor_perl .) at /etc/httpd/conf/startup.pl line 7. BEGIN failed--compilation aborted at /etc/httpd/conf/startup.pl line 7. Compilation failed in require at (eval 1) line 1. When I do perl -we 'use Apache::Registry();' I get this error: Can't locate Apache/Registry.pm in @INC (@INC contains: /usr/lib/perl5/5.8.0/i386-linux-thread-multi /usr/lib/perl5/5.8.0 /usr/lib/perl5/site_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.0 /usr/lib/perl5/site_perl /usr/lib/perl5/vendor_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.0 /usr/lib/perl5/vendor_perl .) at -e line 1. BEGIN failed--compilation aborted at -e line 1. I am running Red Hat 8 and as far as I can tell, I have everything installed. A 'locate Registry.pm' results in this: /root/.cpan/build/mod_perl-1.28/lib/Apache/Registry.pm /root/.cpan/build/mod_perl-1.28/blib/lib/Apache/Registry.pm /usr/lib/perl5/vendor_perl/5.8.0/i386-linux-thread-multi/ModPerl/Registry.pm Any help would be appreciated on this. Thanks, Eric __________________________________________________ Eric Kahklen System Administrator Capitol Hill Housing Improvement Program 1406 10th Ave, Suite 101 Seattle, WA 98122 206-329-7303 _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org Wiki: spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: www.seattleperl.org From sthoenna at efn.org Sun Aug 3 15:14:17 2003 From: sthoenna at efn.org (Yitzchak Scott-Thoennes) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: RE / Split Question References: <0307310002110E.00809@bng2406iy20tf> Message-ID: On Thu, 31 Jul 2003 00:02:11 -0700, krahnj@acm.org wrote: >$ perl -le' >$glob = "425 501 sttlwa01t 425 712 sttlwa01t tacwa02t 425 337 tacwa02t "; > >@array = $glob =~ /( \b\d+ \s+ \d+ (?:\s+ \D\w*)+ )/xg; > >print for @array; >' >425 501 sttlwa01t >425 712 sttlwa01t tacwa02t >425 337 tacwa02t The problem with this kind of approach is that it silently ignores bad data (or good data if you make a mistake in your regex). I like to do this kind of spliting with something like: @array = $glob =~ /\G ( \b\d+ \s+ \d+ (?:\s+ \D\w*)+ ) \s+ /xgc; print "error!" if (pos($glob)||0) != length($glob) This always starts each match where the preceeding one left off and then verifies that the entire string was consumed. From pdarley at kinesis-cem.com Mon Aug 4 14:16:29 2003 From: pdarley at kinesis-cem.com (Peter Darley) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: mod_perl Newbie Help In-Reply-To: <000f01c35aac$21a8ef20$5d00000a@chhip2.local> Message-ID: Eric, Here's a page that explains the differences between 2.0 and older mod_perls: http://perl.apache.org/docs/2.0/user/porting/porting.html Thanks, Peter Darley -----Original Message----- From: spug-list-bounces@mail.pm.org [mailto:spug-list-bounces@mail.pm.org]On Behalf Of Eric Kahklen Sent: Monday, August 04, 2003 10:16 AM To: Perl-List (E-mail) Subject: SPUG: mod_perl Newbie Help This was originally posted on a Linux list but was suggested that I look here for further help on the issue. I've also included some suggestions for others and the resulting errors. Sorry for the crude descriptions, but I am still quite new to Perl in general. I am trying to learn about mod_perl and seem to be having problems trying to get it to work correctly. I've been struggling with this for a few days and am a beginner so please keep that in mind. I am reading a book on Open Source Web Development and am following the directions to configure mod_perl. The suggested perl script fails and says it cannot find the Registry.pm. From my understanding, the script is called from the httpd.conf file to load modules needed for mod_perl. In addition, the settings are also allowing the CGI scripts from the previous chapter to be run without being re-written. Here is what is being suggested by the author: #!/usr/bin/perl #startup.pl # Tell Perl where to find our modules use lib '/var/www/mod_perl'; # use some common modules use Apache::Registry(); use Apache::Constants(); use CGI ':standard'; use DBI; # add other modules here.. # the file needs to end with 1; 1; This is added to the end of the httpd.conf file PerlRequire conf/startup.pl PerlFreshRestart On When I do a /etc/init.d/httpd graceful I get the following error: [error] Can't locate Apache/Registry.pm in @INC (@INC contains: /var/www/mod_perl /usr/lib/perl5/5.8.0/i386-linux-thread-multi /usr/lib/perl5/5.8.0 /usr/lib/perl5/site_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.0 /usr/lib/perl5/site_perl /usr/lib/perl5/vendor_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.0 /usr/lib/perl5/vendor_perl .) at /etc/httpd/conf/startup.pl line 7. BEGIN failed--compilation aborted at /etc/httpd/conf/startup.pl line 7. Compilation failed in require at (eval 1) line 1. When I do perl -we 'use Apache::Registry();' I get this error: Can't locate Apache/Registry.pm in @INC (@INC contains: /usr/lib/perl5/5.8.0/i386-linux-thread-multi /usr/lib/perl5/5.8.0 /usr/lib/perl5/site_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.0 /usr/lib/perl5/site_perl /usr/lib/perl5/vendor_perl/5.8.0/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.0 /usr/lib/perl5/vendor_perl .) at -e line 1. BEGIN failed--compilation aborted at -e line 1. I am running Red Hat 8 and as far as I can tell, I have everything installed. A 'locate Registry.pm' results in this: /root/.cpan/build/mod_perl-1.28/lib/Apache/Registry.pm /root/.cpan/build/mod_perl-1.28/blib/lib/Apache/Registry.pm /usr/lib/perl5/vendor_perl/5.8.0/i386-linux-thread-multi/ModPerl/Registry.pm Any help would be appreciated on this. Thanks, Eric __________________________________________________ Eric Kahklen System Administrator Capitol Hill Housing Improvement Program 1406 10th Ave, Suite 101 Seattle, WA 98122 206-329-7303 _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org Wiki: spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: www.seattleperl.org From spug at ifokr.org Mon Aug 4 16:18:05 2003 From: spug at ifokr.org (Brian Hatch) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: New GSLUG-ANNOUNCE list Message-ID: <20030804211805.GS22683@ifokr.org> Sending this here for those SPUG folks also interested in the Linux goings-on in Seattle. We have created a new announcement list for GSLUG, the Greater Seattle Linux User Group. Here's the official blurb: " This list is used to communicate issues relating to GSLUG, the Greater Seattle Linux Users Group. It is a read-only list (only GSLUG organizers can post to it) which will keep you informed about GSLUG meetings, events, and other happenings. It will be a very low usage list. " If you want to sign up, visit http://lists.gslug.org/mailman/listinfo/gslug-announce It is managed by MailMan. Archives will also be available from that page. -- Brian Hatch Bri: "Do you want to go Systems and get a steamed milk?" Security Engineer Reegen: "No. Caffene." http://www.ifokr.org/bri/ Every message PGP signed -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030804/a2aafc73/attachment.bin From tim at consultix-inc.com Tue Aug 5 12:57:33 2003 From: tim at consultix-inc.com (SPUG-list-owner) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Open Sauce Lunch, Thu. 8/7 Message-ID: <20030805105733.A21888@timji.consultix-inc.com> This week's Ballard Open Sauce lunch will be at the Golden City restaurant, on Thursday, 8/7 at 12:30. Tell the lovely "Ping" (love that name; that's why this venue is PERFECT for UNIX/Linux geeks!) that you're looking for Tim's table. See spugwiki.perlocity.org, under "Social Events", for more details, and to RSVP. Hope to see you there! -Tim ======================================================= | Tim Maher, Ph.D. tim(AT)timmaher.org | | SPUG Founder & Leader spug(AT)seattleperl.com | | Seattle Perl Users Group www.seattleperl.com | ======================================================= From beckyls at u.washington.edu Wed Aug 6 16:58:18 2003 From: beckyls at u.washington.edu (Rebecca L. Schmidt) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: UW Certificate Programs Message-ID: Dear SPUG members, Want to improve your Perl skills, or apply your Perl skills to UNIX or Linux system administration? Want to learn how to protect data and networks? Then check out these UW Certificate Programs: Certificate Program in Perl Programming http://www.outreach.washington.edu/extinfo/certprog/per/per_main.asp Certificate Program for UNIX/Linux Administration http://www.outreach.washington.edu/extinfo/certprog/uad/uad_main.asp Certificate Program in Data and Internet Security http://www.outreach.washington.edu/extinfo/certprog/dis/dis_main.asp UW Extension is accepting applications now for programs that begin this fall. Classes take place in the evenings in Seattle or Bellevue, and are taught by industry experts. Attend a free information meeting to learn more about the programs. UNIX/Linux and Data and Internet Security Information Meetings: ** Tuesday, August 19, 2003, 6-7 p.m., UW Educational Outreach, 2445 140th Ave. NE, Suite B-100, Bellevue. ** Thursday, September 4, 2003, 6-7 p.m., UW Educational Outreach, 2445 140th Ave. NE, Suite B-100, Bellevue. Perl Information Meeting: ** Wednesday, August 27, 2003, 6-7 p.m., UW Extension Downtown, 1325 Fourth Ave., Suite 400, Seattle Please see the URLs above for detailed program information, and feel free to contact me if you have specific questions. Best regards, Rebecca Schmidt Associate Program Manager University of Washington Extension rschmidt@ese.washington.edu; beckyls@u.washington.edu 4311 11th Ave NE, Office #343G Seattle, WA 98105-4608 (206) 221-6243 From stuart_poulin at yahoo.com Wed Aug 6 20:48:04 2003 From: stuart_poulin at yahoo.com (Stuart Poulin) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Module Dist In-Reply-To: <1059947091.785.308.camel@karma.localnet> Message-ID: <20030807014804.79526.qmail@web80606.mail.yahoo.com> Too Cool! The true power of perl - automating the unrealizable. I wish I had one of these to play with. Any chance this works with other Linksys routers? "C.J. Collier" wrote: Hey all, I'm working on a cute little hack for what's becoming Seattle Wireless' favorite new router, the WRT54G. This little box runs Linux 2.4.5-mips, and we've found a way to get a shell on it, among other things. I'm writing a perl module and a set of scripts that use it to automate a bunch of the things required to get said shell. I'm thinking of releasing the module on the CPAN if anyone thinks it's a useful tool. Could I get some of you to review it? I release new versions when I think they're ready, so I won't give you an exact version ;) Check this directory and get the one with the biggest number: http://cj.colliertech.org/swn/ Thanks much! C.J. _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org Wiki: spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: www.seattleperl.org --------------------------------- Do you Yahoo!? SBC Yahoo! DSL - Now only $29.95 per month! -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.pm.org/pipermail/spug-list/attachments/20030806/a2c92368/attachment.htm From tim at consultix-inc.com Thu Aug 7 02:35:06 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Last Call, Open-Sauce Chinese Thu-12:30 Message-ID: <20030807003506.A28972@timji.consultix-inc.com> This week's Ballard Open Sauce lunch will be at the Golden City restaurant, on Thursday, 8/7 at 12:30. Tell the lovely "Ping" (love that name; that's why this venue is PERFECT for UNIX/Linux geeks!) that you're looking for Tim's table. See spugwiki.perlocity.org, under "Social Events", for more details. No need to RSVP at this point, just show up before 1pm and you'll find somebody still there to dine with. Hope to see you there! -Tim ======================================================= | Tim Maher, Ph.D. tim(AT)timmaher.org | | SPUG Founder & Leader spug(AT)seattleperl.com | | Seattle Perl Users Group www.seattleperl.com | | SPUGwiki Site spugwiki.perlocity.org | | Perl Certification Site perlcert.perlocity.org | ======================================================= From bob at hiltners.com Thu Aug 7 09:40:56 2003 From: bob at hiltners.com (Bob Hiltner) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Re: Last Call, Open-Sauce Chinese Thu-12:30 References: <20030807003506.A28972@timji.consultix-inc.com> Message-ID: <001501c35cf1$e94ce3f0$0200a8c0@computer> Apropos, for those who might not be familiar with this classic gem, a must read review on Amazon... http://www.amazon.com/exec/obidos/tg/detail/-/0140502416/103-4621031-7833469?v=glance Read the first review before lunch. It'll give you a great perspective on Ping. :-) ----- Original Message ----- From: "Tim Maher" To: Sent: Thursday, August 07, 2003 12:35 AM Subject: SPUG: Last Call, Open-Sauce Chinese Thu-12:30 > This week's Ballard Open Sauce lunch will be at the Golden City > restaurant, on Thursday, 8/7 at 12:30. Tell the lovely "Ping" > (love that name; that's why this venue is PERFECT for UNIX/Linux > geeks!) that you're looking for Tim's table. > > See spugwiki.perlocity.org, under "Social Events", for more > details. No need to RSVP at this point, just show up before 1pm > and you'll find somebody still there to dine with. > > Hope to see you there! > > -Tim From cjcollier at colliertech.org Thu Aug 7 12:02:59 2003 From: cjcollier at colliertech.org (C.J. Collier) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Module Dist In-Reply-To: <20030807014804.79526.qmail@web80606.mail.yahoo.com> References: <20030807014804.79526.qmail@web80606.mail.yahoo.com> Message-ID: <1060275460.25630.23.camel@karma.localnet> On Wed, 2003-08-06 at 18:48, Stuart Poulin wrote: > Too Cool! The true power of perl - automating the unrealizable. I > wish I had one of these to play with. Any chance this works with > other Linksys routers? Not yet, but there is a new one that's recently come out that does 802.11a, b & g and runs Linux. The SWN folks are working on getting into the box, and if they are able to do so, then I'll rename and expand the module ;) Cheers, C.J. From tim at consultix-inc.com Thu Aug 7 22:49:03 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Juggling of Meetings Message-ID: <20030807204903.A32518@timji.consultix-inc.com> SPUGsters, Brian "Ingy" Ingerson won't be available after all for this month's meeting, so the new lineup is now: 8/19: "Unit Testing", by Jonathan Gardner > Check SPUGtopics at spugwiki.perlocity.org for the rough draft of Jonathan's talk A short presentation by UW personnel, about their Perl Certificate program 9/16: "Kwiki", by Ingy 10/21: OPEN TO PROPOSALS 11/18: "Perl Security", by Brian Hatch 12/16: OPEN TO PROPOSALS The web site has been updated accordingly. Those of you who haven't done so yet, please visit the Wiki and edit the MemberPages page to make a link to a page for yourself, and tell us about your software offerings under SPUGsterSoftware. The URL is: spugwiki.perlocity.org -Tim ======================================================= | Tim Maher, Ph.D. tim(AT)teachmeperl.com | | SPUG Founder & Leader spug(AT)seattleperl.com | | Seattle Perl Users Group www.seattleperl.com | | SPUG Wiki Site spugwiki.perlocity.org | | Perl Certification Site perlcert.perlocity.org | ======================================================= From douglas at slugstone.net Fri Aug 8 12:32:37 2003 From: douglas at slugstone.net (Douglas Kirkland) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Module::Signature install problem Message-ID: <200308081032.38120.douglas@slugstone.net> It seems that Cpan has changed lately some where around version 1.75. Whenever I install a module through CPAN, I keep getting this message. CPAN: Module::Signature security checks disabled because Module::Signature not installed. Please consider installing the Module::Signature module. So I try to install Module::Signature. Many modules later I run into this problem. CPAN.pm: Going to build I/IL/ILYAZ/modules/Math-Pari-2.010500.tar.gz Did not find GP/PARI build directory around. Do you want to me to fetch GP/PARI automatically? (If you do not, you will need to fetch it manually, and/or direct me to the directory with GP/PARI source via the command-line option paridir=/dir) Make sure you have a large scrollback buffer to see the messages. Fetch? (y/n, press Enter) y Getting GP/PARI from ftp://megrez.math.u-bordeaux.fr/pub/pari/unix/ Cannot list (): Bad file descriptor at utils/Math/PariBuild.pm line 186, <> chunk 1. Can't fetch file with Net::FTP, now trying with LWP::UserAgent... Well the install never gets what it wants. I tried visteding the site and I do not get connected either. Just wondering if I picked the wrong time of day or did the site every work? Also does anybody have a work around or does it just not matter and live with the message? Thanks, Douglas From Marc.M.Adkins at Doorways.org Sat Aug 9 04:59:46 2003 From: Marc.M.Adkins at Doorways.org (Marc M. Adkins) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: <20030807204903.A32518@timji.consultix-inc.com> Message-ID: Seems like I knew the answer to this at one time... I'm running an HTTP::Daemon on Linux. It responds to /quit by quitting, which it does by exiting its main loop and closing (and undef'ing) the HTTP::Daemon object (which is derived from IO::Socket::INET). So far as I know I'm doing all the housekeeping properly. After it quits, I restart it. For almost a minute the port is still held and the HTTP::Daemon object can't be allocated. I've timed it and it seems steady at about 55 seconds. If I to KILL the program it doesn't seem to hold the port. It doesn't hold the port running on Windows. Is there a magic cookie for this? mma From m3047 at inwa.net Fri Aug 8 22:31:40 2003 From: m3047 at inwa.net (Fred Morris) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Closing HTTP::Daemon port Message-ID: Marc wrote: >Seems like I knew the answer to this at one time... I've noticed similar behaviors at times... >I'm running an HTTP::Daemon on Linux. It responds to /quit by quitting, >which it does by exiting its main loop and closing (and undef'ing) the >HTTP::Daemon object (which is derived from IO::Socket::INET). So far as I >know I'm doing all the housekeeping properly. Are you closing the listener on the socket? Because in order to listen for connections, you have a listener on it. (how can you test/tell? that's the thing that's always stumped me.. well sure, netstat. doesn't seem to be "fast enough" though. I don't trust it.) Are you checking the exit status before undeffing? Is there somebody trying to make a connect on it when you close/undef? >After it quits, I restart it. For almost a minute the port is still held >and the HTTP::Daemon object can't be allocated. I've timed it and it seems >steady at about 55 seconds. > >If I to KILL the program it doesn't seem to hold the port. That I don't quite understand... does the job otherwise stick around as defunct? >It >doesn't hold the port running on Windows. Hey, it's Windows! The only thing it's holding onto is its own... -- Fred Morris m3047@inwa.net From Marc.M.Adkins at Doorways.org Sat Aug 9 07:16:59 2003 From: Marc.M.Adkins at Doorways.org (Marc M. Adkins) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: Message-ID: > Are you closing the listener on the socket? Because in order to listen for > connections, you have a listener on it. (how can you test/tell? that's the > thing that's always stumped me.. well sure, netstat. doesn't seem to be > "fast enough" though. I don't trust it.) Are you checking the exit status > before undeffing? Is there somebody trying to make a connect on > it when you > close/undef? I'm closing the HTTP::Daemon object which is an IO::Socket::INET and (presumably) the actually listener. I call close() on it then undef it. I would hope that would be the necessary and sufficient condition for closing the beast. I would add that I always close the connection objects created during normal processing, as per the documentation of HTTP::Daemon. As to someone attempting to make a connection, I suppose that's possible, though unlikely. Frankly, I don't want that to be an issue, because it's not controllable from within the server script. > >If I to KILL the program it doesn't seem to hold the port. > > That I don't quite understand... does the job otherwise stick > around as defunct? I haven't been able to find any. I've been looking. mma From scott+spug at mail.dsab.rresearch.com Sat Aug 9 16:19:20 2003 From: scott+spug at mail.dsab.rresearch.com (Scott Blachowicz) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Playing with Ingy's kwiki package Message-ID: <20030809212035.110621DD7@sabami.seaslug.org> Hi- This is probably really an apache question, but I was wondering if any of you knew what this was about... I've got a home Debian Linux system ("woody", but with a bunch of "testing" stuff installed) that I was thinking of playing around with a wiki on for tracking various family lists, etc. So, I installed the kwiki apt package (v0.17-1) installed and I've gone to my /usr/lib/cgi-bin dir, created a my-kwiki subdir, cd'd to that and run kwiki-install. Now, when I point my Netscape (4.7x) at http://localhost/cgi-bin/my-kwiki/index.cgi, I get this 403 error: Forbidden You don't have permission to access /cgi-bin/my-kwiki/css/Display.css on this server. Apache/1.3.27 Server at voyager.rresearch.com Port 80 and checking my apache error.log file, I see this: [Sat Aug 9 14:08:55 2003] [error] [client 127.0.0.1] file permissions deny server execution: /usr/lib/cgi-bin/my-kwiki/css/Display.css Why would it try to execute that file? Grepping through the sources in there, I see Display.css here: ./template/display_header.html:5: ./template/blog_header.html:5: I'm assuming that I'm just missing some big of Apache config here, but I don't know what. Any ideas? Scott PS: I'll be on vacation for about 2 weeks starting tomorrow (Sun 8/10), so if you don't get an immediate reply, I've either left or am busy packing...:) From jmates at sial.org Sat Aug 9 17:42:57 2003 From: jmates at sial.org (Jeremy Mates) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Re: Playing with Ingy's kwiki package In-Reply-To: <20030809212035.110621DD7@sabami.seaslug.org> References: <20030809212035.110621DD7@sabami.seaslug.org> Message-ID: <20030809224257.GM13157@darkness.sial.org> * Scott Blachowicz > [Sat Aug 9 14:08:55 2003] [error] [client 127.0.0.1] file permissions deny server execution: /usr/lib/cgi-bin/my-kwiki/css/Display.css > > Why would it try to execute that file? The cgi-bin directory is likely marked in httpd.conf with ScriptAlias[1], which passes the contents off to the mod_cgi handler[2], including the non-executable CSS style definitions. The above error is probably due to various +x bits being unset on the file in question, which would not solve the Apache-wants-to-run-the-style-file problem. Solutions would be to locate my-kwiki outside of the cgi-bin directory and enable .cgi handling for the Directory in question: AddHandler cgi-script .cgi Or to rework my-kwiki to look for the CSS file outside of cgi-bin. [1] - ScriptAlias: http://httpd.apache.org/docs/mod/mod_alias.html#scriptalias [2] - mod_cgi: http://httpd.apache.org/docs/mod/mod_cgi.html From scott+spug at mail.dsab.rresearch.com Sat Aug 9 19:02:33 2003 From: scott+spug at mail.dsab.rresearch.com (Scott Blachowicz) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Re: Playing with Ingy's kwiki package In-Reply-To: <20030809224257.GM13157@darkness.sial.org> References: <20030809212035.110621DD7@sabami.seaslug.org> <20030809224257.GM13157@darkness.sial.org> Message-ID: <20030810000348.661B31DD7@sabami.seaslug.org> Jeremy Mates wrote: > * Scott Blachowicz > > [Sat Aug 9 14:08:55 2003] [error] [client 127.0.0.1] file permissions deny server execution: /usr/lib/cgi-bin/my-kwiki/css/Display.css > > > > Why would it try to execute that file? > > The cgi-bin directory is likely marked in httpd.conf with ScriptAlias[1], > which passes the contents off to the mod_cgi handler[2], including the > non-executable CSS style definitions. The above error is probably due to > various +x bits being unset on the file in question, which would not > solve the Apache-wants-to-run-the-style-file problem. > > Solutions would be to locate my-kwiki outside of the cgi-bin directory > and enable .cgi handling for the Directory in question: > > AddHandler cgi-script .cgi > > Or to rework my-kwiki to look for the CSS file outside of cgi-bin. I also needed 'Options ExecCGI' for the directory I put it in. Turns out this is documented in the pages that come with kwiki, but I hadn't gotten that far and for some reason didn't think to go looking through the files I had. Thanx for the info! Scott From cjcollier at colliertech.org Sat Aug 9 21:37:12 2003 From: cjcollier at colliertech.org (C.J. Collier) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: References: Message-ID: <1060482787.1544.7.camel@karma.localnet> What's netstat --inet say about the state of the socket? C.J. On Sat, 2003-08-09 at 02:59, Marc M. Adkins wrote: > Seems like I knew the answer to this at one time... > > I'm running an HTTP::Daemon on Linux. It responds to /quit by quitting, > which it does by exiting its main loop and closing (and undef'ing) the > HTTP::Daemon object (which is derived from IO::Socket::INET). So far as I > know I'm doing all the housekeeping properly. > > After it quits, I restart it. For almost a minute the port is still held > and the HTTP::Daemon object can't be allocated. I've timed it and it seems > steady at about 55 seconds. > > If I to KILL the program it doesn't seem to hold the port. It > doesn't hold the port running on Windows. > > Is there a magic cookie for this? > > mma > > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org Wiki: spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: www.seattleperl.org > From dleonard at dleonard.net Sun Aug 10 13:48:59 2003 From: dleonard at dleonard.net (dleonard@dleonard.net) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: Message-ID: Frankly I'm suprised that it is only 55 seconds. The linux kernel should take 2 minutes to properly recycle a port such that it is available for usage after closing if done in kernel space. -- On Sat, 9 Aug 2003, Marc M. Adkins wrote: > Seems like I knew the answer to this at one time... > > I'm running an HTTP::Daemon on Linux. It responds to /quit by quitting, > which it does by exiting its main loop and closing (and undef'ing) the > HTTP::Daemon object (which is derived from IO::Socket::INET). So far as I > know I'm doing all the housekeeping properly. > > After it quits, I restart it. For almost a minute the port is still held > and the HTTP::Daemon object can't be allocated. I've timed it and it seems > steady at about 55 seconds. > > If I to KILL the program it doesn't seem to hold the port. It > doesn't hold the port running on Windows. > > Is there a magic cookie for this? > > mma > > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org Wiki: spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: www.seattleperl.org > From Marc.M.Adkins at Doorways.org Sun Aug 10 21:42:07 2003 From: Marc.M.Adkins at Doorways.org (Marc M. Adkins) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: Message-ID: > Frankly I'm suprised that it is only 55 seconds. The linux kernel should > take 2 minutes to properly recycle a port such that it is available for > usage after closing if done in kernel space. It's steady at between 55 and 60 seconds. Mandrake 9.1. May be the other side of the connection (see my reply re: netstat --inet). mma From Marc.M.Adkins at Doorways.org Sun Aug 10 21:42:09 2003 From: Marc.M.Adkins at Doorways.org (Marc M. Adkins) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: <1060482787.1544.7.camel@karma.localnet> Message-ID: > What's netstat --inet say about the state of the socket? It says TIME_WAIT from my daemon's port to another port on the same machine. So...background...this is all occurring when a new server comes up. It calls to the existing server (if any) with the /quit URL in order to kill the old server gracefully. It does this via LWP::Simple::get(). Then the new server loops until the port becomes available and it comes online. This recycles the server periodically (via cron). So...I'm now wondering if LWP::Simple isn't holding onto the connection. I've done everything I know of on the HTTP::Daemon side, maybe it's the consumer side. I'll look at that when I get a chance (dinner calls just now). mma From andrew at sweger.net Sun Aug 10 22:26:16 2003 From: andrew at sweger.net (Andrew Sweger) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: Message-ID: On Sun, 10 Aug 2003, Marc M. Adkins wrote: > > What's netstat --inet say about the state of the socket? > > It says TIME_WAIT from my daemon's port to another port on the same machine. (I'm just talking out my /dev/random here.) But that's just the closing side of a tcp socket I thought. This shouldn't prevent a new listener from binding to that port. If you don't have anything in LISTEN state on the daemon port, shrug. The netstat --inet option doesn't report ports that are listening by default. Run as 'netstat -nl --inet' (that's dash-en-ell) to see who's still listening. To see what programs (PIDs) are associated, use 'netstat -nlp --inet'. Make it 'netstat -nap --inet' to see sockets in all states (not just listeners). If the daemon process has really gone away (got zombies?), what could be holding on to that port (this is where dleonard will start telling us about all kinds of horrors from deep inside the kernel)? If the proc is still there with sockets hanging around, then there's something wrong with the cleanup code. If so, can you provide a chunk of code? -- Andrew B. Sweger -- The great thing about multitasking is that several things can go wrong at once. From ben at reser.org Mon Aug 11 01:30:54 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: References: Message-ID: <20030811063053.GB11173@titanium.brain.org> On Sun, Aug 10, 2003 at 07:42:07PM -0700, Marc M. Adkins wrote: > It's steady at between 55 and 60 seconds. Mandrake 9.1. May be the other > side of the connection (see my reply re: netstat --inet). Can you post your code so we can play with it? Or at least a minimum working example of something that will cause your issue. If I can replicate it I might be able to figure out how to fix the issue... -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From mathin at mathin.com Mon Aug 11 11:18:26 2003 From: mathin at mathin.com (Dan Ebert) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Seattle Perl Gig Message-ID: <1060618706.21714.26.camel@algernon.lan.enic.cc> Hey all, A friend of mine sent this to me ... I thought I'd pass it along to the group. Dan. --------------------- I am hoping you or someone you know may be interested in this contract in Seattle. I am looking for an expert in Perl Scripting, understands aspects of Unix/HP-UX system security and OS. Access DBA would be a huge plus. This is a 1-2 month contract in Seattle If you know of someone who fits this skill set, please feel free to pass along my contact information below Thank you very much and have a wonderful day! Elizabeth Harvey Executive Recruiter The TRIAD Group 425-454-0282 800-514-9155 elizabeth.harvey@triadgroup.com From andrew at sweger.net Mon Aug 11 15:56:37 2003 From: andrew at sweger.net (Andrew Sweger) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Seattle Perl Gig In-Reply-To: <1060618706.21714.26.camel@algernon.lan.enic.cc> Message-ID: Although most folks on the SPUG list appreciate hearing about opportunities to work using their favorite language, we do ask that some additional information be provided. This helps target the job to the right person (and keeps everyone from having to call and ask a lot of questions). Please see the information on the SPUG website at http://seattleperl.org/ under the heading "Job Offers for SPUG Members". It gives a list of the basic information requested and an email address for posting (messages are screened before forwarding on to the list). Your friend is also welcome to do this directly (there's no need to be subscribed to the list to post job offers through this address). Thank you. On Mon, 11 Aug 2003, Dan Ebert wrote: > > Hey all, > > A friend of mine sent this to me ... I thought I'd pass it along to the > group. > > Dan. > --------------------- > I am hoping you or someone you know may be interested in this contract > in Seattle. I am looking for an expert in Perl Scripting, understands > aspects of Unix/HP-UX system security and OS. Access DBA would be a > huge plus. > > This is a 1-2 month contract in Seattle > > If you know of someone who fits this skill set, please feel free to pass > along my contact information below > > Thank you very much and have a wonderful day! > > Elizabeth Harvey > Executive Recruiter > The TRIAD Group > 425-454-0282 > 800-514-9155 > elizabeth.harvey@triadgroup.com -- Andrew B. Sweger -- The great thing about multitasking is that several things can go wrong at once. From ashok_palihill at hotmail.com Mon Aug 11 18:11:59 2003 From: ashok_palihill at hotmail.com (Ashok Misra) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement firms Message-ID: as several people on this thread mentioned there are indeed several companies hiring right right now and finding an open postion involves connecting with the right people. i was wondering if anyone had recomendations for working with a placement firm. one firm that i have met i.e. http://www.allenandassociates.com/ has quite convincing marketing. best best regards -a >From: Chris Turan >To: spug-list@mail.pm.org >Subject: SPUG: Giving up on computer jobs >Date: Tue, 22 Jul 2003 19:04:29 -0700 (PDT) > >Hi All, > >I've been searching for jobs for several months now. I've given up on the >computer market and began looking for retail and restaurant jobs. I've >found that many of those have been taken as well and the employers can >cherry pick who to hire. Those jobs that aren't taken don't seem to want >to hire me because I've over-qualified and am afriad I'll just leave when >a good job turns up. > >Has anyone had similar problems? Any recommendations? I'm keeping my >shoulder to the grindstone and am looking. I was hoping someone might >have something to suggest. I live in Bellevue and am looking mostly on >the Eastside. I've been looking mostly in Bellevue so that if they >economic situation doesn't change, I can sell my car and still be able to >walk to work. > >Perhaps someone has some ideas I haven't considered yet. > >Thanks, >-Chris > > > > > > >_____________________________________________________________ >Seattle Perl Users Group Mailing List >POST TO: spug-list@mail.pm.org Wiki: spugwiki.perlocity.org >ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list >MEETINGS: 3rd Tuesdays, U-District, Seattle WA >WEB PAGE: www.seattleperl.org > _________________________________________________________________ Protect your PC - get McAfee.com VirusScan Online http://clinic.mcafee.com/clinic/ibuy/campaign.asp?cid=3963 From bri at ifokr.org Mon Aug 11 22:42:49 2003 From: bri at ifokr.org (Brian Hatch) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Reminder: Open Sauce Lunch this Wednesday Message-ID: <20030812034249.GW16892@ifokr.org> The next Open Sauce Lunch is this Wednesday, Aug 13th at 12:30 at California Pizza Kitchen at Northgate Mall, just off of I-5. (Yes, this is an attempt by one of the Ballardites to have an Open Sauce Lunch more accessible to non Ballardites...) For more details and maps, go to http://spugwiki.perlocity.org/index.cgi?WednesdayAug13atNorthgate If you are coming, either add your name to the wiki entry above, or send me an email so I can reserve an appropriate table. Look for a guy with a goatee and purple camouflage hat; that will be this lunch's Convener, Brian Hatch. -- Brian Hatch "Touch passion when it comes Systems and your way. It's rare enough Security Engineer as it is. Don't walk away http://www.ifokr.org/bri/ when it calls you by name." Every message PGP signed -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030811/08f33be8/attachment.bin From tim at teachmeperl.com Mon Aug 11 12:05:42 2003 From: tim at teachmeperl.com (Tim Maher/CONSULTIX) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Seattle Perl Gig In-Reply-To: <1060618706.21714.26.camel@algernon.lan.enic.cc> References: <1060618706.21714.26.camel@algernon.lan.enic.cc> Message-ID: <20030811100542.A13670@timji.consultix-inc.com> On Mon, Aug 11, 2003 at 09:18:26AM -0700, Dan Ebert wrote: > > Hey all, > > A friend of mine sent this to me ... I thought I'd pass it along to the > group. Dan (and others), In the future, please forward job leads to spug@seattleperl.org, so that I can obtain additional details and post the announcement in a more complete and usable form. TIA, ======================================================= | Tim Maher, Ph.D. tim(AT)teachmeperl.com | | SPUG Founder & Leader spug(AT)seattleperl.com | | Seattle Perl Users Group www.seattleperl.com | | SPUG Wiki Site spugwiki.perlocity.org | | Perl Certification Site perlcert.perlocity.org | ======================================================= From Marc.M.Adkins at Doorways.org Tue Aug 12 00:46:15 2003 From: Marc.M.Adkins at Doorways.org (Marc M. Adkins) Date: Mon Aug 2 21:37:05 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: <20030811063053.GB11173@titanium.brain.org> Message-ID: > Can you post your code so we can play with it? Or at least a minimum > working example of something that will cause your issue. If I can > replicate it I might be able to figure out how to fix the issue... OK, here's a stripped-down version. It demonstrates the behavior on Mandrake 9.1. Create two terminals. Run testsvr.pl in one. Wait until it says "Server Created." Run testsvr.pl in the other. It should shut down the first one and run in the second one. Repeat as necessary. Fun, huh? On Windows this happens immediately (more or less). On Linux it takes 55-60 seconds. There's a counter in the code to keep you company. Since my last post I opened up LWP::Simple and swiped the 'trivial' function for testing purposes. Didn't change the behavior. While it was in I did notice that there was a 60 second timeout on the connection to the server for the /quit command. I changed this to 30 seconds and it didn't affect anything. So I removed that code and went back to LWP::Simple. I previously said that the behavior didn't happen if I used to interrupt the original server. That is not in fact the case (or at least I can't duplicate that feature now). If I the first server and _then_ start the second it _still_ takes almost a minute. The sleep 1 at the top of the wait loop is necessary on Windows but not on Linux. It doesn't affect the behavor on Linux, it just isn't necessary. mma -------------- next part -------------- A non-text attachment was scrubbed... Name: testsvr.pl Type: application/octet-stream Size: 4669 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030811/c0570cd6/testsvr.obj From andrew at sweger.net Tue Aug 12 02:06:01 2003 From: andrew at sweger.net (Andrew Sweger) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: Message-ID: I was able to repeat the behavior you describe on a Debian 3.0 (woody) system. Very nice code, easy to read and follow (even with braces where I don't like them). Thanks. First I'd like to say that this is a strange way to recycle a server. This is what I would call "in-band" control where the control is excerted over the same channel that the daemon normally operates through. Most programs will use an out-of-band control method either through signals or Unix sockets (e.g., postfix, named). Just a very brief skim over the code and playing, I came across this odd bit of code: while (my $page = get("http://localhost:$port/quit")) { logInfo(" Server Killed"); sleep 1; # otherwise we occasionally try to start too fast } The get() is being called by the second instance of the script which does indeed kill the first. But note that get() returns the requested document (which may be "true" or "false" depending on what was returned). get() returns undef in the event of a failure. What struck me is that should get() return anything true, it will report "Server Killed", sleep, and then call get() *again*. But I don't see the message "Server Killed" reported by the second instance. Why would you keep calling get() if it got a true result back? On the network side of things, what I am seeing is a socket created by the second instance (by the get() call I presume) that hangs around for a minute in TIME_WAIT state. It's on the daemon's side of the connection (the first instance). So, I'm going to say that it is your daemon that's not cleaning up the socket after handling incoming requests. I can prove this by calling for arbitrary URIs on this port (repeatedly) and see half-dead sockets piling up on the daemons side. E.g., "wget http://localhost:9876/foobar". The TIME_WAIT sockets are not associated with any process. If I remembered more about the TIME_WAIT state, I'd know what this means. But I'm guessing that the daemon is releasing the socket before completely shutting it down after handling the request. I'll try to take a closer look at this portion of the code later. In the meantime, consider a more traditional out-of-band control system. This is useful in the event that your daemon is having trouble with it's http interface (and keeps rogues from shutting down your server). On Mon, 11 Aug 2003, Marc M. Adkins wrote: > OK, here's a stripped-down version. It demonstrates the behavior on > Mandrake 9.1. -- Andrew B. Sweger -- The great thing about multitasking is that several things can go wrong at once. From andrew at sweger.net Tue Aug 12 03:42:55 2003 From: andrew at sweger.net (Andrew Sweger) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: Message-ID: On the ^C thing: if you kill the first instance of the daemon _before_ any client connections have been made (thus creating our TIME_WAIT zombie sockets), then the server completely shutdown and releases all network resources immediately. On Tue, 12 Aug 2003, Andrew Sweger wrote: > On the network side of things, what I am seeing is a socket created by the > second instance (by the get() call I presume) that hangs around for a > minute in TIME_WAIT state. It's on the daemon's side of the connection > (the first instance). So, I'm going to say that it is your daemon that's > not cleaning up the socket after handling incoming requests. I can prove > this by calling for arbitrary URIs on this port (repeatedly) and see > half-dead sockets piling up on the daemons side. E.g., "wget > http://localhost:9876/foobar". The TIME_WAIT sockets are not associated > with any process. If I remembered more about the TIME_WAIT state, I'd know > what this means. But I'm guessing that the daemon is releasing the socket > before completely shutting it down after handling the request. It looks like TIME_WAIT is a part of life with TCP/IP. But I know I deal with servers that don't exhibit this behavior (some written by me back where my memory has faded). I have read that the TIME_WAIT state occurs _after_ the socket has been closed but to allow handling of data that may still be on the network. The TIME_WAIT state can last up to four minutes depending on certain attributes of the socket. I have attempted to influence the $connect socket by calling flush and shutdown on it prior to the close, all to no avail (as well as the $server instance). It's been a long time since I've tinkered with socket stuff lower down than this (and my hands just start shaking when I even think of taking the debugger back down there). I'm throwing in the towel. :) Good luck. -- Andrew B. Sweger -- The great thing about multitasking is that several things can go wrong at once. From jgardner at jonathangardner.net Tue Aug 12 08:48:31 2003 From: jgardner at jonathangardner.net (Jonathan Gardner) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement firms In-Reply-To: References: Message-ID: <200308120648.32731.jgardner@jonathangardner.net> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Monday 11 August 2003 16:11, Ashok Misra wrote: > as several people on this thread mentioned there are indeed several > companies hiring right right now and finding an open postion involves > connecting with the right people. > i was wondering if anyone had recomendations for working with a placement > firm. > one firm that i have met i.e. http://www.allenandassociates.com/ has quite > convincing marketing. best best regards First, when we write English, we use the SHIFT key to capitalize letters. That's because we are writing email and not chatting on IRC. This may be a reason you aren't finding a job. You look like a script kiddie in your email. Hopefully, you don't sound like one on the phone. I personally doubt a placement firm can succeed where SPUG has failed. But if you are convinced that they can do it for you, then go ahead and give it a try. If there's one thing about us perl guys, we are not cookie-cutter replicates of each other. Each of us wants different responsibilities in different kinds of companies. Each of us will have to find our own way to get to where we want to be. This may be the way you are going to have to take to get to where you want to be. - -- Jonathan Gardner Live Free, Use Linux! -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.2.1 (GNU/Linux) iD8DBQE/OPAvWgwF3QvpWNwRAoDXAKDXyUs7H4K8vw3dl12dLgCJnEGKnwCfcCjW G+7zklJ14WP35/rU8x+aLpI= =fEFq -----END PGP SIGNATURE----- From bri at ifokr.org Tue Aug 12 20:50:10 2003 From: bri at ifokr.org (Brian Hatch) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Open Sauce Lunch tomorrow at Northgate Message-ID: <20030813015010.GA24069@ifokr.org> The next Open Sauce Lunch is this Wednesday, Aug 13th at 12:30 at California Pizza Kitchen at Northgate Mall, just off of I-5. (Yes, this is an attempt by one of the Ballardites to have an Open Sauce Lunch more accessible to non Ballardites...) For more details and maps, go to http://spugwiki.perlocity.org/index.cgi?WednesdayAug13atNorthgate If you are coming, either add your name to the wiki entry above, or send me an email so I can reserve an appropriate table. If you have a PGP/GPG key, bring a copy of your fingerprint/bits/etc and ID with you, so we can exchange and sign keys as well. Look for a guy with a goatee and purple camouflage hat; that will be this lunch's Convener, Brian Hatch. -- Brian Hatch "Wearing of this garment Systems and does not enable you to fly" Security Engineer - warning on a kid's http://www.ifokr.org/bri/ superman costume Every message PGP signed -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030812/5ce7cfc3/attachment.bin From dan at concolor.org Tue Aug 12 21:21:57 2003 From: dan at concolor.org (Dan Sabath) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Open Sauce Lunch Friday Message-ID: Just your friendly reminder. Love to see more of you there. Sign up at: http://spugwiki.perlocity.org/index.cgi?FriAug15InInternationalDistrict -dan Open Sauce Lunch Chinese (Dim Sum) food in International District Friday, 8/15, 12:00pm House of Hong 409 8th Ave S Seattle, Washington 98104 (206) 622-7997 MAP Website Look for a guy wearing a hawaiian shirt and beard; that will be this lunch's Convener, Dan Sabath. From andrew at sweger.net Tue Aug 12 23:17:49 2003 From: andrew at sweger.net (Andrew Sweger) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement firms In-Reply-To: <200308120648.32731.jgardner@jonathangardner.net> Message-ID: On Tue, 12 Aug 2003, Jonathan Gardner wrote: > First, when we write English, we use the SHIFT key to capitalize letters. > That's because we are writing email and not chatting on IRC. This may be a > reason you aren't finding a job. You look like a script kiddie in your email. > Hopefully, you don't sound like one on the phone. Dud, what about thos popl that don't hav a shift ky? Or the lttr '' (dammit)? Seriously now. The Seattle Perl Users Group (the mail list specifically) has grown well beyond the boundaries of the city of Seattle and Puget Sound. I think this extends beyond geography and language to include "people of alternate writing styles". We welcome all comers here. If we don't understand what they're saying, we'll ask for clarification or, if necessary, ignore them. Your point is well taken with respect to finding job. But that's no reason to cut a person down. In Ashok's case, look over his postings to this list. Aside from the conspicuous lack of capital letters and occasional double double words, the writing is very readable. I'll bet more people have trouble reading the convoluted crap I write than Ashok's writing. So let's cut each other some slack. (Okay, I realize I'm being two-faced. I'm the jerk who flames people for using fancy words. But that was supposed to be friendly jabbing.) -- Andrew B. Sweger -- The great thing about multitasking is that several things can go wrong at once. From Marc.M.Adkins at Doorways.org Wed Aug 13 01:23:41 2003 From: Marc.M.Adkins at Doorways.org (Marc M. Adkins) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: Message-ID: > First I'd like to say that this is a strange way to recycle a server. This > is what I would call "in-band" control where the control is excerted over > the same channel that the daemon normally operates through. Most programs > will use an out-of-band control method either through signals or Unix > sockets (e.g., postfix, named). It was just simpler. This is used on my home net, behind a firewall, so I felt safe exposing the /quit URL. That way I didn't have to complicate things more than they already were. If I had started on UNIX it might have been different, but there are limitations on Windows and things that seem simple on one side of the Great Divide are not so simple on the other. Nevertheless, point taken, and in a production environment I would have felt compelled to do as you say. > Just a very brief skim over the code and playing, I came across this odd > bit of code: > > while (my $page = get("http://localhost:$port/quit")) > { > logInfo(" Server Killed"); > sleep 1; # otherwise we occasionally try to start too fast > } > > The get() is being called by the second instance of the script which does > indeed kill the first. But note that get() returns the requested document > (which may be "true" or "false" depending on what was returned). get() > returns undef in the event of a failure. Y'know, it occurred to me after I logged off last night that maybe the loop was trapping me. But I just went and tested it. I replaced the while with an if and so it only gets hit once. Same behavior. [Moreover, I don't think that just now I was seeing the "Server Killed". Which seems normal on the test server, as I'm returning an error condition. On the real server it returns a good page and so it does in fact loop.] As to why I used a while loop, it was something like killing a cockroach in one's kitchen. Hit it until it's dead and one more time for luck. I was just coding for the off chance that there were _two_ servers, one just wait for the other to end, but of course with the delay in the system that can't happen anyway. In the full daemon there is actually code that reads the page that is returned. There is state that is thus preserved across instantiations. This is another reason why the /quit works for me, it kills two cockroaches with one shoe. Using an out-of-band connection would have required a pipe or socket to pass the state information, not just a system-wide semaphore or whatever. > On the network side of things, what I am seeing is a socket created by the > second instance (by the get() call I presume) that hangs around for a > minute in TIME_WAIT state. It's on the daemon's side of the connection > (the first instance). So, I'm going to say that it is your daemon that's > not cleaning up the socket after handling incoming requests. I can prove > this by calling for arbitrary URIs on this port (repeatedly) and see > half-dead sockets piling up on the daemons side. E.g., "wget > http://localhost:9876/foobar". The TIME_WAIT sockets are not associated > with any process. If I remembered more about the TIME_WAIT state, I'd know > what this means. But I'm guessing that the daemon is releasing the socket > before completely shutting it down after handling the request. So...when I close the HTTP::Daemon object it isn't closing the listen port? I haven't taken the time to crawl through that code yet... mma From andrew at sweger.net Wed Aug 13 04:10:19 2003 From: andrew at sweger.net (Andrew Sweger) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: Message-ID: On Tue, 12 Aug 2003, Marc M. Adkins wrote: > So...when I close the HTTP::Daemon object it isn't closing the listen port? > I haven't taken the time to crawl through that code yet... The listener is definitely closing cleanly. It's the connection handling the "client" request that's not getting cleaned up. To see this in action, fire up the first instance of the server, then run this in another terminal: for i in $(seq 1 10); do wget http://localhost:9876/foobar > /dev/null 2>&1 done netstat -nape --inet # Thanks, Adam! The "/foobar" URI is just to exercise the server and not make it really do anything other than return an error page. You should see ten sockets in TIME_WAIT state with the local address being on port 9876. The "foreign" addresses will all be on various ports. You should also see that there is no process associated with these sockets. The listener should still be there on port 9876 and associated with the daemon's PID. I believe this is the same thing that happens when giving the /quit command. The listener (essentially a "half" socket) is nowhere to be found after giving the /quit command. -- Andrew B. Sweger -- The great thing about multitasking is that several things can go wrong at once. From jgardner at jonathangardner.net Wed Aug 13 09:19:00 2003 From: jgardner at jonathangardner.net (Jonathan Gardner) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement firms In-Reply-To: References: Message-ID: <200308130719.01708.jgardner@jonathangardner.net> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Tuesday 12 August 2003 21:17, Andrew Sweger wrote: > On Tue, 12 Aug 2003, Jonathan Gardner wrote: > > First, when we write English, we use the SHIFT key to capitalize > > letters. That's because we are writing email and not chatting on > > IRC. This may be a reason you aren't finding a job. You look like a > > script kiddie in your email. Hopefully, you don't sound like one on > > the phone. > > Seriously now. The Seattle Perl Users Group (the mail list > specifically) has grown well beyond the boundaries of the city of > Seattle and Puget Sound. I think this extends beyond geography and > language to include "people of alternate writing styles". We welcome > all comers here. If we don't understand what they're saying, we'll > ask for clarification or, if necessary, ignore them. > We welcome people from all writing styles. That doesn't mean businesses will. A poor writer sounds dumb. Whether or not they are really dumb doesn't make a difference. If you sound dumb, then the recruiter and the hiring manager are going to think you are dumb. If the recruiter and the hiring manager think you are dumb, you are never going to see the developers on your potential team. You are never going to demonstrate your perl skills. You are never going to be considered for the jobs you want to get. - -- Jonathan Gardner Live Free, Use Linux! -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.2.1 (GNU/Linux) iD8DBQE/OkjUWgwF3QvpWNwRAqyQAJ97FWVnDs/7S1AGX6dhsb2Uo/JfigCfcS0b mDevxl64a+5JaZPhTHe8mHI= =ZUzn -----END PGP SIGNATURE----- From andrew at sweger.net Wed Aug 13 09:34:19 2003 From: andrew at sweger.net (Andrew Sweger) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement firms In-Reply-To: <200308130719.01708.jgardner@jonathangardner.net> Message-ID: On Wed, 13 Aug 2003, Jonathan Gardner wrote: > We welcome people from all writing styles. That doesn't mean businesses > will. > > A poor writer sounds dumb. Whether or not they are really dumb doesn't > make a difference. If you sound dumb, then the recruiter and the hiring > manager are going to think you are dumb. As I said, your point on the matter of job hunting was well taken. But there is no call in personally attacking a member of this list (which is the point I am attempting to get across to you). Personally, I tend to not hire folks that are slow learners rather than those that don't have the right look/sound/writing/etc. Of course, impressing these ideals on HR is an ongoing challenge at most companies. -- Andrew B. Sweger -- The great thing about multitasking is that several things can go wrong at once. From doer at microen.com Wed Aug 13 10:02:04 2003 From: doer at microen.com (Rodney Doe) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement fi rms Message-ID: <581D864582E4D311853D009027DE223A01AE1102@harpo.microen.com> Would it be useful to discuss those of us who provide input into the hiring process have rejected candidates in the past? What fatal errors have we seen on r?sum?s? What went wrong with interviews? I recall a particular perusal of a stack of applications, when a company president and I encountered an application with a cover letter signed with the pseudonym 'Boromir the Weird'. The president saw the signature, chuckled, and promptly dropped the application into the recycle basket without even a glance at Boromir's esteemed qualifications. While this is an extreme case, perhaps the generation of a list of fatal flaws and positive points might add some value. I love to see: - A one page r?sum?. This indicates to me that the person can summarize details and think abstractly. I hate to see: - Spelling errors. - Grammatical errors. - Five page r?sum?s, crammed with acronyms and insignificant details. Rodney B. Doe, P.E. Senior Software Engineer Micro Encoder Inc. www.microen.com From legrady at earthlink.net Wed Aug 13 10:28:45 2003 From: legrady at earthlink.net (Tom Legrady) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement fi rms In-Reply-To: <581D864582E4D311853D009027DE223A01AE1102@harpo.microen.com> References: <581D864582E4D311853D009027DE223A01AE1102@harpo.microen.com> Message-ID: <1060788517.28360.19.camel@localhost.localdomain> On Wed, 2003-08-13 at 11:02, Rodney Doe wrote: > I love to see: > - A one page r?sum?. This indicates to me that the person can summarize > details and think abstractly. I was always told a resume should be two pages. Recently someone complained my two-page resume was too short. Tom From wnorth at state.mt.us Wed Aug 13 10:35:05 2003 From: wnorth at state.mt.us (North, Walter) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: RE: Giving up on computer jobs & usefulness of placement fi rms Message-ID: I think Rodney has a pretty good idea, one that could be useful to all of us. So, here I go. As a person who is often asked for input on whether to hire someone I too like a short and to the point resume. One page is preferable, two is fine. Bad spelling is unacceptable as everyone has a spell checker, and if they can't find time to run it on their resume then I would be suspect of how they would trouble shoot problems on the job. After all a resume is supposed to be one of the most carefully crafted documents of your life and if you can't even run the spell checker on it....... Bad grammer is a fact of life in a one page resume, just as in email. As long as it isn't too bad. (how that for bad grammer). Personally, I would have at least interviewed Boromir for having the hair to use something like that. As a potential employee I have always tried to keep my resume as short as possible. In fact I have 2 of them, one is one page or maybe slipping over into a 2nd page that I usually send out and a longer one if the situation requires it. Some employers seem to like to see your entire history of employment from high school, or before, others only care about your last job or two. Some only seem to care if you are currently employeed. As for acronyms, the computer biz is one vast acronym and I do not see how you can get out of using at least a few, especially if you have been around for a while. > -----Original Message----- > From: Rodney Doe [mailto:doer@microen.com] > Sent: Wednesday, August 13, 2003 9:02 AM > To: spug-list@mail.pm.org > Subject: Re: SPUG: Giving up on computer jobs & usefulness of > placement > fi rms > > > Would it be useful to discuss those of us who provide input > into the hiring > process have rejected candidates in the past? What fatal > errors have we > seen on r?sum?s? What went wrong with interviews? > > I recall a particular perusal of a stack of applications, > when a company > president and I encountered an application with a cover > letter signed with > the pseudonym 'Boromir the Weird'. The president saw the signature, > chuckled, and promptly dropped the application into the recycle basket > without even a glance at Boromir's esteemed qualifications. > While this is > an extreme case, perhaps the generation of a list of fatal flaws and > positive points might add some value. > > I love to see: > - A one page r?sum?. This indicates to me that the person > can summarize > details and think abstractly. > > I hate to see: > - Spelling errors. > - Grammatical errors. > - Five page r?sum?s, crammed with acronyms and insignificant details. > Rodney B. Doe, P.E. > Senior Software Engineer > Micro Encoder Inc. > www.microen.com > > ----------------------------------------------------- Walter North 406-444-2914 Operating Systems Programmer State of Montana wnorth@state.mt.us ----------------------------------------------------- From joneil at cobaltgroup.com Wed Aug 13 10:45:12 2003 From: joneil at cobaltgroup.com (O'neil, Jerome) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: RE: Giving up on computer jobs & usefulness of placemen t fi rms Message-ID: <25160AB2660F8B449892B0EB9C29C1650C3F28@ex-sea-is2.cobaltgroup.com> > As for acronyms, the computer biz is one vast acronym and I do not > see how you can get out of using at least a few, especially if you > have been around for a while. Acronyms can be a great indicator. If you put an acronym on a resume, you should at least be prepared to discuss it. I can't count the number of candidates that had a list of acronyms on the sheet, but couldn't tell you when they used them, and in a couple of cases, what they even meant. It's pretty obvious when someone is using the acronym shotgun to get noticed. -Jerome From pdarley at kinesis-cem.com Wed Aug 13 11:08:05 2003 From: pdarley at kinesis-cem.com (Peter Darley) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placement fi rms Message-ID: Folks, I have often been in a position to hire tech folks, and the number one thing that impresses me when looking at a resume is personal interests. A programmer who writes programs for themselves in their free time is likely to rise to the top, a network administrator who doesn't have a network at their home isn't likely to perform well on the job. The thing that is the biggest red flag to me is an applicant who focuses a lot on their industry certifications (especially MCSEs). Having them isn't a mark against them, but if it's what they put forward as proof that they know what they're doing that tells me that they probably don't have much real world experience, weather it's at work, school, noodling around at home, etc. Thanks, Peter Darley From schieb at centurytel.net Wed Aug 13 11:59:26 2003 From: schieb at centurytel.net (Islandman) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placement firms References: Message-ID: <3F3A6E6E.7D30C185@centurytel.net> Peter Darley wrote: > > Folks, > > I have often been in a position to hire tech folks, and the number one > thing that impresses me when looking at a resume is personal interests. A > programmer who writes programs for themselves in their free time is likely > to rise to the top, a network administrator who doesn't have a network at > their home isn't likely to perform well on the job. Interesting. I've heard it's better to list the things you do OUTSIDE of computer work lest you seem too narrow. -Brian Vashon, WA > > The thing that is the biggest red flag to me is an applicant who focuses a > lot on their industry certifications (especially MCSEs). Having them isn't > a mark against them, but if it's what they put forward as proof that they > know what they're doing that tells me that they probably don't have much > real world experience, weather it's at work, school, noodling around at > home, etc. > > Thanks, > Peter Darley > > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org From mathin at mathin.com Wed Aug 13 12:17:59 2003 From: mathin at mathin.com (Dan Ebert) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placement firms In-Reply-To: <3F3A6E6E.7D30C185@centurytel.net> References: <3F3A6E6E.7D30C185@centurytel.net> Message-ID: <1060795079.25442.40.camel@algernon.lan.enic.cc> That's what I was going to say ... I've gotten other jobs BECAUSE I had interests/experiences outside of the job related stuff. Dan. On Wed, 2003-08-13 at 09:59, Islandman wrote: > Peter Darley wrote: > > > > Folks, > > > > I have often been in a position to hire tech folks, and the number one > > thing that impresses me when looking at a resume is personal interests. A > > programmer who writes programs for themselves in their free time is likely > > to rise to the top, a network administrator who doesn't have a network at > > their home isn't likely to perform well on the job. > > Interesting. I've heard it's better to list the things you do OUTSIDE of > computer work lest you seem too narrow. > > -Brian > Vashon, WA > > > > > > The thing that is the biggest red flag to me is an applicant who focuses a > > lot on their industry certifications (especially MCSEs). Having them isn't > > a mark against them, but if it's what they put forward as proof that they > > know what they're doing that tells me that they probably don't have much > > real world experience, weather it's at work, school, noodling around at > > home, etc. > > > > Thanks, > > Peter Darley > > > > _____________________________________________________________ > > Seattle Perl Users Group Mailing List > > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > > WEB PAGE: http://www.seattleperl.org > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > From cansubaykan at hotmail.com Wed Aug 13 12:20:32 2003 From: cansubaykan at hotmail.com (John Subaykan) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: RE: Giving up on computer jobs & usefulness of placement firms Message-ID: I hate to be a spelling fascist but... >From: "North, Walter" >To: "'spug-list@mail.pm.org'" > > >Bad spelling is unacceptable.. ... >Bad grammer is a fact of life in a one page resume, just as in >email. As long as it isn't too bad. (how that for bad grammer). s/grammer/grammar/g; # sorry :) I agree, spelling is very important in resumes since this is your first introduction to the person reviewing resumes. I have dictionary.com on my list of bookmarks, I don't trust spellcheckers. Bad grammar on resumes is a given, since using complete sentences is not good resume style. But even within the restrictions of incomplete sentences, there are some grammar mistakes. You wouldn't say "I was designing database applications"; you might say "Designed database applications." You would not want to say "Did design database applications". You especially wouldn't want to say "Done did database applications." In email conversations between techies, I don't see the reason to be enraged if people don't use publishing quality English - except for things like 'u' and '4' instead of 'you' and 'for'; but I've never seen this on SPUG. _________________________________________________________________ STOP MORE SPAM with the new MSN 8 and get 2 months FREE* http://join.msn.com/?page=features/junkmail From umar at drizzle.com Wed Aug 13 12:26:08 2003 From: umar at drizzle.com (Umar Cheema) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement fi rms In-Reply-To: <581D864582E4D311853D009027DE223A01AE1102@harpo.microen.com> Message-ID: Over the years I have received constructive criticism and fair appraisal on the content of my resume. Here're a few facts/tips that I have gathered based on my experience: - The notion of first impression has a pretty heavy value in this area. Especially during the times when managers/others receive massive amount of resumes. Generally they would glance at your first page, registering some familiar looking words, etc into their memory, then they'll turn the page over and glance at the second page to get an idea of how long it is, and then flip back to the first page. So the trick is to capture their attention in the first glance and at the same time make sure your resume fits at the most in two pages. From what I have heard and seen, an ideal resume would be about a page and a half. This gives them the notion that it's not too long and cramped with every little thing you have ever done, and at the same time you have done enough that you had to go over to the second page. - Saying too much in too little is an art. Regardless of the format, style and length of your resume, when it comes to the hiring manager leaning back on his/her chair and going through each line, content is all that matters. Of course it's a combination of factors but even if your first impression, style and length take you to the next step, it's your content that'll further your chances. Is it better to try to fit all the relevant and important informtion in one carefully worded paragraph as oppose to two slightly wordier paragraphs? Absolutely. - The command on the language does give you an advantage over other candidates. I remember a couple of years ago I was going through a resume of a friend and I enjoyed reading it. I read every sentence and appreciated the simplicity and an excellent choice of words in almost every sentence. The trick is to look at the paragraph you just wrote at least three times and try to restructure it in a better way everytime. And when you think it's looking great, restructure it one more time. - It's important to tailor your resume to the taste of your prospective employer. If I am applying for a Perl job I would make sure that they notice words like Perl, Unix, XML, OOP, HTML, Linux, Apache in their first glance. That would ensure that after they have glanced over your second page, they would come back to see what you have actually done in those areas. Similarly, if I am applying for a QA job, I would totally rearrange my skills just so they notice words like White Box, Black Box, Regression Analysis, Development Life Cycles, etc before they flip to the second page. - Personal skills and extra-curricular activities can become a deciding factor when two different candidates successfully pass the initial screening. Considering the economy and the umemployment rate it's pretty safe to assume that there will be many candidates who possess technical skills very similar to yours. Then it's the job of your resume to convince the hiring managers that even though all these five candidates have a very similar set of skills and experience, you're the one that will best fit within their culture and environment. Personal skills usually include standard stuff such as works well under pressure, ability to handle and manage multiple tasks, leadership skills, etc. A good set of person skills starts with a very specific and appropriate skill/talent that is extremely relevant to the job you are applying for. This can be accomodated by doing some research on the job you are very serious about applying to. Try to get a feel of extactly what they're looking for and then look under your hat of skills and try to make a solid bridge between the two. If you don't have any relevant personal skill, then pick one from the standard/universal ones that will make you stand out. Then simply push(@skills, $thisSkill). Hope this can be useful to some along with other useful tips/suggestions from others on the list. Umar On Wed, 13 Aug 2003, Rodney Doe wrote: > Would it be useful to discuss those of us who provide input into the hiring > process have rejected candidates in the past? What fatal errors have we > seen on r?sum?s? What went wrong with interviews? > > I recall a particular perusal of a stack of applications, when a company > president and I encountered an application with a cover letter signed with > the pseudonym 'Boromir the Weird'. The president saw the signature, > chuckled, and promptly dropped the application into the recycle basket > without even a glance at Boromir's esteemed qualifications. While this is > an extreme case, perhaps the generation of a list of fatal flaws and > positive points might add some value. > > I love to see: > - A one page r?sum?. This indicates to me that the person can summarize > details and think abstractly. > > I hate to see: > - Spelling errors. > - Grammatical errors. > - Five page r?sum?s, crammed with acronyms and insignificant details. > > > > Rodney B. Doe, P.E. > Senior Software Engineer > Micro Encoder Inc. > www.microen.com > > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > From jay at scherrer.com Wed Aug 13 12:35:41 2003 From: jay at scherrer.com (Jay Scherrer) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placement fi rms In-Reply-To: References: Message-ID: <200308131035.41489.jay@scherrer.com> On Wednesday 13 August 2003 09:08 am, Peter Darley wrote: > The thing that is the biggest red flag to me is an applicant who focuses a > lot on their industry certifications (especially MCSEs). Having them isn't > a mark against them, but if it's what they put forward as proof that they > know what they're doing that tells me that they probably don't have much > real world experience, weather it's at work, school, noodling around at > home, etc. I have been wrestling the idea of getting certified with either Perl or Linux. I didn't know this would be a red flag :-( Would it be better to show from SourceForge or similar? Just curious, once my house is done I'll be trying enter the job market. Any help for preparing would be helpful. Jay From pdarley at kinesis-cem.com Wed Aug 13 12:50:46 2003 From: pdarley at kinesis-cem.com (Peter Darley) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placementfirms In-Reply-To: <1060795079.25442.40.camel@algernon.lan.enic.cc> Message-ID: Folks, I may not be a very typical resume reviewer/hirer. I have always been in a position where the resumes were sent directly to me, I was the only one who reviewed them and made the final decision. In a more typical hiring environment where it first goes by an HR person, who has artificial constraints on candidate qualifications, such as 'diversity' and such, it's better to present yourself as well rounded. For me, geek that I am, I get excited when an applicant is involved in Seattle Wireless, or has contributed code to the GNU/Linux (to be PC) codebase. I probably represent the hiring criteria of a small business better than larger ones. Thanks, Peter Darley -----Original Message----- From: spug-list-bounces@mail.pm.org [mailto:spug-list-bounces@mail.pm.org]On Behalf Of Dan Ebert Sent: Wednesday, August 13, 2003 10:18 AM To: Islandman Cc: spug-list@mail.pm.org Subject: Re: SPUG: FW: Giving up on computer jobs & usefulness of placementfirms That's what I was going to say ... I've gotten other jobs BECAUSE I had interests/experiences outside of the job related stuff. Dan. On Wed, 2003-08-13 at 09:59, Islandman wrote: > Peter Darley wrote: > > > > Folks, > > > > I have often been in a position to hire tech folks, and the number one > > thing that impresses me when looking at a resume is personal interests. A > > programmer who writes programs for themselves in their free time is likely > > to rise to the top, a network administrator who doesn't have a network at > > their home isn't likely to perform well on the job. > > Interesting. I've heard it's better to list the things you do OUTSIDE of > computer work lest you seem too narrow. > > -Brian > Vashon, WA > > > > > > The thing that is the biggest red flag to me is an applicant who focuses a > > lot on their industry certifications (especially MCSEs). Having them isn't > > a mark against them, but if it's what they put forward as proof that they > > know what they're doing that tells me that they probably don't have much > > real world experience, weather it's at work, school, noodling around at > > home, etc. > > > > Thanks, > > Peter Darley > > > > _____________________________________________________________ > > Seattle Perl Users Group Mailing List > > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > > WEB PAGE: http://www.seattleperl.org > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org From aaron at activox.com Wed Aug 13 13:04:47 2003 From: aaron at activox.com (Aaron Salo) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement fi rms In-Reply-To: References: <581D864582E4D311853D009027DE223A01AE1102@harpo.microen.com> Message-ID: <3.0.5.32.20030813110447.01a61e00@mail.activox.com> At 10:26 AM 8/13/2003 -0700, you wrote: >- It's important to tailor your resume to the taste of your prospective >employer. If I am applying for a Perl job I would make sure that they >notice words like Perl, Unix, XML, OOP, HTML, Linux, Apache in their first >glance. That would ensure that after they have glanced over your second >page, they would come back to see what you have actually done in those >areas. Most importantly in regard to this, in these days of automated HR screening systems, electronic resume submissions, and hundreds of applicants for every position, if you're applying for a position at a mid to large size company you won't even get a chance to land on the first tier screener's desk unless the HR software hits the right keywords in your resume and scores you above the cut line. You can be the most remarkable candidate for a position and get /dev/null'ed by the software if your resume is not crafted to score high enough to make the first cut. And more often these days the first hurdle is getting past the software. When they def a job opening, AFAIK they tell the software to screen applicants and score them based on pretty dumb algorithms that do string matching. So if the job calls for XML and Oracle and ERP and Siebel, you better make sure you have mention of those items in your resume if you want to survive the software screen and have a chance for the first tier grunt to flip your pages over and admire your external activities. Getting through the software screener is similar to the arcane discipline of search engine placement. There is very little concrete info on how to improve your scoring, but a safe bet is if the posting has specific requirements for discipline based experience, make sure you state them all, and exactly as set forth in the job announcement. FWIW, following that deductive chain, it seems logical to even mention disciplines you don't have experience in so they'll give you a software hit and get you under the nose of a human, i.e., XML - familiar with XML schema, DTD, XSLT, and other aspects although no recent experience MCSE - experienced in Win32 systems, networking, SQLServer administration, and other aspects although have not yet achieved MCSE certification and the like. Good luck to everyone looking for work. ~!a From pdarley at kinesis-cem.com Wed Aug 13 13:05:35 2003 From: pdarley at kinesis-cem.com (Peter Darley) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placement firms In-Reply-To: <200308131035.41489.jay@scherrer.com> Message-ID: Jay, I didn't mean say that the certifications were a red flag. The red flag is the reliance on the certifications in the absence of other support. If someone has a certification, plus professional or non-professional experience, it's certainly not a mark against them. With me it's also not a mark for them either, as my opinion of technical certifications is, as they say, piss poor. Thanks, Peter Darley -----Original Message----- From: spug-list-bounces@mail.pm.org [mailto:spug-list-bounces@mail.pm.org]On Behalf Of Jay Scherrer Sent: Wednesday, August 13, 2003 10:36 AM To: Peter Darley Cc: spug-list@pm.org Subject: Re: SPUG: FW: Giving up on computer jobs & usefulness of placement firms On Wednesday 13 August 2003 09:08 am, Peter Darley wrote: > The thing that is the biggest red flag to me is an applicant who focuses a > lot on their industry certifications (especially MCSEs). Having them isn't > a mark against them, but if it's what they put forward as proof that they > know what they're doing that tells me that they probably don't have much > real world experience, weather it's at work, school, noodling around at > home, etc. I have been wrestling the idea of getting certified with either Perl or Linux. I didn't know this would be a red flag :-( Would it be better to show from SourceForge or similar? Just curious, once my house is done I'll be trying enter the job market. Any help for preparing would be helpful. Jay _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org From cwilkes-spug at ladro.com Wed Aug 13 13:11:31 2003 From: cwilkes-spug at ladro.com (Chris Wilkes) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placement firms In-Reply-To: <200308131035.41489.jay@scherrer.com> References: <200308131035.41489.jay@scherrer.com> Message-ID: <20030813181131.GH70478@www.ladro.com> On Wed, Aug 13, 2003 at 10:35:41AM -0700, Jay Scherrer wrote: > On Wednesday 13 August 2003 09:08 am, Peter Darley wrote: > > > The thing that is the biggest red flag to me is an applicant who focuses a > > lot on their industry certifications (especially MCSEs). Having them isn't > > a mark against them, but if it's what they put forward as proof that they > > know what they're doing that tells me that they probably don't have much > > real world experience, weather it's at work, school, noodling around at > > home, etc. > > I have been wrestling the idea of getting certified with either Perl or Linux. > I didn't know this would be a red flag :-( > Would it be better to show from SourceForge or similar? > Just curious, once my house is done I'll be trying enter the job market. Any > help for preparing would be helpful. What do you think an employer would like to see: 1) a piece of paper that has a gold star next to your name or 2) a major work of yours that you've had to: a) sync up with dozens of people over the internet b) fully document your procedures so that others can work on it c) steered a project in a direction of greater usuability d) answer user questions on mailing lists Granted all those bullet points are up to you to produce, but I would think that they show what employers are really looking for in programmers, which is: a) creativity b) responsiveness to end users / other devs c) a self starter attitude as you're not getting paid for this d) an ability to play well with others e) being able to see the whole project, not just some nifty perl one liner What's the piece of paper show the employer? That you can answer questions on a test about the third switch to a funky command. Granted that's an extreme example of a poor test, but I think you get my point. So if you're going to enter the job market now (yikes!) I would seriously look to see if there's a sourceforge project or some volunteer work you can do to show some programming experience versus taking a test. Chris From jgardner at jonathangardner.net Wed Aug 13 14:22:51 2003 From: jgardner at jonathangardner.net (Jonathan Gardner) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement fi rms In-Reply-To: <3.0.5.32.20030813110447.01a61e00@mail.activox.com> References: <581D864582E4D311853D009027DE223A01AE1102@harpo.microen.com> <3.0.5.32.20030813110447.01a61e00@mail.activox.com> Message-ID: <200308131222.53117.jgardner@jonathangardner.net> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Wednesday 13 August 2003 11:04, Aaron Salo wrote: > > Getting through the software screener is similar to the arcane > discipline of search engine placement. There is very little concrete > info on how to improve your scoring, but a safe bet is if the posting > has specific requirements for discipline based experience, make sure > you state them all, and exactly as set forth in the job announcement. > I've found that if you use the exact same words as the recruiter uses when they posted the job, your chances of getting a phone call go up significantly. One job I was looking for had the word "ecommerce" in it three times. I put that word in my resume twice, along with "object-oriented perl", "apache", and "postgresql", which were all mentioned in exactly those phrases, and I scored an interview. - -- Jonathan Gardner Live Free, Use Linux! -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.2.1 (GNU/Linux) iD8DBQE/OpALWgwF3QvpWNwRAikYAKDDOE4J6OcwuvtGA5sbzXWXLsdjcACgkQFu DKKrs/lqa8UUidWj7oNNgR4= =j1xh -----END PGP SIGNATURE----- From jgardner at jonathangardner.net Wed Aug 13 14:26:19 2003 From: jgardner at jonathangardner.net (Jonathan Gardner) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placementfirms In-Reply-To: References: Message-ID: <200308131226.20692.jgardner@jonathangardner.net> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Wednesday 13 August 2003 10:50, Peter Darley wrote: > ... it's better to present yourself as well > rounded. For me, geek that I am, I get excited when an applicant is > involved in Seattle Wireless, or has contributed code to the > GNU/Linux (to be PC) codebase. I probably represent the hiring > criteria of a small business better than larger ones. That's interesting that well-rounded to you means involvement in software outside of work. Well-rounded to me and a lot of my peers seemed to have meant "does things other than programming in spare time". I do enjoy talking about the various Free Software / Open Source projects I contribute to, even if they don't relate directly to the job at hand. I've found that some interviewers like this a lot. Others are too focused to care beyond your abilities. - -- Jonathan Gardner Live Free, Use Linux! -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.2.1 (GNU/Linux) iD8DBQE/OpDbWgwF3QvpWNwRArg+AJ9ncT7T7Ts8y0cPdKhIRpigDRhf4wCeIl1B pjibodvW52htfJysBut5Ih0= =hci/ -----END PGP SIGNATURE----- From tim at consultix-inc.com Wed Aug 13 15:05:49 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Call for Manuscript Reviewers Message-ID: <20030813130549.A21729@timji.consultix-inc.com> SPUGsters, As you all know by now, at least through the subliminal impact of my .sig, I'm writing a book on Perl. And on the Unix Shell. And on other Unix utilities too. It's got everything! Even detailed coverage of the only loop The Larry didn't want you to have, *select*. Anyway, before I submit the first batch of chapters to a flock of hostile reviewers who are going to tear it to shreds 8-}, I thought I'd ask if some of the folks on this list might be interested in reading a chapter or two, and giving me the benefit of their feedback. There's no remuneration available, but I promise that each volunteer reviewer will have his or her /foot duly noted/ in the book. 8-} So drop me a line if you'd like to help me, and the future readers of this book, with this project. -Tim *------------------------------------------------------------* | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* | Watch for my Book: "Minimal Perl for Shell Programmers" | *------------------------------------------------------------* From dleonard at dleonard.net Wed Aug 13 16:42:00 2003 From: dleonard at dleonard.net (dleonard@dleonard.net) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement fi rms In-Reply-To: <581D864582E4D311853D009027DE223A01AE1102@harpo.microen.com> Message-ID: On Wed, 13 Aug 2003, Rodney Doe wrote: > Would it be useful to discuss those of us who provide input into the hiring > process have rejected candidates in the past? What fatal errors have we > seen on résumés? What went wrong with interviews? Good suggestion... Here's what I look for in a candidate: I like one and a half to two page resumes. A one pager doesn't give me enough information on what they actually know. A three pager is too long for me to easily peruse and tells me the candidate doesn't know how to summarize and prioritize. Certifications don't mean squat unless you actually can use the knowledge. I like to ask real world problems relating to the certification and see if they actually know anything about it. Most of the time they don't so the certification works as a strike against them. When I'm interviewing someone I like to find out what their career plans are and how the job they are interviewing for is going to help those plans. If they are looking to learn the technology or have a specific goal in mind then that is a big plus in their favor. If they are just like "it's another coding job doing the same thing I've been doing for the last decade" then they just lost a point. Drive towards personal goals and desire to learn are useful. Someone who is just putting in their time to cash a check is less than desirable. I like to ask them what technology or job they've enjoyed the most and why. What they like working on and why gives me a great insight into their character. When adding an engineer, getting people who can fit into the environment is a big consideration. I've rejected technically brilliant programmers because no one in the company, and I mean no one, would have been able to work with them. Hygiene is also important. Nobody is going to hire you if you smell like you just finished off a 2 week bender and didn't take a shower for that entire time. -- From tim at consultix-inc.com Wed Aug 13 19:52:23 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Automagic non-matching in -B files Message-ID: <20030813175223.A22915@timji.consultix-inc.com> I've just discovered that perl -wlne 'print "Processing $. in $ARGV\n"; /./ and print' $HOME indicates that Perl automagically disallows any processing of records from the (binary) file, without warning or die'ing (instead of trashing the screen with spurious binary match-data, like commercial *grep*). Does anybody know where this is documented? I can't find it under perldoc -tf open, which seems like the right place to look. -Tim *------------------------------------------------------------* | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* | Watch for my Book: "Minimal Perl for Shell Programmers" | *------------------------------------------------------------* From david.dyck at fluke.com Wed Aug 13 20:02:25 2003 From: david.dyck at fluke.com (David Dyck) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Automagic non-matching in -B files In-Reply-To: <20030813175223.A22915@timji.consultix-inc.com> References: <20030813175223.A22915@timji.consultix-inc.com> Message-ID: On Wed, 13 Aug 2003 at 17:52 -0700, Tim Maher wrote: > I've just discovered that > > perl -wlne 'print "Processing $. in $ARGV\n"; /./ and print' $HOME > > indicates that Perl automagically disallows any processing of records > from the (binary) file, without warning or die'ing (instead of trashing > the screen with spurious binary match-data, like commercial *grep*). I don't get any warnings or errors from the above file (directory right?) and perl -e 'print pack "c*", 0 .. 255' > binfile perl -le 'print "bin" if -B "binfile"' perl -wlne 'print "Processing $. in $ARGV\n"; /./ and print' binfile prints out 2 records . From marc.gibian at earthlink.net Wed Aug 13 20:07:51 2003 From: marc.gibian at earthlink.net (marc.gibian@earthlink.net) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placement fi rms Message-ID: <21758676.1060823271871.JavaMail.root@cookiemonster.psp.pas.earthlink.net> I have to strongly disagree with Peter on this one. I want to hire well rounded individuals who have lives outside of the technology field. I think there is such as thing as being too close to things and thus never seeing the large picture. I also find the technology addicted tend to not know when they are too tired to continue (and thus make lots of stupid errors they then waste even more time finding and fixing). Give me someone who is emotionally invested in their work yet have lives outside of the workplace. I should note that in my 25+ year career I have been involved in many hiring decisions. I have never regretted one hire made with my agreement, nor regretted one not hired with me joining in the thumbs down. -Marc -----Original Message----- From: Peter Darley Sent: Aug 13, 2003 9:08 AM To: SPUG Subject: SPUG: FW: Giving up on computer jobs & usefulness of placement fi rms Folks, I have often been in a position to hire tech folks, and the number one thing that impresses me when looking at a resume is personal interests. A programmer who writes programs for themselves in their free time is likely to rise to the top, a network administrator who doesn't have a network at their home isn't likely to perform well on the job. The thing that is the biggest red flag to me is an applicant who focuses a lot on their industry certifications (especially MCSEs). Having them isn't a mark against them, but if it's what they put forward as proof that they know what they're doing that tells me that they probably don't have much real world experience, weather it's at work, school, noodling around at home, etc. Thanks, Peter Darley _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org From ben at reser.org Wed Aug 13 20:23:29 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Automagic non-matching in -B files In-Reply-To: <20030813175223.A22915@timji.consultix-inc.com> References: <20030813175223.A22915@timji.consultix-inc.com> Message-ID: <20030814012328.GN11173@titanium.brain.org> n Wed, Aug 13, 2003 at 05:52:23PM -0700, Tim Maher wrote: > I've just discovered that > > perl -wlne 'print "Processing $. in $ARGV\n"; /./ and print' $HOME > > indicates that Perl automagically disallows any processing of records > from the (binary) file, without warning or die'ing (instead of trashing > the screen with spurious binary match-data, like commercial *grep*). > > Does anybody know where this is documented? I can't find it under > perldoc -tf open, which seems like the right place to look. prints binary stuff out for me. When I do: perl -wlne 'print "Processing $. in $ARGV\n"; /./ and print' /bin/ls > out I get ELF output as I would expect. Tried it with perl 5.8.0 and 5.8.1 RC4. -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From marc.gibian at earthlink.net Wed Aug 13 21:27:33 2003 From: marc.gibian at earthlink.net (marc.gibian@earthlink.net) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Giving up on computer jobs & usefulness of placement fi rms Message-ID: <8178121.1060828054009.JavaMail.root@cookiemonster.psp.pas.earthlink.net> On Wed, 13 Aug 2003, Rodney Doe wrote: > Would it be useful to discuss those of us who provide input into the hiring > process have rejected candidates in the past? What fatal errors have we > seen on r?sum?s? What went wrong with interviews? A couple of more general thoughts, from both the job seeker and hiring perspectives: 1 - In the current software job market, it does not matter how perfect your resume is... there are SO many resumes being submitted for each job that even the most perfect resume that fits the job description in every regard may never be seen. Thus, make sure you have a solid resume, but don't think that a lack of response is due to a poor resume. It may just be that your resume is only one of a pile of 1000 and they only looked at 50 to 100 randomly grabbed out of that pile. 2 - Network Network Network ... it seems trite. Everyone says it, but its not really clear what it is or how it helps. But the way you make sure that your resume is one of those 50 to 100 that are actually looked at is to network. It REALLY does work, and you never know which contact you made through what meeting or activity is the one that will pay off. Its one of those investments in time that really does work. It does not even have to be in a job related setting. If you are off hiking a trail and start talking to someone you meet along the way, make sure they know you are job hunting and give them your "elevator pitch"... they may just be that magic connection. 3 - Finally, while I have been on the resume reading side and understand the desire for one or two page resumes, I don't know how you reconcile that with attempting to represent a 25+ year career in a resume? There are certain things we expect on every resume... a history of prior employment including at least some minimal indication of what was done, who the company was, and the timeline (want to see continuity of employment, a progression of positions, and longevity). For someone with 25 years, this history in a minimal form can fill two pages. From ben at reser.org Wed Aug 13 22:54:09 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: References: <20030811063053.GB11173@titanium.brain.org> Message-ID: <20030814035408.GP11173@titanium.brain.org> On Mon, Aug 11, 2003 at 10:46:15PM -0700, Marc M. Adkins wrote: > OK, here's a stripped-down version. It demonstrates the behavior on > Mandrake 9.1. > > Create two terminals. Run testsvr.pl in one. Wait until it says "Server > Created." Run testsvr.pl in the other. It should shut down the first one > and run in the second one. Repeat as necessary. Fun, huh? > > On Windows this happens immediately (more or less). On Linux it takes 55-60 > seconds. There's a counter in the code to keep you company. > > Since my last post I opened up LWP::Simple and swiped the 'trivial' function > for testing purposes. Didn't change the behavior. While it was in I did > notice that there was a 60 second timeout on the connection to the server > for the /quit command. I changed this to 30 seconds and it didn't affect > anything. So I removed that code and went back to LWP::Simple. > > I previously said that the behavior didn't happen if I used to > interrupt the original server. That is not in fact the case (or at least I > can't duplicate that feature now). If I the first server and > _then_ start the second it _still_ takes almost a minute. > > The sleep 1 at the top of the wait loop is necessary on Windows but not on > Linux. It doesn't affect the behavor on Linux, it just isn't necessary. Add ReuseAddr => 1 to your call to new for HTTP::Daemon. Found by gooling for: linux HTTP::Daemon TIME_WAIT http://cookbook.soaplite.com/#reusing%20sockets%20on%20restart The document mentions Reuse which according to the IO::Socket::INET pod is deprecated in favor of ReuseAddr. Windows apparently allows you to reuse sockets immediately. Linux (and UNIX implementations in general) places the port in the status of TIME_WAIT on the idea that there might still be data floating around out there on the network from the existing connections. If a server was allowed to bind to the same address and port it might receive partial data with the same proto, local addr, local port, remote addr and remote port that would "corrupt" an existing connection. You can find an explanation of the possible error conditions that TIME_WAIT is intended to avoid here: http://www.unixguide.net/network/socketfaq/2.7.shtml ReuseAddr sets SO_REUSEADDR on your bind which tells the socket implementation that while you realize you may get bizzare results but that you don't want to wait. Even though the error conditions above mentioned are possible they are highly improbable an explanation of why can be found here: http://www.unixguide.net/network/socketfaq/4.5.shtml HTHs -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From ben at reser.org Wed Aug 13 22:57:58 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: <20030814035408.GP11173@titanium.brain.org> References: <20030811063053.GB11173@titanium.brain.org> <20030814035408.GP11173@titanium.brain.org> Message-ID: <20030814035758.GQ11173@titanium.brain.org> On Wed, Aug 13, 2003 at 08:54:08PM -0700, Ben Reser wrote: > Add ReuseAddr => 1 to your call to new for HTTP::Daemon. Forgot to mention that I verified this on Mandrake 9.1/PPC it works fine. :) -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From Marc.M.Adkins at Doorways.org Wed Aug 13 23:59:03 2003 From: Marc.M.Adkins at Doorways.org (Marc M. Adkins) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Closing HTTP::Daemon port In-Reply-To: <20030814035408.GP11173@titanium.brain.org> Message-ID: > Quoth Ben Reser: > > Add ReuseAddr => 1 to your call to new for HTTP::Daemon. Hah! I just (and I mean JUST) found that myself whilst trolling through IO::Socket::INET documentation. I was just returning to my email (still sadly on my Windows box) to write up the answer... Thanks for looking. Really. If I had been just a little less motivated I'd have been forever in your debt. ;) BTW, an interesting side note. I had an 'old' server running without the flag. Then I started a 'new' server _with_ the flag. It waited for 60 seconds. Then I started it again and it went right away. So the flag has to be set on the existing server as well as the server coming along behind it. Well, that's what I think, anyway. I admit I didn't do exhaustive testing on this, being basically lazy and just durn glad to have a solution to the problem. Happy, happy, joy, joy. I KNEW there was a magic cookie. mma ps thanks to Andrew Sweger as well, 'preciate the time From mako at debian.org Thu Aug 14 00:31:46 2003 From: mako at debian.org (Benj. Mako Hill) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Open Sauce Lunch Friday In-Reply-To: References: Message-ID: <20030814053146.GT988@nozomi> On Tue, Aug 12, 2003 at 07:21:57PM -0700, Dan Sabath wrote: > Just your friendly reminder. Love to see more of you there. Sign up at: > http://spugwiki.perlocity.org/index.cgi?FriAug15InInternationalDistrict I'll be there and would love to do a PGP/GPG keysigning with anyone else there who also does the whole PGP/GPG thing. If you're interested, lease bring a printed copy of your key's fingerprint and a piece of government issued photo ID. Regards, Benjamin Mako Hill -- Benj. Mako Hill mako@debian.org http://mako.yukidoke.org/ -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030813/4ef57be5/attachment.bin From mako at debian.org Thu Aug 14 00:40:50 2003 From: mako at debian.org (Benj. Mako Hill) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Debian 10th Anniversary Message-ID: <20030814054049.GU988@nozomi> This isn't purely Perl but there seems to be a good deal of overlap I've seen between other Linux/Open Source. And hey, since perl is an essential package in Debian, the Perl community should be an essential piece in our 10th birthday celebration. :) The Debian 10th is at Alki Beach Park this Saturday at 1pm. We've got two tables reserved and will be bring a couple small grills. It's going to be something of a potluck/BBQ so please bring food, frisbees, fingerprints plus whatever else. Also feel free to bring guests, spouses, children and the like. The source for more information is: http://mako.yukidoke.org/debian_10th/ Everyone is welcome. Pass this around if you feel the desire. I hope to see some of you there. Regards, Benjamin Mako Hill P.S. If you want to stay up on this stuff in the future. Join the Seattle Debian community's email list at: http://lists.yukidoke.org/mailman/listinfo/debian-seattle-soc -- Benj. Mako Hill mako@debian.org http://mako.yukidoke.org/ -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030813/8423a213/attachment.bin From pdarley at kinesis-cem.com Thu Aug 14 10:35:34 2003 From: pdarley at kinesis-cem.com (Peter Darley) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placementfi rms In-Reply-To: <21758676.1060823271871.JavaMail.root@cookiemonster.psp.pas.earthlink.net> Message-ID: Marc, I guess I assume that everyone has a life outside of work, weither they include it on their resume or not. Not everyone has the kind of interest in the technical side of things that they explore it outside of work however, so I look for that in a resume. Someone who really doesn't have a life outside of work is going to be so broken that they'll be screened out of the hiring process anyway. :) Thanks, Peter Darley -----Original Message----- From: spug-list-bounces@mail.pm.org [mailto:spug-list-bounces@mail.pm.org]On Behalf Of marc.gibian@earthlink.net Sent: Wednesday, August 13, 2003 6:08 PM To: Peter Darley; SPUG Subject: Re: SPUG: FW: Giving up on computer jobs & usefulness of placementfi rms I have to strongly disagree with Peter on this one. I want to hire well rounded individuals who have lives outside of the technology field. I think there is such as thing as being too close to things and thus never seeing the large picture. I also find the technology addicted tend to not know when they are too tired to continue (and thus make lots of stupid errors they then waste even more time finding and fixing). Give me someone who is emotionally invested in their work yet have lives outside of the workplace. I should note that in my 25+ year career I have been involved in many hiring decisions. I have never regretted one hire made with my agreement, nor regretted one not hired with me joining in the thumbs down. -Marc -----Original Message----- From: Peter Darley Sent: Aug 13, 2003 9:08 AM To: SPUG Subject: SPUG: FW: Giving up on computer jobs & usefulness of placement fi rms Folks, I have often been in a position to hire tech folks, and the number one thing that impresses me when looking at a resume is personal interests. A programmer who writes programs for themselves in their free time is likely to rise to the top, a network administrator who doesn't have a network at their home isn't likely to perform well on the job. The thing that is the biggest red flag to me is an applicant who focuses a lot on their industry certifications (especially MCSEs). Having them isn't a mark against them, but if it's what they put forward as proof that they know what they're doing that tells me that they probably don't have much real world experience, weather it's at work, school, noodling around at home, etc. Thanks, Peter Darley _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org From karl.b.hartman at boeing.com Thu Aug 14 11:19:02 2003 From: karl.b.hartman at boeing.com (Hartman, Karl B) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness of placementfirms Message-ID: <628E489B972CBC46999B4E71C609FA8F01F72368@xch-nw-06.nw.nos.boeing.com> I personally hold any certifications obtained via the company who makes the product(s) as suspect. I find the Microsoft certification the most troubling. True, I have never taken it (nor will I ever), and I haven't seen the practice questions in some years, but when I was assisting some of the Boeing NT admins in their attempts to obtain their MCSE, I had to tell them two answers, Industry answer(so when you talked to people from a non-Microsoft shop they won't look at you strangely) and Microsoft answer (so you can pass the test). After taking classes for years for both HP and SUN, I found the biggest problem being redundancy. Now, I have to admit, I have not yet taken any RedHat classes, but I will be remedying that soon. If I was hiring someone, I would be more interested in educational history and work and non-work activities in the field of which I am hiring for. Certifications from institutions of higher education would hold a lot more sway than any commercial certification. I would use the latter only in the case of a tie breaker. If you have any questions or comments, don't hesitate to call. Thanks, Karl Hartman SSG Client/Server Operations - Computing Admin Process Mgmt 425-294-8172 (office) Business Sense "Failure to embrace an idea just because it doesn't make sense or just plain doesn't work does not constitute resistance to change" Common Sense - from "Really important stuff my kids taught me" "While you're standing there deciding whether or not to get your net, the butterfly is flying away." -----Original Message----- From: Peter Darley [mailto:pdarley@kinesis-cem.com] Sent: Wednesday, August 13, 2003 11:06 AM To: Jay@scherrer.com Cc: spug-list@pm.org Subject: RE: SPUG: FW: Giving up on computer jobs & usefulness of placementfirms Jay, I didn't mean say that the certifications were a red flag. The red flag is the reliance on the certifications in the absence of other support. If someone has a certification, plus professional or non-professional experience, it's certainly not a mark against them. With me it's also not a mark for them either, as my opinion of technical certifications is, as they say, piss poor. Thanks, Peter Darley -----Original Message----- From: spug-list-bounces@mail.pm.org [mailto:spug-list-bounces@mail.pm.org]On Behalf Of Jay Scherrer Sent: Wednesday, August 13, 2003 10:36 AM To: Peter Darley Cc: spug-list@pm.org Subject: Re: SPUG: FW: Giving up on computer jobs & usefulness of placement firms On Wednesday 13 August 2003 09:08 am, Peter Darley wrote: > The thing that is the biggest red flag to me is an applicant who > focuses a > lot on their industry certifications (especially MCSEs). Having them isn't > a mark against them, but if it's what they put forward as proof that > they know what they're doing that tells me that they probably don't > have much real world experience, weather it's at work, school, > noodling around at home, etc. I have been wrestling the idea of getting certified with either Perl or Linux. I didn't know this would be a red flag :-( Would it be better to show from SourceForge or similar? Just curious, once my house is done I'll be trying enter the job market. Any help for preparing would be helpful. Jay _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org From karl.b.hartman at boeing.com Thu Aug 14 11:52:28 2003 From: karl.b.hartman at boeing.com (Hartman, Karl B) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: FW: Giving up on computer jobs & usefulness ofplacementfirms Message-ID: <628E489B972CBC46999B4E71C609FA8F01F72369@xch-nw-06.nw.nos.boeing.com> However, I forgot to add, the few times I went out to job hunting, the first interviews were with HR and they all asked the same thing, "Do you have any certifications?" This is because almost all HR folks in large corporations and are there to screen applicants, little or no knowledge of what is really needed to perform most jobs in the company. As noted, with electronic resumes, they have already been filtered for the appropriate words/phrases. Karl -----Original Message----- From: Hartman, Karl B Sent: Thursday, August 14, 2003 9:19 AM To: Peter Darley; Jay@scherrer.com Cc: spug-list@pm.org Subject: RE: SPUG: FW: Giving up on computer jobs & usefulness ofplacementfirms I personally hold any certifications obtained via the company who makes the product(s) as suspect. I find the Microsoft certification the most troubling. True, I have never taken it (nor will I ever), and I haven't seen the practice questions in some years, but when I was assisting some of the Boeing NT admins in their attempts to obtain their MCSE, I had to tell them two answers, Industry answer(so when you talked to people from a non-Microsoft shop they won't look at you strangely) and Microsoft answer (so you can pass the test). After taking classes for years for both HP and SUN, I found the biggest problem being redundancy. Now, I have to admit, I have not yet taken any RedHat classes, but I will be remedying that soon. If I was hiring someone, I would be more interested in educational history and work and non-work activities in the field of which I am hiring for. Certifications from institutions of higher education would hold a lot more sway than any commercial certification. I would use the latter only in the case of a tie breaker. If you have any questions or comments, don't hesitate to call. Thanks, Karl Hartman SSG Client/Server Operations - Computing Admin Process Mgmt 425-294-8172 (office) Business Sense "Failure to embrace an idea just because it doesn't make sense or just plain doesn't work does not constitute resistance to change" Common Sense - from "Really important stuff my kids taught me" "While you're standing there deciding whether or not to get your net, the butterfly is flying away." -----Original Message----- From: Peter Darley [mailto:pdarley@kinesis-cem.com] Sent: Wednesday, August 13, 2003 11:06 AM To: Jay@scherrer.com Cc: spug-list@pm.org Subject: RE: SPUG: FW: Giving up on computer jobs & usefulness of placementfirms Jay, I didn't mean say that the certifications were a red flag. The red flag is the reliance on the certifications in the absence of other support. If someone has a certification, plus professional or non-professional experience, it's certainly not a mark against them. With me it's also not a mark for them either, as my opinion of technical certifications is, as they say, piss poor. Thanks, Peter Darley -----Original Message----- From: spug-list-bounces@mail.pm.org [mailto:spug-list-bounces@mail.pm.org]On Behalf Of Jay Scherrer Sent: Wednesday, August 13, 2003 10:36 AM To: Peter Darley Cc: spug-list@pm.org Subject: Re: SPUG: FW: Giving up on computer jobs & usefulness of placement firms On Wednesday 13 August 2003 09:08 am, Peter Darley wrote: > The thing that is the biggest red flag to me is an applicant who > focuses a > lot on their industry certifications (especially MCSEs). Having them isn't > a mark against them, but if it's what they put forward as proof that > they know what they're doing that tells me that they probably don't > have much real world experience, weather it's at work, school, > noodling around at home, etc. I have been wrestling the idea of getting certified with either Perl or Linux. I didn't know this would be a red flag :-( Would it be better to show from SourceForge or similar? Just curious, once my house is done I'll be trying enter the job market. Any help for preparing would be helpful. Jay _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org From tim at consultix-inc.com Thu Aug 14 18:38:12 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Regex/matching-op Problem Message-ID: <20030814163812.A26300@timji.consultix-inc.com> HELP! My brain has fallen down, and it can't get back up! I can't figure out why this program won't match two sentences in the same line, as with the "rejoicing" one. See another source of confusion in final comments way below. Any ideas? -Tim (aka "Yumpy the Certifier") perl -wln -e 'BEGIN { $/="" ; $,="\n\n"; } print /(?:And|Or|But)\b[^.?!]+?[.?!]/g ;' < References: <20030814163812.A26300@timji.consultix-inc.com> Message-ID: <20030814234759.GW8566@ifokr.org> > My brain has fallen down, and it can't get back up! > > I can't figure out why this program won't match two sentences in the same > line, as with the "rejoicing" one. See another source of confusion in final > comments way below. I don't understand what you want to do, can you tell us what you want to see as output? With no changes, I get this output: " And so it came to pass that The Larry blessed a Ponie, and appointed brave knights, armed with the Sticks of the Riddle, to guide her. And there was great rejoicing among JAPHs everywhere. And much grog was quenched. But there was no quenching by The Damian, who graciously bequeathed his share to Yumpy the Certifier. " -- Brian Hatch "Gods by the bushel! Systems and Gods by the pound! Security Engineer Gods for all occasions!" http://www.ifokr.org/bri/ Every message PGP signed -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030814/85e9b4e3/attachment.bin From jsl at blarg.net Thu Aug 14 19:42:05 2003 From: jsl at blarg.net (Jim Ludwig) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: / Spreadsheet::WriteExcel / any experience? / Message-ID: Hey there: Before posting any possible problems, I thought I'd ask to see if anyone on this list has ever used Spreadsheet::WriteExcel (0.41) in a non-Windows environment? For the most part it's great, but I'm currently knocking my head against the monitor... Thanks, jim From spug-list at l.ifokr.org Thu Aug 14 20:10:20 2003 From: spug-list at l.ifokr.org (Brian Hatch) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: Open Sauce Lunch Friday Message-ID: <20030815011020.GA2872@ifokr.org> Sending reminder on Dan's behalf: Open Sauce Lunch Tomorrow (Friday) Chinese (Dim Sum) food in International District Friday, 8/15, 12:00pm * House of Hong * 409 8th Ave S * Seattle, Washington 98104 * (206) 622-7997 * MAP * Website Look for a guy wearing a hawaiian shirt and beard; that will be this lunch's Convener, Dan Sabath. If you have a PGP or GPG key and are interested in doing a keysigning, please bring a printed copy of your key's fingerprint and a piece of government issued photo ID. There will be people to exchange this information with. More details and RSVP online at http://spugwiki.perlocity.org/index.cgi?FriAug15InInternationalDistrict -- Brian Hatch M$ Where do you want to go today? Systems and Linux: Where do you want to go tommorow? Security Engineer FreeBSD: Are you guys coming or what? http://www.ifokr.org/bri/ Every message PGP signed -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030814/ae0f3c0b/attachment.bin From ced at carios2.ca.boeing.com Thu Aug 14 20:43:00 2003 From: ced at carios2.ca.boeing.com (ced@carios2.ca.boeing.com) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: / Spreadsheet::WriteExcel / any experience? / Message-ID: <200308150143.SAA28520@carios2.ca.boeing.com> > Before posting any possible problems, I thought > I'd ask to see if anyone on this list has ever > used Spreadsheet::WriteExcel (0.41) in a > non-Windows environment? Yes, details are fuzzy now but I used Spreadsheet::WriteExcel sometime last year on Solaris. Worked well for my simple app. -- Charles DeRykus my $workbook = Spreadsheet::WriteExcel->new("EDS-$current_month.xls"); my $worksheet = $workbook->addworksheet(); $worksheet->write( 0, 1, 'Bellevue'); $worksheet->write( 0, 2, 'St. Louis'); $worksheet->write( 0, 3, 'Seal Beach'); $worksheet->write( 0, 4, 'Total'); my ( $row, $col, $region ); $row = 1; my $monthly_total; MONTH: foreach my $mon ( sort {$mon{$a} <=> $mon{$b}} keys %mon ) { $col = $monthly_total = 0; if ( $numeric_mon{ $mon } > $numeric_mon{ $current_month } ) { $worksheet->write( $row++, $col, "$mon-$current_year" ); next MONTH; } $worksheet->write( $row, $col, "$mon-$current_year" ); foreach $region ( qw/Bellevue St.Louis SealBeach/ ) { $worksheet->write( $row, ++$col, $totals{ $region }{ $mon } ); $monthly_total += $totals{ $region }{ $mon }; } $worksheet->write( $row++, $col, $totals{ $region }{ $mon} ); } $workbook->close(); __END__ From mathin at mathin.com Thu Aug 14 22:40:56 2003 From: mathin at mathin.com (Dan Ebert) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: / Spreadsheet::WriteExcel / any experience? / In-Reply-To: Message-ID: I use it fairly regularly ... can't remember what version off the top of my head. Fire away! Dan. ---------------------------------------------------------- Immigration is the sincerest form of flattery. - Unknown ---------------------------------------------------------- On Thu, 14 Aug 2003, Jim Ludwig wrote: > Hey there: > > Before posting any possible problems, I thought > I'd ask to see if anyone on this list has ever > used Spreadsheet::WriteExcel (0.41) in a > non-Windows environment? > > For the most part it's great, but I'm currently > knocking my head against the monitor... > > Thanks, > jim > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > > From jsl at blarg.net Fri Aug 15 12:29:14 2003 From: jsl at blarg.net (Jim Ludwig) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: / Spreadsheet::WriteExcel / any experience? / In-Reply-To: (message from Dan Ebert on Thu, 14 Aug 2003 20:40:56 -0700 (PDT)) Message-ID: Hey again: I've gotten a few responses encouraging me to post my Spreadsheet::WriteExcel problem, so here it is. -------------- What I'm doing -------------- I'm reading an Excel spreadsheet with hundreds of rows (let's just say 1000 rows) with Spreadsheet::ParseExcel. In each of those rows are 6 columns of data which I extract. I feed the 6 items of data into a black box, and out pops an additional 18 items of data which are related to the original 6. At this point I write out to a different spreadsheet with Spreadsheet::WriteExcel -- first I write back out the original 6 columns of data, and then I write out the other 18 columns of data in the same row. I do this for each row encountered in the original spreadsheet from which I'm reading. FYI, every cell I write out has fewer than 128 characters of data (far less than the 255 limit). --------------------------------------- First problem (Spreadsheet::WriteExcel) --------------------------------------- This is where it gets weird, as I'm unable to determine where Spreadsheet::WriteExcel is going wrong. gnumeric: After one spreadsheet was done being processed, I tried to open it up using gnumeric, and this was the error message I got: Inconsistent block allocation table Excel: I tried opening the same spreadsheet with Excel, in a Windows environment, and this was the error message I got: 'foo.xls' cannot be accessed. The file may be read-only, or you may be trying to access a read-only location. Or, the server the document is stored on may not be responding --------------------------------------------- Second problem (Spreadsheet::WriteExcel::Big) --------------------------------------------- Given the problems I was having with Spreadsheet::WriteExcel, I thought it couldn't hurt to try Spreadsheet::WriteExcel::Big. I re-ran my script, and this time the error messages were a little different. gnumeric: When I opened my resulting file on the command line (gnumeric foo.xls), I got this (line wraps are mine): foo.xls Excel 95 (gnumeric:24066): gnumeric:read-WARNING **: XL: Xf index 0xF00 is not in the range[0..0x1E) ** (gnumeric:24066): CRITICAL **: file ../../src/sheet-style.c: line 901 (cell_tile_apply_pos): assertion `col < SHEET_MAX_COLS' failed ** (gnumeric:24066): CRITICAL **: file ../../src/sheet.c: line 2407 (sheet_cell_new): assertion `col < SHEET_MAX_COLS' failed ** (gnumeric:24066): CRITICAL **: file ../../src/cell.c: line 273 (cell_set_value): assertion `cell != NULL' failed ** (gnumeric:24066): CRITICAL **: file ../../../plugins/excel/ms-biff.c: line 316 (ms_biff_query_next): assertion `q->length < 20000' failed Error, hit end without EOF ** (gnumeric:24066): CRITICAL **: file ../../../plugins/excel/ms-biff.c: line 316 (ms_biff_query_next): assertion `q->length < 20000' failed Interestingly, it only displayed the first 129 rows of output, and on the 129th row, all columns of data were missing from the 8th column on. Rows 130 through 1000 were not apparent at all (although when viewed through 'hexdump -C' or 'strings', all the rest of the data could be seen). Then the interesting part. When I quit gnumeric through the file menu, I saw these additional output messages: ** (gnumeric:24066): WARNING **: Leaked 1 nodes from value string pool. Leaking string [WOTNOH88] with ref_count=1. ** (gnumeric:24066): WARNING **: Leaked 1 nodes from string pool. The string which was leaked ("WOTNOH88") was the one which was in the 8th column of the 129th row. !!?? What's going on there? Excel: When I tried to open the same file using Excel, I got a different error message from the one I'd encountered previously: EXCEL.exe has generated errors and will be closed by Windows. You will need to restart the program. An error log is being created. ----------- What gives? ----------- What I'm trying to figure out is why/how Spreadsheet::WriteExcel is writing a cell which later gets "leaked". I do not know anything about the internals of Excel to know where to start, and trudging through the source of Spreadsheet::WriteExcel and all the modules it depends upon doesn't sound that appealing. I've tried this with different spreadsheets, and it always seems to happen in various columns on the 129th (1-based) row (when it happens at all). Even when I don't write to the 129th row at all, skipping on ahead straight to row 130, the same phenomenon occurs. So, more accurately, this happens on the first row after the 128th (1-based) row (when it happens at all). The only 2 methods I'm using to write to cells are write_blank() and write_string(). Has anyone encountered this before? If so, did you come across a solution? jim From mathin at mathin.com Fri Aug 15 13:10:32 2003 From: mathin at mathin.com (Dan Ebert) Date: Mon Aug 2 21:37:06 2004 Subject: SPUG: / Spreadsheet::WriteExcel / any experience? / In-Reply-To: References: Message-ID: <1060971031.29709.45.camel@algernon.lan.enic.cc> I have successfully written excel files with 50,000 rows (and 20+ columns). I have never been able to get gnumeric to read the Spreadsheet::WriteExcel::Big files, but the 'normal' ones seem to work fine. I haven't seen the errors you are getting. Could it be something quirky with the data you are writing to the cells? (unprintable characters or something) Dan. On Fri, 2003-08-15 at 10:29, Jim Ludwig wrote: > Hey again: > > I've gotten a few responses encouraging me to post > my Spreadsheet::WriteExcel problem, so here it is. > > -------------- > What I'm doing > -------------- > > I'm reading an Excel spreadsheet with hundreds of > rows (let's just say 1000 rows) with > Spreadsheet::ParseExcel. In each of those rows > are 6 columns of data which I extract. > > I feed the 6 items of data into a black box, and > out pops an additional 18 items of data which are > related to the original 6. > > At this point I write out to a different > spreadsheet with Spreadsheet::WriteExcel -- first > I write back out the original 6 columns of data, > and then I write out the other 18 columns of data > in the same row. > > I do this for each row encountered in the original > spreadsheet from which I'm reading. > > FYI, every cell I write out has fewer than 128 > characters of data (far less than the 255 limit). > > --------------------------------------- > First problem (Spreadsheet::WriteExcel) > --------------------------------------- > > This is where it gets weird, as I'm unable to > determine where Spreadsheet::WriteExcel is going > wrong. > > gnumeric: > > After one spreadsheet was done being processed, I > tried to open it up using gnumeric, and this was > the error message I got: > > Inconsistent block allocation table > > Excel: > > I tried opening the same spreadsheet with Excel, > in a Windows environment, and this was the error > message I got: > > 'foo.xls' cannot be accessed. The file may be > read-only, or you may be trying to access a > read-only location. Or, the server the > document is stored on may not be responding > > --------------------------------------------- > Second problem (Spreadsheet::WriteExcel::Big) > --------------------------------------------- > > Given the problems I was having with > Spreadsheet::WriteExcel, I thought it couldn't > hurt to try Spreadsheet::WriteExcel::Big. > > I re-ran my script, and this time the error > messages were a little different. > > gnumeric: > > When I opened my resulting file on the command > line (gnumeric foo.xls), I got this (line wraps > are mine): > > foo.xls > Excel 95 > > (gnumeric:24066): gnumeric:read-WARNING **: > XL: Xf index 0xF00 is not in the > range[0..0x1E) > > ** (gnumeric:24066): CRITICAL **: file > ../../src/sheet-style.c: line 901 > (cell_tile_apply_pos): assertion `col < > SHEET_MAX_COLS' failed > > ** (gnumeric:24066): CRITICAL **: file > ../../src/sheet.c: line 2407 (sheet_cell_new): > assertion `col < SHEET_MAX_COLS' failed > > ** (gnumeric:24066): CRITICAL **: file > ../../src/cell.c: line 273 (cell_set_value): > assertion `cell != NULL' failed > > ** (gnumeric:24066): CRITICAL **: file > ../../../plugins/excel/ms-biff.c: line 316 > (ms_biff_query_next): assertion `q->length < > 20000' failed > Error, hit end without EOF > > ** (gnumeric:24066): CRITICAL **: file > ../../../plugins/excel/ms-biff.c: line 316 > (ms_biff_query_next): assertion `q->length < > 20000' failed > > Interestingly, it only displayed the first 129 > rows of output, and on the 129th row, all columns > of data were missing from the 8th column on. Rows > 130 through 1000 were not apparent at all > (although when viewed through 'hexdump -C' or > 'strings', all the rest of the data could be > seen). > > Then the interesting part. When I quit gnumeric > through the file menu, I saw these additional > output messages: > > ** (gnumeric:24066): WARNING **: Leaked 1 > nodes from value string pool. > Leaking string [WOTNOH88] with ref_count=1. > > ** (gnumeric:24066): WARNING **: Leaked 1 > nodes from string pool. > > The string which was leaked ("WOTNOH88") was the > one which was in the 8th column of the 129th row. > > !!?? > > What's going on there? > > Excel: > > When I tried to open the same file using Excel, I > got a different error message from the one I'd > encountered previously: > > EXCEL.exe has generated errors and will be > closed by Windows. You will need to restart > the program. An error log is being created. > > ----------- > What gives? > ----------- > > What I'm trying to figure out is why/how > Spreadsheet::WriteExcel is writing a cell which > later gets "leaked". I do not know anything about > the internals of Excel to know where to start, and > trudging through the source of > Spreadsheet::WriteExcel and all the modules it > depends upon doesn't sound that appealing. > > I've tried this with different spreadsheets, and > it always seems to happen in various columns on > the 129th (1-based) row (when it happens at all). > > Even when I don't write to the 129th row at all, > skipping on ahead straight to row 130, the same > phenomenon occurs. > > So, more accurately, this happens on the first row > after the 128th (1-based) row (when it happens at > all). > > The only 2 methods I'm using to write to cells are > write_blank() and write_string(). > > Has anyone encountered this before? If so, did > you come across a solution? > > jim > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > From tnight at pobox.com Fri Aug 15 16:25:59 2003 From: tnight at pobox.com (Terry Nightingale) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: / Spreadsheet::WriteExcel / any experience? / References: Message-ID: <007201c36373$d4658e00$f604380a@TRI6644> I don't know whether this is an option for you, but if so it may save you some head-bashing. What I've done with success in the past is to generate a CSV (comma-separated-value) text file, and give the generated file a ".xls" extension. When Excel opens the file, it "automagically" converts it to Excel format. I don't know whether this trick works with GNUmeric. ----- Original Message ----- From: "Jim Ludwig" To: "Seattle Perl Users Group" Sent: Friday, August 15, 2003 10:29 AM Subject: Re: SPUG: / Spreadsheet::WriteExcel / any experience? / > Hey again: > > I've gotten a few responses encouraging me to post > my Spreadsheet::WriteExcel problem, so here it is. > > -------------- > What I'm doing > -------------- > > I'm reading an Excel spreadsheet with hundreds of > rows (let's just say 1000 rows) with > Spreadsheet::ParseExcel. In each of those rows > are 6 columns of data which I extract. > > I feed the 6 items of data into a black box, and > out pops an additional 18 items of data which are > related to the original 6. > > At this point I write out to a different > spreadsheet with Spreadsheet::WriteExcel -- first > I write back out the original 6 columns of data, > and then I write out the other 18 columns of data > in the same row. > > I do this for each row encountered in the original > spreadsheet from which I'm reading. > > FYI, every cell I write out has fewer than 128 > characters of data (far less than the 255 limit). > > --------------------------------------- > First problem (Spreadsheet::WriteExcel) > --------------------------------------- > > This is where it gets weird, as I'm unable to > determine where Spreadsheet::WriteExcel is going > wrong. > > gnumeric: > > After one spreadsheet was done being processed, I > tried to open it up using gnumeric, and this was > the error message I got: > > Inconsistent block allocation table > > Excel: > > I tried opening the same spreadsheet with Excel, > in a Windows environment, and this was the error > message I got: > > 'foo.xls' cannot be accessed. The file may be > read-only, or you may be trying to access a > read-only location. Or, the server the > document is stored on may not be responding > > --------------------------------------------- > Second problem (Spreadsheet::WriteExcel::Big) > --------------------------------------------- > > Given the problems I was having with > Spreadsheet::WriteExcel, I thought it couldn't > hurt to try Spreadsheet::WriteExcel::Big. > > I re-ran my script, and this time the error > messages were a little different. > > gnumeric: > > When I opened my resulting file on the command > line (gnumeric foo.xls), I got this (line wraps > are mine): > > foo.xls > Excel 95 > > (gnumeric:24066): gnumeric:read-WARNING **: > XL: Xf index 0xF00 is not in the > range[0..0x1E) > > ** (gnumeric:24066): CRITICAL **: file > ../../src/sheet-style.c: line 901 > (cell_tile_apply_pos): assertion `col < > SHEET_MAX_COLS' failed > > ** (gnumeric:24066): CRITICAL **: file > ../../src/sheet.c: line 2407 (sheet_cell_new): > assertion `col < SHEET_MAX_COLS' failed > > ** (gnumeric:24066): CRITICAL **: file > ../../src/cell.c: line 273 (cell_set_value): > assertion `cell != NULL' failed > > ** (gnumeric:24066): CRITICAL **: file > ../../../plugins/excel/ms-biff.c: line 316 > (ms_biff_query_next): assertion `q->length < > 20000' failed > Error, hit end without EOF > > ** (gnumeric:24066): CRITICAL **: file > ../../../plugins/excel/ms-biff.c: line 316 > (ms_biff_query_next): assertion `q->length < > 20000' failed > > Interestingly, it only displayed the first 129 > rows of output, and on the 129th row, all columns > of data were missing from the 8th column on. Rows > 130 through 1000 were not apparent at all > (although when viewed through 'hexdump -C' or > 'strings', all the rest of the data could be > seen). > > Then the interesting part. When I quit gnumeric > through the file menu, I saw these additional > output messages: > > ** (gnumeric:24066): WARNING **: Leaked 1 > nodes from value string pool. > Leaking string [WOTNOH88] with ref_count=1. > > ** (gnumeric:24066): WARNING **: Leaked 1 > nodes from string pool. > > The string which was leaked ("WOTNOH88") was the > one which was in the 8th column of the 129th row. > > !!?? > > What's going on there? > > Excel: > > When I tried to open the same file using Excel, I > got a different error message from the one I'd > encountered previously: > > EXCEL.exe has generated errors and will be > closed by Windows. You will need to restart > the program. An error log is being created. > > ----------- > What gives? > ----------- > > What I'm trying to figure out is why/how > Spreadsheet::WriteExcel is writing a cell which > later gets "leaked". I do not know anything about > the internals of Excel to know where to start, and > trudging through the source of > Spreadsheet::WriteExcel and all the modules it > depends upon doesn't sound that appealing. > > I've tried this with different spreadsheets, and > it always seems to happen in various columns on > the 129th (1-based) row (when it happens at all). > > Even when I don't write to the 129th row at all, > skipping on ahead straight to row 130, the same > phenomenon occurs. > > So, more accurately, this happens on the first row > after the 128th (1-based) row (when it happens at > all). > > The only 2 methods I'm using to write to cells are > write_blank() and write_string(). > > Has anyone encountered this before? If so, did > you come across a solution? > > jim > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > > From jsl at blarg.net Fri Aug 15 17:41:52 2003 From: jsl at blarg.net (Jim Ludwig) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: / Spreadsheet::WriteExcel / any experience? / In-Reply-To: <007201c36373$d4658e00$f604380a@TRI6644> (tnight@pobox.com) Message-ID: Thanks to all of you who suggested writing out to a file which contains comma- or tab-separated values. Naturally, barring any real workarounds, this is what I will have to resort to doing. Sadly, doing so would disallow me from having all the beautiful auto-formatting I had anticipated. Incidentally, I did not use any formatting in the test cases I reported on below, nor were there any freaky characters. THE REAL THING I was kind of getting at was, what in the world is going on after row 128? I think I will write to the WriteExcel developer. In the mean time, I will, :(, go with CSV. jim ----- Original Message ----- From: "Jim Ludwig" To: "Seattle Perl Users Group" Sent: Friday, August 15, 2003 10:29 AM Subject: Re: SPUG: / Spreadsheet::WriteExcel / any experience? / > Hey again: > > I've gotten a few responses encouraging me to post > my Spreadsheet::WriteExcel problem, so here it is. > > -------------- > What I'm doing > -------------- > > I'm reading an Excel spreadsheet with hundreds of > rows (let's just say 1000 rows) with > Spreadsheet::ParseExcel. In each of those rows > are 6 columns of data which I extract. > > I feed the 6 items of data into a black box, and > out pops an additional 18 items of data which are > related to the original 6. > > At this point I write out to a different > spreadsheet with Spreadsheet::WriteExcel -- first > I write back out the original 6 columns of data, > and then I write out the other 18 columns of data > in the same row. > > I do this for each row encountered in the original > spreadsheet from which I'm reading. > > FYI, every cell I write out has fewer than 128 > characters of data (far less than the 255 limit). > > --------------------------------------- > First problem (Spreadsheet::WriteExcel) > --------------------------------------- > > This is where it gets weird, as I'm unable to > determine where Spreadsheet::WriteExcel is going > wrong. > > gnumeric: > > After one spreadsheet was done being processed, I > tried to open it up using gnumeric, and this was > the error message I got: > > Inconsistent block allocation table > > Excel: > > I tried opening the same spreadsheet with Excel, > in a Windows environment, and this was the error > message I got: > > 'foo.xls' cannot be accessed. The file may be > read-only, or you may be trying to access a > read-only location. Or, the server the > document is stored on may not be responding > > --------------------------------------------- > Second problem (Spreadsheet::WriteExcel::Big) > --------------------------------------------- > > Given the problems I was having with > Spreadsheet::WriteExcel, I thought it couldn't > hurt to try Spreadsheet::WriteExcel::Big. > > I re-ran my script, and this time the error > messages were a little different. > > gnumeric: > > When I opened my resulting file on the command > line (gnumeric foo.xls), I got this (line wraps > are mine): > > foo.xls > Excel 95 > > (gnumeric:24066): gnumeric:read-WARNING **: > XL: Xf index 0xF00 is not in the > range[0..0x1E) > > ** (gnumeric:24066): CRITICAL **: file > ../../src/sheet-style.c: line 901 > (cell_tile_apply_pos): assertion `col < > SHEET_MAX_COLS' failed > > ** (gnumeric:24066): CRITICAL **: file > ../../src/sheet.c: line 2407 (sheet_cell_new): > assertion `col < SHEET_MAX_COLS' failed > > ** (gnumeric:24066): CRITICAL **: file > ../../src/cell.c: line 273 (cell_set_value): > assertion `cell != NULL' failed > > ** (gnumeric:24066): CRITICAL **: file > ../../../plugins/excel/ms-biff.c: line 316 > (ms_biff_query_next): assertion `q->length < > 20000' failed > Error, hit end without EOF > > ** (gnumeric:24066): CRITICAL **: file > ../../../plugins/excel/ms-biff.c: line 316 > (ms_biff_query_next): assertion `q->length < > 20000' failed > > Interestingly, it only displayed the first 129 > rows of output, and on the 129th row, all columns > of data were missing from the 8th column on. Rows > 130 through 1000 were not apparent at all > (although when viewed through 'hexdump -C' or > 'strings', all the rest of the data could be > seen). > > Then the interesting part. When I quit gnumeric > through the file menu, I saw these additional > output messages: > > ** (gnumeric:24066): WARNING **: Leaked 1 > nodes from value string pool. > Leaking string [WOTNOH88] with ref_count=1. > > ** (gnumeric:24066): WARNING **: Leaked 1 > nodes from string pool. > > The string which was leaked ("WOTNOH88") was the > one which was in the 8th column of the 129th row. > > !!?? > > What's going on there? > > Excel: > > When I tried to open the same file using Excel, I > got a different error message from the one I'd > encountered previously: > > EXCEL.exe has generated errors and will be > closed by Windows. You will need to restart > the program. An error log is being created. > > ----------- > What gives? > ----------- > > What I'm trying to figure out is why/how > Spreadsheet::WriteExcel is writing a cell which > later gets "leaked". I do not know anything about > the internals of Excel to know where to start, and > trudging through the source of > Spreadsheet::WriteExcel and all the modules it > depends upon doesn't sound that appealing. > > I've tried this with different spreadsheets, and > it always seems to happen in various columns on > the 129th (1-based) row (when it happens at all). > > Even when I don't write to the 129th row at all, > skipping on ahead straight to row 130, the same > phenomenon occurs. > > So, more accurately, this happens on the first row > after the 128th (1-based) row (when it happens at > all). > > The only 2 methods I'm using to write to cells are > write_blank() and write_string(). > > Has anyone encountered this before? If so, did > you come across a solution? > > jim From mako at debian.org Fri Aug 15 22:49:12 2003 From: mako at debian.org (Benj. Mako Hill) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Debian 10th Anniversary BBQ/Potluck Tomorrow (Saturday 8/16) @ Alki Beach Park Message-ID: <20030816034911.GU988@nozomi> This is the final reminder that tomorrow is the first day of the 10th year of Debian's existence and that, as a result, the Seattle Debian Community is doing its part to celebrate with a picnic/potluck at Alki Beach Park! The time to arrive is 1pm and it will continue until everyone wanders off. Information on getting there with maps and such is available here: http://mako.yukidoke.org/debian_10th/ David Smead, Andrew Sweger and Brian Nelson have volunteered to bring grills of varying sizes so feel free to bring grillable food. Things you might want to bring: * Food. It's a potluck sort of system so feel free to bring whatever. I'm bringing salmon and sauces. * Games/etc. It's a beach and a park. Bring whatever you feel is necessary. I'll probably wear a swimsuit and bring a frisbee. * PGP/GPG printed copies of fingerprints and photo ID if you are interested in signing keys. No Debian meeting would be complete without this. :) If people get lots, needs help or have questions I've got an (out of state) cell phone. Call 413.441.6627. I hope to see a number of you tomorrow! Regards, Benjamin Mako Hill -- Benj. Mako Hill mako@debian.org http://mako.yukidoke.org/ -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030815/6e8be628/attachment.bin From jay at scherrer.com Fri Aug 15 23:37:30 2003 From: jay at scherrer.com (Jay Scherrer) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: / Spreadsheet::WriteExcel / any experience? / In-Reply-To: References: Message-ID: <200308152137.30619.jay@scherrer.com> Off the head it looks like your row/columns are getting messed up. On Friday 15 August 2003 10:29 am, Jim Ludwig wrote: > Hey again: > I feed the 6 items of data into a black box, and > out pops an additional 18 items of data which are > related to the original 6. > > At this point I write out to a different > spreadsheet with Spreadsheet::WriteExcel -- first > I write back out the original 6 columns of data, > and then I write out the other 18 columns of data > in the same row. > > I do this for each row encountered in the original > spreadsheet from which I'm reading. > > --------------------------------------- > First problem (Spreadsheet::WriteExcel) > --------------------------------------- > > This is where it gets weird, as I'm unable to > determine where Spreadsheet::WriteExcel is going > wrong. > > gnumeric: > > After one spreadsheet was done being processed, I > tried to open it up using gnumeric, and this was > the error message I got: > > Inconsistent block allocation table > And your getting an error because the row/columns are not aligned. Remember Excel is only a csv file system. Much like your key/value in a hash. When coverting into an Excel data form why not just use DBD::csv. > Excel: > > I tried opening the same spreadsheet with Excel, > in a Windows environment, and this was the error > message I got: > > 'foo.xls' cannot be accessed. The file may be > read-only, or you may be trying to access a > read-only location. Or, the server the > document is stored on may not be responding > Is the fil locked? Or are you setting permissions acurately? > --------------------------------------------- > Second problem (Spreadsheet::WriteExcel::Big) > --------------------------------------------- > > Given the problems I was having with > Spreadsheet::WriteExcel, I thought it couldn't > hurt to try Spreadsheet::WriteExcel::Big. > > I re-ran my script, and this time the error > messages were a little different. > > gnumeric: > > When I opened my resulting file on the command > line (gnumeric foo.xls), I got this (line wraps > are mine): > > foo.xls > Excel 95 > > (gnumeric:24066): gnumeric:read-WARNING **: > XL: Xf index 0xF00 is not in the > range[0..0x1E) > > ** (gnumeric:24066): CRITICAL **: file > ../../../plugins/excel/ms-biff.c: line 316 > (ms_biff_query_next): assertion `q->length < > 20000' failed > Error, hit end without EOF > > ** (gnumeric:24066): CRITICAL **: file > ../../../plugins/excel/ms-biff.c: line 316 > (ms_biff_query_next): assertion `q->length < > 20000' failed > Is your dtd or data dictionary for the given inputs matching the cell? Good luck, Jay > jim > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org -- Personalized e-mail and domain names: Hey there: Once again I'd like to thank everyone who responded to my query. I'd like to report back that the problem has been solved. I heard back from the Spreadsheet::WriteExcel developer, and it sounds to me like it sounded to him that he'd encountered this one before. He asked if the "black box" into which I was feeding my data contained an XML parser, suggesting that, if it was, the data I was writing out would be in UTF8 format. Even if the data is in the ASCII range it may coerce other WriteExcel internal strings to UTF8. This will cause bytes with the high bit set (such as the 129 row number) to be expanded to two bytes. This will in turn corrupt the file and lead to an Excel error such as the one you are seeing. He then suggested I read the "WORKING WITH XML" section of the man page, which contained the needed fix: my $non_utf8_value = pack( 'C*', unpack( 'U*', $value )); Heh, this is an issue I won't soon forget :). Thanks again, everybody. By the way, you can make some really pretty spreadsheets in non-CSV format ;). jim > From: Jim Ludwig > To: Seattle Perl Users Group > Date: Fri, 15 Aug 2003 15:41:52 -0700 > Subject: Re: SPUG: / Spreadsheet::WriteExcel / any experience? / > > Thanks to all of you who suggested writing out to > a file which contains comma- or tab-separated > values. > > Naturally, barring any real workarounds, this is > what I will have to resort to doing. > > Sadly, doing so would disallow me from having all > the beautiful auto-formatting I had anticipated. > > Incidentally, I did not use any formatting in the > test cases I reported on below, nor were there any > freaky characters. > > THE REAL THING I was kind of getting at was, what > in the world is going on after row 128? I think I > will write to the WriteExcel developer. In the > mean time, I will, :(, go with CSV. > > jim > > ----- Original Message ----- > From: "Jim Ludwig" > To: "Seattle Perl Users Group" > Sent: Friday, August 15, 2003 10:29 AM > Subject: Re: SPUG: / Spreadsheet::WriteExcel / any experience? / > > > > Hey again: > > > > I've gotten a few responses encouraging me to post > > my Spreadsheet::WriteExcel problem, so here it is. > > > > -------------- > > What I'm doing > > -------------- > > > > I'm reading an Excel spreadsheet with hundreds of > > rows (let's just say 1000 rows) with > > Spreadsheet::ParseExcel. In each of those rows > > are 6 columns of data which I extract. > > > > I feed the 6 items of data into a black box, and > > out pops an additional 18 items of data which are > > related to the original 6. > > > > At this point I write out to a different > > spreadsheet with Spreadsheet::WriteExcel -- first > > I write back out the original 6 columns of data, > > and then I write out the other 18 columns of data > > in the same row. > > > > I do this for each row encountered in the original > > spreadsheet from which I'm reading. > > > > FYI, every cell I write out has fewer than 128 > > characters of data (far less than the 255 limit). > > > > --------------------------------------- > > First problem (Spreadsheet::WriteExcel) > > --------------------------------------- > > > > This is where it gets weird, as I'm unable to > > determine where Spreadsheet::WriteExcel is going > > wrong. > > > > gnumeric: > > > > After one spreadsheet was done being processed, I > > tried to open it up using gnumeric, and this was > > the error message I got: > > > > Inconsistent block allocation table > > > > Excel: > > > > I tried opening the same spreadsheet with Excel, > > in a Windows environment, and this was the error > > message I got: > > > > 'foo.xls' cannot be accessed. The file may be > > read-only, or you may be trying to access a > > read-only location. Or, the server the > > document is stored on may not be responding > > > > --------------------------------------------- > > Second problem (Spreadsheet::WriteExcel::Big) > > --------------------------------------------- > > > > Given the problems I was having with > > Spreadsheet::WriteExcel, I thought it couldn't > > hurt to try Spreadsheet::WriteExcel::Big. > > > > I re-ran my script, and this time the error > > messages were a little different. > > > > gnumeric: > > > > When I opened my resulting file on the command > > line (gnumeric foo.xls), I got this (line wraps > > are mine): > > > > foo.xls > > Excel 95 > > > > (gnumeric:24066): gnumeric:read-WARNING **: > > XL: Xf index 0xF00 is not in the > > range[0..0x1E) > > > > ** (gnumeric:24066): CRITICAL **: file > > ../../src/sheet-style.c: line 901 > > (cell_tile_apply_pos): assertion `col < > > SHEET_MAX_COLS' failed > > > > ** (gnumeric:24066): CRITICAL **: file > > ../../src/sheet.c: line 2407 (sheet_cell_new): > > assertion `col < SHEET_MAX_COLS' failed > > > > ** (gnumeric:24066): CRITICAL **: file > > ../../src/cell.c: line 273 (cell_set_value): > > assertion `cell != NULL' failed > > > > ** (gnumeric:24066): CRITICAL **: file > > ../../../plugins/excel/ms-biff.c: line 316 > > (ms_biff_query_next): assertion `q->length < > > 20000' failed > > Error, hit end without EOF > > > > ** (gnumeric:24066): CRITICAL **: file > > ../../../plugins/excel/ms-biff.c: line 316 > > (ms_biff_query_next): assertion `q->length < > > 20000' failed > > > > Interestingly, it only displayed the first 129 > > rows of output, and on the 129th row, all columns > > of data were missing from the 8th column on. Rows > > 130 through 1000 were not apparent at all > > (although when viewed through 'hexdump -C' or > > 'strings', all the rest of the data could be > > seen). > > > > Then the interesting part. When I quit gnumeric > > through the file menu, I saw these additional > > output messages: > > > > ** (gnumeric:24066): WARNING **: Leaked 1 > > nodes from value string pool. > > Leaking string [WOTNOH88] with ref_count=1. > > > > ** (gnumeric:24066): WARNING **: Leaked 1 > > nodes from string pool. > > > > The string which was leaked ("WOTNOH88") was the > > one which was in the 8th column of the 129th row. > > > > !!?? > > > > What's going on there? > > > > Excel: > > > > When I tried to open the same file using Excel, I > > got a different error message from the one I'd > > encountered previously: > > > > EXCEL.exe has generated errors and will be > > closed by Windows. You will need to restart > > the program. An error log is being created. > > > > ----------- > > What gives? > > ----------- > > > > What I'm trying to figure out is why/how > > Spreadsheet::WriteExcel is writing a cell which > > later gets "leaked". I do not know anything about > > the internals of Excel to know where to start, and > > trudging through the source of > > Spreadsheet::WriteExcel and all the modules it > > depends upon doesn't sound that appealing. > > > > I've tried this with different spreadsheets, and > > it always seems to happen in various columns on > > the 129th (1-based) row (when it happens at all). > > > > Even when I don't write to the 129th row at all, > > skipping on ahead straight to row 130, the same > > phenomenon occurs. > > > > So, more accurately, this happens on the first row > > after the 128th (1-based) row (when it happens at > > all). > > > > The only 2 methods I'm using to write to cells are > > write_blank() and write_string(). > > > > Has anyone encountered this before? If so, did > > you come across a solution? > > > > jim From mako at debian.org Mon Aug 18 11:07:06 2003 From: mako at debian.org (Benj. Mako Hill) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Food before Meeting Message-ID: <20030818160705.GQ988@nozomi> I heard this weekend that people often get together for food before SPUG meetings. Who should I contact about reservations for for food before the meeting tomorrow? Regards, Mako -- Benj. Mako Hill mako@debian.org http://mako.yukidoke.org/ -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030818/6d690ffb/attachment.bin From tim at consultix-inc.com Mon Aug 18 13:19:51 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: 8/19 Meeting: Software/Programmer Testing/Certification Message-ID: <20030818111951.A7488@timji.consultix-inc.com> August 19, 2003 Seattle Perl Users Group Meeting -------------------------------------------------------- THEME: Testing and Certifying Software -- And Programmers! Title: "News Updates" Speaker: Tim Maher Tim will talk briefly about some recent developments of interest to SPUGsters, including the OSCON conference's vote in favor of certifying Perl programmers (see perlcert.perlocity.org) , and SPUG's new Kwiki site (spugwiki.perlocity.org) Title: "Perl Certificate Program" Representatives from the Univ. of Washington will briefly describe their Perl program, with Q/A at end Title: "Unit Testing" Speaker: Jonathan Gardner Jonathan will talk about the benefits of Unit Testing, and how to go about doing it. A rough draft of his talk is available at http://spugwiki.perlocity.org/index.cgi?UnitTesting --------------------------------------------------------- Meeting Time: Tuesday, August 19, 2003 7-9pm Location: SAFECO bldg, Brooklyn St. and NE 45th St. Cost: Admission is free and open to the general public. Info: http://seattleperl.org/ * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Pre- and Post- Meeting Activities --------------------------------- The pre-meeting dinner will be at the Cedars restaurant, at 50th St. and Brooklyn, in the University District, near the Safeco building where the meeting will take place. The phone number is 527-5247. If you're planning to be there, please enter your name on the Kwiki RSVP page by 2pm on the meeting day. (NOTE: Arrival by 5:45pm is recommended for those ordering food). Those who comply with the RSVP policy, and are therefore counted in the seating reservation, will have top priority for seating at the speaker's table. ====================================================== | Tim Maher, Ph.D. tim@timmaher.org | | SPUG Founder & Leader spug@seattleperl.org | | Seattle Perl Users Group www.seattleperl.org | ====================================================== From ssarapat at enc.org Mon Aug 18 13:24:44 2003 From: ssarapat at enc.org (Steve Sarapata) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: 8/19 Meeting: Software/Programmer Testing/Certification Message-ID: Re Unit Testing. I'd recommend the first 5 chapters of "Testing Computer Software" by Kaner, ISBN 0-471-35846-0 -----Original Message----- From: Tim Maher [mailto:tim@consultix-inc.com] Sent: Monday, August 18, 2003 2:20 PM To: spug-list@Pm.org Subject: SPUG: 8/19 Meeting: Software/Programmer Testing/Certification August 19, 2003 Seattle Perl Users Group Meeting -------------------------------------------------------- THEME: Testing and Certifying Software -- And Programmers! Title: "News Updates" Speaker: Tim Maher Tim will talk briefly about some recent developments of interest to SPUGsters, including the OSCON conference's vote in favor of certifying Perl programmers (see perlcert.perlocity.org) , and SPUG's new Kwiki site (spugwiki.perlocity.org) Title: "Perl Certificate Program" Representatives from the Univ. of Washington will briefly describe their Perl program, with Q/A at end Title: "Unit Testing" Speaker: Jonathan Gardner Jonathan will talk about the benefits of Unit Testing, and how to go about doing it. A rough draft of his talk is available at http://spugwiki.perlocity.org/index.cgi?UnitTesting --------------------------------------------------------- Meeting Time: Tuesday, August 19, 2003 7-9pm Location: SAFECO bldg, Brooklyn St. and NE 45th St. Cost: Admission is free and open to the general public. Info: http://seattleperl.org/ * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Pre- and Post- Meeting Activities --------------------------------- The pre-meeting dinner will be at the Cedars restaurant, at 50th St. and Brooklyn, in the University District, near the Safeco building where the meeting will take place. The phone number is 527-5247. If you're planning to be there, please enter your name on the Kwiki RSVP page by 2pm on the meeting day. (NOTE: Arrival by 5:45pm is recommended for those ordering food). Those who comply with the RSVP policy, and are therefore counted in the seating reservation, will have top priority for seating at the speaker's table. ====================================================== | Tim Maher, Ph.D. tim@timmaher.org | | SPUG Founder & Leader spug@seattleperl.org | | Seattle Perl Users Group www.seattleperl.org | ====================================================== _____________________________________________________________ Seattle Perl Users Group Mailing List POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list MEETINGS: 3rd Tuesdays, U-District, Seattle WA WEB PAGE: http://www.seattleperl.org From schieb at centurytel.net Mon Aug 18 13:42:14 2003 From: schieb at centurytel.net (Islandman) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: 8/19 Meeting: Software/Programmer Testing/Certification References: <20030818111951.A7488@timji.consultix-inc.com> Message-ID: <3F411E06.9EB4EFDE@centurytel.net> Dang! These look like great topics but I can't make it. If someone could post notes on the talks they'd be held in the highest of esteem. Thanks, -Brian Schieber Vashon, WA Tim Maher wrote: > > August 19, 2003 Seattle Perl Users Group Meeting > -------------------------------------------------------- > THEME: Testing and Certifying Software -- And Programmers! > > Title: "News Updates" > Speaker: Tim Maher > Tim will talk briefly about some recent developments of > interest to SPUGsters, including the OSCON conference's > vote in favor of certifying Perl programmers (see > perlcert.perlocity.org) , and SPUG's new Kwiki site > (spugwiki.perlocity.org) > > Title: "Perl Certificate Program" > Representatives from the Univ. of Washington will briefly > describe their Perl program, with Q/A at end > > Title: "Unit Testing" > Speaker: Jonathan Gardner > Jonathan will talk about the benefits of Unit Testing, > and how to go about doing it. A rough draft of his > talk is available at > http://spugwiki.perlocity.org/index.cgi?UnitTesting > --------------------------------------------------------- > > Meeting Time: Tuesday, August 19, 2003 7-9pm > Location: SAFECO bldg, Brooklyn St. and NE 45th St. > > Cost: Admission is free and open to the general public. > Info: http://seattleperl.org/ > > * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * > > Pre- and Post- Meeting Activities > --------------------------------- > The pre-meeting dinner will be at the Cedars restaurant, at 50th > St. and Brooklyn, in the University District, near the Safeco > building where the meeting will take place. The phone number is > 527-5247. If you're planning to be there, please enter your name > on the Kwiki RSVP page by 2pm on the meeting day. (NOTE: Arrival > by 5:45pm is recommended for those ordering food). > > Those who comply with the RSVP policy, and are therefore > counted in the seating reservation, will have top priority for > seating at the speaker's table. > > ====================================================== > | Tim Maher, Ph.D. tim@timmaher.org | > | SPUG Founder & Leader spug@seattleperl.org | > | Seattle Perl Users Group www.seattleperl.org | > ====================================================== > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org From jgardner at jonathangardner.net Mon Aug 18 14:43:03 2003 From: jgardner at jonathangardner.net (Jonathan Gardner) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: 8/19 Meeting: Software/Programmer Testing/Certification In-Reply-To: References: Message-ID: <200308181243.07983.jgardner@jonathangardner.net> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Monday 18 August 2003 11:24, Steve Sarapata wrote: > Re Unit Testing. I'd recommend the first 5 chapters of "Testing Computer > Software" by Kaner, ISBN 0-471-35846-0 > One of my favorite books on testing. I'll see if I can't squeeze in that info into that talk for those who aren't familiar with it. - -- Jonathan Gardner jgardner@jonathangardner.net Live Free, Use Linux! -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.2.1 (GNU/Linux) iD8DBQE/QSxKWgwF3QvpWNwRAvCEAKDjZqQejRXX3gQNcBluFZLY4eoqogCbB3lj xiaYtk8wpzbYLGd6ULl/VG4= =a8fF -----END PGP SIGNATURE----- From sthoenna at efn.org Mon Aug 18 21:16:57 2003 From: sthoenna at efn.org (Yitzchak Scott-Thoennes) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: new pod files in 5.8.1 Message-ID: <20030819021657.GA3620@efn.org> There are a couple of useful new pod files coming in 5.8.1 that you all might like to see a preview of: http://public.activestate.com/cgi-bin/perlbrowse?file=pod/perlreref.pod http://public.activestate.com/cgi-bin/perlbrowse?file=pod/perlcheat.pod From chris.nord at attws.com Tue Aug 19 16:54:56 2003 From: chris.nord at attws.com (Nord, Chris) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Net::Telnet issues..... Message-ID: <1C5FD409C0E6D44BAEDEFD8A04AD027D30FD25@WA-MSG05-BTH.wireless.attws.com> Gurus, I am having some trouble capturing data while using the Net::Telnet module. Line: my @data = $t->cmd('find Routing_Label'); should save the results of the command 'find Routing_Label', but it does not. However if you view the Input_Log correct results of 'find Routing_Label' are captured. I have tried various concoctions using $t->waitfor() and $t->print() as per the module documents but with the same results... Any ideas? Chris Nord ###################################################### #! /bin/perl -w use Net::Telnet; use strict; my $cli_user = 'user'; my $cli_pw = 'password'; my $node = 'lab'; my $port = '8123'; my $prompt = '/\[N\/S\]\> /'; my $t = new Net::Telnet ( Timeout => 5, Prompt => "$prompt", Errmode => 'return', Dump_log => 'dump.log', Input_log => 'in.log', Output_Log => 'out.log', ); print "\nConnecting: $node $port\n"; unless ( $t->open (Host => $node, Port => $port) ){ print "ERROR: not able to connect, node:$node port:$port\n"; exit 1; } unless ( $t->login($cli_user,$cli_pw) ){ print "ERROR: not able to login, user:$cli_user pw:cli_pw\n"; exit 1; } $t->cmd('select switch awspsx1'); $prompt = '/\[awspsx1\]\> /'; print "print data here!!\n"; my @data = $t->cmd('find Routing_Label'); print @data; # @data remains empty! $t->cmd('exit'); ##### in.log ######################################### Login : user Password: ........ Sonus Insight V04.01.00R003P10 CLI Shell (Non-Tcl) Date: Tue Aug 19 14:41:49 PDT 2003 [N/S]> select switch awspsx1 Result: Ok [awspsx1]> find Routing_Label Routing_Label_Id ------------------------ CRT_B_MOBILE_RL CRT_GSX1_GSX2_RL_TDM CRT_GSX1_GSX2_RL_TDM2te CRT_GSX2_RL_TDM RH_ORIG_GW To_ERIC5000_1 To_ERIC5000_2 To_LOAD_BOX_1 To_LOAD_BOX_2 To_LUANYPATH1 To_LUANYPATH2 To_LUANYPATH3 To_LUCENT2G_1 To_LUCENT2G_2 To_NORTEL2G_1 To_NORTEL2G_2 To_NORTELGSM_1 To_NORTELGSM_2 To_OCTELVM_1 To_SPATIALRTC2 To_SPATIALRTC2_1 To_SPATIALRTC2_2 To_SPATIALWTC3 To_SPATIALWTC3_1 To_SPATIALWTC3_2 To_TSUNAMI_1 To_TSUNAMI_2 Result: Ok [awspsx1]> exit Goodbye! From shawnw at speakeasy.org Wed Aug 20 01:00:29 2003 From: shawnw at speakeasy.org (Shawn Wagner) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Net::Telnet issues..... In-Reply-To: <1C5FD409C0E6D44BAEDEFD8A04AD027D30FD25@WA-MSG05-BTH.wireless.attws.com> References: <1C5FD409C0E6D44BAEDEFD8A04AD027D30FD25@WA-MSG05-BTH.wireless.attws.com> Message-ID: <20030820060029.GA32566@speakeasy.org> On Tue, Aug 19, 2003 at 02:54:56PM -0700, Nord, Chris wrote: > Gurus, > > I am having some trouble capturing data while using the Net::Telnet > module. Line: my @data = $t->cmd('find Routing_Label'); should save the > results of the command 'find Routing_Label', but it does not. However > if you view the Input_Log correct results of 'find Routing_Label' are > captured. I have tried various concoctions using $t->waitfor() and > $t->print() as per the module documents but with the same results... > > Any ideas? > snippage > $prompt = '/\[awspsx1\]\> /'; You're changing the $prompt variable, but you're not letting $t know that it's now supposed to use a different prompt. Try $t->prompt('/\[awspsx1\]> /'); instead and see if that works. > > print "print data here!!\n"; > > my @data = $t->cmd('find Routing_Label'); > print @data; > > # @data remains empty! -- Shawn Wagner shawnw@speakeasy.org From jgardner at jonathangardner.net Wed Aug 20 11:39:52 2003 From: jgardner at jonathangardner.net (Jonathan Gardner) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Last night's Unit Testing Talk Message-ID: <200308200939.54186.jgardner@jonathangardner.net> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 I only got about 90% of the way through the talk. Unfortunately, we didn't have time for the group discussion about Testing, and some additional comments about what to do if you happen to be working on a project without tests in place. The talk is posted at http://jonathangardner.net/talks/unit_testing/2003-08-19/ Enjoy! - -- Jonathan Gardner jgardner@jonathangardner.net Live Free, Use Linux! -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.2.1 (GNU/Linux) iD8DBQE/Q6RYWgwF3QvpWNwRArDxAKCEFZbxIUuI05eYUceeXn5CyjwNtgCggwws vuYyJZcN78gt4fKj9PxUbl4= =KUy0 -----END PGP SIGNATURE----- From tim at consultix-inc.com Wed Aug 20 15:55:09 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: "last" behaving oddly in -n loop Message-ID: <20030820135509.A17801@timji.consultix-inc.com> SPUGsters, I'm confused about how "last" is being handled with a -n (implicit) loop, which is very different from the way "next" is being handled. I'd appreciate any insights! Here's the problem in a nutshell: $ cat last2 print "$.: $ARGV" and last LINE ; END{print "File is $ARGV\n"} $ perl -wln last2 motd1 motd2 1: motd1 File is motd1 $ Why didn't it process the second file, motd2, as happens with "next"? It's acting like last is magically allowed to break out of the outer (invisible) foreach loop, while next only affects the inner (invisible) while loop. Deparsings and more details are shown below. -Tim ============================================================== | Tim Maher, Ph.D. tim(AT)teachmeperl.com | | SPUG Founder & Leader spug(AT)seattleperl.com | | Seattle Perl Users Group http://www.seattleperl.com | | SPUG Wiki Site http://spugwiki.perlocity.org | | Perl Certification Site http://perlcert.perlocity.org | ============================================================== tim@timji:/tmp> cat next2 print "$.: $ARGV" and next LINE ; END{print "File is $ARGV\n"} tim@timji:/tmp> perl -wln next2 motd1 motd2 1: motd1 2: motd1 3: motd1 4: motd1 5: motd2 6: motd2 7: motd2 8: motd2 File is motd2 tim@timji:/tmp> perl -MO=Deparse -wln next2 motd1 motd2 BEGIN { $^W = 1; } BEGIN { $/ = "\n"; $\ = "\n"; } LINE: while (defined($_ = )) { chomp $_; next LINE if print "$.: $ARGV"; sub END { print "File is $ARGV\n"; } ; } The outer-loop stuff ( presumably, @ARGV or @ARGV='-'; foreach $ARGV (@ARGV) {} ) is not shown, but because it works correctly, I don't mind 8-} But after seeing the following, I mind! tim@timji:/tmp> cat last2 print "$.: $ARGV" and last LINE ; END{print "File is $ARGV\n"} tim@timji:/tmp> perl -MO=Deparse last2 motd1 motd2 BEGIN { $^W = 1; } BEGIN { $/ = "\n"; $\ = "\n"; } LINE: while (defined($_ = )) { chomp $_; last LINE if print "$.: $ARGV"; sub END { print "File is $ARGV\n"; } ; } tim@timji:/tmp> perl -wln last2 motd1 motd2 1: motd1 File is motd1 $ Why didn't it go on to the next file, after last-ing out of the loop for the first one? I.e., if "next LINE" triggers the next iteration (which would have happened anyway), and allows the filename args to get processed in sequence, why doesn't "last LINE" just terminate the "inner" loop, and allow the (invisible) outer loop to open the next file? "close ARGV" *does* work that way, but it also resets $., which I don't want, and in any case I'd rather have "last LINE" do what I think it should! Any ideas? -Tim *------------------------------------------------------------* | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* | Watch for my Book: "Minimal Perl for Shell Programmers" | *------------------------------------------------------------* From andrew at sweger.net Wed Aug 20 17:28:50 2003 From: andrew at sweger.net (Andrew Sweger) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: "last" behaving oddly in -n loop In-Reply-To: <20030820135509.A17801@timji.consultix-inc.com> Message-ID: It's behaving the way I would expect of "last". I thought the file arguments were all run together in the of the while(). If you want to detect the end of individual files (while using normal IO functions), I thought you need to manually check eof(). On Wed, 20 Aug 2003, Tim Maher wrote: > I'm confused about how "last" is being handled with a -n > (implicit) loop, which is very different from the way "next" > is being handled. I'd appreciate any insights! -- Andrew B. Sweger -- The great thing about multitasking is that several things can go wrong at once. From krahnj at acm.org Wed Aug 20 18:02:41 2003 From: krahnj at acm.org (John W. Krahn) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: "last" behaving oddly in -n loop In-Reply-To: <20030820135509.A17801@timji.consultix-inc.com> References: <20030820135509.A17801@timji.consultix-inc.com> Message-ID: <03082016024102.04209@bng2406iy20tf> On Wednesday 20 August 2003 13:55, Tim Maher wrote: > > I'm confused about how "last" is being handled with a -n > (implicit) loop, which is very different from the way "next" > is being handled. I'd appreciate any insights! > > Here's the problem in a nutshell: > > $ cat last2 > print "$.: $ARGV" and last LINE ; END{print "File is $ARGV\n"} > > $ perl -wln last2 motd1 motd2 > 1: motd1 > File is motd1 > $ > > Why didn't it process the second file, motd2, as happens with "next"? > It's acting like last is magically allowed to break out of the outer > (invisible) foreach loop, while next only affects the inner > (invisible) while loop. next and last are working as designed. There is no (invisible) foreach loop. <> processes the files in @ARGV as a continuous list of lines. If you want to move to the next file in @ARGV you have to explicitly close the ARGV filehandle. print "$.: $ARGV" and close ARGV; END{print "File is $ARGV\n"} John -- use Perl; program fulfillment From krahnj at acm.org Wed Aug 20 18:10:51 2003 From: krahnj at acm.org (John W. Krahn) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: "last" behaving oddly in -n loop In-Reply-To: References: Message-ID: <03082016105103.04209@bng2406iy20tf> On Wednesday 20 August 2003 15:28, Andrew Sweger wrote: > > On Wed, 20 Aug 2003, Tim Maher wrote: > > I'm confused about how "last" is being handled with a -n > > (implicit) loop, which is very different from the way "next" > > is being handled. I'd appreciate any insights! > > It's behaving the way I would expect of "last". I thought the file > arguments were all run together in the of the while(). If you > want to detect the end of individual files (while using normal IO > functions), I thought you need to manually check eof(). Actually, you need to check eof not eof(). :-) perldoc -f eof John -- use Perl; program fulfillment From ben at reser.org Wed Aug 20 18:58:02 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: "last" behaving oddly in -n loop In-Reply-To: <20030820135509.A17801@timji.consultix-inc.com> References: <20030820135509.A17801@timji.consultix-inc.com> Message-ID: <20030820235802.GD1509@titanium.brain.org> On Wed, Aug 20, 2003 at 01:55:09PM -0700, Tim Maher wrote: > SPUGsters, > > I'm confused about how "last" is being handled with a -n > (implicit) loop, which is very different from the way "next" > is being handled. I'd appreciate any insights! > > Here's the problem in a nutshell: > > $ cat last2 > print "$.: $ARGV" and last LINE ; END{print "File is $ARGV\n"} > > $ perl -wln last2 motd1 motd2 > 1: motd1 > File is motd1 > $ > > Why didn't it process the second file, motd2, as happens with "next"? > It's acting like last is magically allowed to break out of the outer > (invisible) foreach loop, while next only affects the inner (invisible) > while loop. You're assuming that it is expanding to something roughly like this: BEGIN { $^W = 1; } BEGIN { $/ = "\n"; $\ = "\n"; } END { print "File is $ARGV\n"; } unshift (@ARGV, '-') unless @ARGV; while ($#ARGV >= 0) { $ARGV = shift; open (ARGV, $ARGV); LINE: while (defined($_ = )) { chomp $_; print "$.: $ARGV" and last LINE; } } This is probably a pretty common assumption since the documention for the null file handle gives a code example that roughly would fit that. However, it is simply a psuedo code and you can't rely on perl to actually implement things that particular way (in fact it doesn't). A careful reading of the documentation hints at that but perhaps the documentation could be more explicit about that. Rather perl is implementing things as Deparse is showing and is magical. There is an implicit loop but is not implemented as a perl expansion but treated entirely special. From your programs view provides it the contents of the file currently opened as ARGV and then the contents of each following file named in @ARGV or STDIN if no files are listed in @ARGV. A good way to see how is completely special is to try and do something like this: perl -e 'open ARGV, 'motd2'; while () { print;}' motd1 motd2 If you run this you'll see that you get the output for motd2, then motd1, then motd2 again. This is slightly different than how the documentation says it behaves. If it behaved exactly as the documentation was written then my expansion of your code above would not have worked at all. I'm not sure if it has always behaved this way or if this was a change that was made at some point along the way to allow such code to actually work, despite the magical properties of . To add to the confusion perl acutally does assume loops around your programs when using -n, and in that case it that loop really exists and you can even control its flow with the normal flow control keywords. While the documentation in -n says it does assume that loop, the documentation for the null filehandle only says it provides you something like such a loop. Andrew Sweger and John W. Krahn already explained how to get it to behave the way you want so I won't repeat that. HTHs -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From krahnj at acm.org Wed Aug 20 20:47:51 2003 From: krahnj at acm.org (John W. Krahn) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: "last" behaving oddly in -n loop In-Reply-To: <20030820173600.A18964@timji.consultix-inc.com> References: <20030820135509.A17801@timji.consultix-inc.com> <03082016024102.04209@bng2406iy20tf> <20030820173600.A18964@timji.consultix-inc.com> Message-ID: <03082018475104.04209@bng2406iy20tf> On Wednesday 20 August 2003 17:36, Tim Maher/CONSULTIX wrote: > On Wed, Aug 20, 2003 at 04:02:41PM -0700, John W. Krahn wrote: > > > $ perl -wln last2 motd1 motd2 > > > 1: motd1 > > > File is motd1 > > > $ > > > > > > Why didn't it process the second file, motd2, as happens with > > > "next"? It's acting like last is magically allowed to break out > > > of the outer (invisible) foreach loop, while next only affects > > > the inner (invisible) while loop. > > > > next and last are working as designed. There is no (invisible) > > foreach loop. <> processes the files in @ARGV as a continuous list > > of lines. > > I know the diamond operator works that way, but according to the > deparsing, it's *not* <> being used, but instead , which I > was assuming to be populated by the result of an open (ARGV, > shift), or something similar, in a surrounding (implicit) loop. That may be a problem with B::Deparse. :-) and <> do exactly the same thing and if you use <> you will see that ARGV and $ARGV are in use. B::Deparse outputs "LINE: while (defined($_ = )) {" but as anyone who has used perl for a while knows the "defined($_ =" part is redundant and is rarely (if ever) used in real Perl5 code so we can probably assume that B::Deparse is just being verbose for some reason. John -- use Perl; program fulfillment From tim at consultix-inc.com Thu Aug 21 01:37:39 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: search.cpan.org Message-ID: <20030820233739.D2914@timji.consultix-inc.com> Can anybody please explain to me the search-language syntax being used over at search.cpan.org? One would think that once you've selected "search in MODULES ONLY" and typed in Select::POSIX::Shell, that it would be smart enough not to split the words on the ::'s, and show (irrelevant) matches for each component! The FAQ is totally mute on the subject of constructing queries, but it appears to me to be an extremely lame system. Am I missing something, or is this as feeble as it looks? -Tim *------------------------------------------------------------* | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* | Watch for my Book: "Minimal Perl for Shell Programmers" | *------------------------------------------------------------* From asim at pair.com Thu Aug 21 09:09:18 2003 From: asim at pair.com (Asim Jalis) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: search.cpan.org In-Reply-To: <20030820233739.D2914@timji.consultix-inc.com> References: <20030820233739.D2914@timji.consultix-inc.com> Message-ID: <20030821140918.GA54298@wokkil.pair.com> On Wed, Aug 20, 2003 at 11:37:39PM -0700, Tim Maher wrote: > Can anybody please explain to me the search-language syntax being > used over at search.cpan.org? One would think that once you've > selected "search in MODULES ONLY" and typed in > Select::POSIX::Shell, that it would be smart enough not to split > the words on the ::'s, and show (irrelevant) matches for each > component! The FAQ is totally mute on the subject of constructing > queries, but it appears to me to be an extremely lame system. Am > I missing something, or is this as feeble as it looks? I've noticed the same lameness myself. However, the following search query at Google works well: site:search.cpan.org Select POSIX Shell I had to remove :: from the module name because Google uses ":" for magic purposes. Asim From sthoenna at efn.org Thu Aug 21 11:17:03 2003 From: sthoenna at efn.org (Yitzchak Scott-Thoennes) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: "last" behaving oddly in -n loop In-Reply-To: <20030820135509.A17801@timji.consultix-inc.com> References: <20030820135509.A17801@timji.consultix-inc.com> Message-ID: <20030821161703.GA3444@efn.org> I think you are under a misconception. <> aka aka readline(ARGV) doesn't wrap an extra loop around your code (whether you use -n or not) so your last LINE falls of the end of your program. For instance, in the code: #!/usr/bin/perl -wpl # merge with next line if line ends with \ chop,chomp($_.=<>) while substr($_,-1) eq '\\'; both the implicit <> because of -p and the explicit <> will do the magic-ARGV thing of looping through the files in @ARGV. In the explicit <>, there is clearly no way for perl to magically insert a loop around part of your code. Instead, the magic of using each file in @ARGV in turn occurs in the guts of the readline builtin. (Fixing this so that anything--not just readline--using the ARGV handle loops through all the files is on the list in perltodo.pod.) To do what you want, change last2 from: print "$.: $ARGV" and last LINE; END{print "File is $ARGV\n"} to: print "$.: $ARGV" and close ARGV; END{print "File is $ARGV\n"} (Obviously you would want to also have "next LINE" after the close if there were other code following.) From tim at consultix-inc.com Thu Aug 21 22:06:00 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: 3 Jobs with Cray, one Perlish Message-ID: <20030821200600.A7702@timji.consultix-inc.com> FYI. Check out the 3 local jobs with Seattle Cray Inc. http://www.cray.com/jobs/sweng-1131.html Position Title: Software Engineer ====================== Job Number: 1131 Location: Seattle, WA Date Posted: 06/03 This position will be responsible for working with the OS Development team to put together a Linux Clustering solution for Cray Customers. Job Description: As part of ongoing development efforts, Cray is creating a set of technologies for operating systems, system administration and clustering. The successful candidate will be required to work cooperatively with the groups developing these technologies to select the ones that are applicable for early deployment. The employee will then need to work with Cray's integration team to create a Linux Cluster platform for early versions of the selected Cray software to be deployed at customers' sites. The employee will need to work with these early-adopters to fix bugs, resolve technical issues, collect feedback, and follow up with the rest of the development organization to make sure customer feedback is applied to the final product. Responsibilities: Develop versions for Cray software suitable to deployment at early-adopter sites. Develop a Linux Cluster solution to host initial versions of Cray software. Develop procedures to install the Linux Cluster and deploy the software. Collect customer feedback and fix bugs. Relay the feedback to the main development effort for incorporation into the final product. Qualifications: Minimum of 3 years development experience in C, Perl and Python Minimum 2 years working in Linux Minimum of 1 year Linux or Unix System Administration Understanding of operating system concepts and how they are implemented in the Linux kernel Experience in downloading, patching, compiling and deploying the Linux kernel would be a plus Familiarity with Linux clustering toolkit would be a plus Interaction and coordination with extended design team and other teams as needed Attention to detail while being flexible to the ever changing conditions of a true fast paced environment Good communication skills BS in Computer Science US citizen or permanent resident Beth Kester Technical Recruiter bkester@cray.com 206-701-2040 (o) Cray Inc. 411 First Avenue South, Suite 600 Seattle WA 98104.2860 http://www.cray.com/jobs/current.html Position Title: Test Engineer =================== Job Number: 1132 Location: Seattle, WA Date Posted: 06/03 Job Description/Responsibilities: As a Software Test Engineer in the Systems Integration Group, you will be working with a small group of software engineers testing cutting edge Linux-based systems. Responsibilities include: System software test planning and test specification. Porting and executing existing test software. Developing and executing new test software. Test results tracking. Reporting and reproducing system software failures. Regression testing updated system software. Assisting with resource allocation planning. Assisting with development and test process refinement. Contributing to overall software development and release goals. Prioritizing and strategizing for optimal testing effectiveness. Qualifications: Minimum of 4 years experience with testing complicated software systems. System-level knowledge of Unix/Linux. Substantial experience with C programming and shell scripting. Knowledge of networks and distributed systems preferred. Proven ability to produce software that is both portable and reusable. Experience writing and executing test plans and test cases. Proven planning and scheduling experience. Experience with entire software life cycle; from inception to release, support, and maintenance. Ability to work in a fast-paced, schedule-driven environment. Ability to acquire and use knowledge of the entire system and its components to test effectively. Independent self-starter who can cooperate effectively with other team members. Strong analytical and problem solving skills. Good written and verbal communication skills. U.S. Citizenship is required. Education: BS, BE or BA in a related field or equivalent experience. Beth Kester Technical Recruiter bkester@cray.com 206-701-2040 (o) Cray Inc. 411 First Avenue South, Suite 600 Seattle WA 98104.2860 http://www.cray.com/jobs/current.htm Position Title: I/O Developer =================== Job Number: 1127 Location: Seattle, WA Date Posted: 05/03 As a Software Engineer in the I/O Group, you will be working as a member of a small group of engineers developing a parallel computer system targeted for high-end production oriented data centers. The position involves designing, developing, and maintaining networking and I/O software. This is an opportunity to work with cutting edge hardware and software technology to provide high performance I/O solutions. Job Description/Responsibilities: Responsibilities include designing, developing, and maintaining networking solutions to enable high bandwidth TCP/IP connectivity; designing and implementing services to provide distributed parallel I/O and clustered file systems; designing and implementing solutions to provide fault tolerant I/O services; evaluating and selecting leading edge I/O hardware and software to meet high performance requirements; performance analysis and tuning of distributed I/O subsystems; analyzing and fixing problems to improve the overall system quality; in general developing a system to support high performance I/O services in a scalable, fault tolerant, parallel environment. Qualifications: Strong analytical and problem solving skills. Good written and verbal communication skills. Ability to work well in a cooperative development environment. Ability to work independently to solve difficult problems and implement solutions. Ability to work in a fast-paced, schedule-driven environment. Flexibility and willingness to adapt easily to changes. Education/Experience: Experience installing, configuring, maintaining and tuning of UNIX or Linux TCP/IP networking utilities. Experience developing, configuring, debugging and tuning of file systems. Experience with performance analysis and tuning of I/O subsystems. At least 3 years experience developing kernel internal I/O subsystems of Unix or Linux operating systems, such as TCP/IP, file systems or drivers. Experience with cluster, distributed or SAN file systems is a plus. US Citizen Beth Kester Technical Recruiter bkester@cray.com 206-701-2040 (o) Cray Inc. 411 First Avenue South, Suite 600 Seattle WA 98104.2860 http://www.cray.com/jobs/current.html -Tim *------------------------------------------------------------* | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* | Watch for my Book: "Minimal Perl for Shell Programmers" | *------------------------------------------------------------* From m3047 at inwa.net Fri Aug 22 01:18:08 2003 From: m3047 at inwa.net (Fred Morris) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: 3 Jobs with Cray, one Perlish Message-ID: >FYI. Check out the 3 local jobs with Seattle Cray Inc. Multiply by 3.8 (+- 1 stdev).... and don't hold your breath. -- Fred Morris m3047@inwa.net From m3047 at inwa.net Fri Aug 22 01:26:58 2003 From: m3047 at inwa.net (Fred Morris) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Ingy! Callling Brian Ingerson! Message-ID: Imagine if there was an Inline which blrted to a Cra? Numerical Engine. How cool is that? -- FWM m3047@inwa.net From wildwood_players at yahoo.com Fri Aug 22 11:02:06 2003 From: wildwood_players at yahoo.com (Richard Wood) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: pack and unpack of binary raster graphic files Message-ID: <20030822160206.3712.qmail@web11508.mail.yahoo.com> I hope this email is readable, I wrote it in notepad then pasted it in. I am trying to process (read and interpret) TIFF (raster graphics) files which are binary. These files can be big-endian or little-endian, based on a header record. The files I am reading are big-endian. The format of the records that I am trying to read are: 12 - 8-bit bytes |--------|--------|--------|--------|--------|--------|--------|--------|--------|--------|--------|--------| | 2-byte tag | 2-byte type | 4-byte count | 4-byte value or offset | The tag is a number that corresponds to a tag name, things like: 256 = ImageWidth 257 = ImageLength 259 = Compression 306 = DateTime 270 = ImageDescription The type is a number that describes the what is contained in the value field: 1 = BYTE 8-bit unsigned integer 2 = ASCII 8-bit byte that contains a 7-bit ASCII code; the last byte is NUL (binary zero) 3 = SHORT 16-bit unsigned integer 4 = LONG 32-bit unsigned integer 5 = RATIONAL Two LONGs: first represents numerator, second denominator 6 = SBYTE 8-bit unsigned integer 7 = UNDEFINED 8-bit byte that may contain anything depending upon the tag 8 = SSHORT 16-bit signed integer 9 = SLONG 16-bit signed integer 10 = SRATIONAL Two SLONGs: first represents numerator, second denominator 11 = FLOAT Single precision (4-byte) IEEE format 12 = DOUBLE Double precision (8-byte) IEEE format I read in 12 bytes, check the "type" then try to convert it to something I can use like ascii characters or numbers. I am not having much luck with the ascii. I have never used pack or unpack before and I clearly need help. If the "type" is ASCII and the "count" is less than 4, the last 4 bytes should be ascii characters, otherwise it is an offset into the file. I read the 12 bytes into $buffer like this: seek(TIF, $offset, 0) or die "Seek $i: $!\n"; read(TIF, $buffer, 12); Now, can someone give me a couple of examples on how to get meaningful data out of the value field? I am able to get the tag, type, count values using an unpack to separate the bytes then another unpack to convert to decimal but I suspect that is not the most efficient method. I am not sure how to get meaningful data out of the ascii. ($tagB, $typeB, $countB, $val1B, $val2B, $val3B, $val4B) = unpack "B16B16B32B8B8B8B8", $buffer; $tagC = bin2dec($tagB); $typeC = bin2dec($typeB); $countC = bin2dec($countB); sub bin2dec { return unpack("N", pack("B32", substr("0" x 32 . shift, -32))); } Regards, Rich Wood ===== Richard O. Wood Wildwood IT Consultants, Inc. wildwood_players@yahoo.com 425.281.1914 mobile 206.544.9885 desk __________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com From m3047 at inwa.net Fri Aug 22 11:33:11 2003 From: m3047 at inwa.net (Fred Morris) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: not quite random thoughts on the current crop of user groups Message-ID: I've been in the IT racket for 20 years. Early on in my career I joined the Seattle chapter of DECUS (SeaLUG). After that went away I looked sporadically at UGs but for some reason they never really clicked for me. For the past year I've been attending both GSLUG and SPUG fairly regularly. I'm coming to some conclusions about the current crop of user groups. Among those conclusions: I don't see enough people having fun; their workings are not transparent; and they suffer from the ills which typically accrue with charismatic leadership. SeaLUG never charged me money.. or wanted any. By the time it finally faded away (some time after DEC was subsumed by Compaq), it still had money in the bank (literally). The national DECUS convention cost money (and the only time I went was when my employer paid my way), and when we had regionals we charged for those: renting facilities and equipment, getting insurance, etc. costs money. Aside from charging for seminars and whatnot, how did SeaLUG get money? I think the national DECUS gave us a little initially. How did we manage to have money? We spent it carefully, we always thought free was good, and we had a national organization which perhaps didn't underwrite costs but which served as a non-interfering umbrella organization and was able to pool some risks; but in the end I suppose it was an accident... which would of course never have happened without our creating the correct opportunity. (By the way, DECUS had a fairly extensive library of free software.) A lot of people in SeaLUG were pretty darned boring. But that's a personal taste as much as anything. However, there are other reasons for networking besides getting a job; one of those is to get to know other people. Not everybody was boring. I've still got a couple of friends from that era. It seems like with GSLUG and SPUG you don't really get to know many people... unless they're spreading some glamour, and it's hard to say how accurate or useful that really is in the longer term sense of building relationships. SeaLUG's primary mission in life in my opinion was a place to get together and discuss DEC (it was not "owned" by Digital); SeaLUG did a pretty good job of that, as did DECUS as a whole. SeaLUG also took on occasional projects, notably the regionals and the wiring of Coe School. That's not a lot in a span of over a decade and maybe there are others which escape me. Nonetheless, why weren't there more? I think the reasons for that will seem familiar: interest of the membership and organization. Yet when there was enough interest things got done, there was no question of a single point of failure or single-vote veto. The workings of SeaLUG were quite transparent. There were rules, there were elected officers, and of course there was a clear purpose. In the case of GSLUG and SPUG the first two simply do not apply. "Clear purpose" is an interesting one, but I somehow think that if you'd polled the membership you would have gotten some agreement; I don't know if you polled the membership of GSLUG or SPUG whether you'd get that sort of agreement. People announced job openings at SeaLUG, yes. I don't see this same situation with GSLUG, but I have some questions about the job openings announced at SPUG: jobs.perl.org had an announcement for CarDomain on August 4th. That one never made it to the list. We seem to hear frequently (for some value of "frequently") about Amazon. We hear about them frequently enough that I have some questions: What are they paying? Are these new positions or is their turnover that high? What are their engineering practices like? How does any of this relate to their less than stellar employment practices as reported elsewhere, if at all? I am glad to see Cray's positions announced here (finally). I tried to bring them up in July and was ignored. Oh well. It does seem that all jobs have to go through Tim. What are the real reasons for that? I'm sure there are some, and "protecting" the membership is not one of them. I humbly suggest that if I have these sorts of nagging thoughts, perhaps people who might otherwise be inclined to announce opportunities have them as well. How are presentations chosen? None of the three organizations had written guidelines. I know it takes some work to find people who are willing to make presentations. SeaLUG was a democratic institution; GSLUG has started polling the membership; what about SPUG? A fair number of SeaLUG presentations were fairly boring. With both GSLUG and (especially) SPUG the entertainment factor seems much higher. Is that good or bad? Is it symptomatic of something? In both GSLUG and SPUG I see occasional displays where eccentricity crosses a line that I refer to as the Second Shwartzian Transform... you know that one whose final formulation as a Lagrange Polynomial is sometimes referred to as "felony stupidity". Where does critical thinking go in these cases? A recent example from GSLUG, and I suppose I should have found it entertaining (silly me), was the "security consultant" giving a presentation about using SSH to bust firewalls. That's fine that's good, but saying "this could get you fired" just ain't the same as giving the nitty gritty details of traffic analysis and other means to identify this sort of behavior. Telling people how proxy servers are great to tunnel through ain't the same as telling them maybe they want to be really careful about running one lest spammers and "security consultants" find out about it (somebody from the audience shouldn't have to bring this up). There's definitely some serious attention-getting behavior at work here. Maybe Perl security will be even funnier. I think Tim's fetish with Perl certification is going down the same road. Obviously if I hear the name Shwartz anywhere near "certification" I'm going to spit; but besides that... The University of Washington's program costs a lot of money. The cost should be made very clear to people 1) so that they know how much and 2) so that they know it's an advertisement! While we're at it, who's it open to, and how does a 40+ year old with nearly 20 years experience and no degree get it for free or $100? Maybe somebody from the UW could come and give a talk about this; I'd definitely show up! I think you can see where this is going: most of the blather about certification seems to me like "marketecture". Where does the money go? How do we get certified for free? Most of the certs that people want (or don't want, there are camps on both sides of the aisle) are underwritten (that's an understatement) by companies which make a lot of money selling the infrastructure the certs are targeted for; they also spend a lot of money marketing said infrastructure. W(h)ither town crier? I'm sure there are other examples. For me there's no escaping the fact that SPUG's agenda is Tim's agenda; maybe if it was clearer what that was, I could make up my mind about it. The fundamental problem with charismatic leadership as an organizational model is that whether or not the leader promotes it as a cult of personality, it encourages imitation within the membership (and if the leadership doesn't actively encourage this, it is usually blind to it). That imitation is counter to the dynamics needed by professional organizations of peers; it's also counter to easy and open fraternity, which is needed for fun, fellowship and playing around. It also leads to a situation where the people who get involved are often seeking attention... and when they don't get it they lose interest. I'd like to see more critical thought about these things in the broader organizational context, but maybe that's just me. I also don't see people having enough of what I would consider "fun"; somehow the tag in somebody's .sig that animals learn by playing therefore comes across to me as a desperate plea for help. Maybe this will help. Maybe the fundamental flaw with this critique is that neither SPUG nor GSLUG is intended to be a professional organization of peers (I suspect the folks at GSLUG would say "you are correct!", but I'm fairly certain they do want to have fun, fellowship and play.). Where are you supposed to discuss things like this in a charismatic organization? I guess you aren't; but, fait accompli. I therefore close with Grace Hopper's quip that 'tis better to beg forgiveness than to ask permission and a reminder that consistency is the hobgoblin of small minds. I just can't seem to raise my level of interest in either GSLUG or SPUG much beyond entertainment value; I've basically given up on anything else with SPUG, although I still have some inclinations with regard to GSLUG. If anybody has a suggestion as to an organization of professional peers with a local presence which is low or no cost and is run without charisma as the glue let me know. I'd say "Or hey, let's make one", but I'm not sure what the organizational mission would be: figuring out how we can make money? Well, that's a thought. But I think it's going to attract the wrong kind of crowd, because there are just too many scared cowboys out there right now. Maybe I'll start checking out Seattle Wireless... -- Fred Morris m3047@inwa.net From james at banshee.com Fri Aug 22 12:33:29 2003 From: james at banshee.com (James Moore) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: not quite random thoughts on the current crop of user groups In-Reply-To: Message-ID: <000a01c368d3$813c80c0$797ba8c0@gealach> I suspect your dissatisfaction may mean that your goals for a user group aren't necessarily shared by other people. For example, it never would have occurred to me to think that a user group should be "fun." I don't object to that, but "interesting" and "useful" are far more important to me than "fun" (at least in this context). As for transparency - it's never occurred to me that it was lacking. It's not like SPUG is a multi-million-dollar operation shrouded in secrecy that's sucking money out of my pockets. If a small group of people want to take on the responsibility of organizing something like SPUG, I say YES! Please! Do all the scutwork for something that's useful for me! Thank you! Do that again! So what if that means the projects close to what the organizers are working on get more attention? It's not like they're trying to shut out something else. I'll bet you dollars to donuts that if you offer to organize the speakers for a meeting on something near and dear to your heart you'd be welcomed with open arms. Chances of me organizing something like the Damien Conway talk last month: close to zero. Chances of it, or something like it, happening again with the current system: likely. That pretty much tells me everything I need to know. - James From james at banshee.com Fri Aug 22 12:37:23 2003 From: james at banshee.com (James Moore) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: pack and unpack of binary raster graphic files In-Reply-To: <20030822160206.3712.qmail@web11508.mail.yahoo.com> Message-ID: <000b01c368d4$0bfb2630$797ba8c0@gealach> Not really an answer to your question, but when I muck around with graphics files I usually check to see if ImageMagick can help solve my problem. - James From wildwood_players at yahoo.com Fri Aug 22 13:17:20 2003 From: wildwood_players at yahoo.com (Richard Wood) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: pack and unpack of binary raster graphic files In-Reply-To: <000b01c368d4$0bfb2630$797ba8c0@gealach> Message-ID: <20030822181720.44495.qmail@web11504.mail.yahoo.com> James, I have a pretty specialized need. I am not aware of any tools in ImageMagick that would do what I need to do. I have to process through the scanlines looking for repeating patterns then determine where these patterns occur. Once I have information that defines some bounding rectangles, I have to either remove external content or internal content depending upon the situation. Identifying the bounding rectangles is the trick! At that time, when I know the coordinates, I might end up using ImageMagick. The number of graphics that I am working with is in the 100's of thousands, and they are not necessarily very consistent. Thanks for the thought, it did cause me to review the functionality available in ImageMagick. Regards, Rich Wood --- James Moore wrote: > Not really an answer to your question, but when I > muck around with graphics > files I usually check to see if ImageMagick can help > solve my problem. > > - James > > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org > http://spugwiki.perlocity.org > ACCOUNT CONFIG: > http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > ===== Richard O. Wood Wildwood IT Consultants, Inc. wildwood_players@yahoo.com 425.281.1914 mobile 206.544.9885 desk __________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com From davidinnes at chicagoscience.com Fri Aug 22 15:03:18 2003 From: davidinnes at chicagoscience.com (David Innes) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: pack and unpack of binary raster graphic files In-Reply-To: <20030822181720.44495.qmail@web11504.mail.yahoo.com> Message-ID: <000801c368e8$6df9e1a0$fe7e5e40@converger.net> Hmm. I though maybe you could subvert primitives from Perl-based Optical Character Recognition modules but I couldn't really find anything along those lines on CPAN. Starting very long ago NASA and DoD has done a whole bunch of research on machine recognition of shapes in images. You'd think some of it would be available if not on CPAN then at least on Google. I think I might not be using the right terminology. For what it's worth, CPAN appears to offer very little in the way of shape recognition beyond something called No::OCRData. The first line in the description says "This documentation is written in Norwegian, for others, suffice to say that it does not really have much to do with Optical Character Recognition." (The module is evidently intended to operate on data scanned from standard Norwegian bank forms.) Surely someone out there has already solved some of these problems and published solutions. I called a friend who got his start in graphics doing portraits on etch-a-sketches and went from there. He suggested that if you can define something that's unique about the objects you're trying to detect you might be able to use the Gimp's magic wand or intelligent scissors tools to find the boundaries for you. CPAN does have a bunch of modules for the Gimp. But I should also mention that he said "wow, this is a classic case where 'example a' and 'example b' would make the problem a lot easier to solve." Just for fun, and to make SPUG more interesting, I think there's supposed to be an anti-porn filter for enterprise firewalls that can filter incoming image streams for anything showing too large a proportion of uninterrupted flesh tones. It seems sort of silly to me but the key thing is that if firewall-level apps can distinguish the shapes of naked people in real time (very irregular in both the corporate and graphical sense) then it shouldn't be that hard to detect more regular shapes in your images. But for better or worse, "example a" and "example b" for that problem probably wouldn't make it past our enterprise firewalls. :-) -- David Innes -----Original Message----- From: spug-list-bounces@mail.pm.org [mailto:spug-list-bounces@mail.pm.org] On Behalf Of Richard Wood Sent: Friday, August 22, 2003 11:17 AM To: 'Seattle Perl Users Group' Subject: RE: SPUG: pack and unpack of binary raster graphic files James, I have a pretty specialized need. I am not aware of any tools in ImageMagick that would do what I need to do. I have to process through the scanlines looking for repeating patterns then determine where these patterns occur. Once I have information that defines some bounding rectangles, I have to either remove external content or internal content depending upon the situation. Identifying the bounding rectangles is the trick! At that time, when I know the coordinates, I might end up using ImageMagick. The number of graphics that I am working with is in the 100's of thousands, and they are not necessarily very consistent. Thanks for the thought, it did cause me to review the functionality available in ImageMagick. Regards, Rich Wood From m3047 at inwa.net Fri Aug 22 23:31:27 2003 From: m3047 at inwa.net (Fred Morris) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: not quite random thoughts on the current crop of user groups Message-ID: Jobs: Jobs and projects and all of that: my point wasn't about posting to the list, my specific example was something that I knew about a month ago and nobody was interested enough to hear it. That happens. But part of the reason that it happens is that Tim Runs The Show. That's all, I wasn't implying anything nefarious. Last time I checked, the SPUG page pretty clearly says recruiters shouldn't show up at meetings and announce their openings. In this case I happen to know that Beth is an internal recruiter and I was simply trying to bring it to people's attention... if they cared. Again, I'm not implying anything nefarious, just that Tim wasn't interested... so it goes. I do start to have some broader social questions about the frequent flyers, and that seems not to be (openly) addressed at all. Tim Maher writes: >On Fri, Aug 22, 2003 at 09:33:11AM -0700, Fred Morris wrote: >> How are presentations chosen? None of the three organizations >> had written guidelines. I know it takes some work to find >> people who are willing to make presentations. SeaLUG was a >> democratic institution; GSLUG has started polling the >> membership; what about SPUG? > >"Choosing" rarely comes into the picture; there's rarely any >competition for time slots, so just about anybody who wants to >talk about anything Perlish is given the time they ask for (the >only exceptions are those with bad track records, of showing up >unprepared or incomprehensible on two separate occasions). I also >do a bit of arm-twisting sometimes to encourage certain >individuals to speak. And that's all there is to the "system"! I know there's more planning to it than that. ;-) >> A fair number of SeaLUG presentations were fairly boring. With >> both GSLUG and (especially) SPUG the entertainment factor seems >> much higher. Is that good or bad? Is it symptomatic of >> something? > >I'd give my usual response to this, which is that people who >gravitate to Perl are intrinsically more creative and expressive >in the first place, so it's no wonder they are more entertaining >communicators. And that's why, I think, it's fun! >> In both GSLUG and SPUG I see occasional displays where >> eccentricity crosses a line that I refer to as the Second >> Shwartzian Transform... you know that one whose final >> formulation as a Lagrange Polynomial is sometimes referred to >> as "felony stupidity". Where does critical thinking go in >> these cases? > >Alas, there's a fine line sometimes between creativity and >eccentricity. That's the way it goes. . . There's another fine line between brilliance and autocracy and... the Second Shwartzian Transform. It's a certain blind spot, sort of like how you can't divide by zero but you can make a value arbitrarily small (or large) and pretend... I don't know if that's clearer or not. >> I think Tim's fetish with Perl certification is going down the >> same road. > >I *know* it's not a fetish, because those are supposed to be fun! >8-} I've never spent so much time and incurred so much flack for >anything I've ever worked on before, so I know it can't be a >fetish. 7-{ Don't be so sure!! :-p >> Obviously if I hear the name Shwartz anywhere near >> "certification" I'm going to spit; but besides that... The >> University of Washington's program costs a lot of money. The >> cost should be made very clear to people 1) so that they know >> how much and 2) so that they know it's an advertisement! > >If memory serves, the UW program is priced somewhere around >$1,500. I would have thought that everybody knew that college >classes cost money, but if not, sorry for not making that clear. >And by the way, lest you think I was bribed to allow that >"infomercial", that program actually competes with my own >training offerings, so it's hardly in my personal interest to >promote it, but I invited them to make their pitch to round out >the coverage of the Certification topic. > >By the way, the UW offering is primarily a training program, and >that's why it's not just $100 like a (bogus) on-line >certification test. > >> While we're at it, who's it open to, and how does a 40+ year >> old with nearly 20 years experience and no degree get it for >> free or $100? > >You don't, but you don't need it either. My view of >certification (expounded to the extreme at >perlcert.perlocity.org) Aggh. Ok, I'll have to go read it now. >is that its main purpose is to show the >business community that we've got credentials if they want them, >and secondarily to help those who don't have other ways of doing >so to establish their knowledge . With 20 years of experience, >your "portfolio" would surely attest to your abilities (but a >CPAN module or two wouldn't hurt either!). Google for SqlHtmlRpt or PerlJacket; again, why do I need CPAN? Are JAPHs hiring? I thought most JAPHs weren't in management... at least in managed environments. >[...] >> How do we get certified for free? Most of the certs that people >> want (or don't want, there are camps on both sides of the >> aisle) are underwritten (that's an understatement) by companies >> which make a lot of money selling the infrastructure the certs >> are targeted for; they also spend a lot of money marketing said >> infrastructure. > >How do you get any service that involves people doing hard work >and incurring expenses for free? Either somebody subsidizes it >for their own reasons, or it doesn't happen. And if certification >helps you get a job, or helps Perl programmers look more >professional, it should be worth a few hours' pay to you! And if it perpetuates myths about who is qualified and who is not, then what? I did go and read the perlocity stuff (this is a late addition) and the blind spot in this seems to be that the debate is focused on the impact on those not susceptible to the myth! (Think I should be posting this on the Wiki? Probably.) Of *course* Larry should be certified.. or not certified.. it's the only way to bust the myth: LARRY WALL IS NOT CERTIFIED TO WORK WITH PERL -- it's true, this very minute. There's this whole dynamic going on about people not making money -- being recognized -- for actually writing software, but by writing books or teaching: a "you go out there and work" mentality. No, you don't have to tell me how hard it is to prepare to teach a class... been there, done that (taught short courses on VAX/VMS at none other than the UW, FWIW). It's not sustainable unless the planet stays in the "make more geeks" mode, and I don't know if it is, or if would be healthy in any case. It goes back to the contempt which plays into the Second Shwartzian Transform: you shouldn't have to know Perl to make toast, and because you don't have to know Perl to make toast doesn't make it a contemptible or lesser activity. Come on Tim! Money is just a concept in Academia! They give away scholarships, they pay grad students next to nothing.. all to achieve some Higher Purpose. If that Higher Purpose was to ensure that anybody over 40 and/or with more than 10 years experience in a field was qualified, and even encouraged, to attend their institution, it would simply be that way. I'm not asking them to pay me: I'm saying let me come and take your classes for nominal cost... the way things were at the UW (officially or unofficially) before Reagan criminalized the whole 'net in the first place! >> For me there's no escaping the fact that SPUG's agenda is Tim's >> agenda; maybe if it was clearer what that was, I could make up >> my mind about it. > >Huh? AFAIK, SPUG has no "agenda", outside of helping its members >learn more about Perl. And my agenda is obvious - I like Perl a >lot, and am looking for ways to help people learn more about it, >and improve its position in the marketplace. Well, OK. And you do an excellent job of showcasing the eccentric, eclectic and interdisciplinary aspects of what I'll wave my hands and call "the Perl community". But I'm not sure I can see any measurable impact on its position in the marketplace. >> The fundamental problem with charismatic leadership as an >> organizational model is that whether or not the leader promotes >> it as a cult of personality, it encourages imitation within the >> membership (and if the leadership doesn't actively encourage >> this, it is usually blind to it). That imitation is counter to >> the dynamics needed by professional organizations of peers; >> it's also counter to easy and open fraternity, which is needed >> for fun, fellowship and playing around. It also leads to a >> situation where the people who get involved are often seeking >> attention... and when they don't get it they lose interest. > >I don't agree with any of that. Are you saying that I'm too >likable to be a good enough leader? If so, you don't know me well >enough 8-}. Do you see SPUGsters imitating my fashion choices, or >hair style, or charity work for Perl? Do you really think we're >not having fun at the OpenSauce lunches? What "model" would you >rather see in place of the loosely defined one we have now? Nice straw man. I'm sure myself and many others do our charity work, too... and would with or without Perl, SPUG, etc. As for the OpenSauce lunches, am I mistaken or haven't most of them taken place on weekdays? See, I have this job... and I can't argue with your assertion that you're having fun, I just wonder about what I sense is a certain desperation in it. And then there's the dynamic of all of the copycat lunches, and that's the charismatic model that I was speaking to: that Tim does it, and then a whole bunch of people do it, and it's not really coordinated. A lot of that effort is destined to fail, and is wasted... for my definition of "wasted", of course. (PS, King Street Cafe is much better, foodwise than HoH. But none of it holds a candle to a real Dim Sum Palace in SF.) >> I'd like to see more critical thought about these things in the >> broader organizational context, but maybe that's just me. I >> also don't see people having enough of what I would consider >> "fun"; somehow the tag in somebody's .sig that animals learn by >> playing therefore comes across to me as a desperate plea for >> help. Maybe this will help. > >Apart from the recent OpenSauce lunches, and the OpenSauce Happy >Hour that will debut next week, "Fun" was also had at the Perl >Quiz of the Week parties that Michael Wolf organized for a while. What happened to those? >So Fred, to have more fun, you might want to participate in some >of these activities. I guess maybe it's me. Yeah, it's me. But I think again you hit it on the head, Tim does something, so other people do something, and without a broader organizational structure a lot of these things don't end up having much staying power. The failure of the charismatic model is that there is a very limited amount of Tim to go around. Maybe people are hoping technology will come to the rescue in the form of Wikis and whatnot, and I don't know how well that will fare. I was corresponding with Michael when he was setting those up.. and a bunch of other things, too. It's too much noise and has too little focus for me, personally, and I'm not really interested in fondling interesting baubles; I'm more interested in the social and business aspects... or in pulling several thousand feet of wire. (BTW, I took a road trip with Michael to the Portland Perl UG. See, now that was fun. Unfortunately I don't think either of us has reliable enough (4 wheel) transpo to make it a regular thing. That may change for me soon here.) [...] >Thanks for your comments Fred, and all the time you took to write >them down. > >If you can give specific recommendations about how we might >change SPUG to make it better in your view, please let them be >heard. I'm not asking you to change, I'm not sure it would be "good" if you did. In some ill-formed manner I'm just trying to point out some limitations of the charismatic model. I've seen it, and the bureaucratic model, and many in between, in real workplace environments numerous times over the last 20 years. They all have their good and their bad points. It's just that the current crop of viable and "attractive" (for me) geek groups seem to be strongly charismatic at this point in time. I guess more than anything else I'm simply trying to point that out. Given that that's the state of the world, I probably need to look elsewhere for the kind of fellowship that I crave... doesn't mean I don't like Perl, or appreciate the work you do, or the highly entertaining shows you manage to put together. Charisma works very well in a "boutique" setting, it may even be necessary. When people start thinking about expanding it beyond the boutique, they need to start thinking about their organizational model. -- Fred Morris m3047@inwa.net From spug-list at l.ifokr.org Fri Aug 22 23:57:48 2003 From: spug-list at l.ifokr.org (Brian Hatch) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: not quite random thoughts on the current crop of user groups In-Reply-To: References: Message-ID: <20030823045748.GI1340@ifokr.org> > As for the OpenSauce lunches, am > I mistaken or haven't most of them taken place on weekdays? Schedule one and they will come. I've arranged two (one coming up Monday at Rosita's in Ballard) and they were during the week because on the weekend I am exclusively a daddy, and my 3 year old isn't interested in a bunch of geeks. If you want to have one on a weekend, announce it and it exists! That's all there is to it! > and I can't argue with your assertion that you're having fun, I > just wonder about what I sense is a certain desperation in it. I go because, as a telecommuter, I seldom see anyone outside the GSLUG meeting and my daughter's day care. That probably qualifies as 'desperation for human contact.' Not sure what you're insinuating though. Doesn't seem like anyone else has any desperation aspects to it. Just hanging out with geeks and having food. > And then > there's the dynamic of all of the copycat lunches, and that's the > charismatic model that I was speaking to: that Tim does it, and then a > whole bunch of people do it, and it's not really coordinated. I don't see any problem with that. The first was when Tim and I were going to get together anyway and opened it up to anyone interested. Guess that was pretty evil of us. > A lot of that > effort is destined to fail, and is wasted... for my definition of "wasted", > of course. > (PS, King Street Cafe is much better, foodwise than HoH. But > none of it holds a candle to a real Dim Sum Palace in SF.) Then schedule one. You'll probably do better getting SPUGgers if you have it north of California. > I guess maybe it's me. Yeah, it's me. But I think again you hit it on the > head, Tim does something, so other people do something, and without a > broader organizational structure a lot of these things don't end up having > much staying power. The failure of the charismatic model is that there is a > very limited amount of Tim to go around. Maybe people are hoping technology > will come to the rescue in the form of Wikis and whatnot, and I don't know > how well that will fare. So what's your suggestion? No one should do anything because if someone does something then it's destined to fail. Got it. -- Brian Hatch Bri: I need a nap. Systems and Josh: I understand. Security Engineer Do you need http://www.ifokr.org/bri/ a bottle? Every message PGP signed -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030822/18521370/attachment.bin From spug-list at l.ifokr.org Mon Aug 25 09:45:41 2003 From: spug-list at l.ifokr.org (Brian Hatch) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Open Sauce Lunch today in Ballard Message-ID: <20030825144541.GD5286@ifokr.org> The next Open Sauce Lunch is Today, Aug 25th at 12:30 at Rosita's Mexican Restraunt in Ballard, just across the street from the QFC on Holman Road NW For more details and maps, go to http://spugwiki.perlocity.org/index.cgi?MondayAug25inBallard If you are coming, either add your name to the wiki entry above, or send me an email so I can reserve an appropriate table. If you have a PGP/GPG key, bring a copy of your fingerprint/bits/etc and ID with you, so we can exchange and sign keys as well. Look for a guy with a goatee and purple camouflage hat; that will be this lunch's Convener, Brian Hatch. -- Brian Hatch "I'll stay away longer. Systems and I love being missed." Security Engineer http://www.ifokr.org/bri/ Every message PGP signed -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030825/d732169e/attachment.bin From karl.b.hartman at boeing.com Mon Aug 25 10:55:47 2003 From: karl.b.hartman at boeing.com (Hartman, Karl B) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: not quite random thoughts on the current crop of user groups Message-ID: <628E489B972CBC46999B4E71C609FA8F01F7238D@xch-nw-06.nw.nos.boeing.com> Personally, I prefer them on weekdays. But maybe not every week. Dan, Dim Sum location: Sun-Ya Restaurant 605 7th S SEATTLE, WA 98104-2907 No Dim Sum, not fancy, but good restaurant. My wife Ling and my favorite restaurant in Seattle. Hing Loon Restaurant 628 S Weller Seattle, WA 98104 If you have any questions or comments, don't hesitate to call. Thanks, Karl Hartman >SSG Client/Server Operations - Computing Admin Process Mgmt 425-294-8172 (office) Business Sense "Failure to embrace an idea just because it doesn't make sense or just plain doesn't work does not constitute resistance to change" Common Sense - from "Really important stuff my kids taught me" "If you want to zoom down the expert slope tomorrow, you have to fall down the bunny slope today." -----Original Message----- From: Brian Hatch [mailto:spug-list@l.ifokr.org] Sent: Friday, August 22, 2003 9:58 PM To: Fred Morris Cc: spug-list@mail.pm.org; Tim Maher/CONSULTIX Subject: Re: SPUG: not quite random thoughts on the current crop of user groups > As for the OpenSauce lunches, am > I mistaken or haven't most of them taken place on weekdays? Schedule one and they will come. I've arranged two (one coming up Monday at Rosita's in Ballard) and they were during the week because on the weekend I am exclusively a daddy, and my 3 year old isn't interested in a bunch of geeks. If you want to have one on a weekend, announce it and it exists! That's all there is to it! > and I can't argue with your assertion that you're having fun, I just > wonder about what I sense is a certain desperation in it. I go because, as a telecommuter, I seldom see anyone outside the GSLUG meeting and my daughter's day care. That probably qualifies as 'desperation for human contact.' Not sure what you're insinuating though. Doesn't seem like anyone else has any desperation aspects to it. Just hanging out with geeks and having food. > And then > there's the dynamic of all of the copycat lunches, and that's the > charismatic model that I was speaking to: that Tim does it, and then a > whole bunch of people do it, and it's not really coordinated. I don't see any problem with that. The first was when Tim and I were going to get together anyway and opened it up to anyone interested. Guess that was pretty evil of us. > A lot of that > effort is destined to fail, and is wasted... for my definition of > "wasted", of course. > (PS, King Street Cafe is much better, foodwise than HoH. But none of > it holds a candle to a real Dim Sum Palace in SF.) Then schedule one. You'll probably do better getting SPUGgers if you have it north of California. > I guess maybe it's me. Yeah, it's me. But I think again you hit it on > the head, Tim does something, so other people do something, and > without a broader organizational structure a lot of these things don't > end up having much staying power. The failure of the charismatic model > is that there is a very limited amount of Tim to go around. Maybe > people are hoping technology will come to the rescue in the form of > Wikis and whatnot, and I don't know how well that will fare. So what's your suggestion? No one should do anything because if someone does something then it's destined to fail. Got it. -- Brian Hatch Bri: I need a nap. Systems and Josh: I understand. Security Engineer Do you need http://www.ifokr.org/bri/ a bottle? Every message PGP signed From dan at concolor.org Mon Aug 25 11:25:32 2003 From: dan at concolor.org (Daniel Sabath) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: not quite random thoughts on the current crop of user groups In-Reply-To: <628E489B972CBC46999B4E71C609FA8F01F7238D@xch-nw-06.nw.nos.boeing.com> References: <628E489B972CBC46999B4E71C609FA8F01F7238D@xch-nw-06.nw.nos.boeing.com> Message-ID: <4833.128.95.154.177.1061828732.squirrel@www.concolor.org> Karl. Thanks for the suggestion. I have them scheduled every two weeks in the downtown area, but if that is too often just let me know. I did that scheduling based on feedback during the first one. The next one is Friday 8/29. Fred. I have no idea what you mean by copycat lunches. I started it downtown because i thought it was a good idea and can't make the ballard ones. One of the things about open source is the community. This is community building. While I hear your point of view and even agree with some of it, in this case I think you are barking up the wrong tree. We would however love to see you show up this friday and help build the community. -dan > Personally, I prefer them on weekdays. But maybe not every week. > > Dan, > Dim Sum location: > Sun-Ya Restaurant > 605 7th S SEATTLE, WA 98104-2907 > > No Dim Sum, not fancy, but good restaurant. My wife Ling and my > favorite restaurant in Seattle. Hing Loon Restaurant > 628 S Weller Seattle, WA 98104 > > If you have any questions or comments, don't hesitate to call. > > Thanks, > > Karl Hartman >>SSG Client/Server Operations - Computing Admin Process Mgmt > 425-294-8172 (office) > > Business Sense > "Failure to embrace an idea just because it doesn't make sense or > just plain doesn't work does not constitute resistance to change" > > Common Sense - from "Really important stuff my kids taught me" > "If you want to zoom down the expert slope tomorrow, you have > to fall down the bunny slope today." > > > > -----Original Message----- > From: Brian Hatch [mailto:spug-list@l.ifokr.org] > Sent: Friday, August 22, 2003 9:58 PM > To: Fred Morris > Cc: spug-list@mail.pm.org; Tim Maher/CONSULTIX > Subject: Re: SPUG: not quite random thoughts on the current crop of user > groups > > > > >> As for the OpenSauce lunches, am >> I mistaken or haven't most of them taken place on weekdays? > > Schedule one and they will come. > > I've arranged two (one coming up Monday at Rosita's in Ballard) and they > were during the week because on the weekend I am exclusively a daddy, > and my 3 year old isn't interested in a bunch of geeks. > > If you want to have one on a weekend, announce it and it exists! That's > all there is to it! > >> and I can't argue with your assertion that you're having fun, I just >> wonder about what I sense is a certain desperation in it. > > I go because, as a telecommuter, I seldom see anyone outside the GSLUG > meeting and my daughter's day care. That probably qualifies as > 'desperation for human contact.' Not sure what you're insinuating > though. Doesn't seem like anyone else has any desperation aspects to > it. Just hanging out with geeks and having food. > >> And then >> there's the dynamic of all of the copycat lunches, and that's the >> charismatic model that I was speaking to: that Tim does it, and then a >> whole bunch of people do it, and it's not really coordinated. > > I don't see any problem with that. The first was when Tim and I were > going to get together anyway and opened it up to anyone interested. > Guess that was pretty evil of us. > >> A lot of that >> effort is destined to fail, and is wasted... for my definition of >> "wasted", of course. > >> (PS, King Street Cafe is much better, foodwise than HoH. But none of >> it holds a candle to a real Dim Sum Palace in SF.) > > Then schedule one. You'll probably do better getting SPUGgers if you > have it north of California. > >> I guess maybe it's me. Yeah, it's me. But I think again you hit it on >> the head, Tim does something, so other people do something, and >> without a broader organizational structure a lot of these things don't >> end up having much staying power. The failure of the charismatic >> model is that there is a very limited amount of Tim to go around. >> Maybe people are hoping technology will come to the rescue in the >> form of Wikis and whatnot, and I don't know how well that will fare. > > So what's your suggestion? No one should do anything because if someone > does something then it's destined to fail. Got it. > > > > -- > Brian Hatch Bri: I need a nap. > Systems and Josh: I understand. > Security Engineer Do you need > http://www.ifokr.org/bri/ a bottle? > > Every message PGP signed > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org From tim at consultix-inc.com Mon Aug 25 14:05:59 2003 From: tim at consultix-inc.com (Tim Maher) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Docs on "-l" wrong? Message-ID: <20030825120559.A22166@timji.consultix-inc.com> SPUGsters, Does anybody think the -l documentation from "perldoc perlrun" is any good? This is a confusing topic in the first place, IMHO, and this man page makes it moreso. See my comments below. And I'd welcome yours too. -Tim -l[octnum] enables automatic line-ending processing. It has two separate effects. First, it automatically chomps $/ (the input record separator) when used with -n or -p. Second, it assigns "$\" (the output record separator) to have the value of octnum so that any print state? ments will have that separator added back on. If octnum is omitted, sets "$\" to the current value of $/. For instance, to trim lines to 80 columns: perl -lpe 'substr($_, 80) = ""' Tim's comment: The whole point of this example, which unfortunately goes unstated, is that when you truncate lines, you lose their attached newlines, so without -l adding the "\n" back in automagically, the (long) lines would not be separated by anything. I'm left wondering, is this passage meant to be an aid to developing the readers knowledge, or a test of it? 8-} This is fine documentation written for the people who don't need it, but not the others. Note that the assignment "$\ = $/" is done when the switch is processed, so the input record separator can be different than the output record separator if the -l switch is followed by a -0 switch: gnufind / -print0 | perl -ln0e 'print "found $_" if -p' This sets "$\" to newline and then sets $/ to the null character. Tim says: Instead of leaving it to the reader to disentangle the confusing prose and wonder whether 0 is an argument to -n or something else, I'd prefer an explicit statement that -l is donating the terminal newlines to the solution (via $\), and -0 is setting $/ to the null character. -Tim *------------------------------------------------------------* | Tim Maher (206) 781-UNIX (866) DOC-PERL (866) DOC-UNIX | | tim(AT)Consultix-Inc.Com TeachMeUnix.Com TeachMePerl.Com | *+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-* | Watch for my Book: "Minimal Perl for Shell Programmers" | *------------------------------------------------------------* From sthoenna at efn.org Mon Aug 25 14:58:11 2003 From: sthoenna at efn.org (Yitzchak Scott-Thoennes) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Docs on "-l" wrong? In-Reply-To: <20030825120559.A22166@timji.consultix-inc.com> References: <20030825120559.A22166@timji.consultix-inc.com> Message-ID: <20030825195811.GA4084@efn.org> On Mon, Aug 25, 2003 at 12:05:59PM -0700, Tim Maher wrote: > SPUGsters, > > Does anybody think the -l documentation from "perldoc perlrun" > is any good? This is a confusing topic in the first place, IMHO, and > this man page makes it moreso. See my comments below. And I'd welcome > yours too. > > -Tim > -l[octnum] > enables automatic line-ending processing. It has two > separate effects. First, it automatically chomps $/ > (the input record separator) when used with -n or -p. > Second, it assigns "$\" (the output record separator) > to have the value of octnum so that any print state­ > ments will have that separator added back on. If > octnum is omitted, sets "$\" to the current value of > $/. For instance, to trim lines to 80 columns: > > perl -lpe 'substr($_, 80) = ""' Its somewhat confusing to say it chomps $/ when it really chomps (implied: $/ off the end of) $_. > > Tim's comment: > The whole point of this example, which unfortunately goes unstated, > is that when you truncate lines, you lose their attached newlines, > so without -l adding the "\n" back in automagically, the (long) lines > would not be separated by anything. I'm left wondering, is this passage > meant to be an aid to developing the readers knowledge, or a test of it? > 8-} This is fine documentation written for the people who don't need it, > but not the others. Would an extra sentence after the perl -lpe line that explains how it works work for you? > Note that the assignment "$\ = $/" is done when the > switch is processed, so the input record separator > can be different than the output record separator if > the -l switch is followed by a -0 switch: > > gnufind / -print0 | perl -ln0e 'print "found $_" if -p' > > This sets "$\" to newline and then sets $/ to the > null character. > > Tim says: > Instead of leaving it to the reader to disentangle the confusing > prose and wonder whether 0 is an argument to -n or something > else, I'd prefer an explicit statement that -l is donating the > terminal newlines to the solution (via $\), and -0 is setting $/ > to the null character. Would it work just to rewrite the switches as -ln -0 -e? Or do you think further explanation is needed? If you can come up with a better example not using find and -print0, that would be good. Perhaps something that prints the first sentence of each paragraph of input? From m3047 at inwa.net Tue Aug 26 02:12:27 2003 From: m3047 at inwa.net (Fred Morris) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Docs on "-l" wrong? Message-ID: Tim writes: >SPUGsters, > >Does anybody think the -l documentation from "perldoc perlrun" >is any good? This is a confusing topic in the first place, IMHO, and >this man page makes it moreso. See my comments below. And I'd welcome >yours too. > >-Tim > -l[octnum] > enables automatic line-ending processing. It has two > separate effects. First, it automatically chomps $/ > (the input record separator) when used with -n or -p. > Second, it assigns "$\" (the output record separator) > to have the value of octnum so that any print state- > ments will have that separator added back on. If > octnum is omitted, sets "$\" to the current value of > $/. For instance, to trim lines to 80 columns: > > perl -lpe 'substr($_, 80) = ""' > >Tim's comment: >The whole point of this example, which unfortunately goes unstated, >is that when you truncate lines, you lose their attached newlines, >so without -l adding the "\n" back in automagically, the (long) lines >would not be separated by anything. Seems to me that any line < 80 characters in the above example is not truncated... as opposed to a literalist reading of "the first 80 characters of each line": the former assumes that very few (if any) lines would ever be longer than 80 characters (VT-100? VT-52?), while the latter assumes that the intent is really to do something to each line. Having been around long enough, that seems to me to be the lever which moves the world. From that it follows that lines < 80 characters would be treated differently than others.. or would they? My first impression of what that says is "-l is going to add a newline back on to any unfortunate line which is munged". In fact it only hit me that -l necessarily has to add a record terminator to *every* line on further reflection. >I'm left wondering, is this passage >meant to be an aid to developing the readers knowledge, or a test of it? >8-} This is fine documentation written for the people who don't need it, >but not the others. I'm guessing neither, but that it is an example which is getting somewhat dated. (I've used the -l flag before and it's worked as I expected it to.. I'd never even thought about this before!) (Maybe it's good I didn't read the documentation?) > Note that the assignment "$\ = $/" is done when the > switch is processed, so the input record separator > can be different than the output record separator if > the -l switch is followed by a -0 switch: > > gnufind / -print0 | perl -ln0e 'print "found $_" if -p' > > This sets "$\" to newline and then sets $/ to the > null character. > >Tim says: >Instead of leaving it to the reader to disentangle the confusing >prose and wonder whether 0 is an argument to -n or something >else, I'd prefer an explicit statement that -l is donating the >terminal newlines to the solution (via $\), and -0 is setting $/ >to the null character. I confess to never having used -0 or thought about it until now! It seems clear it's not an argument to -n. It seems that what they're saying is that the input and output separators are decoupled, and that I could for instance programmatically change one independently of the other (which would better suit my style and needs: I never do any paragraph-oriented processing, all lines or slurp... how do I use -0 to set it to undef?)... I assume that's true anyway, both from some zen intuition as well as from that snippet of documentation. However it's hard to tell without context whether the intention of that snippet is to explain the decoupling or to explain -0. -- Fred Morris m3047@inwa.net From sthoenna at efn.org Tue Aug 26 12:53:10 2003 From: sthoenna at efn.org (Yitzchak Scott-Thoennes) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Docs on "-l" wrong? In-Reply-To: References: Message-ID: <20030826175310.GA3208@efn.org> On Tue, Aug 26, 2003 at 12:12:27AM -0700, Fred Morris wrote: > I confess to never having used -0 or thought about it until now! It seems > clear it's not an argument to -n. It seems that what they're saying is that > the input and output separators are decoupled, and that I could for > instance programmatically change one independently of the other (which > would better suit my style and needs: I never do any paragraph-oriented > processing, all lines or slurp... how do I use -0 to set it to undef?)... I > assume that's true anyway, both from some zen intuition as well as from > that snippet of documentation. However it's hard to tell without context > whether the intention of that snippet is to explain the decoupling or to > explain -0. Since it is in the -l section, it's to explain the decoupling. A similar example with no -l is given in the -0 section. -0 : sets $/ to "\0" -0octal : if 0, sets $/ to "" (paragraph mode) if 1-0377, sets $/ to that char if >0377, sets $/ to undef (only -0777 is documented) Starting in 5.8.1: -0xfee : sets $/ to "\x{fee}" -0xfie : for better backward compatibility, parsed as -0 -xfie, not -0xf -ie From christopher.w.cantrall at boeing.com Tue Aug 26 13:13:51 2003 From: christopher.w.cantrall at boeing.com (Cantrall, Christopher W) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Docs on "-l" wrong? Message-ID: I would think that the documentation of the -l switch would start with "Automagically takes care of line endings, so you don't have to think about chomp or \n - high DWIMage factor." and then continue with a detailed explanation. Well, I don't know if the pumpking would approve of "DWIMage". __________________________________________ Christopher Cantrall Structural Engineer, 767 Fuselage Chairman, 767 Airframe Peer Council http://fuselage767.web.boeing.com/PeerCouncil/ phone: 425-342-4131 fax: 425-717-3174 Christopher.W.Cantrall@Boeing.com > -----Original Message----- > From: Tim Maher [mailto:tim@consultix-inc.com] > Sent: Monday, August 25, 2003 12:06 PM > To: spug-list@pm.org > Subject: SPUG: Docs on "-l" wrong? > > > SPUGsters, > > Does anybody think the -l documentation from "perldoc perlrun" > is any good? This is a confusing topic in the first place, IMHO, and > this man page makes it moreso. See my comments below. And I'd welcome > yours too. [snip] From sthoenna at efn.org Tue Aug 26 16:13:24 2003 From: sthoenna at efn.org (Yitzchak Scott-Thoennes) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Docs on "-l" wrong? In-Reply-To: References: Message-ID: <20030826211324.GA3276@efn.org> On Tue, Aug 26, 2003 at 11:13:51AM -0700, "Cantrall, Christopher W" wrote: > I would think that the documentation of the -l switch would start with "Automagically takes care of line endings, so you don't have to think about chomp or \n - high DWIMage factor." and then continue with a detailed explanation. > > Well, I don't know if the pumpking would approve of "DWIMage". See: http://www.xray.mpe.mpg.de/mailing-lists/perl5-porters/2002-06/msg00016.html From m3047 at inwa.net Tue Aug 26 20:22:24 2003 From: m3047 at inwa.net (Fred Morris) Date: Mon Aug 2 21:37:07 2004 Subject: SPUG: Docs on "-l" wrong? Message-ID: Yitzchak Scott-Thoennes wrote: >On Tue, Aug 26, 2003 at 11:13:51AM -0700, "Cantrall, Christopher W" wrote: >> Well, I don't know if the pumpking would approve of "DWIMage". > >See: >http://www.xray.mpe.mpg.de/mailing-lists/perl5-porters/2002-06/msg00016.html They used "thingie", too. -- Fred Morris m3047@inwa.net From perl at pryan.org Tue Aug 26 23:24:21 2003 From: perl at pryan.org (Patrick Ryan) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: SPUG Resources Message-ID: <20030827042421.GH16725@stingray.pryan.org> Hello, I just stumbled upon two SPUG IRC channels and thought it would be a good idea to list those and other things at the SPUG kwiki. Andrew has been populating those channels. irc://irc.perl.org/spug irc://irc.freenode.net/spug If there are any other SPUG resources, feel free to populate the kwiki entry: http://spugwiki.perlocity.org/index.cgi?Resources - Patrick -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.pm.org/pipermail/spug-list/attachments/20030826/34a2fe2a/attachment.bin From ashok_palihill at hotmail.com Wed Aug 27 09:51:57 2003 From: ashok_palihill at hotmail.com (Ashok Misra) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: request help with consultation rate Message-ID: Greetings spugsters! A company in the Bay area has approached me to work as a consultant at their site to build a processing system for posting transactions to certain types of Overseas Bank Accounts. All work has to be done on site due to security reasons. I estimate my portion of the development to take about 12 weeks. Local lodging & dining expenses , weekly to & fro air fare to fly to the Bay area from seattle is borne by the client. I am having some difficulty in quoting a competitive rate since I have never worked as a consultant before. If anyone has worked as a consultant in the Bay Area or has an idea on how to work out a competetive rate I would really appreciate your help. I would request you to email me a contact phone number so I could discuss the details with you. If I learn any generic information on consulatation rates in this process I will be glad to share that with anyone interested. Thanks in anticipation Best Regards Ashok Ashok Misra 206 236 0170 (H) 206 427 5232 (M) _________________________________________________________________ STOP MORE SPAM with the new MSN 8 and get 2 months FREE* http://join.msn.com/?page=features/junkmail From tim at teachmeperl.com Wed Aug 27 10:19:59 2003 From: tim at teachmeperl.com (Tim Maher/CONSULTIX) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: Help with consultation rate for SFO Message-ID: <20030827081959.A29990@timji.consultix-inc.com> NOTE: Posted by Tim, for Ashok. -Tim Greetings spugsters! A company in the Bay area has approached me to work as a consultant at their site to build a processing system for posting transactions to certain types of Overseas Bank Accounts. All work has to be done on site due to security reasons. I estimate my portion of the development to take about 12 weeks. Local lodging & dining expenses , weekly to & fro air fare to fly to the Bay area from Seattle is borne by the client. I am having some difficulty in quoting a competitive rate since I have never worked as a consultant before. If anyone has worked as a consultant in the Bay Area or has an idea on how to work out a competitive rate I would really appreciate your help. I would request you to email me a contact phone number so I could discuss the details with you. If I learn any generic information on consultation rates in this process I will be glad to share that with anyone interested. Thanks in anticipation Best Regards Ashok Ashok Misra 206 236 0170 (H) 206 427 5232 (M) From wildwood_players at yahoo.com Wed Aug 27 13:47:39 2003 From: wildwood_players at yahoo.com (Richard Wood) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack Message-ID: <20030827184739.12454.qmail@web11503.mail.yahoo.com> I thought I would try this again without the reference to graphics which I hope is why only one person responded. I am familiar with and use both ImageMagick and GIMP. That is not what I need. Surely there are many pack/unpack experts out there in SPUG-land. I am trying to process (read and interpret) binary files. These files can be big-endian or little-endian, based on a header record. The files I am reading are big-endian. The format of the records that I am trying to read are: 12 - 8-bit bytes (line feeds introduced for clarity) |--------|--------| | 2-byte tag | |--------|--------| | 2-byte type | |--------|--------|--------|--------| | 4-byte count | |--------|--------|--------|--------| | 4-byte value or offset | I read in 12 bytes, check the "type" then try to convert the "value" to something I can use like ascii characters or numbers. I am not having much luck with the ascii. I have never used pack or unpack before and I clearly need help. If the "type" is ASCII and the "count" is less than 4, the last 4 bytes should be ascii characters, otherwise it is an offset into the file. I read the 12 bytes into $buffer like this: seek(TIF, $offset, 0) or die "Seek $i: $!\n"; read(TIF, $buffer, 12); Now that I have 12 bytes of binary data in my buffer, can someone give me a couple of examples on how to get meaningful data out of the value field? I am able to get the tag, type, count values using an unpack to separate the bytes then another unpack to convert to decimal but I suspect that is not the most efficient method. I am not sure how to get meaningful data out of the ascii. ($tagB, $typeB, $countB, $val1B, $val2B, $val3B, $val4B) = unpack "B16B16B32B8B8B8B8", $buffer; $tagC = bin2dec($tagB); $typeC = bin2dec($typeB); $countC = bin2dec($countB); sub bin2dec { return unpack("N", pack("B32", substr("0" x 32 . shift, -32))); } Details of the expected data: The tag is a number that corresponds to a tag name, things like: 256 = ImageWidth 257 = ImageLength 259 = Compression 306 = DateTime 270 = ImageDescription The type is a number that describes the what is contained in the value field: 1 = BYTE 8-bit unsigned integer 2 = ASCII 8-bit byte that contains a 7-bit ASCII code; the last byte is NUL (binary zero) 3 = SHORT 16-bit unsigned integer 4 = LONG 32-bit unsigned integer 5 = RATIONAL Two LONGs: first represents numerator, second denominator 6 = SBYTE 8-bit unsigned integer 7 = UNDEFINED 8-bit byte that may contain anything depending upon the tag 8 = SSHORT 16-bit signed integer 9 = SLONG 16-bit signed integer 10 = SRATIONAL Two SLONGs: first represents numerator, second denominator 11 = FLOAT Single precision (4-byte) IEEE format 12 = DOUBLE Double precision (8-byte) IEEE format Regards, Rich Wood ===== Richard O. Wood Wildwood IT Consultants, Inc. wildwood_players@yahoo.com 425.281.1914 mobile 206.544.9885 desk __________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com From sthoenna at efn.org Wed Aug 27 14:22:14 2003 From: sthoenna at efn.org (Yitzchak Scott-Thoennes) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack In-Reply-To: <20030827184739.12454.qmail@web11503.mail.yahoo.com> References: <20030827184739.12454.qmail@web11503.mail.yahoo.com> Message-ID: <20030827192213.GA380@efn.org> On Wed, Aug 27, 2003 at 11:47:39AM -0700, Richard Wood wrote: > Surely there are many pack/unpack experts out there in > SPUG-land. Sadly, there aren't that many people familiar with pack/unpack, including me. I know that after many complaints about the documentation in perlfunc, a special tutorial, perlpacktut, was created. Does it help you? If not, consider adding to it if and when you resolve your problem. From ben at reser.org Wed Aug 27 14:22:34 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack In-Reply-To: <20030827184739.12454.qmail@web11503.mail.yahoo.com> References: <20030827184739.12454.qmail@web11503.mail.yahoo.com> Message-ID: <20030827192233.GO5494@titanium.brain.org> On Wed, Aug 27, 2003 at 11:47:39AM -0700, Richard Wood wrote: > ($tagB, $typeB, $countB, $val1B, $val2B, $val3B, > $val4B) = unpack "B16B16B32B8B8B8B8", $buffer; > $tagC = bin2dec($tagB); > $typeC = bin2dec($typeB); > $countC = bin2dec($countB); > > sub bin2dec { > return unpack("N", pack("B32", substr("0" x 32 . > shift, -32))); > } Why not do this: seek(TIF, $offset, 0) or die "Seek $i: $!\n"; read(TIF, $buffer, 12); ($tag, $type, $count, $value) = unpack "nnNB32", $buffer; if ($type == 1) { $decoded_value = unpack "C", $value; } elsif ($type == 2) { $decoded_value = unpack "Z$count", $value; } elsif ($type == 3) { $decoded_value =unpack "n", $value; } elsif ($type == 4) { $decoded_value = unpack "N", $value; ... -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From andrew at sweger.net Wed Aug 27 15:35:05 2003 From: andrew at sweger.net (Andrew Sweger) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack In-Reply-To: <20030827184739.12454.qmail@web11503.mail.yahoo.com> Message-ID: On Wed, 27 Aug 2003, Richard Wood wrote: > Surely there are many pack/unpack experts out there in SPUG-land. There are, but they are few and far between. pack/unpack are a pair of the more inscrutable functions built into perl. It's a down-and-dirty, hands-on-the-metal kind of interface that just does not feel very Perl-ish; a necessary evil, if you will. I typically surround code containing these functions with a barricade of cinder block-like comments (lines of #'s): ###################################################################### ###################################################################### ##### DANGER DANGER - Avert your eyes - DANGER DANGER ###### ###################################################################### ###################################################################### (Wait, an evil idea has just ignited in my brain. Must find extinguisher.) -- Andrew B. Sweger -- The great thing about multitasking is that several things can go wrong at once. From sthoenna at efn.org Wed Aug 27 16:37:30 2003 From: sthoenna at efn.org (Yitzchak Scott-Thoennes) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack In-Reply-To: References: <20030827184739.12454.qmail@web11503.mail.yahoo.com> Message-ID: <20030827213730.GA3780@efn.org> On Wed, Aug 27, 2003 at 01:35:05PM -0700, Andrew Sweger wrote: > On Wed, 27 Aug 2003, Richard Wood wrote: > > > Surely there are many pack/unpack experts out there in SPUG-land. > > There are, but they are few and far between. pack/unpack are a pair of the > more inscrutable functions built into perl. It's a down-and-dirty, > hands-on-the-metal kind of interface that just does not feel very > Perl-ish; a necessary evil, if you will. I wouldn't say they are inscrutable, just that the templates are a mini-language of their own. Formats and regular expressions are also mini-languages -- just knowing perl syntax won't help you in these areas. From wildwood_players at yahoo.com Wed Aug 27 17:04:31 2003 From: wildwood_players at yahoo.com (Richard Wood) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack In-Reply-To: <20030827192233.GO5494@titanium.brain.org> Message-ID: <20030827220431.49622.qmail@web11501.mail.yahoo.com> Ben, Dan, Andy, Thanks for your responses. Ben, your code is certainly cleaner and seems to be heading me in the right direction. I don't seem to be getting good values out but that just requires more thought and experimentation. (by the way did you mean to use an "A" instead of a "Z" for ASCII? Z is not a meta character in pack. > } elsif ($type == 2) { > $decoded_value = unpack "Z$count", $value; Dan, I appreciate your pointing toward Image::Info. I am interested in the meta data but that is just the ground work. The real problem lies in interpreting the scan lines and making decision based on bit patterns. I will consider incorporating the Image::Info methods into the program. Andy, I like your style! I guess I am on the right foot, sounds like trial and error will be the order of the day until I get it totally correct. Thanks for your ideas. Rich Wood --- Ben Reser wrote: > On Wed, Aug 27, 2003 at 11:47:39AM -0700, Richard > Wood wrote: > > ($tagB, $typeB, $countB, $val1B, $val2B, $val3B, > > $val4B) = unpack "B16B16B32B8B8B8B8", $buffer; > > $tagC = bin2dec($tagB); > > $typeC = bin2dec($typeB); > > $countC = bin2dec($countB); > > > > sub bin2dec { > > return unpack("N", pack("B32", substr("0" x 32 > . > > shift, -32))); > > } > > Why not do this: > > > seek(TIF, $offset, 0) or die "Seek $i: $!\n"; > read(TIF, $buffer, 12); > ($tag, $type, $count, $value) = unpack "nnNB32", > $buffer; > if ($type == 1) { > $decoded_value = unpack "C", $value; > } elsif ($type == 2) { > $decoded_value = unpack "Z$count", $value; > } elsif ($type == 3) { > $decoded_value =unpack "n", $value; > } elsif ($type == 4) { > $decoded_value = unpack "N", $value; > ... > > -- > Ben Reser > http://ben.reser.org > > "What upsets me is not that you lied to me, but that > from now on I can > no longer believe you." -- Nietzsche > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org > http://spugwiki.perlocity.org > ACCOUNT CONFIG: > http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > ===== Richard O. Wood Wildwood IT Consultants, Inc. wildwood_players@yahoo.com 425.281.1914 mobile 206.544.9885 desk __________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com From ben at reser.org Wed Aug 27 17:45:24 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack In-Reply-To: <20030827220431.49622.qmail@web11501.mail.yahoo.com> References: <20030827192233.GO5494@titanium.brain.org> <20030827220431.49622.qmail@web11501.mail.yahoo.com> Message-ID: <20030827224524.GR5494@titanium.brain.org> On Wed, Aug 27, 2003 at 03:04:31PM -0700, Richard Wood wrote: > Ben, your code is certainly cleaner and seems to be > heading me in the right direction. I don't seem to be > getting good values out but that just requires more > thought and experimentation. (by the way did you mean > to use an "A" instead of a "Z" for ASCII? Z is not a > meta character in pack. According to perldoc -f pack it is: Z A null terminated (ASCIZ) string, will be null padded. It's been a while since I used (un)?pack so it might be one of the ones that are only usable in pack and not unpack... I didn't test the code I sent so I can't say for sure it worked. If I had a sample file of what I was parsing I might get further... -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From wildwood_players at yahoo.com Wed Aug 27 18:03:16 2003 From: wildwood_players at yahoo.com (Richard Wood) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack In-Reply-To: <20030827224524.GR5494@titanium.brain.org> Message-ID: <20030827230316.81051.qmail@web11503.mail.yahoo.com> Right, Got the "Z" working on a machine with a newer version of perl (doesn't work on was 5.004_04, does work on 5.005_03). I only have second edition of Camel. I will take a look at third edition tonight. All of my LONGs are coming out 808464432 and all of my SHORTS are coming out 12336. I will figure this out sooner or later. Sorry I can't send a file. They belong to a local airplane company and I am not at liberty to transmit them. elsif ($type == 3) { # SHORT $decoded_value =unpack "n", $value; } elsif ($type == 4) { # LONG $decoded_value = unpack "N", $value; } LONG 00000000000000000000000000000000 808464432 LONG 00000000000000001100100000010111 808464432 LONG 00000000000000000011000000010111 808464432 LONG 00000000000000000011000000010111 808464432 LONG 00000000100000000011100000110011 808464432 LONG 00000000000000000000000000000000 808464432 SHORT 00000000010000000000000000000000 12336 SHORT 00000000100000000000000000000000 12336 SHORT 00000000001000000000000000000000 12336 SHORT 00000000000000000000000000000000 12336 SHORT 01000000000000000000000000000000 12337 SHORT 00000000100000000000000000000000 12336 SHORT 00000000100000000000000000000000 12336 Rich Wood --- Ben Reser wrote: > On Wed, Aug 27, 2003 at 03:04:31PM -0700, Richard > Wood wrote: > > Ben, your code is certainly cleaner and seems to > be > > heading me in the right direction. I don't seem > to be > > getting good values out but that just requires > more > > thought and experimentation. (by the way did you > mean > > to use an "A" instead of a "Z" for ASCII? Z is > not a > > meta character in pack. > > According to perldoc -f pack it is: > Z A null terminated (ASCIZ) string, will be null > padded. > > It's been a while since I used (un)?pack so it might > be one of the ones > that are only usable in pack and not unpack... > > I didn't test the code I sent so I can't say for > sure it worked. If I > had a sample file of what I was parsing I might get > further... > > -- > Ben Reser > http://ben.reser.org > > "What upsets me is not that you lied to me, but that > from now on I can > no longer believe you." -- Nietzsche > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org > http://spugwiki.perlocity.org > ACCOUNT CONFIG: > http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > ===== Richard O. Wood Wildwood IT Consultants, Inc. wildwood_players@yahoo.com 425.281.1914 mobile 206.544.9885 desk __________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com From ben at reser.org Wed Aug 27 19:27:00 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack In-Reply-To: <20030827230316.81051.qmail@web11503.mail.yahoo.com> References: <20030827224524.GR5494@titanium.brain.org> <20030827230316.81051.qmail@web11503.mail.yahoo.com> Message-ID: <20030828002700.GU5494@titanium.brain.org> On Wed, Aug 27, 2003 at 04:03:16PM -0700, Richard Wood wrote: > Right, > > Got the "Z" working on a machine with a newer version > of perl (doesn't work on was 5.004_04, does work on > 5.005_03). I only have second edition of Camel. I > will take a look at third edition tonight. > > All of my LONGs are coming out 808464432 and all of my > SHORTS are coming out 12336. I will figure this out > sooner or later. > > Sorry I can't send a file. They belong to a local > airplane company and I am not at liberty to transmit > them. > > elsif ($type == 3) { # SHORT > $decoded_value =unpack "n", $value; > } elsif ($type == 4) { # LONG > $decoded_value = unpack "N", $value; > } I don't know what I was thinking when I wrote that sample code, This will work better (I hope): seek(TIF, $offset, 0) or die "Seek $i: $!\n"; read(TIF, $buffer, 8); ($tag, $type, $count, $value) = unpack "nnN", $buffer; read(TIF, $buffer, 4); if ($type == 1) { $value = unpack "C", $buffer; } elsif ($type == 2) { $value = unpack "Z$count", $buffer; } elsif ($type == 3) { $value =unpack "n", $buffer; } elsif ($type == 4) { $value = unpack "N", $buffer; ... Can't imagine who that local airplane company might be... :) -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From wildwood_players at yahoo.com Thu Aug 28 09:23:13 2003 From: wildwood_players at yahoo.com (Richard Wood) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack In-Reply-To: <20030828002700.GU5494@titanium.brain.org> Message-ID: <20030828142313.57696.qmail@web11505.mail.yahoo.com> David, Ben, Gary, Gary - picked up Third Edition of the Camel, you are correct, lots more examples, Thanks! David - yep, those numbers of mine looked very suspicious. Not sure why it was happening. The data came from an actual TIFF file. Ben - Thanks for the update on your example. Mine is now working well with essentially what you provided. I believe I am now on my way to success. Thanks very much to all. Rich Wood seek(TIF, $offset, 0) or die "Seek $i: $!\n"; read(TIF, $buffer, 8); ($tag, $type, $count) = unpack "nnN", $buffer; read(TIF, $buffer, 8); if ($type == 1) { $value = unpack "C", $buffer; } elsif ($type == 2) { $value = unpack "Z$count", $buffer; } elsif ($type == 3) { $value =unpack "n", $buffer; } elsif ($type == 4) { $value = unpack "N", $buffer; } elsif ($type == 5) { ($dv1, $dv2) = unpack "NN", $buffer; $value = "$dv1:$dv2"; } } # # now go get that 61 char ASCII string at address 288 # seek(TIF, 288, 0) or die "Seek $i: $!\n"; read(TIF, $buffer, 61); $decoded_value = unpack "Z61", $buffer; printf("%61s\n", $decoded_value); RESULTS: LONG 1 0 LONG 1 5096 LONG 1 3304 SHORT 1 1 SHORT 1 4 SHORT 1 0 ASCII 61 288 SHORT 1 512 SHORT 1 1 SHORT 1 1 LONG 1 3304 LONG 1 72908 RATIONAL 1 256:18546693 RATIONAL 1 264:19202052 LONG 1 0 SHORT 1 2 ME_P63154 CD0002 21 24 01 V 101 00 --- Ben Reser wrote: > On Wed, Aug 27, 2003 at 04:03:16PM -0700, Richard > Wood wrote: > > Right, > > > > Got the "Z" working on a machine with a newer > version > > of perl (doesn't work on was 5.004_04, does work > on > > 5.005_03). I only have second edition of Camel. > I > > will take a look at third edition tonight. > > > > All of my LONGs are coming out 808464432 and all > of my > > SHORTS are coming out 12336. I will figure this > out > > sooner or later. > > > > Sorry I can't send a file. They belong to a local > > airplane company and I am not at liberty to > transmit > > them. > > > > elsif ($type == 3) { # SHORT > > $decoded_value =unpack "n", $value; > > } elsif ($type == 4) { # LONG > > $decoded_value = unpack "N", $value; > > } > > I don't know what I was thinking when I wrote that > sample code, > This will work better (I hope): > > seek(TIF, $offset, 0) or die "Seek $i: $!\n"; > > read(TIF, $buffer, 8); > > ($tag, $type, $count, $value) = unpack "nnN", > $buffer; > read(TIF, $buffer, 4); > if ($type == 1) { > > $value = unpack "C", $buffer; > > } elsif ($type == 2) { > > $value = unpack "Z$count", $buffer; > > } elsif ($type == 3) { > > $value =unpack "n", $buffer; > > } elsif ($type == 4) { > > $value = unpack "N", $buffer; > > ... > > > Can't imagine who that local airplane company might > be... :) > > -- > Ben Reser > http://ben.reser.org > > "What upsets me is not that you lied to me, but that > from now on I can > no longer believe you." -- Nietzsche > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org > http://spugwiki.perlocity.org > ACCOUNT CONFIG: > http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > ===== Richard O. Wood Wildwood IT Consultants, Inc. wildwood_players@yahoo.com 425.281.1914 mobile 206.544.9885 desk __________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com From dan at concolor.org Thu Aug 28 10:59:35 2003 From: dan at concolor.org (Dan Sabath) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: Open Sauce Lunch in International Dist. Tomorrow. Message-ID: <9E9F0804-D970-11D7-8F8C-000393A6CEB6@concolor.org> Open Sauce Lunch In International District Friday, 8/29, 12:00pm Any Suggestions as to where? If i don't have any suggestions it will default to House of Hong. (possibly Hing Loon. Any takers? Change the wiki) Look for a guy wearing a hawaiian shirt and beard; that will be this lunch's Convener, Dan Sabath. If you have a PGP or GPG key and are interested in doing a keysigning, please bring a printed copy of your key's fingerprint and a piece of government issued photo ID. There will be people to exchange this information with. -dan From dleonard at dleonard.net Thu Aug 28 19:20:32 2003 From: dleonard at dleonard.net (dleonard@dleonard.net) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: pack and unpack In-Reply-To: Message-ID: pack and unpack are totally perl-ish. They fit so well in the TMTOWTDI philosophy. They are so great they pretty much document themselves. I like using them almost as much as splice. Not many other languages can do so much with so little code in such an elegant fashion. I'll take pack and unpack any day over a complex regex with backreferences and the like. -- On Wed, 27 Aug 2003, Andrew Sweger wrote: > On Wed, 27 Aug 2003, Richard Wood wrote: > > > Surely there are many pack/unpack experts out there in SPUG-land. > > There are, but they are few and far between. pack/unpack are a pair of the > more inscrutable functions built into perl. It's a down-and-dirty, > hands-on-the-metal kind of interface that just does not feel very > Perl-ish; a necessary evil, if you will. I typically surround code > containing these functions with a barricade of cinder block-like comments > (lines of #'s): > > ###################################################################### > ###################################################################### > ##### DANGER DANGER - Avert your eyes - DANGER DANGER ###### > ###################################################################### > ###################################################################### > > (Wait, an evil idea has just ignited in my brain. Must find extinguisher.) > > -- > Andrew B. Sweger -- The great thing about multitasking is that several > things can go wrong at once. > > _____________________________________________________________ > Seattle Perl Users Group Mailing List > POST TO: spug-list@mail.pm.org http://spugwiki.perlocity.org > ACCOUNT CONFIG: http://mail.pm.org/mailman/listinfo/spug-list > MEETINGS: 3rd Tuesdays, U-District, Seattle WA > WEB PAGE: http://www.seattleperl.org > From kahn at cpan.org Thu Aug 28 20:03:20 2003 From: kahn at cpan.org (Jeremy G Kahn) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: assigning to $0; different Linuces Message-ID: <3F4EA658.3090801@cpan.org> Has anybody else discovered that the ability to assign with $0 varies across even Linux versions? I wrote a very simple script to test this: #!/usr/bin/perl print "getting started\n"; sleep 5; $0 = "foo"; sleep 5; print "done!\n"; When I run it on Debian (kernel 2.4, perl 5.8.0 i386-linux-thread-multi), I can check the ps results thusly: jeremy@mystique:~$ ./test.pl& for f in 1 2 3; do ps -ef | grep foo; ps -ef | grep test; sleep 3; done [1] 5803 getting started jeremy 5805 5726 0 17:48 pts/4 00:00:00 grep foo jeremy 5803 5726 0 17:48 pts/4 00:00:00 /usr/bin/perl ./test.pl jeremy 5807 5726 0 17:48 pts/4 00:00:00 grep test jeremy 5810 5726 0 17:48 pts/4 00:00:00 grep foo jeremy 5803 5726 0 17:48 pts/4 00:00:00 /usr/bin/perl ./test.pl jeremy 5812 5726 0 17:48 pts/4 00:00:00 grep test jeremy 5803 5726 0 17:48 pts/4 00:00:00 foo jeremy 5817 5726 0 17:48 pts/4 00:00:00 grep test jeremy@mystique:~$ done! See how 5803 successfully changes its name ($0) to "foo" instead of test.pl ? Clever, eh? I'm using this as a status monitor for some locking tools. But when I run the same thing on redhat 7.3 (kernel 2.4, perl version 5.8.0 for i686-linux) I get these results: [1] 14104 getting started jgk 14106 11162 0 17:52 pts/5 00:00:00 grep foo jgk 14104 11162 0 17:52 pts/5 00:00:00 /usr/nikola/bin/perl ./test.pl jgk 14108 11162 0 17:52 pts/5 00:00:00 grep test jgk 14111 11162 0 17:52 pts/5 00:00:00 grep foo jgk 14104 11162 0 17:52 pts/5 00:00:00 /usr/nikola/bin/perl ./test.pl jgk 14113 11162 0 17:52 pts/5 00:00:00 grep test jgk 14116 11162 0 17:52 pts/5 00:00:00 grep foo jgk 14104 11162 0 17:52 pts/5 00:00:00 /usr/nikola/bin/perl ./test.pl jgk 14118 11162 0 17:52 pts/5 00:00:00 grep test bash-2.05a$ all done! Notice that the name of process 14104 does *not* change. This is really irritating, since the assign-to-$0 is a nice feature, when you're doing lots of fork-ing and exec-ing. Does anybody know how one would check that a system can do this from within a script? Or a configuration variable in Linux that somebody might be able to tune? --jeremy From ben at reser.org Thu Aug 28 20:21:49 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: assigning to $0; different Linuces In-Reply-To: <3F4EA658.3090801@cpan.org> References: <3F4EA658.3090801@cpan.org> Message-ID: <20030829012148.GX5494@titanium.brain.org> On Thu, Aug 28, 2003 at 06:03:20PM -0700, Jeremy G Kahn wrote: > Has anybody else discovered that the ability to assign with $0 varies > across even Linux versions? [snip] > This is really irritating, since the assign-to-$0 is a nice feature, > when you're doing lots of fork-ing and exec-ing. Does anybody know how > one would check that a system can do this from within a script? Or a > configuration variable in Linux that somebody might be able to tune? IIRC there are two different procps distributions. Likely the difference is that Debian is using a different one than RedHat. You might also compare the content of the various "files" in /proc/$$/ Where $$ = your pid -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From mwallend at fastmail.fm Thu Aug 28 20:44:34 2003 From: mwallend at fastmail.fm (Michael Wallendahl) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: Job Posting: www.FastMail.fm Message-ID: <000c01c36dcf$1b46f9c0$3201a8c0@lestat> The e-mail provider that I use, FastMail, is looking to hire a Perl developer. Below are some links to the job opening. I wish I knew more Perl because although they are based out of Australia (I think), they are willing to hire another telecommuter. http://jobs.perl.org/job/894 http://www.emailaddresses.com/forum/showthread.php?s=&threadid=14654&perpage=15&pagenumber=1 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.pm.org/pipermail/spug-list/attachments/20030828/d6b3285c/attachment.htm From david.dyck at fluke.com Thu Aug 28 21:10:39 2003 From: david.dyck at fluke.com (David Dyck) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: assigning to $0; different Linuces In-Reply-To: <20030829012148.GX5494@titanium.brain.org> References: <3F4EA658.3090801@cpan.org> <20030829012148.GX5494@titanium.brain.org> Message-ID: On Thu, 28 Aug 2003 at 18:21 -0700, Ben Reser wrote: > On Thu, Aug 28, 2003 at 06:03:20PM -0700, Jeremy G Kahn wrote: > > Has anybody else discovered that the ability to assign with $0 varies > > across even Linux versions? no :-) > [snip] > > IIRC there are two different procps distributions. Likely the > difference is that Debian is using a different one than RedHat. I agree with Ben. > You might also compare the content of the various "files" in /proc/$$/ > Where $$ = your pid Here's a quick example that uses the linux /proc filesystem (try may 5 proc) that should work on both your linux systems $ cmdline aaa bbb before: cmdline=/usr/local/bin/perl\0-w\0/usr0/dcd/bin/cmdline\0aaa\0bbb\0 stat=cmdline after: cmdline=foo stat=cmdline $ cat /usr0/dcd/bin/cmdline #!/usr/local/bin/perl -w sub getfile($) { my $fname = shift; open FILE,"<$fname" || die "can't open $fname:$!\n"; local $/; my $file= || die "can't read $fname:$!\n"; close FILE || die "can't close $fname:$!\n"; # print "$fname=$file\n"; return $file; } sub proc_cmdline($) { my $pid = shift; my $cmdline=getfile("/proc/$pid/cmdline"); $cmdline =~ s/\0/\\0/g; print "cmdline=$cmdline\n"; } sub proc_stat($) { my $pid = shift; my $stat=getfile("/proc/$pid/stat"); if ($stat =~ m/\(([^)]*)\)/) { print "stat=$1\n"; } else { print "stat: no match in $stat\n"; } } sub show($) { print " ",shift,":\n"; proc_cmdline($$); proc_stat($$); } show "before"; $0="foo"; show "after"; __END__ man 5 proc /dev/proc cmdline This holds the complete command line for the process, unless the whole process has been swapped out, or unless the process is a zom? bie. In either of these later cases, there is nothing in this file: i.e. a read on this file will return 0 characters. The command line arguments appear in this file as a set of null-separated strings, with a further null byte after the last string. stat Status information about the process. This is used by ps(1). It is defined in /usr/src/linux/fs/proc/array.c. The fields, in order, with their proper scanf(3) format specifiers, are: pid %d The process id. comm %s The filename of the executable, in parentheses. This is visible whether or not the executable is swapped out. From ben at reser.org Fri Aug 29 00:38:46 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: assigning to $0; different Linuces In-Reply-To: References: <3F4EA658.3090801@cpan.org> <20030829012148.GX5494@titanium.brain.org> Message-ID: <20030829053845.GZ5494@titanium.brain.org> On Thu, Aug 28, 2003 at 07:10:39PM -0700, David Dyck wrote: > I agree with Ben. Looks like I was wrong... > $ cmdline aaa bbb > before: > cmdline=/usr/local/bin/perl\0-w\0/usr0/dcd/bin/cmdline\0aaa\0bbb\0 > stat=cmdline > after: > cmdline=foo > stat=cmdline On Mandrake Linux 9.1/ppc. [breser@titanium breser]$ perl cmdline aaaa bbbb before: cmdline=perl\0cmdline\0aaaa\0bbbb\0 stat=perl after: cmdline=perl\0cmdline\0aaaa\0bbbb\0 stat=perl Perhaps, there is a kernel component to this.. -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From dan at concolor.org Fri Aug 29 01:49:19 2003 From: dan at concolor.org (Dan Sabath) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: Open Sauce Lunch Today. Message-ID: Chinese food in International District Friday, 8/29, 12:00pm Hing Loon 628 S Weller St Seattle, Washington 98104 (206) 682-2828 http://www.lostinseattle.com/LIS/restaurant/hingloonseafoodrestaura.html Look for a guy wearing a hawaiian shirt and beard; that will be this lunch's Convener, Dan Sabath. If you have a PGP or GPG key and are interested in doing a keysigning, please bring a printed copy of your key's fingerprint and a piece of government issued photo ID. There will be people to exchange this information with. Please sign up on the Wiki so we can reserve a large enough table. http://spugwiki.perlocity.org/index.cgi?FriAug29InInternationalDistrict -dan From ben at reser.org Fri Aug 29 11:17:03 2003 From: ben at reser.org (Ben Reser) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: assigning to $0; different Linuces In-Reply-To: <3F4EA658.3090801@cpan.org> References: <3F4EA658.3090801@cpan.org> Message-ID: <20030829161702.GF5494@titanium.brain.org> On Thu, Aug 28, 2003 at 06:03:20PM -0700, Jeremy G Kahn wrote: > But when I run the same thing on redhat 7.3 (kernel 2.4, perl version > 5.8.0 for i686-linux) I get these results: [snip] > Notice that the name of process 14104 does *not* change. > > This is really irritating, since the assign-to-$0 is a nice feature, > when you're doing lots of fork-ing and exec-ing. Does anybody know how > one would check that a system can do this from within a script? Or a > configuration variable in Linux that somebody might be able to tune? After some testing I can replicate this on 5.8.0 on Mandrake 9.1/ppc. My version of 5.8.1 RC4 on another machine Mandrake 9.2(cooker) works fine. So it looks like it is a perl bug in 5.8.0 that has been fixed in 5.8.1. -- Ben Reser http://ben.reser.org "What upsets me is not that you lied to me, but that from now on I can no longer believe you." -- Nietzsche From kahn at cpan.org Fri Aug 29 12:43:59 2003 From: kahn at cpan.org (Jeremy G Kahn) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: assigning to $0; different Linuces In-Reply-To: <20030829161702.GF5494@titanium.brain.org> References: <3F4EA658.3090801@cpan.org> <20030829161702.GF5494@titanium.brain.org> Message-ID: <3F4F90DF.7050708@cpan.org> Ben Reser wrote: >On Thu, Aug 28, 2003 at 06:03:20PM -0700, Jeremy G Kahn wrote: > > >>But when I run the same thing on redhat 7.3 (kernel 2.4, perl version >>5.8.0 for i686-linux) I get these results: >> >> >>Notice that the name of process 14104 does *not* change. >> >>This is really irritating, since the assign-to-$0 is a nice feature, >>when you're doing lots of fork-ing and exec-ing. Does anybody know how >>one would check that a system can do this from within a script? Or a >>configuration variable in Linux that somebody might be able to tune? >> >> > >After some testing I can replicate this on 5.8.0 on Mandrake 9.1/ppc. >My version of 5.8.1 RC4 on another machine Mandrake 9.2(cooker) works >fine. So it looks like it is a perl bug in 5.8.0 that has been fixed in >5.8.1. > > http://search.cpan.org/author/JHI/perl-5.8.1-RC4/pod/perldelta.pod#Platform_specific_fixes Turns out it indeed is a bug in 5.8.0. I don't know why it works on my Debian 5.8.0 installation; all I can assume is that it was patched in some incremental fix for 5.8.0-debian . Thanks for the advice! --jeremy From kahn at cpan.org Fri Aug 29 13:25:55 2003 From: kahn at cpan.org (Jeremy G Kahn) Date: Mon Aug 2 21:37:08 2004 Subject: SPUG: assigning to $0; different Linuces In-Reply-To: <3F4F90DF.7050708@cpan.org> References: <3F4EA658.3090801@cpan.org> <20030829161702.GF5494@titanium.brain.org> <3F4F90DF.7050708@cpan.org> Message-ID: <3F4F9AB3.8060008@cpan.org> one last followup, to remove any shred of mystery... Jeremy G Kahn wrote: > http://search.cpan.org/author/JHI/perl-5.8.1-RC4/pod/perldelta.pod#Platform_specific_fixes > > > Turns out it indeed is a bug in 5.8.0. I don't know why it works on > my Debian 5.8.0 installation; all I can assume is that it was patched > in some incremental fix for 5.8.0-debian . Okay, now I know why it works on my Debian system. The Debian maintainers (Bog bless 'em) have backported the 5.8.1 fix into the 'testing' release: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=178404 Just in case anybody else was dying to know... --jeremy