SPUG: Fw: Uniq in perl

Greg J. Badros greg.badros at infospace.com
Mon Mar 19 19:54:12 CST 2001


Dean Hudson <dean at ero.com> writes:

> this may make -w complain but:
> 
> @out = grep {!/^$last$/ and $last = $_} @in;
> 
> behaves like uniq; it only gets rid of adjacent duplicated lines.

You've gotta be careful to quote meta-characters in $last:

@out = grep {!/^\Q$last\E$/ and $last = $_} @in;  # untested

should work better, but the regex could just be replaced by `ne', too.

Greg


> 
> dean.
> 
> On Mon, Mar 19, 2001 at 03:59:53PM -0800, Bill Alford wrote:
> > Ooops, that argument was an array reference, not the full array.  Just
> > change as appropriate below.
> > 
> > On Mon, 19 Mar 2001, Bill Alford wrote:
> > 
> > > All the solutions I've seen so far look good for treating the list as a
> > > set (I may have missed one that didn't).  But, from the uniq man page:
> > > 
> > >        Discard  all  but  one  of successive identical lines from
> > >        INPUT (or standard input), writing to OUTPUT (or  standard
> > >        output).
> > > 
> > > It looks like that's what's happening below.
> > > 
> > > Here's what I can come up with (not tested and a little verbose, but I
> > > like verbose :) :
> > > 
> > > sub uniq {
> > >   if (! @_) {
> > >     return @_;  # for the empty set case
> > >   }
> > >   my $last = shift;
> > >   my @result = ( $last );
> > >   my $cur;
> > > 
> > >   foreach $cur ( @_ ) {
> > >     # assuming string input
> > >     if ($cur ne $last) {
> > >       $last = $cur;
> > >       push(@result,$cur);
> > >     }
> > >   }
> > >   return @result;
> > > }
> > > 
> > > 
> > > 
> > > On Mon, 19 Mar 2001, jeff saenz wrote:
> > > 
> > > > What if you convert the array values to hash keys and let the hash eliminate the
> > > > duplicates? Something like:
> > > > 
> > > > @hash{@array} = ();
> > > > or maybe a map. but i think maps are inefficient.  something like that.
> > > > 
> > > > Richard Anderson wrote:
> > > > 
> > > > > Does anyone have comments for Mike?
> > > > >
> > > > > Richard Anderson, Ph.D.          www.unixscripts.com
> > > > > Perl / Oracle / Unix                Richard.Anderson at raycosoft.com
> > > > > Raycosoft, LLC                        Seattle, WA, USA
> > > > > ----- Original Message -----
> > > > > From: "Mike" <shivan at ici.net>
> > > > > To: "Richard Anderson" <Richard.Anderson at raycosoft.com>
> > > > > Sent: Friday, March 16, 2001 2:12 PM
> > > > > Subject: Uniq in perl
> > > > >
> > > > > > This is something I had sitting around. I didn't write it. Its a subroutine
> > > > > > that uniqs an array. I was just wondering if this is the best way to do it
> > > > > > or if anyone has any thing better. Thanks
> > > > > >
> > > > > >
> > > > > > ## Usage: &uniq( \@ARRAY );
> > > > > > sub uniq {
> > > > > > my (@uwork, @unew, $uname);
> > > > > >    @uwork = @{$_[0]};
> > > > > >    @uwork = sort( @uwork );
> > > > > >    @unew  = ( shift @uwork );
> > > > > >
> > > > > >    foreach $uname ( @uwork ) {
> > > > > >       @unew=( @unew, $uname) if ( $uname ne $unew[ -1 ] );
> > > > > >    }
> > > > > >
> > > > > >    @{$_[0]}=@unew;
> > > > > > }
> > > > > >
> > > > > > :wq
> > > > > >
> > > > > >
> > > > >
> > > > >  - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> > > > >      POST TO: spug-list at pm.org       PROBLEMS: owner-spug-list at pm.org
> > > > >       Subscriptions; Email to majordomo at pm.org:  ACTION  LIST  EMAIL
> > > > >   Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
> > > > >  For daily traffic, use spug-list for LIST ;  for weekly, spug-list-digest
> > > > >   Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/
> > > > 
> > > > 
> > > >  - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> > > >      POST TO: spug-list at pm.org       PROBLEMS: owner-spug-list at pm.org
> > > >       Subscriptions; Email to majordomo at pm.org:  ACTION  LIST  EMAIL
> > > >   Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
> > > >  For daily traffic, use spug-list for LIST ;  for weekly, spug-list-digest
> > > >   Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/
> > > > 
> > > > 
> > > 
> > > 
> > 
> > 
> >  - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> >      POST TO: spug-list at pm.org       PROBLEMS: owner-spug-list at pm.org
> >       Subscriptions; Email to majordomo at pm.org:  ACTION  LIST  EMAIL
> >   Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
> >  For daily traffic, use spug-list for LIST ;  for weekly, spug-list-digest
> >   Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/
> > 
> > 
> 
> 
>  - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>      POST TO: spug-list at pm.org       PROBLEMS: owner-spug-list at pm.org
>       Subscriptions; Email to majordomo at pm.org:  ACTION  LIST  EMAIL
>   Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
>  For daily traffic, use spug-list for LIST ;  for weekly, spug-list-digest
>   Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/

 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
     POST TO: spug-list at pm.org       PROBLEMS: owner-spug-list at pm.org
      Subscriptions; Email to majordomo at pm.org:  ACTION  LIST  EMAIL
  Replace ACTION by subscribe or unsubscribe, EMAIL by your Email-address
 For daily traffic, use spug-list for LIST ;  for weekly, spug-list-digest
  Seattle Perl Users Group (SPUG) Home Page: http://www.halcyon.com/spug/





More information about the spug-list mailing list