[LA.pm] perl CGI querying of directory filenames most efficient method?

Peter Benjamin pete at peterbenjamin.com
Thu Sep 22 17:14:08 PDT 2005


At 03:10 PM 9/22/2005, Jeremy Leader wrote:
Yes, I'd expect the directory blocks to be cached, 

I keep RAM amount exceeding the amount of web root and
database tables size, so everything ought to fit in RAM.
There is 26 meg of RAM free right now.  I like that.

unless
>the directory was really huge or there was a whole lot of
>other disk access going on at the same time.  

Well, given the amount of RAM, I expect even huge directories
to have all blocks present in RAM, and never swapped.

Most of the disk access will be log writing, so there is 
little to no collisions in R/W action to the hard drive.
My replacement colo will have multiple hard drives.  :-)

>What does
>"ls -sd $dir" give as the number of blocks in the directory?

> ls -sd images
296 images
> ls -ld images
drwxrwxr-x  8 pete  pete  294912 Sep 22 12:28 images


>If the web server is opening these files (to serve their
>contents), then the -e tests won't add much cost, I
>believe.

I concur.  I like to back up my "belief" with testing, but
I can not see how to cheaply test this.  It involves two
computers, one remotely doing http requests for the web
CGI to test for, insert into the dynamic page, and the
web server to then get a request from the remote computer
for the images.  It's involved, and time consuming to set
up, so is not inexpensive.

The next issue I have is with FTP uploading to the directory
means a customer GUI download of all the filenames, and that 
takes like 30 seconds for 12000 names.  When that doubles and 
triples, it will become a head ache.  I can disable ls, but ...
Is there any good solutions out there now for this issue?




More information about the Losangeles-pm mailing list