SPUG:DOS / BASIC Script

Chuck Langenberg SeattlePerlFJ92 at langenberg.com
Tue Jan 21 13:14:29 CST 2003


I'd suggest a backup system witn NO faith in "auto pilot" processes.
Technology is wonderful -- when it works. But one day you're going on
vacation. And there's nothing your stand-in will ever do that's as
important as a backup. They may not see it that way though. But
you're the one that's going to be working till 3 AM if the backups
get hosed.
  
The problem with making a system so simple that even a simpleton can
run it is -- a simpleton eventually will run it. If the company finds
out how simple you've made it, they'll want to hire the cheapest
labor they can find to run it. If it's really simple, that person may
even train somebody else, who trains somebody else -- none of whom
would understand it. If the backup operator doesn't understand what's
going on, sooner or later they'll get flustered and launch the backup
process 5 or 10 more times. Just to be safe, you know. And your
generations of backups may float off into bit heaven. You may not
even know they screwed up until months later -- when it's too late.
But you do verify the content and readability of your backups every
day, don't you?
  
Here's a suggestion which would interject some human oversight.
Rather than get fancy with automated rotations, I'll assume that
your backup is put into the same directory name every day -- let's
call it bakdir.
1. Create a job (or bat file) let's call it "dailybakup".
2. The first thing the job does is check for the existence of the
   "bakdir" directory. And if it exists, the job terminates with a
   message -- telling them they forgot to rename the directory
   after the last backup.
3. If the job is able to create the bakdir directory, then it assumes
   everything is OK. Then the job copies the current data into bakdir.
4. Every day a human being must "manually" rename the bakdir directory.
   If they don't know how to rename a directory, they shouldn't be
   doing backups in the first place. A good naming convention would
   come in handy here -- such as bakdir.20030121 which means
   January 21, 2003. And if you have numerous backup.YYYYmmDD
   directories laying around, they'll sort nicely, ordered by date.
5. To free the space up, I think you should delete those directories
   manually too. I don't want some automated process renaming or
   zapping my backup directories.
  
I vote for minimal automation in my backup process.
  
  
> One way to keep the backup files without "rotation" is to save the file
> with the day-of-the-month mod 2 (for ~15 backups) or the day-of-the-week
> (for 7 backup files).  Then let the files be written over when it day
> comes arount:
> 
>  backup-1, backup-2, backup-3,....
> 
> Then you do not have to move/rename files.
> 
> -Matthew.Bustad
> 
> On Fri, 17 Jan 2003, September Nyang oro wrote:
  
  
> > SPUG-ers,
> > 
> > I got a job in DataStage that runs once per day. It basically gets raw 
> > data from DB2, transforms it ( we do some calculations and other stuff 
> > here), and then dumps the data into an Oracle. So that part works 
> > perfectly.
> > 
> > We need to be able to archive this raw data from the DB2 source for 14 
> > days just incase something happens to our jobs, so we could go back and 
> > get that raw data from the archive and run it again. We're trying to 
> > archive this data in a flat file in Windows NT. Right now when we run 
> > the process we could only archive the raw data for 1 day. DataStage 
> > could only 'overwrite' or 'append' to an existing file. But we want 
> > separate files everyday.
> > 
> > The challenge: I need to write a script in Basic / DOS that would put a 
> > timestamp in my jobs. Before it 'overwrites' yesterday's job in the 
> > system when it writes today's job, it should move 'yesterday's job into 
> > another directory/folder. And then keep that old job in the new folder 
> > for 14 days. After 14 days that job should be deleted. So a script that 
> > can perform those functions is something that would be useful to me.
> > 
> > So ideally, I'll need to have a script that runs first on the system, 
> > moves the old file into the new directory, checks whether that new 
> > directory has files older than 14 days, if so then delete the older-
> > than-14-days file(s). If no older than 14 days files are available, 
> > then just finish.
> > 
> > Ideas are welcome.
> > 
> > Thanks.
> > 
> > ../seppy
  






More information about the spug-list mailing list