[GRLUG] file backup script

Roberto Villarreal rvillarreal at mktec.com
Tue Jul 17 21:04:03 EDT 2007


On Tuesday 17 July 2007, Jeff DeFouw wrote:
> On Tue, Jul 17, 2007 at 01:58:56PM -0400, Nathan Drier wrote:
> > Ive been battling with this all day,  so I thought id ask the
> > list for any ideas:
> >
> > I have a text list of about 17,000 files that I need to
> > consolidate into one
> > folder.  All these files are spread through a folder structure on
> > a mapped drive.  I need to write a script that will read the
> > filenames from the text file,  then search recursively through
> > the mapped drive for them.  If the file exists somewhere on the
> > mapped drive,  I need it to be copied to a local folder where I
> > can run them all through a converter.  Any ideas?
>
> I would use perl and read the list of files into a hash.  The
> File::Find module should get the job done for the main part of the
> search.  It calls a user function for each file found, so you could
> immediately copy or save the file to an array.

Just happened to have a script that needed to do something similar... 
slightly modified to fit his scenario and untested (of course!):

#!/usr/bin/perl

use strict;
use warnings;

use File::Find;
use File::Copy;

my ( %hsh, @lines );

my $destdir = '/path/to/copy/to';
my $rootdir = '/path/to/search';


open( FH, "/path/to/file/with/filenames" );
@lines = <FH>;
close( FH );

chomp and $hsh{$_} = undef foreach @lines;

find ( sub {
    if ( exists( $hsh{$_} ) ) {
        copy( "$File::Find::name", "$destdir/$_" ); }
  }, ( $rootdir ) );


Roberto


More information about the grlug mailing list