[GRLUG] Imaging software improvements

Michael Mol mikemol at gmail.com
Mon Nov 13 09:34:42 EST 2006


I'm passively familiar with two system imaging software packages,
Symantec's Ghost and Novell's system imaging software. (Don't remember
the name.)

One thing I've noticed is that they don't handle errors very well.
When I watch a batch of ten or fifteen computers getting imaged, if
one machine fails, that machine aborts, and a second attempt has to be
made for it after the rest of the systems are done transferring data.
The new attempt doesn't resume where the old attempt left off, though;
it starts from the beginning, which means another couple hours of
waiting.

I've thought of what I see as a better way to write imaging software.
Basically, you break the transmission into chunks, and check each
chunk's md5sum against what the server says it should be.  One could
break the disk image into chunks, or the filesystem.

Once the server is done multicasting the entire image, the client
computers would report which chunks failed, and those chunks would be
transmitted again. If the same chunk failed on many computers, it
would get a higher priority in retransmission than those chunks that
failed on fewer computers.

Thoughts?

-- 
:wq


More information about the grlug mailing list