[GRLUG] concurrently running scripts
Michael Mol
mikemol at gmail.com
Thu Jan 30 11:01:15 EST 2014
On Thu, Jan 30, 2014 at 10:48 AM, Eric Beversluis
<ebever at researchintegration.org> wrote:
> I presently have a script that backs up Zarafa mail stuff to a local
> directory, then does an rsync of it to a USB-mounted ioSafe, then does a
> mysqldump locally and then an rsync of the mysqldump to the external
> ioSafe.
>
> The whole process is closing in on taking more than the hour it has
> before the process starts over. We earlier explored using flock to keep
> one instance from starting before the earlier one finished. But I'm
> thinking it might be better just to separate out the mysqldump into a
> separate script that would run (once a day) concurrently with the other
> backups (that are running each hour).
>
> My question is whether that would create any problems as the two scripts
> try to write to the same HDD and then to the same USB drive. I'm
> assuming that the OS could handle that, but I want to be sure.
>
> Thanks.
Separating out your script's jobs into separate scripts is a good
idea. That way, if one starts failing, it won't prevent the other from
running to completion.
As for whether multiple writers to the same USB drive is a
problem...you're already I/O bound, so you just need to make sure
you're writing to different folders.
That said, I'd ditch your existing script and use a couple instances
of rsnapshot: http://www.rsnapshot.org/
It will do exactly what you're looking for, and a tiny bit more--you
get the benefit of incremental backups. It's trivial to have it keep
just a few each of hourly backups, daily backups, weekly backups,
monthly, yearly, every-third-and-fourth-thursday...you get the idea.
(It's scheduled via cron, and you tell it which period you're doing
backups for.)
And you can run it per-task.
--
:wq
More information about the grlug
mailing list