"nice" processes where applicable in your script:<br><br>$ nice -n 19 command<br><br>If rsync is available on both ends you can use rsync with a bandwidth limit over ssh:<br><br>$ rsync -av -e ssh --bwlimit=10 foo:src/bar/ /data/tmp<br>
<br><div class="gmail_quote">On Fri, Mar 21, 2008 at 11:52 PM, Michael Mol <<a href="mailto:mikemol@gmail.com">mikemol@gmail.com</a>> wrote:<br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<div><div></div><div class="Wj3C7c">On Fri, Mar 21, 2008 at 11:07 PM, Tim Schmidt <<a href="mailto:timschmidt@gmail.com">timschmidt@gmail.com</a>> wrote:<br>
><br>
> On Fri, Mar 21, 2008 at 10:47 PM, Michael Mol <<a href="mailto:mikemol@gmail.com">mikemol@gmail.com</a>> wrote:<br>
> > I run a site called Rosetta Code. It's a fairly active wiki, with<br>
> > around 500 edits per week and around 10,000 page views per week,<br>
> > sometimes more. Currently, the database weighs in at about 130MiB.<br>
> ><br>
> > Combined the database size, my shared hosting account at Bluehost, and<br>
> > my desire to do backups, and you get a problem. See, this evening I<br>
> > wrote a script that would run mysqldump on the database and save it to<br>
> > my home machine. Problem is, running mysqldump on that database on<br>
> > that server* causes sites to start spitting HTTP 500 errors, and my<br>
> > account gets suspended for a minute until the CPU utilization sliding<br>
> > window passes.<br>
> ><br>
> > * Bluehost has the MySQL server and Apache server on the same box for<br>
> > any given account.<br>
> ><br>
> > So I can't exactly automate it without taking down my site (and<br>
> > possibly many others) each time. I need to find different hosting for<br>
> > Rosetta Code. Preferably something that won't choke when I pull<br>
> > 100+MB of data out of the database every morning over an SSH<br>
> > connection.<br>
><br>
> You could pipe the mysqldump output through something to throttle<br>
> it... bzip2 comes to mind, but that wouldn't lower your CPU<br>
> utilization. Sure seems like something < 5 lines of perl could get<br>
> done though.<br>
<br>
</div></div>Well, it worked out to 10 lines, but here it is. I'll give it a try.<br>
<br>
#!/usr/bin/perl -w<br>
use strict;<br>
my $rate = 25*1024; # 25KB/s<br>
<br>
my $count = 0;<br>
do<br>
{<br>
my $buf;<br>
$count = read STDIN, $buf, $rate;<br>
print $buf;<br>
sleep 1 if $count == $rate;<br>
} while( $count == $rate );<br>
<font color="#888888"><br>
<br>
--<br>
:wq<br>
</font><div><div></div><div class="Wj3C7c">_______________________________________________<br>
grlug mailing list<br>
<a href="mailto:grlug@grlug.org">grlug@grlug.org</a><br>
<a href="http://shinobu.grlug.org/cgi-bin/mailman/listinfo/grlug" target="_blank">http://shinobu.grlug.org/cgi-bin/mailman/listinfo/grlug</a><br>
</div></div></blockquote></div><br>