[GRLUG] Hosting ideas
Michael Mol
mikemol at gmail.com
Sun Mar 23 01:12:26 EDT 2008
On Sat, Mar 22, 2008 at 6:08 PM, Douglas Peter Sculley
<dsculley at gmail.com> wrote:
> "nice" processes where applicable in your script:
>
> $ nice -n 19 command
>
> If rsync is available on both ends you can use rsync with a bandwidth limit
> over ssh:
>
> $ rsync -av -e ssh --bwlimit=10 foo:src/bar/ /data/tmp
Nice.
My local backup script was essentially
ssh user at domain mysqldump -h "$DBHOST" -u "$DBUSER"
"--password=$DBPASS" "$DBNAME" |gzip > "$DAILYS/db/$(date +%Y%m%d).gz"
rsync -r user at domain:public_html/rosettacode.org "$RCDAILYS/site"
The rsync bandwidth limit is easy to add, but I'm still not sure about
the throttle script. I want to add it between ssh and gzip, but that
only works if both ssh and mysqldump have limits on their internal
buffers. I'd prefer not to have any of the scripts serverside, as
that reduces portability and opens me to man in the middle attacks if
someone gets access to my home directory on the server.
I'm not certain running nice on the mysqldump process would be a good
idea. The server's load average is about 3. I'm not sure the
mysqldump process would get much love under those conditions. Then
again, cpuinfo tells me there are lots of cores on the box. Does the
load average mean there are 3 processes waiting on each core, or only
three processes total in all the runqueues (averaged over time)?
>
>
>
> On Fri, Mar 21, 2008 at 11:52 PM, Michael Mol <mikemol at gmail.com> wrote:
> >
> >
> >
> >
> >
> >
> > On Fri, Mar 21, 2008 at 11:07 PM, Tim Schmidt <timschmidt at gmail.com>
> wrote:
> > >
> > > On Fri, Mar 21, 2008 at 10:47 PM, Michael Mol <mikemol at gmail.com> wrote:
> > > > I run a site called Rosetta Code. It's a fairly active wiki, with
> > > > around 500 edits per week and around 10,000 page views per week,
> > > > sometimes more. Currently, the database weighs in at about 130MiB.
> > > >
> > > > Combined the database size, my shared hosting account at Bluehost,
> and
> > > > my desire to do backups, and you get a problem. See, this evening I
> > > > wrote a script that would run mysqldump on the database and save it
> to
> > > > my home machine. Problem is, running mysqldump on that database on
> > > > that server* causes sites to start spitting HTTP 500 errors, and my
> > > > account gets suspended for a minute until the CPU utilization
> sliding
> > > > window passes.
> > > >
> > > > * Bluehost has the MySQL server and Apache server on the same box
> for
> > > > any given account.
> > > >
> > > > So I can't exactly automate it without taking down my site (and
> > > > possibly many others) each time. I need to find different hosting
> for
> > > > Rosetta Code. Preferably something that won't choke when I pull
> > > > 100+MB of data out of the database every morning over an SSH
> > > > connection.
> > >
> > > You could pipe the mysqldump output through something to throttle
> > > it... bzip2 comes to mind, but that wouldn't lower your CPU
> > > utilization. Sure seems like something < 5 lines of perl could get
> > > done though.
> >
> > Well, it worked out to 10 lines, but here it is. I'll give it a try.
> >
> > #!/usr/bin/perl -w
> > use strict;
> > my $rate = 25*1024; # 25KB/s
> >
> > my $count = 0;
> > do
> > {
> > my $buf;
> > $count = read STDIN, $buf, $rate;
> > print $buf;
> > sleep 1 if $count == $rate;
> > } while( $count == $rate );
> >
> >
> > --
> > :wq
> >
> >
> >
> >
> > _______________________________________________
> > grlug mailing list
> > grlug at grlug.org
> > http://shinobu.grlug.org/cgi-bin/mailman/listinfo/grlug
> >
>
>
> _______________________________________________
> grlug mailing list
> grlug at grlug.org
> http://shinobu.grlug.org/cgi-bin/mailman/listinfo/grlug
>
--
:wq
More information about the grlug
mailing list