Manual backups

Hello,

I am planning to backup my site with a regular shell. How am I supposed to download the backups to my local computer? Is there a good way to do it or just with scp?

Thank you

13 Replies

Even better, rdiff-backup

Thank you both.

Does it affect your site while downloading the files?

Any backup method will use some CPU, but will mostly be IO bound.

I don't think you can directly limit the bandwidth usage with rdiff-backup (it is possible, but unpractical), but with rsync, you have this

     --bwlimit=KBPS          limit I/O bandwidth; KBytes per second

As for what effect the backup will have, I guess it will mostly depend on how much resources the sites you are referring to needs during the backup.

To summarize, my guess is: Not a lot, but possibly, yes.

If you're just backing up a couple of small sites, Bacula might be overkill. For simple needs, I'd just set up a cron job on the server that dumps the database every day, and rsync/rdiff-backup/rsnapshot both the DB dump and the website files to a different location at regular intervals.

I use Amazon Web Services S3 and duplicity. There are a few little quirks with that but over all it works pretty good. Having said that, I don't have much data that I back up so outbound transfer isn't an issue for me. AWS gives you free storage for a year when you sign up (or they did) up to 5GB I believe. After that it's still ridiculously cheap.

If you Google duplicity and Amazon Web Services you should be able to find a good example of the cron script to setup. If you can't find it I can post mine. You can also encrypt the backups, so your private stuff stays private.

Thank you again.

Can't you use duplicity with a local PC? Do you need Amazon S3?

Daily mysqldump | gzip -9 –rsyncable, then rdiff-backup of almost everything (including the dbdump.gz) except stuff like logfiles. Disk scan/checksumming takes about a hour, the real amount of data transferred usually ends up to be dozens of megabytes at most, unless someone did some big changes on his site that day.

Restore would take a few hours as it's backed up to a DSL with 512 kbps upload, but as I'm a noncommercial host, it's acceptable.

If you need fast backups, well, look into Linode's offer; S3 can be a double-edged sword.

LATE EDIT: Accidental double negation fixed.

@rsk:

gzip -9 –rsyncable
Whoa, thanks! Had never heard of the –rsyncable flag before.

@Vance:

@rsk:

gzip -9 –rsyncable
Whoa, thanks! Had never heard of the –rsyncable flag before.
And it actually works quite good.

Note on database dump - what I do is first of month, fresh dump. Each day after, fresh dump and diff it from the first of months dump, and the delete the fresh dump. That way only until the beginning of the next month, I'm only downloading what has changed since the first, and not the entire dump.

I use rsync over ssh.

Reply

Please enter an answer
Tips:

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (https://www.google.com)

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct