Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How do you backup your production servers?
1 point by agjmills on Jan 13, 2015 | hide | past | favorite | 3 comments
We're a small web development agency who have around 30 dedicated servers with various numbers of websites on each.

All of our source code is in Git, which means that each developer, the servers onto which the code is deployed, and the Git remote all have the full history of the code

Our databases are backed up really poorly - a hacky script I wrote a couple of years ago that dumps them to a file, and then they get committed into Git and pushed to the remote. However it's very difficult to extract them and can take up to a day depending on the size of the database.

Files that a user uploads to a server are not backed up Server configuration files (nginx, apache etc) are not backed up SSL keys/certificates/other are not backed up

I want to back up to the following schedule: Nightly for the last 7 days, then before that, weekly up to 6 months

The total amount of disk space we take up on all servers is around 2TB

All of the backups need to be backed up to the same place - ideally without a single point of failure - I was thinking something like S3-like storage?



database: maste-slave

uploads: rsync

there's a ruby gem called backup (https://meskyanichi.github.io/backup/v4/) ;)


With a master-slave DB configuration, surely you need 2n db machines in order to replicate the database?


it's one server as master (write) and one as slave (read-only). you also must do backups, but using this approach it will be more safe than only backups.

later, you can setup to when one server crashes [master or slave], automatically setup a new server and copy the data.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: