Dropbox Linux Command Line
On my web server that I run several sites (including this one), I wanted a relatively cheap backup option. My host will offer me a backup option, but it’s more than I wanted to pay, so I looked into a few other options.
I settled on a combination of freely available domain backup software (Webmin), which does a nice job of backing up my domains, but if the entire computer tanked, I’d lose them too. So I opted for Dropbox Pro, which comes with 1TB of space for $99 a year. It copies off the backups to Dropbox, which has the added advantage of syncing them to my local computer at home too. It’s worked nicely – and still works to this day.
The Dropbox command line client for Linux will sync stuff – it will sync everything. This is a problem for me because the total amount of space I use for my server backups is less than 100Gb. That leaves 900Gb I could use for my own personal backups, which I’ve wanted to do. I can’t do that because the Dropbox client syncs everything. Given my server has something like 250Gb of space on it, if I started using all that 900Gb, I’d quickly fill up my web server and cause problems. Now, the client can be told to include or exclude directories, which does work. When normally operating, I have it syncing just the one directory that contains the aforementioned server backups.
Here’s the core issue here, and the reason I’m posting. I want to use that extra 900Gb of space, and whenever I add a new directory on my local computer, it goes to my server. I cannot exclude it from the Linux command line “selective sync” until it exists. I’ve been through the help for the command line sync several times already.
I do not know a way to tell the Linux Command Line Selective Sync function to ignore *ALL* directories – including future/unknown ones, and sync “JUST THIS ONE DIRECTORY” no matter what. Basically a whitelist feature.
If someone knows how to do this, please – let me know. It would solve a lot of problems. Given I already pay for Dropbox annually, I’d rather not have to pay someone else (Google, Apple, Amazon) to have cloud services I can put whatever I want on.
If you know a way around this, I’d be really happy to know.
UPDATE: After posting, I found this blog post, which appears to indicate what I’m trying to do (whitelist) doesn’t exist. They offer a cron script option, which I might be able to look into. There is another idea of my own, which is to create a directory on my local machine, get it listed in the “exclude” on the server, and then put anything else I want to put in there in THAT subdirectory. Clumsy, but it could work.
Join the Conversation
Joe,
I am using Dropbox to backup the scsportscar.com site. It’s a small wordpress site that doesn’t take a whole lot of space. It is not my own site and SCR-SCCA has its’ own Dropbox account (that’s key).
I have a script that rolls through the server every night, backs up my file list into a subdirectory in the Dropbox directory called backups. Each night it creates a dated subdirectory. It then removes older backups beyond x number of days so that I don’t use more disc than what I want it to use. Standard stuff. Gives us a few days of backups and some versioning.
What I have done is logged into that Dropbox account and shared off the backup subdirectory to my personal Dropbox account. Now from my personal Dropbox I can get into the backups and do whatever I might need to do with them, and still use all of my personal space for whatever.
Obviously this means two accounts, which may not work for you. One of them can be a freebie depending upon how much space you use.
I run a raspberry pi server (internal web server) and a NAS with a BTRFS storage pool. I think you’d be much happier with AWS. I know you don’t want to commit to another service, but thought I would mention it. I use a simple script to backup data to the Amazon web servers and only pay per GB and transfers to the server. So, to store 100 GB per year it will only cost $27.60 (add $0.005 per 1,000 added files – one-off fee).
I did a little post about the subject on my blog:
https://karlhunter.co.uk/content/?p=16
Hope that helps,
Karl.