Jump to content

Recommended Posts

Last night, I discovered that one of my source code repo's had been lost. 
Today, I confirmed that I had no backup of that repo.
I am feeling poorly now because I know that it was my fault. I should have had a backup process in place to prevent this type of loss.

So now I am wondering, what is your data backup process? How do you ensure your data is not lost in the worst case scenario?

 

Share this post


Link to post
Share on other sites
35 minutes ago, zeke said:

Last night, I discovered that one of my source code repo's had been lost. 
Today, I confirmed that I had no backup of that repo.
I am feeling poorly now because I know that it was my fault. I should have had a backup process in place to prevent this type of loss.

So now I am wondering, what is your data backup process? How do you ensure your data is not lost in the worst case scenario?

 

Which platform ? Windows ? Linux ? OSX ?

 

But rsync is a pretty good way if you use the correct strategy. This can be done on Windows too using the rsync application Deltacopy. Which runs services, and will automatically sync your watched directory, to the backup location.

Share this post


Link to post
Share on other sites

All three platforms.

I have been trying to use freefilesync but it isn't automated.

I have bookmarked Deltacopy because I have never heard of it before. Thanks!

Share this post


Link to post
Share on other sites

As far as my process. I avoid RAID arrays like the plague. RAID in home systems is an additional point of failure, and is a recipe for disaster. So, I use USB 3.0 external drives. Just bare drives, well partitioned and formated of course. And I just hand copy important files to these drives. As for my code base. Well for my beaglebone I have a dedicated Debian system that acts as a NFS, and Samba file share. This means I can mount the NFS share on my beaglebone, and through Samba I can mount the file system on my Windows system. From here, I edit files directly off this share, in windows, using my text editor of choice.  Anyway, the files stay on this debian support system of mine, and never go away until I delete them .

Share this post


Link to post
Share on other sites
5 minutes ago, zeke said:

All three platforms.

I have been trying to use freefilesync but it isn't automated.

I have bookmarked Deltacopy because I have never heard of it before. Thanks!

It'll be a steep learning curve figuring it out. But once learned it's really awesome. Actually, I've known of it for years, and have used it a few times, and still do not know everything about it ;) There's guides online though.

As for Linux, use a systemd timer to fire off once a day or whatever, that calls a script to run your rsync's. If not using systemd, then use a cronjob once or so a day to fire off that same previously mentioned script.

Share this post


Link to post
Share on other sites

LOL sorry I keep thinking of things that I've long forgotten about, until just ow thinking about all of this. I've heard tale of people setting up a local git repo, and using that to backup files too. The cool thing about using git in this manner, is that you would have file versioning too !

Share this post


Link to post
Share on other sites

I had git setup on a local linux box and I was using it.

It suffered disk corruption when the power went out while I was away on Christmas Vacation.

I am presently running efsck on it. It sure is taking a long time to finish.

Share this post


Link to post
Share on other sites
1 minute ago, zeke said:

I had git setup on a local linux box and I was using it.

It suffered disk corruption when the power went out while I was away on Christmas Vacation.

I am presently running efsck on it. It sure is taking a long time to finish.

Ok, so in that case, all you really need to do is rsync your git repo to another location. Some place *maybe* that is only mounted while backing up, and unmounted while not. This way perhaps you would avoid corrupting your backup. I'd have to think on this some more, to see if there is any holes in my idea here, then perhaps come up with a bulletproof strategy.

 

Share this post


Link to post
Share on other sites

I was wondering if a private github repo might make sense.  I don't like going offsite with my stuff so I'm not sure how to respond to my emotions just yet.

Share this post


Link to post
Share on other sites
2 hours ago, zeke said:

I was wondering if a private github repo might make sense.  I don't like going offsite with my stuff so I'm not sure how to respond to my emotions just yet.

The company I work for uses a private github account, and it seems to work great. I also think the guy I work with who manages the account says it only costs like $20 a month too. But this is for business. For personal use, I think it's a lot less. Couldn't hurt to check.

In either case. For a solid backup strategy, it's always a good idea to have offsite backup's too. In addition to local redundancy.

Share this post


Link to post
Share on other sites
Quote

So now I am wondering, what is your data backup process? How do you ensure your data is not lost in the worst case scenario?

Mac OS X: TimeMachine and SuperDuper!.

Every few days I plug in the external USB drive, and TimeMachine does its thing (backing up the files that changed). When that's done, I use SuperDuper! to make an image backup to a different partition on that same external drive. Once a month I plug in a different external drive (a WD My Passport Ultra) and clone the main drive to one of three partitions on the external - it's round-robin so I've got three monthly snapshots, plus TimeMachine, plus the other backup. I also have an old FireWire drive that gets an occasional copy of the main drive.

I should store them in different locations, but they're all in the same room.

For your case, I think TimeMachine could have helped, especially if, unlike my setup, you let it backup to a drive that's always connected - you would have had all the versions at your fingertips.

One more tip: always, always verify that the backups are good -- boot from them if they're supposed to be bootable. This is something that I'm lazy at, and I know one of these days it's going to bite me.

Share this post


Link to post
Share on other sites

I think it's important to keep OS and data separate always. So maybe once, after setting up a fresh install, make a bootable image of the OS partition. But after that, always keep your data separate from the OS. With Linux, I usually mount /home or /home/username on a separate partition. In windows I create an OS partition( ~ 500GB or something ) and partition the rest as a pure data drive.

Share this post


Link to post
Share on other sites

Ugh! It just gets worse the deeper I look into what data was lost.

I lost all of my altium files on my marquee clock project.  

All I have left over are the gerbers that I sent to the board fab house because they have them stored on their server under my account.

Sigh.

 

 

 

Share this post


Link to post
Share on other sites

Bummer is certainly an understatement.

As to my local backups I run Macrium reflect pro (now V7 home edition). I have all of my code and eagle files on a 1TB 10k  rpm hard drive. Macrium does a full backup every day around 02:00 to an internal 5TB drive, and then copies to my server's 24TB JOBD array.  Finally, the source files get posted to my AWS account for "offsite" storage.  My AWS account runs about $2.00/month. And the last upgrade for my 5 licenses for Macrium was about $150.  Money well spent.  All of the machines on my home network run daily backups. In general, I do a daily full system image for the boot drives, and then the data files are incremental folder/file backups.  I generally keep 8 full backups of my system images, and for the data files, there's a full backup done weekly, and I keep four full backups + the incrementals for a month.  FWIW, Macrium's Windows PE rescue disk saved my butt last week when the system files became corrupted due to a problem with my aging RAM.  I was able to recover the system and get completely back up and running after a day's effort (about 200 reboots while playing with BIOS HD settings...)

Share this post


Link to post
Share on other sites
17 minutes ago, NurseBob said:

Bummer is certainly an understatement.

As to my local backups I run Macrium reflect pro (now V7 home edition). I have all of my code and eagle files on a 1TB 10k  rpm hard drive. Macrium does a full backup every day around 02:00 to an internal 5TB drive, and then copies to my server's 24TB JOBD array.  Finally, the source files get posted to my AWS account for "offsite" storage.  My AWS account runs about $2.00/month. And the last upgrade for my 5 licenses for Macrium was about $150.  Money well spent.  All of the machines on my home network run daily backups. In general, I do a daily full system image for the boot drives, and then the data files are incremental folder/file backups.  I generally keep 8 full backups of my system images, and for the data files, there's a full backup done weekly, and I keep four full backups + the incrementals for a month.  FWIW, Macrium's Windows PE rescue disk saved my butt last week when the system files became corrupted due to a problem with my aging RAM.  I was able to recover the system and get completely back up and running after a day's effort (about 200 reboots while playing with BIOS HD settings...)

A kindred spirit!

Multi-level, multi-device backups, bravo!  :biggrin:

Btw, 100% agree on Macrium. It is a great product. It also makes drive upgrades, or switching from spinning drives to SSD, a breeze.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×