Jump to content

Recommended Posts

6 hours ago, zeke said:

I wished that I had the option to recover two months of work that I lost unknowingly. <sad trombone>

With this, I will at least have the option as I move forward.

This is where (possibly) git could come in handy, Lets say you need a specific file from a specific day for whatever reason. But anyhow, I personally, do not necessarily agree with the strategy you chose. But the point is here. we do not have to agree, because what you're getting is what *you* want. At minimum, what you think you want right now.

Still, I urge you to look at your process, and think about it objectively. Which I think is what Rick was also trying to do. Finding an outside corner case where your backup strategy would fail you for a specific case. But let's say it does not fail, and does end up copying 1000 iterations of the same file slightly modified . . . this also may be less than desirable if that outside corner case just makes 1000 copies of the same file, with a different time stamp. or whatever.

Really, what you need to think about is exactly what you want, and if your strategy is fulfilling what you want / need. For me, the difference between a file being saved when actual code differences have been made is a must, Meaning, I change a single line, I may want that change to stick and be persisted, For you, that may not be appropriate?

So, for me personally. I think a local git hub may be exactly what I want, but also having a redundant copy of that local git would be necessary. For you . . . it may be different. Have I beat this horse a bit too much ?

Share this post


Link to post
Share on other sites

Nope, the horse is alive and kicking still. :) 

I firmly believe that ideas need to battle other ideas to the death in order for the best idea to survive so it's all good with me.

I looked further into the versioning question because it is a valid and intriguing question.

I know that I did some rapid save/edits cycles as I edited some graphics files yesterday and this is what I found. Nextcloud has an interesting file structure.

The designers decided to separate the current version from the past version by using directories.

The current version is in the "../nextcloud/data/zeke/files/" directory while the past versions are located in the "../nextcloud/data/zeke/files_versions/" directory.

The past versions are suffixed with a unique identifier. For example, the past versions of my graphics file are named:

  1. "../nextcloud/files_versions/zeke/logo.ai.v1492765859"
  2. "../nextcloud/files_versions/zeke/logo.ai.v1493766306"

The current version is just "../nextcloud/files/zeke/logo.ai" 

This seems to suggest that there is some way to retrieve an older version of a file but I have not looked into that yet.

So to fully answer your question @Rickta59, if I did this:

$ for rev in $(seq 1000); do cat /dev/null >reallyimportant.c; echo $rev; sleep 1; done

Then I would expect the files_versions directory to start filling up with files named reallyimportant.c.v<nnnnnnnnnn>.

How does that sound?

 

Share this post


Link to post
Share on other sites

I had taken a quick glance at the feature list and had noticed that it is supposed to support that feature. I thought I read a further comment stating at some point it starts throwing away old versions to make room. I didn't dig deeper. I thought you might have.

While it is all well and good to backup files as they change, it seems like it would make sense for some scenarios to only create a backup if the contents change.  However, I can imagine other situations where it would be important to note file timestamp changes.

I've noticed that simplistic approaches to backup schemes lead to data loss, while giving the user a false sense of safety that really isn't there.  They only find out about an unrecoverable data loss after the fact.

Write open source software, share it with all your friends. If you ever lose a file you just reach out for a little help from your friends.

 

Share this post


Link to post
Share on other sites
9 hours ago, Rickta59 said:

I had taken a quick glance at the feature list and had noticed that it is supposed to support that feature. I thought I read a further comment stating at some point it starts throwing away old versions to make room. I didn't dig deeper. I thought you might have.

While it is all well and good to backup files as they change, it seems like it would make sense for some scenarios to only create a backup if the contents change.  However, I can imagine other situations where it would be important to note file timestamp changes.

I've noticed that simplistic approaches to backup schemes lead to data loss, while giving the user a false sense of safety that really isn't there.  They only find out about an unrecoverable data loss after the fact.

Write open source software, share it with all your friends. If you ever lose a file you just reach out for a little help from your friends.

 

If someone loses data because their strategy is simplistic. Then they;'re not thinking how to do backup properly( not thinking the process through fully ), or they're doing something silly. You know the reflow oven wulf and I fiddled about with 5 is years ago ? I still have that code safely tucked away on a removable USB hard drive. My backup strategy ? About as simple as it gets. Manually copy files directly to my USB hard drive. Is it fool proof ? not by a long shot, but it works for me.

Now all my beaglebone embedded Linux code, sits on a dedicated server, that's  actually an Asus eee-pc, with a broken screen. But . . .

william@eee-pc:~$ uptime
 17:21:38 up 190 days, 22:13,  2 users,  load average: 0.00, 0.01, 0.05


I'd say it's doing a pretty good job of what it's doing. Do I have a redundant copy of all my work ? In some cases, kind of, but not really. A lot of my important code is on a private git, on github. Now some of the other "important" stuff, is not redudant. But this code I classify as "important" is mostly experimental code, that I wrote while getting familiar with *something*. Be it CANBUS, sm_bus I2C stuff or whatever.

By the way, the reason why that server only has a 190 uptime, Is because I shut it down while I was on vacation for a month . . .Unplugged it from power, and the network while I was at it too( thunderstorms . . .)

Share this post


Link to post
Share on other sites
9 hours ago, zeke said:

By the way, I just looked into the cost of a paid github account and the price is $7/month or $84/year.

That gives you unlimited private repo's.

Alternatively, gitlab community edition is freely available if you want to self-host. I'm researching this option right now.

It sounds interesting. Having a web based GUI could be useful, if that's what it is. Quite honestly though, and do keep in mind im by far not a git guru. I prefer to use tools of this nature form the command line. Honestly, I find using the tools much quicker from the command line. and this way keeps me in touch with how the tool is used. So if someday I wish to script something up, it should be trivial. Which is something else I've been doing a lot of. Which also lends a lot to readability for anyone who knows their way around a Linux system. So when working with others, like for me at work, I designed a complete monitoring system for production boards that anyone at work can read, and understand fairly quickly Even the boss, who is actually an old school embedded *NIX systems software developer.

Anyway, for me, I'm not doing a local backup through git right now. But I have invested a considerable amount of time reading about it over the last year or so. So it's just a matter of time before I follow through. Then like I mentioned in a previous post. I'll have a completely separate partition, and mostly like a completely separate disk that I mount only when making backups. and probably using something like dd, to make 1:1 copies, or perhaps just using tar. So mount -> backup -> umount. This for me, will be very robust, and about as bullet proof as I would care to get. However, the system doing backups, for me would require a UPS, and possibly may just be a laptop for the built in battery power in case of power failures. Which is partly why I have not followed through with this yet. I want everythign runable during a power failure, and I have not had any of the required circutry, and possibly code designed yet. A friend of mine however( not wulf ) have been talking seriously about a design for a while now. Something he'll design in hardware, and something I'll design in software, assuming software will ever be needed.

Share this post


Link to post
Share on other sites

Yeah after watching this video on gitlab: https://www.youtube.com/watch?v=PoBaY_rqeKA

I've concluded for now, it's a bit more complex than I want. Granted . . . git it's self is fairly complex too, but gitlab would be something else I had to read up on, and keep up with, when all I really want is to feel all warm and fuzzy about my data being there when I need it. I may even think about making my backup strategy even simplier than what I'm currently think it should be now. May even forgo git all together, and just use rsync in conjunction with dd or tar. For 3 layers of redundancy. But I do liek the idea of versioning . . . a lot.

Share this post


Link to post
Share on other sites

Hi,

for the projects I need to access from multiple places (and needing some privacy) I've configured a small git server and put gogit/gogs *) on top of it.

Because the core is git, I can use it from command line and gogit gives me the possibility to use it in a way similar to github and (the reason I use it) allows me to define "virtual users" - I can allow other people to access the private repositories without sharing my (main) credentials.

I have luck with a good hosting offering all needed tools.

Cheers,

Liviu

*) I suppose any "interface" will do the same, I've used gogs because I found a tutorial (in German) about the installation on my server.

Share this post


Link to post
Share on other sites

Backup process? manual, copying to several separate devices (NASes, temporarily mounted via sshfs). Currently all are in my apartment, so it will not work if there is a burglary or a fire. Also, it won't help if I forget to copy to backup devices.

Share this post


Link to post
Share on other sites

It has been a long time since I started this thread.

I am happy to report that Nextcloud is still working very well for me.

It is reassuring that I can make a change on one machine and then have that change sync over to another machine almost instantly.

Given that spinning hard drives have been getting less expensive and larger at the same time, I would highly recommend nextcloud to anyone who needs to keep their data safe.

Share this post


Link to post
Share on other sites

An additional follow up.

The Altium data that I lost was my Marquee Clock schematics and pcb project. That really hurt.

I spent some time in February redoing the ring pcb and cpu pcb designs. I attempted to redo the power supply pcb but I just couldn't figure out what TI power supply IC that I used so I gave up on it.

So now I can move forward again on the clock. This time, I am going to add in an BQ32000 RTC IC.

Share this post


Link to post
Share on other sites
47 minutes ago, zeke said:

An additional follow up.

The Altium data that I lost was my Marquee Clock schematics and pcb project. That really hurt.

I spent some time in February redoing the ring pcb and cpu pcb designs. I attempted to redo the power supply pcb but I just couldn't figure out what TI power supply IC that I used so I gave up on it.

So now I can move forward again on the clock. This time, I am going to add in an BQ32000 RTC IC.

Clearly, you are having too much fun.  :(

Much of your commentary is reminiscent of my cpu/mb failure in December 2017.  While I had decent backups of all my design files, I could not get Windows to agree to reinstate itself from backup, so new install on new CPU/MB, with a Windows.old folder.  Only real loss was an encrypted/pwd protected text file I use for a lot of "housekeeping" details.  That was, for me, uncrackable.  Luckily, found an older version on another machine on my home network.

Back up and running with photo/video editing, and pcb experiments.  The fun never ends.

Share this post


Link to post
Share on other sites

Misery loves company :wink:

Still, I believe there is a reason for everything that happens to me.

In this case, I now have a decent file synchronizing system in my office and I now have a much better code base for all my future projects.

 

Share this post


Link to post
Share on other sites

> In this case, I now have a decent file synchronizing system in my office and I now have a much better code base for all my future projects.

Likewise, I have reorganized my entire file/folder layout to conform to what Windows "wants" and while it's not really the way I think, it has made my own maintenance simpler.  Having 8 hardrives in the PC, plus another 16 in storage arrays, allows for things to easily get misplaced...

As to my current backup procedure, I'm still happy with Macrium refliect for the machines in the house, and I use S3 Browser pro to push files offsite to the Amazon AWS S3 buckets for each machine.  And, as I keep taking more photos, they all get uploaded there as well.  Luckily, I don't have the complexity of having to sync files between various locations.

Share this post


Link to post
Share on other sites

I sense that you are managing a lot more data that I can imagine.  Have you checked out /r/datahoarder yet?

As far as synchronizing the data, it happens automatically. I just have to make sure that each workstation is turned on and make sure the nextcloud service is still running.

I should be watching the health of the hard disks as well but I have to do that manually.  Seagate drives make me shiver.  I have a 3GB spinning disk in my workstation and I feel it's a ticking time bomb waiting to go off an obliterate my data.  Note to self: replace seagate with new ssd.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×