• Content count

  • Joined

  • Last visited

  • Days Won


Everything posted by zeke

  1. @admirlk I'm sorry you are having so many challenges with code composer studio! It has happened to me in the past so I know how you feel. In the event that you do not achieve success tonight then might I suggest that you check out CCS Cloud. Make sure to login with your TI username and password. It will give you an online CCS instance to work in. To debug, install the TICloudAgentBridge in your browser and it's local helper program. I just did that today for my Ubuntu box. It works good. Best of luck!
  2. @makiyang614 I sure am hoping that you meant to say LaunchPad rather than Arduino since this site is dedicated to the TI LaunchPad in its various forms. That said, can you tell us if you are using a TI LaunchPad or an Arduino please?
  3. I didn't even know they existed until I read your post! Maybe I can use a freight forwarding service? Hmmmm....
  4. It even says ZEK on the CC3200! It must be calling me.
  5. Last night, I discovered that one of my source code repo's had been lost. Today, I confirmed that I had no backup of that repo. I am feeling poorly now because I know that it was my fault. I should have had a backup process in place to prevent this type of loss. So now I am wondering, what is your data backup process? How do you ensure your data is not lost in the worst case scenario?
  6. By the way, I just looked into the cost of a paid github account and the price is $7/month or $84/year. That gives you unlimited private repo's. Alternatively, gitlab community edition is freely available if you want to self-host. I'm researching this option right now.
  7. Nope, the horse is alive and kicking still. I firmly believe that ideas need to battle other ideas to the death in order for the best idea to survive so it's all good with me. I looked further into the versioning question because it is a valid and intriguing question. I know that I did some rapid save/edits cycles as I edited some graphics files yesterday and this is what I found. Nextcloud has an interesting file structure. The designers decided to separate the current version from the past version by using directories. The current version is in the "../nextcloud/data/zeke/files/" directory while the past versions are located in the "../nextcloud/data/zeke/files_versions/" directory. The past versions are suffixed with a unique identifier. For example, the past versions of my graphics file are named: "../nextcloud/files_versions/zeke/logo.ai.v1492765859" "../nextcloud/files_versions/zeke/logo.ai.v1493766306" The current version is just "../nextcloud/files/zeke/logo.ai" This seems to suggest that there is some way to retrieve an older version of a file but I have not looked into that yet. So to fully answer your question @Rickta59, if I did this: $ for rev in $(seq 1000); do cat /dev/null >reallyimportant.c; echo $rev; sleep 1; done Then I would expect the files_versions directory to start filling up with files named reallyimportant.c.v<nnnnnnnnnn>. How does that sound?
  8. I hear you. I understand. You're right. I appreciate your honesty. Thanks for keeping it real.
  9. I wished that I had the option to recover two months of work that I lost unknowingly. <sad trombone> With this, I will at least have the option as I move forward.
  10. So I have finally got my new file realtime backup system installed and operational. The file server details: OS: Ubuntu 16.04.2 box Storage: 8TB Western Digital Red Software: NextCloud server The client details: Client 1 OS: Win10 Client 2 OS: mac OS Client 3 OS: Ubuntu 16.04.2 Desktop Software: NextCloud client on each Usage: Using NextCloud Client, sign into the Server Select local directory you want to backup Add it to the Server Stand back and it will copy everything over to the server automatically I have observed that if you create a new file in the local monitored directory then the NextCloud Client will almost immediately copy it over to the Server without your interaction. I like that. If desired, you can setup another client machine and get it to replicate a files from the server to itself locally. Multiple redundancy. So far, this system has transferred over 110GB to the server unattended. This configuration will backup files that I generate on a regular basis. Now, I want to setup git and subversion on this same server so that I can take care of files generated during software coding (git) or hardware design files generated by Altium (SVN). So far, I like NextCloud and it fits my work processes.
  11. @yyrkoon Sometimes, when I feel like you do, I go an have a nap until they get off my lawn. ;-) Don't stop helping us understand stuff. You are awesome Dude. !
  12. This is my schematic for a typical msp430 programming setup. Notice the resistor and capacitor on the SBWTDIO line? Compare that to your schematic. What do you see?
  13. I think you need to supplement this post with some photos of this gear. I would like to see what you are working with. What do you think?
  14. If they can be setup as a point-to-point link then there shouldn't be any problem setting up a data link between end points. Carefully selected and aligned Yagi antennas ought to overcome foliage loss.
  15. I have heard of this but I haven't had a chance to play with it yet. What do you think your desired range, data rate, and power consumption numbers ought to be?
  16. I use the G2553 when the client (who could be me) needs that little amount of resources to get the job done. Essentially, it's all about the cost of the chip during production.
  17. I am not a fan of RAID arrays either. I see them as pointless since the new drives are so big and fast compared to 15 years ago when RAID actually made a difference. It was all about transfer speeds and creating large amounts of disk space. If I really really really cared about backing up stuff then I would go and find an LTO tape drive and figure out how to use it. I have had both Western Digital and Seagate drives fail on me. The latest failure (this week) was a Seagate ST3000NC000 3GB. I believe the magnetic media is peeling off of the platters inside. I have tried to rescue it numerous times over this past week but nothing is helping it. I tried getting it to map out all of the bad sectors but there is just too many of them. I cannot get gparted (on linux) to make a new GPT but the disk refuses to co-operate. I tried getting warranty on it but it's over a year out of warranty so I'm screwed that way. I may go and harvest the magnets out of it. So, I checked out the BackBlaze hard disk failure data because they beat the snot out of commercial type hard drives. I wanted to see which drive had the lowest failure rates for them. The Western Digital 8TB drives had zero failures for them. So that's what I went and purchased yesterday. I am still not certain which backup/syncing process I am going to employ but I am leaning towards setting up a separate Linux box with the 8TB drive inside and install NextCloud on it and then sync against it with all of my clients. I am trying to go towards the least amount of effort with the greatest amount of success that works for Windows, Mac, and Linux clients. The Lazy Way can be The Most Efficient Way ;-)
  18. Oh, I did find one more open source option worth mentioning called Unison. It doesn't suit my use case but it is a well regarded possibility.
  19. @NurseBob I had been playing around with github but not every program that I use can use github i.e.: Altium. I may have to setup a subversion server as well. But, I am trying to not depend on an external cloud storage option so I went to a local supplier today and purchased a WD 8TB Red NAS drive today. This will allow me to setup my own cloud storage and sync files against it. Since I want to set something up locally, I looked for the following options: Self-hosted Linux client Windows client Mac client Open source These are the possible programs I found today: Nextcloud Owncloud Seafile Pydio Syncthing I've installed Nextcloud but I haven't tried it out yet because I've run out of time today. The important thing will be the automated process of syncing/backing up all the files on the different sources.
  20. Wow! My head is spinning trying to understand your description of your backup processes. It's cool though. It looks well thought out. My challenge is trying to come up with a process that will work on windows, mac, and Linux simultaneously. I am reading about AWS S3 services. It's bewildering because of all the options available. It would be cool if there was an application that could run on windows, mac, and linux that would orchestrate the backup process.
  21. Ugh! It just gets worse the deeper I look into what data was lost. I lost all of my altium files on my marquee clock project. All I have left over are the gerbers that I sent to the board fab house because they have them stored on their server under my account. Sigh.
  22. Whoops. I forgot that detail. I must be getting old. So this is I2C over CANBUS then. Sorta. I have never worked with CANBUS. Can it cover that distance at that speed?
  23. My gut instincts tell me that your cable will have to be 50 ohm coaxial to get 100kHz over 300 meters. Only super slow speeds can go long distances i.e.: RS-485. I would be inclined to use an msp430 on your cape to be the I2C interface master. You could talk to it with the BBB as if it was a slave serial device. That would isolate the BBB from the slow speed pathway. The BBB could just poll the MSP430 for any new data.
  24. Just in case I didn't make it clear, the speed of the simulated I2C-over-1Wire ends up being about 15kHz. It's not fast but it does work over long lengths of cable and that is pretty darn cool.