Jump to content

What are you doing right now..?

Recommended Posts

I started using CMSIS when playing around with the STM32 discovery boards and it seemed OK. I wondered about using it for Stellaris programming, but the install flaked out and I never got any further. Has anyone got it working on Stellaris/Tiva?



Weirdly enough, I'm just starting to resurrect the BalanceBot project that I mentioned in that thread!

Link to post
Share on other sites
  • Replies 108
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

@@timotet - nice result from isolation milling.   The first machine I made is a fairly standard CNC mill, ballscrews, linear rails & steppers, using Mach3 and Vectric Cut2D & VCarve in the t

Making my first PCB using KiCAD, switching over from ZenitPCB. Next I am going make a prototype board with a LQFP-64 (TM4C123), will push my homemade PCB "printer" to its limits (its inherent resoluti

Seems to sell like hotcakes, the UPS man just dropped a Baofeng UV-5R+ on my doorstep I hope to use it as a test signal  to fiddle with my AIS project. (Carrier signal only, I don't think it is able

Posted Images

Oooh CMSIS...  I think everything that needs to be said has been said.   :smile:


As for what I'm doing right now, I'm doing software.  Specifically, getting an OpenTag communication messaging API bridged to a iOS client.  iOS via "external accessory protocol" is a giant pain.  I'm looking forward to the Android version.

Link to post
Share on other sites
  • 1 year later...

Preamp / Multimedia-center featuring:


Raspberry Pi running OSMC, will add a MSP430 for power control.


My preamp modified for I2C control - I2C implemented in a MSP430G2553.


My RC5 remote, based on a MSP430G2312.


Tiva C as main controller: 320x240 LCD panel, old stepper motor as volume control & navigator (via QEI)


Keystone DAB (on a Monkeyboard) - awaiting delivery, I assume this will be the most tricky part to program for.






Previous design in background, new design in front :



Link to post
Share on other sites

Since this thread is back to life:


Right now, I am going to bed. Spent the night making a microscope camera from an old webcam. Optics from the defunct NTSC cam that came with the 'scope. Needed to set p so the optics form an image and mount the sensor of the cam in the image plane, while making it adjustable enough to align. Junk box to the rescue. Worked overnight because a) I started late afternoon, 2) it is useful to be able to turn on the dark, and iii) a lot of fit and fiddle with the lathe and waiting for epoxy to set up.


Cam has such a small sensor that full frame is about 4mm low zoom, and about 400micron full zoom, versus 20+mm down to 2+mm via the eyepieces. This gives about 8micron per pixel to less than one. The resolving limit for the scope is about 2.5microns, so, as I knew going in, this is far from a perfect solution. I really want a larger sensor and better match for the optics, but can't beat the price. About $3 for a PVC pipe part and three nylon screws I didn't have in the junk box. The rest was on hand.


Thing I can't figure out, and may have to do some actual thinking and reading about things I haven't dealt with in close to 30 years, is why the focus is rock solid in the eyepieces when zooming, but changes in the cam port. Shouldn't happen as the zoom stage is infinfite focus in and out and the cam port optical paths are supposed to be length matched. Unfortunately not my field, so, for the moment, I made my best compromise and will deal with it. The change is substantial: the focus in the eyepieces is with the objective roughly 100mm from the target, and the shift in the cam port is about 24mm over the zoom range. Should be maybe 1 or 2mm max.


Pics if I remember when I wake up.

Link to post
Share on other sites

Well, finally woke back up, got real work done (one of the ways I make a living), and for those that are interested, have pics of the $3ish junkpile microscope cam. I did finally get the focus in in sync with the eyepieces. Just a real tight focal plane. If anyone wants pics and complete description, let me know. Otherwise, attached is a pic of the final product. And, yes, that is a handlebar clamp. The second time I pulled the camera off and had to try to re-align and find the focal plane, that went on to make it easy to find location. I was going to make up a shim to go between the registration faces in teh tube, but this is good enough, so I didn't bother.


Link to post
Share on other sites

Your DIY microscope? I don't understand.

DIY camera for the 'scope. The 'Scope itself is a Nikon stereo zoom I picked up for a gloatable low price, that has an oddball camera port. Two ports, one left and one right, but the optics for them are non-standard, so no way I can afford commercial cam optics. Best price I have found used is still over $1000, without a camera. This is trial run for a better quality and resolution setup.

Link to post
Share on other sites

While waiting for the Monkeyboard I have been busy porting the Keystone driver and writing a graphics library:



I have chosen to write my own graphics library that, shock horror, uses the heap for storing the widget descriptors.

It is fully event driven and currently has support for Canvas, Frame, Button, List and Listelement widgets and stands on top of RobGs LCD driver.


It integrates nicely with remote control and the stepper navigator, and I suspect it should work with a touch driver as well.


So far so good - I hope my code will work together with the Monkeyboard when it arrives...






Link to post
Share on other sites

@@bluehash The horizontal bar is simply an indicator for volume level so no scrolling involved. The channel list is scrollable though, ether via the navigator button or via remote control.


I am using CCS since I need a decent debugger and also because I am familiar with the Eclipse IDE. I have a full CCS licence, the project compiles to over 70K of code - as I understand this is not possible when using the free licence?


What about a post that starts off with a description of event driven programming and how I have implemented it in my UI library? Maybe that can be helpful for those who are new to that concept.


My main() starts of with initializing some libraries before entering the event loop - after that I am at the mercy of the UI library and how I handle the events it delivers back:


while (1) {

I have deliberately made the UI library simple, it is fairly lightweigth (about 900 lines of code) and it needs some kind of user input to drive it.

Currently this is provided either by my remote control or the navigator button (a motor from an old disk drive), to be useful for others it I think must be updated to respond to a touch screen as well. I have separated the remote and navigator code from the UI library itself so that should be fairly straigthforward.


The radio library is partly covered by a NDA (the protocol part), its source cannot be released - however a precompiled library & header files can be made available.



Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...