Entry into Machine Vision

To paraphrase Helen Keller: the only thing worse than being a blind robot is to be a robot with sight, but no vision. 

Machine vision is the combined methods for taking in video or still image data and translating it into some meaningful interpretation of the world that a computer can understand. Some of the most popular applications include face recognition, glyph tracking, 2D barcoding, navigation, and in recent years, using artificial intelligence algorithms to be able to describe the objects in a picture using computers. My thinking is that if I can understand the tools of machine vision, then I can extend those tools to robotics problems I’m working on like object grasping and manipulation and navigation. Machine vision algorithms are built into many sophisticated software packages like MatLab or LabView, but these are typically VERY expensive, making them completely inaccessible to me. Fortunately, the principles of machine vision are well-documented and published outside of these expensive software packages, so at least there’s hope that if I work for it, I can build up my own library of machine vision tools without spending a fortune.

Since I’m building up this library for myself, I want avoid having to rewrite the programs to adapt them to any hardware that I might have connected to a robot including Windows and Linux machines or system-on-a-chip computers like Raspberry Pi, BeagleBone Black, or Intel Edison. My programming experience has been based in Windows computers, so I realize that the languages I’m familiar with won’t be directly useful. I chose Python because it’s open source, free, well-supported, popular, and has versions across all of the hardware platforms I’m concerned with.

I chose finding a color marker as my first venture into the murky waters of machine vision. The problem is pretty simple: use the webcam to find a color, send the coordinates to a robot arm, and move the robot arm. Au contraire, mon frere. That is not all.

This looks easy, doesn't it?

This looks easy, doesn’t it?

The first hurdle to overcome is capturing the webcam data. Fortunately, the Python module Pygame is incredible for doing just that. It allowed me to capture and display a live feed of images from the webcam and superimpose drawing objects over the images so I could easily understand what I was seeing. Most of the code I used came from the tutorial programs in one form or another. In the picture above, you can see the webcam image shown with the superimposed green dot representing the center of the marker.

The second battle to face is lighting. When we look at an object, we are actually seeing the light it’s reflecting. What that means is that when you change the light level or the color content of your light source (for example: use yellow instead of white), suddenly your robot will get lost because the color it “sees” is different than the color it expects. So, now we have to add a step to calibrate the vision system with it’s marker every time we run the program in case the light levels have changed between runs.

The next challenge comes in the form of coordinate systems. When we look at the image that the webcam captures, we can get the position of the object in what I’ll call webcam coordinates. Webcam coordinates are basically the count of pixels in the x- and y-axis from one of the corners. However, the robot arm doesn’t know webcam coordinates. It knows the space around it in what I’ll call robot coordinates, or the distance around the base measured in inches. In order for the computer to provide the arm with something that makes sense, we’ll have to be able to translate webcam coordinates into robot coordinates. If your webcam and arm have parallel x- and y-axes, then conversion may be just scaling the pixel count linearly. If the axes aren’t parallel, then a rotation will be needed. I kept the axes parallel and simply put a ruler down on the surface and used that to “measure” the distance that the webcam sees, then divided by the number of pixels in that axis.

The final roadblock is when you go to make the arm move to the location you’ve given it. The solution to this problem could be as simple as geometric equations or as complicated as inverted kinematic models. Since both methods relate to the movement of the arm, I’ll just call them both kinematics. Even though this will probably be the hardest challenge to overcome, you should probably take it on first since you’ll be able to use the kinematic model of the arm to simplify many other programs you may have for the same arm.

Geometric solution to the kinematic equations

Geometric solution to the kinematic equations

The idea behind kinematic modeling is that you want to write a group or system of equations that tell you what angles to move the arms to so the end of the arm is in a particular position and orientation. In general terms, if you want a robot that can move to any position in a plane (2D), it needs to have at least 2 degrees of freedom (meaning it has 2 moving joints) or if you want to move to any position in a space (3D), it needs to have at least 3 degrees of freedom. In my case, my arm is a 5 degree of freedom arm (meaning it has 5 moving joints) and that makes it particularly complicated to use a purely mathematical inverse kinematic model because I could move to any position in the space around my robot (3D), but I’d have multiple solutions to the equations. I chose to constrain the arm to 3 degrees of freedom by forcing the gripper to have a particular orientation. Then it became easier to model it geometrically.

The program I wrote works fairly well. It’s able to find objects of many colors and it’s pretty entertaining to watch it knock over the marker and chase it outside the field of view. I demonstrated it at the August meeting of the Robotics Society of Southern California which meets every month on the second Saturday.

RSSC Machine Vision Demo

RSSC Machine Vision Demo

If you have any suggestions on an application for color marker tracking or if you’d like to know more about this project, please leave a comment below.

That was my project day!

If you liked this project, check out some of my others:

My Thanks to Mr. Clavel

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

 

Advertisements

My Thanks to Mr. Clavel

I spent some time working on this backburner project over my Christmas vacation. It’s nowhere near finished, but it functions, so I thought I’d share what I have.

This robot arm configuration is called a Clavel Delta Arm and where you might think of a typical robot arm composed of a series of links, this is a parallel design. Well, strictly speaking, my robot arm is missing the parallelogram lower links that are a key feature of the Reymond Clavel design, but that’s a future improvement and I’ll get there.

My goal for this project is to develop the mechanics and controls to make this arm functional. I’ll be adding the parallelogram lower links and developing the kinematic equations so I can drive this arm by cartesian (x, y, and z position) or cylindrical (r, theta, and z position) coordinates.

That was my project day!

If you liked this project, check out some of my others:

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Give Aging Technology a Chance

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

Hot Rod Red Robot Controller

I’ve always believed in working harder by working smarter. So when I do a task more than once, I take a minute to consider if I could make life easier for myself by making a tool, gadget, or program like this controller that will keep me moving forward instead of being stuck in the doldrums.

When I’m working with my PicAxe, Arduino, Raspberry Pi, or even just PLCs, I keep finding myself building the same kinds of prototype circuits over and over again. Circuits like switches, buttons, and potentiometers as voltage dividers or as current limiting devices come up all the time. I’m sure if you’re electronically-oriented, you have had the same happen with you. Instead of having these parts clutter up my solderless breadboard, I decided to make a controller that would house the devices that I could simply wire into my prototypes… and do it with style!

Since I’ve been building prototype circuits with these components for years and it’s mostly straight connections it didn’t take any effort at all to make the electrical plan. The real challenge of this project was planning out how the components I wanted would all fit on a single panel. On one extreme, I could make it a big, obnoxious contraption with everything I could possibly ever need, but completely unwieldy or on the other end of the spectrum, something so small and specific that it’s not useful. Aside from the use / aesthetic spectrum, I also have more than enough prototyping components, so one self-imposed limitation was that I didn’t want to go nuts buying all new stuff. That brought the challenge that I’d have to build the project around these two massive industrial joysticks that I have. If space is such a premium, then why two joysticks you ask? “To control robots”, I would answer.  In the end, the limiting factor for the every dimension of the panel was the size of the joysticks. I managed to fit two switches, two potentiometer / rotary selector switch knobs, and five push buttons in the space between.

Enclosure Base Complete with Unused Hinges

Enclosure Base Complete with Unused Hinges

The build started off as a box with feet and a hinged lid which the components would be mounted to. That was going to give me the flexibility to easily open the cover and make changes, if needed. The hurdle with that design is that the lid, being made from very thin aluminum, would need to be reinforced so it didn’t flex every time you touched it. Also, there are the pointy corners to consider. Every iteration of a supported lid that I came up with was either clunky or complicated or both, so I decided that a fixed lid was the way to go and I’d just have to deal with reaching through the controller to make changes. Between the easy-to-manage handy panel siding and the square material used for bracing and the legs, it took very little time to build the enclosed bottom of the controller. The hardest part was visualizing interacting with the controls and planning where to put them and how to plan for the possibility of changes in the future. For this project and any others you might have dealing with sheet metal and drilling holes, I recommend you buy a set of step drills. Not only do they make much larger holes than you can practically make with general purpose drill bits, but they will also debur the hole after they cut it.

Finished in Shiny Red

Finished in Shiny Red

After I had the enclosed base fitted with the aluminum plate, I test-fitted all of the parts to make absolutely sure everything fits and painted the whole thing with red automotive paint. It’s not by best paint job, but it gets the job done. I may repaint the base or the plate with a different color to give it some personality.

Going forward, this will be great for prototyping. If I need a joystick in a permanent build, though, I think I’ll go for the mini joysticks available from Parallax or others instead of using a joystick almost as large as the robots I build.

That was my project day!

If you liked this project, check out some of my others:

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Give Aging Technology a Chance

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

Give aging technology a chance

 Think twice before throwing out your old tech. Perhaps that disused piece of junk can have a second chance in your next DIY project.

Let’s rewind the clock to 2001: NASA lands the first spacecraft on an asteroid, Gladiator wins the Academy Award for best picture, and Handspring releases the Handspring Visor Neo, the company’s affordable PDA competitor of the Palm Pilot. At the time, this baby was screaming (for hand-helds) with a 33MHz processor, 8MB of RAM and a high resolution 160×160 pixel gray-scale display. The following year, I was lucky enough to get this high-tech piece of portable technology as a barter deal for helping a neighbour understand how to use their own PDA.

I kept my Visor in working order over the years and it was still limping along when smartphones became affordable and I didn’t need it any more. After it was clear that my old friend was no longer useful as a practical tool, I looked into selling it on eBay, but found more equivalent devices that went unsold than sold so I had the choice of either throwing it out or trying to find something else to do with it. Since I’m not inclined to ‘e-waste’, I started digging around for potential ways I could repurpose it on the internet. Handspring made it’s niche in the PDA market with their springboard expansion slot, so I was really interested in figuring out how to hack it (the device has a docking interface and an IR transceiver, too). Given its age, I had a LOT of broken links to sort through, but I eventually found NS Basic Palm, Pocket C palm, Palmphi, and Palm OS Emulator. It took a while, but I was finally able to hijack the emulator from POSE and use a ROM that I found online to use for exploring the potential of the programming languages on the go. I found out that Handspring once had a development kit for the springboard expansion slot, but was disappointed to find that it was no longer available from Palm.

While I was excited at the prospect of writing apps for a new platform, I realized that there was no way I’d be carrying the Visor around with me on a daily basis, so aside from can-I-do-it curiosity, the Visor was doomed to be recycled. The turning point was when I discovered that the Handspring PDAs (and I suspect other Palm Pilots as well) used serial protocols to sync data with the PC. If you look at the circuit board inside the serial docking station, the connection is direct from PDA to PC. I suspect that the only addition the USB docking station brings is an FTDI. The most exciting discovery was that Pocket C had a couple functions allowing the Visor to establish the serial connection and push data independent of syncing. I suppose I should have figured that this was possible since the Targus Stowaway keyboard connected through the docking port. This was such a revelation: If serial comms was possible, then I could connect this little computer to anything with a serial port…  The possibilities were endless.

130608_CradleMod

A few wires and a Molex connector were all I needed to tap into the serial connections on the cradle

 

I decided that the best thing I could do was to relieve my computer of the burden of driving my Lynxmotion robot arm. Since the development software I had didn’t include objects like buttons and sliders, I created my own. Using basic drawing features like boxes, lines, and text, I was able to create a GUI with buttons, indicators, and a slider so I could engage the serial port and individually control each servo in the robot. In the process of developing my program, I realized that I needed a controller that could read data from sensors if I wanted the arm to do anything useful, so as it turns out the Visor wasn’t a good fit for the robot arm in the long-term, but the device had proven itself. Who knows what else I might be able to use it for in the future? Maybe it’s going to be my next universal remote control? I still haven’t figured out how to dig into the springboard expansion slot, but I suspect that it’s going to unlock an even bigger potential for my little friend. Perhaps the next step is to reverse engineer the GPS expansion I bought years ago, but could never get to work.

140608_Arm

My standalone robot arm with it’s new(ish) computer brain

 

The big lesson I learned here was your next project doesn’t have to start as $100+ in new electronics, maybe it starts with that old gadget you have laying around with lots of hidden potential.

That was my Project Day, how was yours?