Tracking Movement Optically And Cheaply

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

Many machine sensors like accelerometers and gyroscopes have fallen in price because of the popularity of smart phones. Low-cost cameras and optical sensors have also benefited from smart phones with camera sensors for photography or video conferencing.

Almost a decade ago my daughter built a number of robotic science fair projects with a CMUcam that was developed at Carnegie Mellon University. It used an RGB camera. She turned that information into HSI (hue, saturation, intensity) to more readily track changes in the image. Back then the technology allowed results measured in seconds per frame.

I helped turn her Java code into assembler and put it on the CMUcam, replacing CMU’s code. The resolution was still low but results were available in tens of frames per second.

Fast forward to the Pixy CMUcam 5, which also was developed at CMU and Charmed Labs (Fig. 1). The Omnivision OV9715 RGB camera is a supported automotive camera, so it will be available for a while. It can capture a 1280- by 800-pixel image, but the system uses a lower resolution so it can handle a higher frame rate.

The board has a 204-MHz NXP LPC4330 microcontroller (see “New Platform Approaches Deliver Top Digital Designs In 2010”). The micro has a Cortex-M0 and a Cortex-M4 core. In an interesting twist, the Cortex-M0 handles the camera reading and processing a frame at a time. It converts the RGB camera information so the Cortex-M4 can perform the frame analysis. The micro has 1 Mbyte of flash and 256 kbytes of RAM. It also has a high-speed quad-SPI (QSPI) that can handle high-speed serial flash memory.

The module exposes a USB, serial port, SPI, and I2C interface. A 10-pin header is designed to link the system to an Arduino-compatible host. The USB interface is handy for PCs. The USB port can supply power. Two ports enable the LPC4330 to control positioning servos.

Kickstarter was used to start off the project. The module alone is $59. Different versions are available including one that comes with servos. The open-source software is designed to be augmented. Eventually the system might include a Python interpreter, allowing even more image processing chores to be offloaded.

Check out the Pixy Kickstarter page if you want to get one soon. They already have enough support to start building. I've already put my bid in for one. I would love to see an FPGA instead of a micro but it would easily double the price, make it a lot harder to use but it would be fast.

The software can capture the color of objects. It can also recognize color codes (CC). A CC is two or more adjacent blobs of color. CCs can be tracked automatically. They also could be used to identify a particular type of object like a charging station or a goal.

Illuminating Fingers

Leap Motion’s Controller also handles object recognition but is designed specifically to recognize hand and finger gestures (Fig. 2). It usually sits in front of a keyboard or display. The technology could be incorporated directly into something like a Mac or Windows laptop or a desktop display.

The Controller uses infrared emitters and a pair of cameras to recognize a user’s fingers and hands. It handles gesture recognition and delivers the information via the USB connection. It can be tied into applications to handle 3D positioning and gestures like pinch/expand zooming. Scrolling now is possible with a swipe.

The Controller is designed to be used with PC applications. The Pixy is designed as a low-cost sensor for robotics. Both should lead to some interesting applications.  

Discuss this Blog Entry 1

on Sep 8, 2013

I remember a study we made regarding motion detection via Matlab. We used simple background subtraction to track a tennis ball. The coordinates were quite inaccurate for the ball but very precise for the players. We tried to use Kalman filter, but I didn't know what to place on the matrices of the predict/state equations. All of this accomplished through a simple video camera.

One of my colleagues worked on a project similar to illumination fingers. They applied it to typing without the use of a keyboard, but this time they used a webcam on top of the hand.

Please or Register to post comments.

What's alt.embedded?

Blogs focusing on embedded, software and systems


William Wong

Bill Wong covers Digital, Embedded, Systems and Software topics at Electronic Design. He writes a number of columns, including Lab Bench and alt.embedded, plus Bill's Workbench hands-on column....
Commentaries and Blogs
Guest Blogs
Jan 26, 2017

An Amateur’s View on the P2 (Part 2): Slew Rate and the Oscillator 3

Justin Mamaradlo takes a further look into the P2 op amp and how it functions, analyzing the oscillation and slew-rate characteristics of the venerable component....More
Jul 15, 2016

Simple Yet Effective ESD Testing Methods for Higher Reliability 11

There are multiple ways to test for electrostatic discharge, ranging from implementing a human-body or machine model to....using a balloon and a comb?...More
Apr 8, 2016

Confabbing on the Fabless Fad 5

High capital and maintenance costs, and EDA advances along with abstractions to deal with chip complexity, have been leading contributors to the fabless migration....More

Sponsored Introduction Continue on to (or wait seconds) ×