My recent travels have taken me to the Intel Developer Forum (IDF) in San Francisco and back to Boston for Design East. I recorded a number of video interviews at both. The Design East 2012 on Engineering TV videos are up right now. You can check the Engineering TV trade show page to see when the IDF videos are up. One of the companies I saw at IDF was PointGrab. PointGrab has technology that turns a webcam into a gesture recognition system.
The other show I had a chance to visit was right down the hall from Design East. It was the Embedded Vision Alliance's Embedded Vision Summit. I had a chance to talk with Jeff Bier, founder and President of BDTI, from the Alliance the latest trends in embedded vision. One of the areas was how applications are taking advantage of smart phones and tablets with built-in cameras. He demonstrated Philip's Vital Sign Camera iPad app (Fig. 1) that detects a person's heart and breathing rate.
This is unlike some other smart phone apps I have used for detecting your pulse rate. Those typically have you place a finger over the lens of the camera. The flash is used to illuminate the finger and the application detects the change of the reflected light that indicates a person's pulse.
Philip's job was a bit more difficult. First, the application has to do facial recognition because it bases its detection process on the subtle changes in the face to determine heart and breathing rate. Second, it has to track the changes so it can count them to determine the rates. It helps to stay still while the process is going on but this is due more to the ability to simplify the algorithms so the processor can handle the analysis.
The latest version can actually identify and track the rate for two people at the same time. Of course, both need to be in front of the camera. The app is available for $0.99 for the iPad and iPhone 4s (and probably the iPhone 5 at this point).
PointGrab's support runs on Windows and targets Windows 8. It is a device driver and includes an API that allows developers to take advantage of the gesture capability.
Yoav Hoshen, Senior VP Business Development, demonstrated PointGrab's technology for me at IDF. I recorded a video of the interview but it has not been posted on Engineering TV. There is no smoke and mirrors but ther is plenty of hand waving.
The effect is similar to what XBox 360 users of Microsoft's Kinect are doing with gaming. The technology used by the Kinect (see How Microsoft's PrimeSense-based Kinect Really Works) is from PrimeSense. It uses a combination of an IR emitter and camera. The PrimeSense system-on-chip (SoC) detects the change in position and size of the infrared pattern that the emitter projects. This is translated into a 3D map that in turn can be used to detect objects like a person. That is then used to track things like arms and hands so gesture recognition can be used.
PointGrab's chore is a bit harder since it uses a single webcam like that typically built into a laptop. It analyzes the video stream to recognize 3D gestures. Not only can PointGrab recognize 2D movement suitable for emulating a mouse or pinch-zoom gestures common on smart phones and tablet interfaces but it can also track movement towards and away from the camera.
The proliferation of cameras paired with more powerful computing platforms is opening vision to a host of new applications. This is just the starting point.