Taking a Look at Event-Based Vision Systems

Feb. 10, 2025
Prophesee’s camera provides a data stream, which includes a pixel location and color values when a change is detected, that’s used directly with tools based on spiking neural networks.

What you’ll learn:

  • What is event-based vision?
  • How do event-based vision systems work?
  • Why are event-based vision systems useful?

 

There’s more than one way to digitize an image. The conventional approach using CMOS sensors captures an entire frame. Another way is to track changes to individual pixels and deliver that information instead. This is called an event-based vision system. I talked with Dr. Luca Verre, Founder of Prophesee, about this technology and the company’s implementation (watch the video above).

Converting Event-Based Vision Capture to Frames

Prophesee’s camera provides a data stream that includes a pixel location and color values when a change is detected. This information can be used directly with tools like Brainchip’s neomorphic computing platform based on spiking neural networks (SNNs). SNNs are event-based machine-learning (ML) models that provide artificial-intelligence (AI) functionality similar to other neural-network ML models. Typically, an SNN used for image processing must have frame-based data converted to an event-based stream while it can use event-based camera data directly.

On the flip side, many ML models are built around frame-based image data. An event-based data stream can be used to generate a frame-based image (Fig. 1). Essentially, changes would be made to a frame buffer as new event data is received.

Low-Power Object Tracking Using Event-Based Vision Systems

Event-based cameras can report movement (Fig. 2) using very little power. A system is able to determine movement, including direction, which helps with things like gesture recognition. Detecting change, and hence movement, while ignoring the rest of the scene can reduce the overhead of processing the scene.

The eye-tracking demonstration places a camera in the glasses that track the eye movement (Fig. 3). In this example, the yellow dot in the laptop display shows where the user is looking at their forefinger. The augmented-reality (AR) glasses also had a forward-facing camera that generated the image on the laptop.

High-Speed Tracking Using Event-Based Vision

One application in which the event-based technology particularly shines involves fast-moving recognition items. A frame-based camera usually operates at a frame rate of 30 or 60 frames/s. This is insufficient when it comes to accurate tracking of fast-moving objects like the water whirlpool in Figure 4.

The display on the screen tracks the fast-moving particles at a faster rate than a frame-based system. Of course, tracking too many objects could generate more data than a frame-based system, but even such a system would have a tough time extracting the object information.

Each type of image-capture system has advantages and disadvantages. Their functionality can overlap, but an event-based approach often is better for tracking many types of objects or state changes with better response time and using less power.

>>Check out more of our CES 2025 coverage

CES 2025 promo ID 47376494 © Anthony Aneese Totah Jr - Dreamstime.com
CES 2025 Las Vegas Convention Center
Embedded

CES 2025: Behind the Scenes

Check out the technology and announcements you probably won't see on the trade-show floor.
About the Author

William G. Wong | Senior Content Director - Electronic Design and Microwaves & RF

I am Editor of Electronic Design focusing on embedded, software, and systems. As Senior Content Director, I also manage Microwaves & RF and I work with a great team of editors to provide engineers, programmers, developers and technical managers with interesting and useful articles and videos on a regular basis. Check out our free newsletters to see the latest content.

You can send press releases for new products for possible coverage on the website. I am also interested in receiving contributed articles for publishing on our website. Use our template and send to me along with a signed release form. 

Check out my blog, AltEmbedded on Electronic Design, as well as his latest articles on this site that are listed below. 

You can visit my social media via these links:

I earned a Bachelor of Electrical Engineering at the Georgia Institute of Technology and a Masters in Computer Science from Rutgers University. I still do a bit of programming using everything from C and C++ to Rust and Ada/SPARK. I do a bit of PHP programming for Drupal websites. I have posted a few Drupal modules.  

I still get a hand on software and electronic hardware. Some of this can be found on our Kit Close-Up video series. You can also see me on many of our TechXchange Talk videos. I am interested in a range of projects from robotics to artificial intelligence. 

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!