An engineer testing out an autonomous driving system from General Motors one of the companies that has agreed to buy Mobileye39s EyeQ4 SoCs Those chips will not be released until 2018 but the machine vision company is already laying out plans for its successor the EyeQ5 Image courtesy of General Motors

Mobileye Reveals Specs for Future Autonomous Driving Chip

May 23, 2016
Mobileye says that the new system will not only enable safety features like blind spot warning and automatic lane changes, but will also lay the groundwork for fully autonomous cars.

If the BlueBox engine from NXP Semiconductors was the answer to Nvidia’s autonomous driving platform, then Mobileye was straining for the last word. Only one day after the NXP revealed its computing platform, the machine vision company said it would start laying out the next version of its flagship EyeQ5 chip.

For years, Mobileye has been an evangelist for machine vision as the central facet of autonomous cars. But the EyeQ5 will focus more on fusing data from sensors, including radar and cameras. Even though samples will not be available until 2018, Mobileye says that the new computer brain is being designed not only for safety features like blind spot warning and lane change assist, but also for fully autonomous driving.

“We have agreements with several leading automakers on systems that will allow drivers to fully disengage during highway driving beginning in 2018,” said Ziv Aviram, chief executive of Mobileye.. “The new EyeQ5, expected to launch in 2020, will be the flagship system that turns the concept of fully autonomous cars we envision into a reality.”

Mobileye is working with STMicroelectronics on the new system, which could be built on the 10nm processing node or smaller. EyeQ5 will feature eight CPU cores coupled with 18 cores of Mobileye’s vision processors, according to the chip's block diagram. These processors “are optimized for a wide variety of computer-vision, signal-processing, and machine learning tasks,” the company said in a statement.

EyeQ5 will also contain four different types of accelerators, each of which can be programmed for a specific group of algorithms, saving power and computational time. The new system will propel around 12 trillion operations per second while keeping power consumption below 5 watts. That is a major improvement over the roughly 2.5 trillion operations per second clocked by the previous-generation EyeQ4, which will be released in 2018.

EyeQ5 places Mobileye in competition with other companies that are trying to find the most efficient approach to autonomous driving. The available systems are capable of merging sensor data from around the vehicle and supporting machine learning algorithms—but all emphasize these features to different extents.

Nvidia has developed the Drive PX computing platform, which contains machine vision processors that support local machine learning. It is capable of merging data from 12 different sensors located around the vehicle and processing 2.3 trillion operations per second.

Nvidia is trying to make the platform the best possible location for the machine learning algorithms that automakers like Tesla, General Motors, and Ford are spending huge amounts of money to develop. Auto companies are pouring billions of dollars into hiring software engineers and buying autonomous driving startups.

Nvidia is pretty much betting the company on deep learning,” said Paul Teich, an analyst with technology research firm Tirias Research. “That’s their hammer, and it is a really good hammer for self-driving cars,” he said.

An earlier version of the chip, the EyeQ2. (Image courtesy of Mobileye).

On the other end of the spectrum is NXP’s BlueBox engine, which has focused more on sensor fusion while merely adding support for machine learning. The platform contains eight computing cores, providing about 90 billion operations per second. According to NXP, BlueBox is already being tested by major automakers and can be used to develop cars that handle all aspects of driving.

“What stuck out in the NXP release was the lack of machine learning—deep learning—that is almost the inverse of Nvidia’s platform,” said James Hodgson, an analyst for autonomous driving at ABI Research. Hodgson added that sensor fusion has been at the center of NXP's autonomous driving strategy because it requires the largest amount of silicon and has already been widely used in advanced driver assisted systems (ADAS).

Mobileye is a kind of middle ground between the two extremes. It had been closer to Nvidia on that spectrum, focusing primarily on machine vision, but is sliding closer to NXP with sensor fusion. The EyeQ4 fuses up to eight separate sensors, while EyeQ5 will combine up to 20 sensors, according to a Mobileye spokesman.

Automakers will be able to create custom algorithms for EyeQ5 using a complete software development kit from Mobileye. The kit can also be used for prototyping deep neural networks an automotive-grade operating system.

Computer engines like EyeQ5 are having early success with automakers trying to build more advanced autonomous cars. NXP Semiconductors said that BlueBox is already being tested with four of the five largest automakers. Mobileye says that Toyota and Daimler are the only two automakers not using the company’s vision chips.

The claims that the largest automakers are using their systems have underlined the intensive testing required for self-driving cars. Since autonomous vehicles might not reach the average person for decades, automakers are buying up platforms and tinkering with different options.

Auto companies are primarily relying on sensor data at this point, Teich said, because it makes testing simpler and more transparent. Even though machine learning might be what takes the steering wheel out of human hands, it is not always clear why these programs make certain decisions. With sensor fusion, engineers have a clearer view into the flow of data through the vehicle.

“The players in this market don’t really have to beat Nvidia—they have to be smarter about what their knowledge can tell them about the car,” Teich said. “They can grow into doing more advanced recognition.”

Correction: May 24, 2016

An earlier version of this article misstated the number of operations per second that Mobileye's EyeQ5 and EyeQ4 computer chips could support. The EyeQ5 will be capable of 12 trillion operations per second, not 12 billion. And the EyeQ4 can support 2.5 trillion operations per second, not 2.5 billion.

About the Author

James Morra | Senior Editor

James Morra is a senior editor for Electronic Design, covering the semiconductor industry and new technology trends, with a focus on power electronics and power management. He also reports on the business behind electrical engineering, including the electronics supply chain. He joined Electronic Design in 2015 and is based in Chicago, Illinois.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!