Qualcomm agreed to pay $44 billion for NXP Semiconductors in part because of its leading position in automotive chips. But the Eindhoven, Netherlands-based company was also trying to get something out of the deal. Qualcomm’s Snapdragon line of chips could help it gain ground in the autonomous driving space, where Nvidia’s Xavier and Intel’s Mobileye chips had a head start.
But regulatory issues last year derailed the deal. That left the Eindhoven, Netherlands-based company on the outside of the emerging autonomous driving market looking in. The company lacks a high performance line of parallel processors that can handle one of the most important tasks in autonomous driving: creating a model of the surroundings based on what cameras, radar, and other sensors see.
Now it is getting another company to fill Qualcomm’s shoes. At the Consumer Electronics Show in Las Vegas on Tuesday, the company said that it had partnered with Kalray to build computer systems for autonomous driving. The partnership aims to help customers safely upgrade from current Level 2 and Level 3 technologies to Level 4 and Level 5 driving.
“We were always working on Plan B in case the deal never happened,” Kamal Khouri, NXP’s general manager of advanced driver assistance systems, said. “These sorts of partnerships combined with supporting open standards is the path we wanted to take,” he added. It would have taken too much time and money to create homegrown alternatives to Nvidia’s Xavier and Mobileye’s EyeQ chips, he said.
The company plans to integrate Kalray’s massively parallel processor array into its autonomous driving development platform, BlueBox. Kalray’s MPPA processor will handle the perception and modeling phase of autonomous driving, modeling the car’s surroundings using sensor fusion, object detection and other artificial intelligence chores. It will run Baidu’s Apollo open source software.
NXP will handle the other major element of autonomous driving: path planning. That means plotting the path the vehicle should take based on the immediate surroundings and then telling the vehicle where and how it should drive. The BlueBox system uses one of the company's Cortex-A72 Layerscape processors and an embedded vision chip designed around the Cortex-A53 core.
NXP has long addressed advanced driver assistance systems, including Level 1—deploying collision warnings, tapping the brakes to help avert potential accidents, and keeping the vehicle a set distance from others on the road—and Level 2 technologies, in which cars steer, brake, and accelerate themselves in limited conditions. The driver, however, must retake control when the situation calls for it.
Giving the car more and more control means giving it more and more compute, too. The company estimates that Level 2 driving demands around a trillion operations per second—more commonly known as TOPS—to handle artificial intelligence jobs. Another 20 million general purpose operations per second, or MIPS, are required to run the algorithms used in planning the car’s route.
Level 3 autonomous cars—in which the driver can completely hand over safety-critical functions in certain situations—demand 50 TOPS and 100 MIPS of compute. Fully autonomous cars that can handle every situation—referred to as Level 5 driving—may need around 200 TOPS and more than 200 MIPS, according to NXP. These cars are also going to need a significant amount of memory.
The challenge is not only giving autonomous cars the ability to take over more and more driving functions but doing so without compromising on safety, security and reliability. Kalray has long focused on building secure chips with high reliability, according to Stephane Cordova, Kalray's head of embedded computing. The company started out serving the aerospace sector when it was founded in 2008.
The company’s latest 28-nanometer chip contains 288 cores capable of running a trillion operations per second. The company’s “freedom of interference” architecture allows each core to work independently. If one fails another can take over what it was doing without causing the whole thing to malfunction. “Safety is the DNA of the architecture,” Cordova told Electronic Design.
One challenge facing autonomous driving developers today is proving that computer systems are safe enough for production cars, Khouri said. The current safety shortcomings of today’s experimental cars can be solved using a two-part system like NXP’s, the company said. “Original equipment manufacturers are starting to think about how to move these solutions into volume production,” Khouri added.
Handling both elements of autonomous driving in the same chip is inherently risky, the company said. Perception, modeling and planning functions running on the same Intel Xeon or Nvidia Xavier chip could end up “clobbering” each other, Khouri said. These chips also have to run additional layers of software to account for a lack of functional safety, wasting memory and processing resources, he said.
“Our platform offers the performance and functional safety needed for reliable autonomous driving as opposed to the risky and power-hungry consumer grade solutions that are currently being tested in vehicles,” Khouri added in a statement. Without hardware isolation, car companies could also have a tougher time upgrading to higher and higher levels of autonomous driving, he said.
“If you want to add more sensors for redundancy’s sake, that means more data for perception and modeling but not necessarily for path planning,” he said. “If you have a monolithic chip, you have to raise the performance of the whole thing. If you separate it out, you can raise the performance on [perception] while keeping [path planning] the same. With our architecture, you can scale one and reuse the other."