>> Electronic Design Resources
.. >> Top Stories of the Week
.. .. >> CES 2021
What you’ll learn:
- A look at recent upgrades to ADAS applications, including advanced monitoring, warning, braking, and steering systems.
- Why more companies are turning to so-called “4D” imaging radar as an effective way to achieve partial autonomy in levels 2 and 3, and may become one of the primary sensors in autonomous level 4 and 5 vehicles.
- A glimpse at the efforts of non-automotive firms like Sony that are placing ADAS-equipped test cars on the road.
We can, together and without debate, agree that advanced driver-assistance system (ADAS) platforms must adapt as technology evolves at both industry and OEM levels. It’s also undeniable that OEMs and Tier 1 suppliers are delivering new multi-purpose solutions ranging from basic safety-compliance functions to advanced levels of automation. All of which are reasons why the ADAS option in an automobile is selling like beer in a college town.
To meet this growing demand, tech companies are scrambling to deliver next-generation products emphasizing both information and safety. Here are a few examples.
Aptiv
Aptiv announced its next-generation Level 1-3 ADAS platform that can be updated over the air (OTA). The sensor suite includes radar, vision, and LiDAR. Among these sensors is the company’s sixth-generation corner/side radars and forward-facing radars, as well as its first 4D imaging radar that’s said to provide twice the detection range versus what’s available on the market today.
The forward-facing radars developed by Aptiv can detect objects 300 meters away and determine their height. Its corner/side radars double the detection range from the previous generation to 200 meters, while also doubling range resolution. According to the company, the vertical field of view is three times as high, and angular resolution is tripled.
Aptiv’s Satellite Architecture approach to sensor fusion takes the intelligence out of the sensors and centralizes it into an active safety domain controller. Signals collected by radars, cameras, and LiDAR feed back to the active safety domain controller, which is a computing platform dedicated to interpreting those signals and executing decisions based on what’s seen by the vehicle. This fuses the data in one step, reducing latency. Its real-time embedded neural network is said to be able to classify dozens of objects within milliseconds.
Machine learning allows the system to improve radar range by 50%, enabling the tracking of small objects that are more than 200 meters away. The system is better able to figure out whether an object is something that the vehicle can drive over or under.
Aptiv’s Interior Sensing Platform includes radars, ultrasonic sensing, and cabin cameras. Driver-state sensing functions, which allow OEMs to account for driver distraction and availability, can be integrated into the ADAS domain controller. It can verify if the driver’s eyes are on the road, but also recognizes and responds to body positioning and gestures. Through standardized application programming interfaces (APIs), the company is able to provide information about how the safety system is performing to the infotainment human-machine interface (HMI), and the HMI can in turn present the information to the driver.
The new ADAS platform can incorporate future technologies and features, including those developed in collaboration with Motional, Aptiv’s autonomous driving joint venture with Hyundai Motor Group. Motional aims to be among the first to put fully driverless vehicles on public roads. Cox Automotive and Motional recently announced a partnership to deploy Cox’s Pivet as Motional’s fleet service provider. The partnership will begin with Motional’s self-driving fleet of robo-taxis in Las Vegas (Fig. 1).
Pivet will coordinate daily, weekly, and monthly cleaning and disinfection, as well as mechanical work, for the Las Vegas fleet, enabling Motional to focus on serving passengers and entering new markets. The company plans to significantly grow its fleet and launch a multimarket service with Lyft, in 2023, in major U.S. cities.
Magna and Fisker
Automotive parts and systems company Magna is collaborating with Fisker to develop ADAS for the Fisker Ocean sports utility vehicle that’s expected to launch in late 2022. The Fisker Ocean, which will have a low starting price ($37,499) and a driving range of up to 300 miles, will initially be manufactured exclusively by Magna in Europe. The vehicle will leverage Magna’s EV architecture combined with the Fisker-Flexible Platform Adaptive Design (FF-PAD) to create a lightweight, aluminum intensive platform for the Fisker Ocean. Fisker claims its FF-PAD design can be adapted to any electric-car platform.
Magna and Fisker will join forces to develop new ADAS features. In addition to leveraging cameras and ultrasonic sensors, the ADAS package includes digital-imaging radar technology. Co-developed with Austin, Texas-based technology startup Uhnder, Icon Radar is said to be the first digital-imaging single-chip radar solution for the automotive marketplace. As part of the cooperation, Fisker will issue Magna warrants to purchase shares currently representing approximately 6% of its equity.
StradVision and D3
StradVision, a supplier of AI-based camera perception software for ADAS and autonomous vehicles (AVs), announced its collaboration with D3 Engineering at CES 2021. The solution integrates a D3 DesignCore platform for product development with StradVision's deep-learning-based front camera software “SVNet.”
StradVision is accelerating the advancement of autonomous vehicles through SVNet, an AI-based object recognition software. According to the two companies, compared to competitors, SVNet achieves higher efficiency in memory usage and energy consumption and can be customized and optimized to any system-on-chip (SoC), thanks to its patented Deep Neural Network. The software also works with other sensors such as LiDAR and radar to achieve surround vision.
SVNet is currently used in mass-production models of ADAS and AVs that support safety function levels 2 to 4, and it will be deployed in more than 8.8 million vehicles worldwide. In this collaboration, StradVision provides six key perception features of SVNet including Lane Detection, Traffic Sign Detection, Traffic Light Detection, Vehicle Detection, Pedestrian Detection, and Free Space Detection to this virtual demo platform.
The solution was viewed at CES 2021 on D3 Engineering's virtual D.A.V.E.—Demonstrator of Autonomous Vehicle Equipment—platform.
LeddarTech and Renesas
LeddarTech, a member of Renesas' R-Car Consortium, announced a new collaboration with the semiconductor supplier on the development and promotion of an automotive ADAS reference platform. This platform combines LeddarTech’s raw data sensor-fusion stack and LiDAR technology with Renesas’ newly launched R-Car V3U—an ASIL D SoC for ADAS and autonomous systems.
Renesas is already investing in LeddarTech SoC development and production for the LeddarEngine, consisting of integrated LiDAR SoCs—the LCA2 and LCA3—and accompanying LiDAR measurement software. These SoCs power what Leddartech calls “the most cost-efficient LiDARs enabling mass deployment of advanced level 2 and 3 passenger-car ADAS applications and a broad range of mobility and industrial applications.”
This automotive ADAS reference platform expands the companies’ collaboration to the system level with a sensor-fusion solution applicable to the camera and radar sensor-based systems, plus systems that add LiDAR, to deliver higher safety and performance. These improvements are achieved with a software-centric extensible architecture compatible with Renesas’ R-Car V3U processor and roadmap.
AImotive and Sony
AImotive, based in Budapest, Hungary, is working with Sony Corp. to pair its automated driving software stack with Sony's VISION-S prototype. According to Sony, the VISION-S integrates level 2+ ADAS capabilities and automated driving systems with the latest EV technology and an innovative entertainment system.
Later in Q1 2021, AImotive will announce new and enhanced versions of its production-ready aiDrive software stack for level 2 to level 4 automated driving, as well as its aiSim development, simulation, and validation tools.
While the car isn't meant for production, Sony announced that its VISION-S prototype vehicle has begun public road tests in Europe (Fig. 2). Sony further said it will continue to develop the vehicle as a showcase of the company's sensors and imaging technologies, plus infotainment tech planned for the auto industry. Sony also plans to conduct driving tests in other regions going forward.