TI’s technology—Ethernet PHYs, controller area networks, FPD-Links, and PCIe communications—can increase vehicle safety while at the same time reducing system size and cost, optimizing high-speed data transfer and meeting International Standardization Organization (ISO) 26262 requirements.
Vision Perception
Much of the good news derived from vehicle sensor development comes from cameras. From the beginning, Tesla, for example, has equipped its cars with eight surround cameras that provide 360 degrees of visibility around the car at 250 meters of range. The car incorporates data from its cameras, identifies important elements of its surrounding environment, and fuses these elements with data from other sensors.
Vision-based perception technology can be used in ADAS applications to interpret the environment and provide information to drivers, helping to move toward a collision-free future. In fact, in the world of autonomous vehicles, computer vision is often referred to as “perception.” That’s because cameras often are the primary sensor used by a vehicle to perceive its environment (such as a front-view camera for collision avoidance and pedestrian recognition, a rear-view camera for backup protection, and surround-view cameras for parking assistance).
In the last few years, deep neural networks have emerged as the dominant approach to working with camera video and images. These neural networks learn from data. For example, to teach a deep neural network what a stop sign looks like, it’s fed multiple stop sign images. Gradually, it learns to recognize a stop sign.
For anyone who has pondered what deep learning would do for ADAS performance in recognizing and classifying various objects, they should look into Stradvision’s SVNet. It runs on the high-performance TDA4AL-Q1 system-on chip (SoC), using camera data to recognize and classify objects.
SVNet is a lightweight solution using compact deep-neural-network (DNN)-based machine-learning algorithms that mimic the information processing of the brain. These algorithms can minimize the amount of computation required per frame. This helps maximize efficiency by minimizing memory and power consumption required for computation.
Delivering Current Beyond 100 A to an ADAS Processor
ADAS isn’t just about sensor technology. Moving toward higher autonomy levels will require enhanced computing power in real-time to provide higher resolution and quick responses. Embedding features such as artificial-intelligence technologies also heightens the need for more power-hungry ADAS SoC processors to enable a higher level of autonomous driving.
Meeting these needs requires a processor to support ECUs beyond 100 A. The design challenges associated with higher power include achieving efficiency for higher current rails, controlling thermal performance and load transients at full loads, and meeting functional-safety requirements.
The TI TPS62876-Q1 buck converter helps designers address currents beyond 30 A with a novel stackability feature (Fig. 2). It achieves the high currents necessary to power a SoC such as the multicore processor TDA4VH-Q1.