Ml Edge 3 609edb922f10a

FPGAs and eFPGAs Accelerate ML Inference at the Edge

May 18, 2021
With ML models in their infancy, there is a need for flexible HW architectures that allow the acceleration of changing models. Learn why FPGAs, that combine high performance and flexibility, are an ideal solution for edge inference applications.

Many industries are rapidly adopting machine learning (ML) to gain insights on the ever increasing data from billions of connected devices. This, combined with a demand for low latency, drives a growing push to move inferencing hardware closer to the location where the data is created. This white paper will describe why FPGA-based hardware accelerators are required in order to eliminate network dependencies, significantly increase performance, and reduce the latency of the ML application.

Sponsored

Free Poster: Wireless Communications Standards

Get insights about the latest cellular, non-cellular, IoT and GNSS specifications including 5G, LTE and Wi-Fi. Sign up to receive your poster in the mail or via download.

Powering next-generation ADAS processors with TI Functional Safety-Compliant buck regulators

The TPS62883-Q1 buck converter delivers up to 30A, supports stackable architectures for over 100A, and meets ASIL D functional safety standards for ADAS. Its high accuracy, flexible...

Out-of-the-box Cellular and Wi-Fi connectivity with AWS IoT ExpressLink

This demo shows how to enroll LTE-M and Wi-Fi evaluation boards with AWS IoT Core, set up a Connected Health Solution as well as AWS AT commands and AWS IoT ExpressLink security...

Integrated Power Supply Buck Converters

Integrated power supply ICs to implement compact and efficient buck converters for factory automation, 5G and IoT.