NVIDIA’s Jetson AGX Orin Development Kit is a great embedded platform for delivering 275 TOPS of raw performance. It targets applications like robotics, but works equally well for any embedded application that needs artificial intelligence (AI) and machine learning (ML) support in the field.
The Jetson AGX Orin module (see figure) is built around a dozen, 64-bit Arm Cortex-A78AE v8.2 cores that have access to a 3-MB L2 and 6-MB L3 cache as well as a GPU based on NVIDIA's new Ampere architecture. This includes 2,048 NVIDIA CUDA cores and 64 Tensor cores plus a pair of NVIDIA Deep Learning Accelerators (NDLAs) v2.0 for AI/ML applications. The Programmable Vision Accelerator (PVA) streamlines video processing of multiple video streams, which is a common task for the platform.
The hardware is impressive, but it’s the encompassing NVIDIA software that really makes the difference. The Jetson Jetpack SDK is built on Ubuntu Linux, including UEFI-based bootloader support and OP-TEE as the Trusted Execution Environment. The dev kit comes with everything installed, which makes getting started much easier. It includes support for the underlying CUDA, CuDNN, TensorRT, and DeepStream tools.
Two items that stand out in the latest kit are the NVIDIA Train-Adapt-Optimize (TAO) and NVIDIA RIVA. The latter targets speech AI, while TAO builds on pre-trained models from NVIDIA. These can be customized in Microsoft’s Azure cloud.