Image credit: Qualcomm
Qualcomm Ai Stack Promo 62b9ec3853f2c

Qualcomm Wants to Put AI Everywhere with New Software Stack

June 27, 2022
The U.S. chip giant is banking on its unified AI stack to secure its spot in the edge-computing market.

Qualcomm is trying to lend a helping hand to anyone that wants to run artificial-intelligence (AI) workloads on its expanding lineup of chips, in a bid to boost its ambitions in connected intelligent edge devices.

The U.S. semiconductor giant rolled out the Qualcomm AI Stack that unifies all of its existing software and tools into a single package, giving developers an end-to-end solution to deploy AI on a range of different chips.

The stack aims to allow manufacturers and developers to make AI models they have trained and optimized for one type of device (such as a smartphone), easily transferable. It will allow them to port it to another piece of hardware such as a laptop or augmented-reality (AR) headset instead of having to start over from scratch.

However, as of now, the new package of AI software only works with the company’s connected intelligent edge products.

“Hardware is critical,” said Ziad Asghar, VP of product management at Qualcomm Technologies, in a recent briefing with reporters and analysts, “but increasingly, AI software is absolutely critical.” He added, “We think this is the leadership stack for the connected intelligent edge.”

“Full Stack” Ambitions

CEO Cristiano Amon has said he aims to transform Qualcomm into a “full stack” company that can address a broad swath of the edge-computing market. The company is using its expertise in the high-end mobile-phone chip arena—where battery life is the biggest priority—to win customers in new markets, including PCs and the IoT. It is also building more powerful AI accelerators such as its NPU into its processors.

The company is rapidly becoming a central player in the automobile market. It rolled out its Snapdragon Ride family to be the brains of GM’s autonomous-driving feature and other chips to drive the digital dashboards in cars. The company wants to take a shot at other areas, too, including data centers, base stations, and robots for use on factory floors.

For Qualcomm, the strategy shift is ratcheting up the competition with the likes of AMD, Intel, and NVIDIA. They are all betting on special-purpose hardware as AI becomes more widespread from the data center out to the edge.

But instead of offering chips and letting customers figure out what to do with them, they are also investing in software tools to program them like Intel’s OneAPI, NVIDIA’s CUDA, and AMD’s upcoming Unified AI Stack.

Qualcomm hopes to stand out from the crowd with a unified software stack that covers every processor in its portfolio. With it, developers gain the ability to develop an AI-powered feature and then move it across different products and tiers.

The company said consolidating its AI software assets into a single stack mirrors its strategy in the hardware department. This involves leveraging the building blocks in its flagship Snapdragon family of mobile chips to expand into new markets. 

“We have now made the same leap [with software], where essentially the same offering is able to cover each and every business we have today,” said Asghar.

Stacked for the Edge

By reducing the amount of work required to adapt AI features from any type of Qualcomm chip to another, the firm is promising to save on costs associated with engineering resources as well as time to market.

“As we have scaled into new businesses, we are now giving OEMs the ability to do the same, without having to spend a lot more in terms of [engineering resources],” said Asghar.

Qualcomm said the unified AI Stack supports a wide range of different AI frameworks and widely used runtimes, including PyTorch, TensorFlow, and ONNX, plus various common libraries, services, compilers, debugging tools, and programming languages, as well as system interfaces, accelerator drivers, and other tools. “There is support at every different level of the stack,” noted Asghar.

The operating systems supported include Windows, Android, Linux, Ubuntu, CentOS, and even several automobile-grade and embedded real-time operating systems such as QNX and Zephyr.

This stack provides direct access to the Qualcomm AI Engine and dedicated AI cores on its Cloud AI 100. The new offering, called AI Engine direct, will now scale across every AI accelerator across its processor families.

The AI Engine is a software library that is able to delegate and deploy existing models directly to the AI accelerators at the heart of its hardware. With AI Engine direct, companies can program software even closer to the silicon to eke out extra performance or reduce power.

Tools in the Toolbox

The unified stack includes a wide range of unique tools to make it easier for developers to deploy AI across its portfolio of chips—in a way that surpasses the software rivals are bringing to the table, said Qualcomm.

One of the standout features of the Qualcomm AI Stack is the AI Model Efficiency Toolkit. Asghar explained that this tool offers the ability to condense a power-hungry AI model trained in the cloud with 32-bit floating-point operations to the 8-bit integer format so that it can run better on battery-powered devices. “You are able to bring huge benefits in terms of power consumption,” up to 4X improvement in many cases, he said.

Another key component is the Neural Architecture Search tool co-developed with Google. This tool allows developers to optimize AI models to fit various constraints, such as higher accuracy, lower power, or lower latency. That way, you can fine-tune the same AI model to excel at power efficiency in smartphones or IoT devices, where long battery life is vital, or reduce latency in industrial robots, where delays are a safety risk.

“We think that with the Qualcomm AI stack, we are enabling developers and OEMs to be able to do so much more with the AI capabilities that we are baking into our devices,” said Asghar.

Customers can get access to these tools through a graphical user interface (GUI) for AI development.

Domain-Specific SDKs

Qualcomm has previously rolled out most of the underlying software in its AI Stack. Many of its software development kits (SDKs) for cars (Snapdragon Ride) and AR and VR (Snapdragon Spaces) are powered by elements of it.

It is also a major building block in the company’s Neural Processing SDK, which it said remains a popular tool for device manufacturers to port AI models to a broader range of Qualcomm-powered hardware.

Unifying all of these building blocks into a single stack is a “leap forward” for its customers, said Asghar. But it gives Qualcomm a lot more flexibility as well: The company said the unified AI stack will serve as the foundation for it to roll out domain-specific SDKs as new markets come into focus and customers request new and better tools. 

About the Author

James Morra | Senior Editor

James Morra is a senior editor for Electronic Design, covering the semiconductor industry and new technology trends, with a focus on power electronics and power management. He also reports on the business behind electrical engineering, including the electronics supply chain. He joined Electronic Design in 2015 and is based in Chicago, Illinois.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!