In a conversation with The Atlantic in 2013, outgoing Intel CEO Paul Otellini shared his thoughts on the evolution of the semiconductor industry.1 During the interview, he sketched out the “history of the computer industry in one chart.” Advances in computing have driven the growth of the semiconductor industry in general and Intel in particular.

Indeed, this one chart succinctly illustrates how the increasing ubiquity and pervasiveness of computing has driven down the cost structure of the semiconductor industry over decades. Otellini noted that an aggressive drop in the price per unit has enabled the equally exponential increase in the number of computing units sold over time.

Starting from mainframes that cost tens of thousands of dollars each and shipped only in thousands of units per year, the industry evolved to shipping well over 300 million PC units at less than $1000 per unit. Smartphones and tablets are on track to ship well over 3 billion units at less than $100 per unit. As is evident in market trends, we are on the cusp of a new wave that will further proliferate the ubiquity of computing by shipping over 30 billion units at less than $10 per unit (see the figure).

Known as the Internet of Things (IoT), this new wave of computing will be powered by ultra-cheap, ultra-low-power, and highly integrated chips connected to a cloud or a network and embedded in virtually every physical object around us. Many analysts expect this market to range well over 20 billion units by the end of the decade.2

The semiconductor industry has enabled successive waves of computing by re-architecting silicon technology for each wave. Companies that have embraced the new architecture early have been able to support the aggressive reduction in price point and establish leadership. On the other hand, incumbents that have failed to quickly adapt have lost ground to disruptors and have struggled to recover. This observation also holds true in the software industry, where a re-architecting of the operating system (OS) and application software has enabled each wave of computing (Table 1).

For the first time in the history of the semiconductor industry, the next wave of computing will not be driven by the most advanced transistor node, but instead by a lagging transistor node. “Energy-per-operation” will increasingly become the driving metric for the industry, and chip and system-level integration will become the key enablers. Emerging trends, though, are likely to enable the semiconductor industry to support a $10 unit price point for the next wave of computing.

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

Transition 1: Mainframe to PC ($1000 Price Point)

The first transition occurred in the early 1980s with the development of a very simple single-chip central processing unit (CPU), Intel’s 80286. Operating at just 6 MHz, it continued to be enhanced over three decades through aggressive transistor scaling and a relentless focus on increasing performance (frequency).

Gradually, additional chips such as the graphics processing unit (GPU), memory, and connectivity were added to the system to increase functionality. The CPU-driven hardware ecosystem complemented a mainstream software ecosystem (initially MS-DOS and eventually Windows) and enabled computing on a mass scale (more than 300 million units) with a price point of less than $1000 per unit.

The combination of the CPU and added chips delivered higher performance and more functionality. The CPU and system architecture were designed for performance. Power was a secondary metric. Intel dominated this wave through an aggressive pursuit of Moore’s Law and the predominance of x86 architecture while Microsoft dominated through a virtual monopoly of Windows in the software ecosystem.

Transition 2: PC to Mobile ($100 Price Point)

The consolidation of functionality on a single chip (SoC) enabled the emergence of the smartphone in the late 2000s. The smartphone and tablet deliver desktop-class functionality with a chipset comprising far fewer chips. This wave of computing established the dominance of the SoC and connectivity solutions and delivered even smaller form factors within a much lower power envelope and much lower cost.

The proliferation of mobile computing was also enabled by dramatically cheaper (eventually free) software ecosystems such as iOS (Apple) and Android (Google) that were specifically designed for mobile (SoC) hardware and not for legacy PC (CPU) hardware. Companies like Apple and Qualcomm dominated the early rise of this wave through the aggressive pursuit of chip-level integration, and the SoC emerged the clear winner over the standalone CPU.

Transistor scaling and Moore’s Law helped further enhance the mobile wave by enabling better SoCs and a more compact SoC solution.3 Companies like MediaTek are driving hard to dominate the maturing mobile wave by offering the entire SoC for under $5 and enabling a unit price point of $100 per unit.

Transition 3: Mobile to IoT ($10 Price Point)

Trends suggest that every tenfold increase in unit volume has been enabled by a tenfold reduction in unit price. So what is the right system/chip architecture to enable greater than 30 billion units at a $10 price point? The answer may lie in even higher levels of on-chip integration.

To support the very small form factors and aggressively low power envelopes required for sensor hubs, many system-level functionalities will need to be integrated on a single chip or package, eventually leading to a system-in-package (SiP) or computer-on-chip (CoC). This trend is already evident based on the basic requirements of an IoT chip.

For example, a generic wearable chip may need to deliver a combination of logic computing (CPU), connectivity (radio/bluetooth/GPS), non-volatile memory (flash), and various analog and mixed-signal functions as well as a variety of sensors. The critical technology metric for such an IoT platform will be its total power envelope, which will need to be as much as 10 times lower than that for a mobile SoC platform.

Semiconductor companies looking to establish leadership in the IoT will need to focus their efforts on functional integration far more than transistor scaling. Several companies are trying to establish early leadership in this space, including a combination of fabless and fab-lite (Qualcomm, Broadcom, Apple, Texas Instruments, STMicroelectronics, NXP Semiconductors), integrated device manufacturers or IDMs (Samsung, Intel), and microelectromechanical-systems or MEMS (STMicroelectronics, Texas Instruments, InvenSense). Foundries too will need to rapidly adapt their transistor technology roadmaps to enable a silicon ecosystem that will enable such integration.

In addition to a new silicon architecture and ecosystem, IoT computing will rely on cheap, widely available standard software that will enable the hardware to communicate with each other and to the cloud while allowing an independent and robust developer ecosystem to proliferate. Qualcomm is attempting to take an early lead in this space with AllJoyn while Samsung is promoting its Tizen OS. Google recently released Android Wear for developers in the wearable space. Many others are establishing leadership in this emerging market.