Gartner Dataquest To EDA Industry: “It’s The Software, Stupid!”

July 24, 2006
The 43rd DAC kicked off in traditional fashion late Sunday afternoon with the annual Gartner Dataquest reception and briefing. Held in the San Francisco Marriott, the Dataquest gathering is when we all know for sure that DAC has begun. The reception is a

The 43rd DAC kicked off in traditional fashion late Sunday afternoon with the annual Gartner Dataquest reception and briefing. Held in the San Francisco Marriott, the Dataquest gathering is when we all know for sure that DAC has begun. The reception is a great opportunity to touch base with old friends, make a few new ones, and to gather a few data points to test over the course of DAC itself. But primarily, it’s a platform for Gartner’s semiconductor and design analysts to deliver their assessments of how the semiconductor and EDA industries are expected to fare, and of the trends they see emerging that’ll influence their predictions.

As always, the presentations begin with an introduction from the chairman of the Electronic Design Automation Consortium (EDAC). Wally Rhines of Mentor Graphics has passed the chairman’s baton to Synopsys’ Aart de Geus. de Geus, in his typical fashion, managed to mix a little humor into his otherwise straightforward rundown of EDAC’s spheres of influence. Suffice to say, we eventually learned just why de Geus insisted that Gartner’s chief EDA analyst, Gary Smith, wear a white suit for the occasion.

ASICs Heading Toward 32 nm Batting leadoff for Gartner’s analyst team was Research VP Bryan Lewis, who covers ASICs, SoC, and FPGAs. Lewis introduced one of the subthemes that ran through many of the afternoon’s presentations: the notion of restrictive design rules (RDRs) becoming prevalent at the 32-nm process node, possibly by 2010. RDR, also known as structured regular silicon, is a concept that, according to Gary Smith, has already been proven at IBM at the 45-nm node. It centers on creating chips from wafers of pre-designed arrays of logic cells. According to Lewis, nodes below 45 nm may need a more rigid architecture if designers are to increase yield predictability. As would be pointed out later, the use of RDRs at 32 nm may have repercussions for the EDA industry.

In discussing platform-ASIC alternatives, Lewis noted that the concept of cell-based ASICs with a PLD partition has not yet taken off. Cell-based options, exemplified by Texas Instruments’ OMAP platform, are doing quite well.

Lewis confirmed what was already obvious regarding structured ASICs, namely, that they’re even more of a niche implementation option than Gartner had concluded in its 2005 forecast. LSI Logic abandoned the structured-ASIC market earlier this year. Hence, Lewis has halved his forecast for the structured-ASIC market, rolling his 2007 estimate back to $564 million (from $894 million) and his 2008 number to $700 million (from $1.4 billion). Yet, Lewis claims, structured ASICs are not dead.

Overall, ASIC design starts are down 6.4% for 2006. ASIC starts have leveled off from the precipitous declines of past years, but Lewis sees them continuing to decline at about the 5-6% rate for the next few years.

Lewis then shifted gears to talk about what he calls “second-generation SoCs,” and in so doing, he introduced yet another subtheme that ran through the presentations. In Gartner’s terminology, first-generation SoCs consist of a compute engine, memory, and associated logic subsystems. The emergence of multiple instances of that lineup on the same chip, with each compute engine possibly running a different operating system and a top software layer tying the chip together, constitutes a “second-generation” SoC. Philips’ Nexperia, TI’s OMAP, and Matsushita’s UniPhier devices are early examples of these multiprocessing SoCs.

For Lewis, the overriding questions with the emergence of second-generation SoCs concern cost and software. These platforms can cost $1 billion to develop. Who will take up the challenge of developing software for them? What is the return on investment one can expect for doing so? Lewis pointed out that in the cases of the Nexperia and OMAP platforms, Philips and TI undertook software development themselves. But in Lewis’ view, it’s clear that 2G SoCs will require a great deal of IP reuse as well as increased application of electronic system-level (ESL) design tools.

Don’t Forget The Software! Gartner’s second speaker, Research VP Daya Nadamuni, sounded an alarm for the industry on the topic of software development. In noting that our cellphones, which are MP3 players that can also deliver our e-mail while giving us directions to a dinner across town via GPS, Nadamuni was slyly pointing out that much of that expanded functionality comes through software.

It’s been Moore’s Law that’s driven much of the development in the electronics industry for the past 40 years or so. But Nathan Myrvhold’s (of Microsoft fame) First Law of Software states that software is a gas: it expands to fill the available hardware resources. It’s that propensity of software developers to use everything the hardware designers give them that, in turn, drives the hardware teams to whip up new hardware with even higher performance. But that performance now comes at a price; namely, greater power consumption and more heat.

The answer to the heat problems, claims Nadamuni, is multicore platforms (Lewis’s 2G SoCs). While the individual cores may run slower than the compute engines on first-generation SoCs, they’ll team to yield more overall processing power. Hence, many first-generation SoCs will be redesigned as 2G SoCs by 2010, Nadamuni projects.

A proliferation of multicore platforms will, however, spawn a new crisis on the software side. Massively parallel architectures, which are a fixture on the server side of the industry, are an enigma to embedded-system coders. The problems are compounded by a hardware-centric perspective.

Nadamuni concluded by asking, “Will we follow Nathan’s First Law of Software? Can software continue to push hardware designers, thereby keeping Moore’s Law intact?”

32 nm And EDA: Will They Mesh? Making her DAC debut was Gartner’s Mary Ann Olsson, a research VP who’s recently moved over to the design side of Gartner’s house from the semiconductor side. In discussing the evolving semiconductor landscape, Olsson noted a disconnect between the semiconductor and EDA industries. She asked, and endeavored to answer, several pertinent questions: Will EDA tools developed for the 65- and 45-nm nodes scale down to 32 and 22 nm? Is the semiconductor industry’s bread and butter still being made in 130-nm and 90-nm production, despite the emergence of 65-nm production silicon? And will restrictive design rules (RDRs) change the business model for the industry at the 32-nm node? What, if any, is the effect of RDRs on the EDA industry’s future?

First, Olsson stated that 450-mm wafer fabs will become a reality by 2014. The bad news, though, is that there will be only about 25 companies worldwide who will be able to afford the price tag of such a fab, pegged at $3.8 to $5 billion.

Nonetheless, says Olsson, almost all of the large integrated device manufacturers (IDMs) are investigating RDRs. Because of the inherently rigid architectures of RDR, widespread adoption of that technology will result in less costly, easier-to-use EDA tools.

It’s expected that some percentage of designs at 32 nm will use RDR. In Olsson’s view, if that percentage exceeds 40%, the result will be hampered growth in DFM tools. This pronouncement couldn’t have gone over too well with the DAC crowd, made up largely of EDA executives, many of whom have staked their futures on the growth of the DFM market.

Think Out Of The Box As always, the Gartner presentations closed with Gary Smith’s talk. Smith tied together many of the threads that had emerged in the earlier presentations: RDR, software, and ESL. And, as usual, Smith challenged the assembled EDA luminaries to “think out of the box” to drive growth in their perennially stagnant industry.

Smith asked a pointed question: Is the EDA doing a good job? On the hardware side, sure. As he pointed out, the cost of designing an IC has held pretty steady ($10 million to $20 million) since 1997. The same can be said for verification. But why, then, is the cost of design overall rising so high? “It’s the software, stupid!” exclaimed Smith. As Daya Nadamuni had stated, more powerful hardware means more code. The software-development community has never been strong on efficiency. Yet, Smith maintains, programmability has replaced power as the key impediment to the continuing dominion of Moore’s Law.

How, then, is the EDA industry going to shake itself out of the doldrums that have pervaded it for so long? Well, it won’t be RTL tools, which have become a commodity as gate-level tools had by 1991. According to Smith, 5% growth in that arena is “optimistic.”

DFM-compatible tools drive growth in the IC CAD market. But Smith echoed Mary Ann Olsson’s comments in stating that if RDR takes hold by the 32-nm node, that growth would slow considerably.

One area that the EDA industry must look to for growth is the in-house-developed tools that are growing in prevalence at IDMs. Last year, Smith reported that such tools were used by 27% of design engineers. That figure has ballooned to 38%. The EDA industry needs to investigate which tools IDMs are building themselves and then figure out how to build them better for commercial sale. As an example, Smith mentioned Philips, which builds most of its own analog/RF tools in house as well as its ESL tools.

For many years now, Smith has held up ESL as a potential growth area for EDA. Calling 2006 “Year Two” of the ESL methodology in practical usage, Smith still sees ESL as a growth area but has focused on two “killer apps” for ESL. One, a concurrent software compiler, is a must-have for the success of Bryan Lewis’s 2G SoCs. Smith believes that the EDA industry has a better shot at developing a concurrent software compiler than does the embedded-system software developers. For one thing, EDA vendors have a much better understanding of concurrency. For another, compilation is a core competency for the EDA industry.

A second “killer ESL app” is what Smith terms the “architectural workbench.” As an example, think of The MathWorks and its architectural exploration tools and Matlab language. However, says Smith, the user of the architectural workbench isn’t a hardware or software engineer. It’s an electrical engineer. In Smith’s words, “develop this kind of tool for the engineers in Brazil or Bulgaria and it’ll be a winner.”

Smith concluded by exhorting the EDA industry to “take off its IC-design blinders.” The industry, he says, should develop tools for the entire design flow, with software design being a major roadblock.

Sponsored Recommendations

TTI Transportation Resource Center

April 8, 2024
From sensors to vehicle electrification, from design to production, on-board and off-board a TTI Transportation Specialist will help you keep moving into the future. TTI has been...

Cornell Dubilier: Push EV Charging to Higher Productivity and Lower Recharge Times

April 8, 2024
Optimized for high efficiency power inverter/converter level 3 EV charging systems, CDE capacitors offer high capacitance values, low inductance (< 5 nH), high ripple current ...

TTI Hybrid & Electric Vehicles Line Card

April 8, 2024
Components for Infrastructure, Connectivity and On-board Systems TTI stocks the premier electrical components that hybrid and electric vehicle manufacturers and suppliers need...

Bourns: Automotive-Grade Components for the Rough Road Ahead

April 8, 2024
The electronics needed for transportation today is getting increasingly more demanding and sophisticated, requiring not only high quality components but those that interface well...

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!