Tasked with squeezing billions of transistors onto fingernail-sized slabs of silicon, chip designers are asking whether machine learning can help.
In the view of electronic design automation firms, machine learning tools could chisel rough edges off complex chips, improving productivity, optimizing trade-offs like power consumption and timing, and testing that chips are ready for manufacturing.
Though chip design is still a deeply creative process, engineers need tools that abstract the massive number of variables in modern chips. Using statistics, the software generates models fitted to simulations that replicate how physical chips will work. The tools would seem to be prime candidates for machine learning, which can be trained to find hidden insights in data without explicit programming.
But these teachable tools are still rare, said Elyse Rosenbaum, a professor of electrical and computer engineering at the University of Illinois Urbana-Champaign, in a telephone interview. Most machine learning tools that do exist are used to confirm that chips match specifications and will be manufactured without flaws.
Rosenbaum, who helps lead the Center for Advancing Electronics with Machine Learning (CAEML), said that most EDA applications will require humans for the training side of the equation. That contrasts from image recognition and cancer detection programs, which excel with unsupervised forms of machine learning.
Designing chips creates lots of data – and sometimes more than engineers know what to do with. “We need to stop providing chip makers with more data,” Ting Ku said on a panel at the Design Automation Conference last month in Austin, Texas. “We need tools to make some decisions.”
Ku, senior director of engineer for Nvidia, said that the company is already using machine learning to provide insights into manufacturing variations that could affect its graphics chips. And these variations are growing more unpredictable with the shift toward smaller process nodes like 10 nanometers.
But this is still virgin ground for an industry that only a few years ago signed onto big data analytics. “We can smell machine learning problems, but we can’t just take a course in it,” said Jeff Dyck, vice president of technical operations at Solido Design Automation. “We cannot back designs on guesses, we need higher levels of confidence.”
Solido’s tools are representative of how the industry is dipping its toes into machine learning. The firm recently released new characterization tools, which after being trained on circuit simulations can make faster predictions than other tools about how, for instance, standard cells and memory will react to higher-than-normal voltages.
Amit Gupta, Solido’s chief executive, said in an email that the software “automatically determines and runs specific simulations, that is used as the training data to build the machine learning models in real time. The models then predict results with brute force accuracy. We find that building design-specific models per run is effective.”
Solido claims that its other tools can verify memory, analog, and other circuits against statistical process variation faster than conventional software. Solido, which recently started a program called ML Labs to build special tools for customers, says that more than 40 companies use the variation-aware design tools to cut power consumption and die size.
Lots of other possibilities exist, though. Dave Kelf, vice president of marketing for OneSpin Solutions, said in an interview that the company is looking to apply machine learning to formal verification, which uses statistics to locate errors missed by simulations. Manish Pandey, Synopsys’ chief architect for new technologies, has floated the same concept.
Eric Hall, chief technology officer at consulting firm E3 Data Science and a former master engineer for Broadcom, said that software trained by humans could also be useful for estimating logic gate power and timing in chips, as well as testing reliability and non-linear responses.
Plunify, a Singapore-based software start-up, has targeted a new pattern recognition tool at FPGAs. The tool, called Kabuto, locates inelegant code that hurts performance or causes timing issues like pipelining. When the final version is released, it will suggest RTL fixes that chip architects can apply automatically.
Software that learns to design chips from scratch will not come out anytime soon or at all, experts said. In the short term, the EDA industry is targeting tools that act like editors making grammatical and spelling changes to the first draft of a novel, optimizing interconnects and other circuits much faster than traditional software.
The usefulness of convolutional neural networks – which have been used to classify images, make financial decisions, and play intuitive board games like Go – is still under debate. It might not fit electronic design automation at all, said Paul Franzon, a professor of electrical and computer engineering at North Carolina State University.
“For most problems you want another model, not a classification,” said Franzon, who oversees CAEML along with Rosenbaum and Madhavan Swaminathan of Georgia Tech, in a recent interview. “You can distinguish a Persian cat from one sitting on a fence, but that is not a concept that fits many EDA problems.”
To learn how machine learning fits into chip design is the mission of CAEML, which launched last year with National Science Foundation support. It has partnered with chip suppliers like Samsung and Qualcomm, software firms Cadence and Synopsys, and server companies like Hewlett Packard Enterprise.
CAEML projects include the use of machine learning to reuse intellectual property, optimize power delivery networks, model high-speed links, and create modular algorithms to speed up verification. Another will test deep learning software for checking that chip layouts match specifications.
Sorin Dobre, senior technology director and manager for Qualcomm, who has designed digital chips from 180nm to 7nm, said that such tools could not only make life easier for senior engineers but also make chip design more accessible to those without decades of experience.
Dobre also said that machine learning could maximize how companies manage massive amounts of design data. IC Manage is already on that route with software that analyzes tape outs to predict bugs in new chips. The Campbell, Calif.-based firm already sells software for accelerating EDA tools, which lets companies cut back on servers for handling their data.
Few experts doubt that machine learning is part of the industry’s future, keeping engineers from getting bogged down in the complexity of multicores and system-on-chips. “In the longer term, we hope to push good design into software, so that it moves with creativity on the human side,” Franzon said