Semiconductor fabs—among the most heavily instrumented manufacturing facilities in today’s industrial landscape—continue to grow and generate data throughout the manufacturing process, making the term “big data” not a stranger to the industry. Billions of chips are processed every year, and with the ever increasing focus on quality and reliability, engineers are gathering data at every node of the manufacturing process.
Semiconductor chip manufacturing operations produce TBs of data. The biggest challenge that arises in utilizing the big-data environment is to analyze across operations, extract systematic signals from noise, and generate actionable insights in real time. With the focus on quality and specialization, the semiconductor manufacturing process has become fragmented, with a number of vendors managing the process from all around the globe. The challenge lies in terms of data collection, compatibility, and analysis to come up with trends and recommendations that can increase operational efficiency, reduce time to market, and find avenues to save cost in the highly competitive market.
The story doesn’t end here. With the advancement in data analyses techniques and development of new algorithms, the semiconductor companies working in one area are utilizing and implementing them in other areas too. The biggest example of the implementation of this kind of intellectual capital is the extension of part average testing (PAT) beyond the automotive industry, which developed PAT based on Automotive Electronics Committee (AEC) guidelines as statistical technique for outlier detection. The aerospace and medical industries were among the first outside the automotive industry to adopt PAT. Now, PAT is being rapidly being accepted as the industry standard for outlier detection by many industries. PAT allows the capture every die with parametric characteristic falling outside of a statistically calculated pass-fail limit.
An outlier is a chip that has passed the original manufacturing tests but differs from the lot while showing abnormal characteristics and is more likely to fail in the field. The outlier forms the basis of the PAT methodology. The outlier detection in the die-level PAT screening can be both static and dynamic.
The static part average testing, or SPAT, technique uses a list of tests and population data from numerous batches. It establishes lower and upper limits to screen out the dice that fall outside the specified boundary. It sets the mean (µ) and standard deviation (σ) and then defines static PAT limits as µ ± 6σ. The static PAT limits—called the lower specification limit (LSL) and upper specification limit (USL)—are updated every six months or eight wafer lots, whichever comes first.
On the other hand, the dynamic PAT or DPAT methodology calculates limits for each wafer test dynamically while computing the mean and standard deviation for each test batch individually. It’s a preferred method because the limits are calculated on the actual material being tested. A die showing values outside the dynamic PAT limits but within the LSL and USL limits is considered an outlier.
This ability to do the data analysis with a tighter statistical limit allows the detection of failures earlier and prevents the high costs, improves quality, and gives confidence to the customers—some of the major reasons that led to extension of PAT beyond automotive industry.