menu circle-arrow2 searchtwitteryoutube

Virtual sensor innovation for higher productivity

Phillip Jones | Senior Staff Data Analyst
Brian Williams | Deposition Engineering Manager
June 1, 2022 | Industry, Technology

It takes a lot of different types of process tools to make chips, from deposition, to lithographic to etch and to cleaning tools among others. Large scale production requires chipmakers to use large fleets of the same chamber types to perform a particular processing step, such as fin etch used in making 3D transistors. Ideally, wafer lots would be processed identically across the fleet, meaning every chamber would behave exactly the same as all others. In practice, however, performance will vary from chamber to chamber due to slight differences in the many control parameters that determine the success of the process. These parameters, including pressure, temperature, power delivery and surface conditions, all have to be co-optimized together.

The chamber matching challenge

The process of bringing performance closer together is called chamber matching. As chip device sizes are shrinking and process tolerances are getting more stringent, the challenges associated with chamber matching are increasing. Traditional methods have included a ‘golden chamber’ approach or sub-component matching. The ‘golden chamber’ method involves defining one chamber as ideal and attempting to tune all others to produce the same results. Sub-component matching focuses on the hardware subsystems and defines rigorous tolerance specifications that each chamber has to satisfy. The assumption of this approach is that if all components of each chamber match identically, then the chamber as a whole should match. Both of these traditional methods have limitations when it comes to complex physical and chemical interactions of advanced plasma processing.

Lam’s Data Analyzer: a proven solution

The Equipment Intelligence® Data Analyzer has been used broadly for fleet matching and big data analytics of etch chambers on the 2300® platform. Many fabs around the world have widely reported dramatic improvements in chamber matching performance, more rapid troubleshooting of chamber issues, improvements to uptime and improvements in MTBC (mean time between cleans).

The approach taken is to look at large data sets of tool sensor output from wafer processing and to identify the natural distribution within a fleet of chambers to detect mismatched chambers, then to drill down to a root cause and correction. This method is a big data multivariate machine learning approach, which looks at many signals within a chamber or within a chamber subsystem.

An illustration of a fleet of Lam 2300® etch systems (each with six separate chambers) and their process distribution before and after Lam’s Equipment Intelligence® Data Analyzer big data machine learning analysis

Now available for multiple station process modules

A new version of the Equipment Intelligence® Data Analyzer designed for plasma-enhanced chemical vapor deposition (PECVD) and atomic layer deposition (ALD) chambers has been deployed at multiple sites. The software has been adapted to accommodate differences in the wafer flow scenarios in PECVD/ALD chambers in relation to etch chambers (multi-pedestal vs. single wafer chambers). New parsing schemes have been added so that individual wafer movement can be tracked across multiple-station process modules.

Accommodations were also implemented in the software to deal with recipe blocks, sub-recipes, and multiple recipe iterations, all unique characteristics of PECVD/ALD wafer flows.

Different views of a Lam PECVD process tool with four quad station (four wafers at a time) chambers

Virtual sensor innovation

The Equipment Intelligence® Data Analyzer is currently being utilized at multiple high volume manufacturing (HVM) customer sites using production data from VECTOR® Strata® PECVD fleets and is accelerating meeting the most critical customer requirements, such as extending preventive maintenance (PM) cycles and improving uptime. This includes the creation and deployment of predictive control charts based on regression models, the use of classification models for predicting metrics and notifications sent to key personnel to allow for early warnings and rapid resolution of potential problems. The use of virtual metrology-based regression models allows for the prediction of production performance as well as an easy drill down to the root cause of production trending. The Data Analyzer has also been used in a reactive fashion to quickly diagnose multiple critical tool issues such as chamber matching or unscheduled down time.

One of the key objectives of using big data machine learning approaches in HVM semiconductor manufacturing (such as the Data Analyzer) is to allow process tools to achieve higher productivity (more good wafers out at lower cost) than ever before and this is now being achieved at multiple customer sites around the world.

Share This: