menu circle-arrow2 searchtwitteryoutube

Controlling Variability and Cost at 3 nm and Beyond

| June 10, 2019 | Industry, Technology

Richard Gottscho, executive vice president and CTO of Lam Research, sat down with Semiconductor Engineering to talk about how to utilize more data from sensors in manufacturing equipment, the migration to new process nodes, and advancements in ALE and materials that could have a big impact on controlling costs. What follows are excerpts of that conversation.

SE: As more sensors are added into semiconductor manufacturing equipment, what can be done with the data generated by those sensors?

Gottscho: You can imagine doing a lot of compensation in different process control schemes, particularly to get around the variability problem. The industry is just at the beginning stages of that. If you look at the finFET, the three-dimensional nature of that device has challenged us and the industry to come up with robust solutions for high-volume manufacturing. You have to worry about residues in little corners, the selectivity of etching one material versus another, the conformality of the depositions. Everything has become more complicated. And when you start the next generation of gate-all-around at 3nm and below, that’s another order of magnitude in complexity. At first, it looks like a modification of a finFET. But the requirements are getting tightened, and the complexity of that gate-all-around architecture is significantly greater than the finFET. It’s a more complex device than we’ve ever seen, and we keep saying that node after node. Yet we as an industry keep moving forward. Along with that are so many sources of variability, and all of them will matter.

SE: Can you discern variation between different pieces of equipment and within different chambers in the same equipment?

Gottscho: Maybe. The reason the outputs are not the same may be because the integrations are not the same, and if you want to make them the same then the customers have to run the same processes. As soon as they deviate in their choice of materials, the thickness of a film, the critical dimension, or the sequence of operations, you’ll end up with different cost equations. That’s why some customers are more cost-competitive than others.

SE: Does data at this level allow you to say that you don’t need to move the critical dimension forward as fast as if you were doing it a different way?

Gottscho: We’re not there yet, but that’s an ambition. We don’t have that level of maturity in terms of the data mining or the applications.

SE: Some of this is additive, too. So one thing may not be a problem by itself, but in conjunction with other sources of variability, for example, it may get worse. How does data help that?

Gottscho: It’s always been a collaborative process between Lam and our customers, and between our suppliers and Lam. We can’t create equipment and solutions in a vacuum. We really need to understand what the customer is trying to do, so they need to open up to us. And they need to understand the tool’s capabilities, so we need to open up to them. This is where data hoarding becomes a problem sometimes. If you look at chamber matching, that has always been a big challenge because the tolerances keep shrinking. So chamber matching solutions from the past don’t work in the future. The complexity has always been there, but because the tolerances were greater you weren’t as sensitive to the interactions between one process and another. There’s an upstream lithography process that has variability in it. There’s an upstream dep process that variability. All of that impacts what comes out of one etch chamber versus another etch chamber. And then, what about your metrology? Every one of these steps that is necessary to define a result that you want to match to a fleet, or another chamber in that fleet, is actually a composite of all the variation upstream, including the measurement. So now, what does it mean to match the etch result? How do you do that without taking into account everything upstream? And in order to do that, we need to collaborate with upstream suppliers and with our customer. That’s all about data from disparate sources in different formats. Putting that together into an algorithm that allows you to break this into its constituent pieces and feed back to each piece appropriately is a very challenging problem.

Continue reading this article at Semiconductor Engineering.

Share This: