This month, we celebrate the 60th anniversary of the integrated circuit (IC). On September 12, 1958, Jack Kilby produced a circuit containing both active and passive components fabricated from semiconductor material and connected with “flying wires.” At the time, scientists and engineers were working diligently to find a substitute for the large, unreliable, energy-hungry vacuum tubes from which the first computers were built. Transistors, which had been invented 10 years earlier, were small, reliable, and energy-efficient, but a production-worthy means for large-scale fabrication and interconnection had proven elusive. Kilby’s device, followed soon by Jean Hoerni’s invention of the planar manufacturing process and Robert Noyce’s conception of a “monolithic IC,” laid the groundwork for the semiconductor manufacturing industry that we know today.
Photo courtesy of Texas Instruments
Back in 1958, the Cold War was at its height, and electronic development and manufacturing efforts were driven primarily by the defense industry’s desire for airborne computational power for navigational and targeting systems in planes and missiles. The Soviet Union’s 1957 launch of Sputnik, and the subsequent space race, would continue this drive. Though military interests still play a driving role in the industry today, they were long ago superseded by the market demands of consumers and commercial interests. In the 70s, it was calculators and mini-computers, in the 80s and 90s, personal computers became mainstream. The first decade of the new millennium saw a shift to entertainment and personal electronics, with must-have products like the PlayStation 2, the iPod, and the iPhone. We are currently in the decade of connections, with innovation focused around smart phones, wearables, the internet of things, 5G cellular technology, cloud computing, assisted driving, and much more. Today, we walk around with more computing power in our pockets than it took to send humans to the moon and back. And technologies like AI, VR/AR, and fully autonomous automobiles are emerging quickly.
ICs themselves have changed from Kilby’s device with critical dimensions measured in millimeters to today’s bleeding-edge designs with structures only a few dozen atoms long. Device architectures have evolved from simple planar structures to complicated 3D finFET designs. The first microprocessors contained a few thousand transistors; today’s, boast a few billion. It was Nobel Laureate Richard Feynman who famously observed that “There is still plenty of room at the bottom.” That was in 1959, the year after Kilby’s invention. Have we bottomed out? Have we finally reached the end of Gordon Moore’s equally famous law that computing power on a chip would double every year or two? Its death has been proclaimed more than once, always prematurely. Somewhat ironically, the explosion of applications in diverse new markets has fostered a revival of legacy manufacturing technologies for devices that do not need the capabilities of the most advanced processes, extending the lifetimes of older fabs and creating an active market for upgrades and updates that incorporate lessons learned at the bleeding-edge. Companies like Lam Research offer performance-proven non-leading-edge products to increase production capacity at a lower economic investment.
What has all this brought us? Clearly, we live in a more connected world, though understanding how to use those connections for our mutual benefit remains a work-in-progress. Kilby, Hoerni, and Noyce, and all the other early contributors, would surely be amazed to see the world we have wrought as we’ve integrated their circuits into almost every aspect of our lives.
Lam Research would like to thank the pioneers in this industry who set the stage for innovation and progress over the past sixty years.