American engineers invented the transistor, perfected the integrated circuit, and built the world-leading firms that powered the computing revolution. But America no longer leads in the semiconductor industry, thanks to decades of complacent economic policy summarized in an infamous remark by the chair of George H. W. Bush’s Council of Economic Advisors: “Potato chips, computer chips, what’s the difference?”
Quite a lot, it turns out.
While U.S. policymakers held tight to the belief that it doesn’t matter who makes what where and trusted “comparative advantage” to leave each country specializing where it could naturally excel, foreign governments placed big bets on the value of dominating the digital future. The Reagan administration fended off Japan’s challenge in the 1980s, but no such defender of the national interest stood ready to take action as Taiwan and South Korea surged forward in the past two decades.
Combined, those countries now manufacture more advanced chips than the United States and account for more of the industry’s value-add. Their national champions, Taiwan Semiconductor Manufacturing Company (TSMC) and Samsung, have surpassed Intel in manufacturing prowess. China continues to invest aggressively in its effort to overtake them.
In 2018, Congress created the National Security Commission on Artificial Intelligence “to consider the methods and means necessary to advance the development of artificial intelligence, machine learning, and associated technologies to comprehensively address the national security and defense needs of the United States.” The commission’s final report, issued in March 2021, put the matter succinctly: “After decades leading the microelectronics industry, the United States is now almost entirely reliant on foreign sources for production of the cutting-edge semiconductors that power all the AI algorithms critical for defense systems and everything else. Put simply: the U.S. supply chain for advanced chips is at risk without concerted government action.”