America could have the world’s most advanced chips and still fall behind China

Nobel Prize winning economist Robert Fogel examined the once-popular theory that a single industry – railroads – explained American’s explosive economic growth in the 19th Century. Fogel concluded that economies are complex, built upon many industries, and it is the sum of progress in all these industries that explains growth and economic power.  Fogel’s conclusions appears to be forgotten in today’s discussion of semiconductors.

This forgetfulness has led to an unfortunate reductionism in many discussions of competition with China.  While semiconductors are a strategic industry and chips are foundational for modern economies, the benefits of having the most advanced chips (or denying them to an opponent) are grossly exaggerated and even irrelevant to aspects of this competition.  It certainly was irrelevant in the competition with the Taliban, a much less advanced opponent who, despite lacking any access to AI and GPUs, drove the U.S. from their country.  

Technological reductionism takes complex issues and reduces them to a few variables.  Some likely sources of this reductionism are a confused understanding of economics and markets combined with an equally confused thinking about technology and military power.  The initial assumption – that industrialization creates power – is correct but only a starting point.  the latest phase of industrialization, driven by GPUs and AI, is a “digital revolution” and is reshaping the economies of the US and other countries.  As with railroads, chips play a part in this, but there are many steps between the ability to make advanced chips and turning this into commercial or military advantage.  If chips alone determined national success, Taiwan would be the most powerful country the world.  

It is in the US interest to continue to lead this digital revolution and build its infrastructure, even in China. A precedent for understanding why leading the AI and data center infrastructure boom is essential in late 19th century Britain, when it built the global infrastructure for telecommunications.  This provided it with strategic advantage and helped grow both Britain’s economy and that of other trading nations.  The world is more complicated now than in 1900 and AI is a much more complex tool than telegraphy, but being the infrastructure builder remains a compelling source of national power. 

Note that Britain relied on innovation, not export controls for its success.  Another example of why building a global infrastructure is important comes from the Huawei/5G story shows. China built most of the world’s 4G telecom infrastructure, giving it commercial, technological ,and intelligence advantage. It would be better for American security to avoid a similar outcome with the build-out of AI infrastructure around the world.

What is important is how the AI services created by chips are used. Companies must find business uses that justify their AI investment.  This process of discovery and experimentation will accelerate over the next decade, the same way that the internet began as a toy and ended up reshaping global business. The first step is acquiring chips, incorporating them into a device, then writing software that takes advantage of the chip, and then finding competitive commercial advantage from these applications.

The same is true for military uses.  The hype cycle does not apply only to business.  Chips by themselves are not lethal.  A chip enables software that is then incorporated into a weapon. If the weapon is used to advance an effective strategy, a better chip provides benefits. There are many cases, such as in 1940 when superior French tanks faced blitzkrieg, where having the best weapons proved inadequate against an opponent with a better strategy. Advanced chips can create better military technologies, but will make only a marginal contribution if these are not embedded in effective strategy and tactics and buttressed by adequate logistic, intel and training.  The Houthi, for example, seem to be doing just fine without GPUs.   

Another example involves hypersonic missiles. China has the best hypersonic missiles, out-tests and outproduces the U.S. in missiles, and this worrisome lead does not depend on semiconductors or AI.  In the missile competition, the US is disadvantaged by its cumbersome arms acquisitions processes and AI will not change this. 

Properly applied, AI will increase lethality, survivability, and precision.  It may even improve decision making. But it will not compensate for badly conceived strategy. Military use of AI does not yet reduce the need for either better weapons, more weapons, or better plans (especially is that opponent has “good enough” weapons.  As AI tools for warfare are developed, they can reduce the chaos and fog of warfare but many other factors will still determine outcomes, including the skill of commanders and troops, the quantity and quality of weapons, the availability of informational tools for intelligence and communications, and the competence of the civilian political leadership.

The US has advantages in some but not all of these and AI (or the availability of the GPUs needed for AI) will not change the current military equation with China.  The US economy is strong because of the abilities of its companies and entrepreneurs to use new technology to offer new goods and services.  Commercial application, not AI or chips per se, is the key to this success. In both commercial and military use, break any link in the chain between chip and ultimate outcome, and the contribution from the chip becomes marginal, no matter how advanced it is.  The race is not to develop the most power chips but to create the applications that take advantage of them.