The American economy is currently experiencing an odd tension that most headlines ignore. Through 2025, GDP growth was robust. In contrast, monthly job growth was barely scraping 15,000 positions. That disparity seems odd to anyone who has spent time observing economic cycles. Not particularly frightening. However, there is something peculiar about how a building feels when the foundation is being subtly altered underneath it.
Because the data is ambiguous, most economists cautiously turn to productivity as an explanation. In particular, the kind of productivity that results from a technology so fundamental that it not only enhances our work but also completely alters our perspective on it. Everyone is trying to explain artificial intelligence, especially the generative kind, while acknowledging that they still lack the appropriate vocabulary.
| Category | Detail |
|---|---|
| Subject | Artificial Intelligence & Global Economic Growth |
| Key Technology | Generative AI / Large Language Models (LLMs) |
| Primary Economy Affected | United States GDP (World’s Largest Economy) |
| Investment Growth (Q3 2025) | Information-processing equipment & software: +16.5% year-over-year |
| Projected Global CapEx by 2030 | $6.7 Trillion in data center infrastructure |
| Share of GDP Exposed to AI | Estimated ~40% of current economic activity |
| Peak AI Productivity Contribution | +0.2 percentage points annually (projected: 2032) |
| Long-Term GDP Impact | +1.5% by 2035 / +3.7% by 2075 (cumulative level increase) |
| Most Exposed Workers | Occupations near the 80th earnings percentile (~50% of tasks automatable) |
| Potential Deficit Reduction (2026–2035) | Estimated $400 billion (US federal budget, preliminary) |
| Key Monitoring Bodies | Bureau of Economic Analysis, Federal Reserve, Institute of International Finance |
| Reference Sources | IMF — AI & the Economy · BEA GDP Data · IIF Capital Flows |
You’ll notice things if you stroll through any mid-size American city with a technology corridor, such as Austin, Raleigh, or parts of the Gulf Coast. office parks with abnormally high electricity consumption. Overlooking the future data centers are cranes. Odd hours are worked by fiber crews. It doesn’t appear to be a revolution. It appears to be infrastructure. And that’s kind of the point. AI is acting more like electricity than a software update, the kind of enabling technology that necessitates years of physical investment before anyone can truly gauge its benefits.
In comparison to the previous year, investment in information-processing hardware and software increased by 16.5% in the third quarter of 2025. GDP would have appeared much weaker if that had been removed from the overall numbers. While manufacturing, construction, and rate-sensitive industries are at best walking, the economy’s AI-intensive sectors are racing. This is not a cohesive boom. Wearing a single outfit, it’s a two-speed economy.

It’s possible that what we’re witnessing is a precursor to a more significant shift in productivity, the same kind that took almost ten years to become evident in the data following the widespread use of personal computers. Economists who research these topics recall how long it took for the internet’s productivity benefits to show up in aggregate statistics. Something similar seems to be happening right now, albeit more quickly in some areas and more slowly in others than anyone could have imagined.
However, caution is necessary. To put it simply, productivity is what remains after all other factors have been taken into consideration. It is a residual. In other words, it absorbs both real gains and measurement error. A portion of the apparent increase in productivity in 2025 might be due to changes in the composition of the labor force, specifically in the people who continue to work after changes in immigration laws.
Even if the productivity of those who stay in the workforce hasn’t changed, the numbers improve if lower-productivity workers leave. It’s not a boom. That’s math.
The statistical frameworks used to assess contemporary economies were designed for a world of factories and machinery, which is the underlying issue. They partially record research expenditures and software investments. However, rather than being capitalized, the costs of training large AI models, improving datasets, and iterating on applications are frequently expensed. In official accounts, the months of engineering and accumulated proprietary insight that go into a semiconductor are largely invisible.
By counting the physical investment, GDP data may be overestimating AI’s immediate contribution while underestimating its wider impact by completely ignoring the spillovers. It is not a novel statistical paradox. Before the mist cleared, the IT revolution of the 1990s caused the same confusion.
The extreme capital intensity of this cycle is what sets it apart. Large amounts of power are needed to build and run the generative AI infrastructure in addition to servers and chips. In a manner not seen since the industrial era, grid capacity and electricity production have subtly emerged as macroeconomic variables. By 2030, analysts predict that data centers could need $6.7 trillion in capital expenditures worldwide. That isn’t a rollout of software. With Texas, Southeast Asia, and portions of India serving as nodes in a new global computing lattice, that is a rearrangement of physical geography.
The Federal Reserve is observing all of this with a perplexed look. The economy might not be as hot as it seems if AI is subtly increasing the economy’s potential output—basically, its speed limit. It would be a mistake to tighten policy in that situation. However, easing too soon would be equally problematic if infrastructure strain and energy bottlenecks are creating a new floor for inflation. Neither situation is tidy. Depending on the sector and quarter you’re measuring, both can happen simultaneously.
The situation is equally uneven on a global scale. Although they have stabilized, Europe and Japan still heavily rely on loose monetary conditions. Lower yields and AI-related capital inflows are helping emerging markets, especially those with what Institute of International Finance researchers refer to as “digital depth”—the ability to create and export digital goods integrated into local value chains in addition to using imported AI tools.
As a result, nations like China, South Korea, and India are drawing more long-term foreign investment. Others continue to use technology that they did not develop, leaving them vulnerable to the fluctuations of global liquidity cycles.
The truthful response to the main query: Is this a real increase in productivity? — is that we still don’t know everything. According to estimates, AI’s contribution to the growth of total factor productivity peaks around 2032, adding about 0.2 percentage points per year at its peak. GDP levels could increase by 1.5 percent by 2035 and more like 3.7 percent by 2075 when compounded over decades.
These are not insignificant figures. However, they need broad diffusion across industries, complementary investments in institutions and skills, and policy frameworks that don’t completely misread the moment—things that history has seldom delivered swiftly.
As we watch this develop, it’s difficult not to feel as though we’re in the midst of something important but can’t quite see its boundaries. The factories, or more accurately, the data centers, are operating. The investment is genuine. The question that will define the next ten years of economic life is whether productivity follows and who benefits when it does.
