The AI productivity question no one's answering

Three years of record AI spending shows zero aggregate profit improvement for non-tech firms, forcing harder questions about whether productivity gains exist or simply get competed away.

The AI productivity question no one's answering
Photo by kris / Unsplash

Three years into the largest technology deployment cycle since cloud computing, aggregate corporate profitability outside the technology sector hasn't budged.

Profit margins for the S&P 493 remain exactly where they've been for a decade: roughly ten percent. Meanwhile, the Magnificent Seven maintain margins above twenty-three percent, capturing infrastructure rents while everyone else pays monopoly prices for the same computational inputs.

This disconnect forces a question most capital allocation discussions are avoiding: where exactly is the productivity showing up?

The evidence demands different assumptions

Recent analysis from Apollo's chief economist (here, here, here) shows profit margins for non-tech firms tracking sideways through Q1 2026 despite record AI capital expenditure. Consensus earnings expectations for the S&P 493 show zero improvement since April 2025, while tech infrastructure providers saw expectations surge eleven percent over the same period. The productivity story appears true for roughly seven companies and absent everywhere else.

Three explanations compete:

First, genuine productivity gains exist but get competed away immediately through price compression or absorbed by implementation costs and organizational friction.

Second, diffusion lags of ten to twenty years mean we're simply early, following the pattern of electricity adoption in the 1890s or computing in the 1970s.

Third, task-level improvements don't translate to firm-level profitability because employees capture gains as reduced effort rather than expanded output.

Yet it appears that many AI deployments solve problems that weren't binding constraints on growth. Companies fund initiatives they label transformation while budget allocation reveals they're automating tasks that represented three percent of total cost structure.

The capital goes in, marginal efficiency improves, competitors match the capability within eighteen months, and margins compress back to industry standard.

The Great Deflation parallel that doesn't hold

Historical analysis suggested a possible regime parallel to the Second Industrial Revolution period from 1870 to 1890. During that era, sustained productivity growth coexisted with flat profit margins as competitive dynamics compressed prices faster than efficiency could expand margins. Real wages rose through deflation while infrastructure monopolists captured extraordinary returns and everyone else saw revenue growth without margin expansion.

The pattern matched, until it didn't. Recent inflation indicators contradict the benign deflation hypothesis. Wage growth remains sticky around four percent year-over-year with Fed survey indicators turning upward on a six-month lead. Industrial commodity prices show renewed strength with typical six-month leads to consumer price inflation. Dollar weakness accelerated with the standard two-month transmission to import prices.

More telling: the gold-real rate correlation that held for decades broke down in 2022. Historically, rising real interest rates suppressed gold prices through opportunity cost. Gold has rallied despite rising real rates, signaling markets now price either dollar reserve currency instability or expectation of eventual debt monetization. Neither condition supports a productivity-driven expansion story.

What fiscal reality reveals about trajectory

Congressional Budget Office projections released February 2026 show the federal deficit reaching $1.9 trillion for fiscal year 2026, roughly 5.8 percent of GDP, during a period of 4.2 percent unemployment and 2.2 percent real growth.

Sustained deficits of six percent during full employment represent unprecedented peacetime fiscal policy. Ten-year projections show debt rising from 101 percent to 120 percent of GDP by 2036, with interest costs expanding from 3.3 percent to 4.6 percent of GDP over the same period.

This creates what Ray Dalio identifies in The Changing World Order as late-stage debt cycle dynamics: monetary tools exhausted, fiscal expansion accelerating, wealth concentration at 1920s levels, and productivity gains too slow to grow out of accumulated obligations. The framework suggests we're in Phase Four approaching Phase Five, where debt service costs force either explicit restructuring, sustained inflation to erode real obligations, or deflationary crisis when bond markets revolt.

This environment doesn't support the ten to twenty year diffusion period required for productivity gains to materialize at scale.

Historical precedent from the Great Deflation era worked because credible gold standard constraints prevented debt monetization and minimal social safety nets allowed labor market clearing through real wage adjustment. We have neither institutional condition today.

The capital allocation implication

When evaluating AI-related capital deployment, the burden of proof has shifted. The default assumption three years ago was reasonable: productivity gains would flow through to profitability as they had in previous technology cycles. The data now shows 493 companies generating zero aggregate signal while seven infrastructure providers capture monopoly rents.

Any AI investment thesis now requires explaining the specific defensible mechanism for capturing rather than competing away productivity gains. "Industry transformation" without unit economics improvement is cost disguised as strategy.

The framework I use with clients starts with a simple diagnostic: map stated strategic priorities against actual resource allocation, then pressure-test whether productivity gains create proprietary advantage or just keep pace with competitive table stakes.

What surfaces in those diagnostics: resource allocation to AI often reflects institutional momentum rather than strategic choice. Leadership commits capital because peers commit capital, vendors promise transformation, and internal advocates present selective proof points.

Meanwhile, the broader question goes unasked—whether productivity at the task level translates to profitability at the firm level in markets where competitive dynamics compress returns to marginal efficiency gains.

The constraint that binds

Between now and 2029, several tensions resolve. Either productivity gains materialize in aggregate profit margins, validating current deployment assumptions. Or margins stay flat while inflation persists, confirming gains get competed away or absorbed by implementation costs. Or fiscal pressures force debt crisis that resets the entire capital allocation environment.

The pattern I see across capital allocation discussions: executives treating AI investment as inevitable rather than evaluating it against alternative deployments that address actual binding constraints on growth.

When segmentation reveals unmet demand in existing markets but resources flow to geographic expansion, that's a diagnostic signal. When distribution capacity limits revenue but growth capital funds demand generation, that's misalignment between stated priorities and resource allocation.

Worth asking before committing next quarter's AI budget: is the productivity improvement wider than what competitors will match in eighteen months, or is the capital efficiency gain wider than what the same resources would generate addressing your binding constraint on growth?


At COLLINGS&CO, I help CEOs diagnose belief-behavior-market misalignment when stated strategy doesn't match capital deployment reality. This diagnostic work reveals whether organizations are solving binding constraints or following institutional momentum.