AI TRAINING AS TEMPORAL COLONIALISM

HOW LARGE LANGUAGE MODELS MORTGAGE TOMORROW'S COMPUTE CAPACITY
AI temporal debt diagram

SYSTEMIC MISPERCEPTION: AI AS PRESENT ACHIEVEMENT

Artificial intelligence presents as a triumph of present capability—models trained on existing hardware, running on today's grids, delivering immediate utility. This presentation is a temporal illusion. The underlying reality is intergenerational debt.

What appears as breakthrough performance is actually a promissory note against compute capacity that does not yet exist. Every parameter added to a large language model today requires future energy expenditure, future hardware manufacturing, future grid capacity—infrastructure that will be built, powered, and maintained by generations who had no say in the training decision.

AI companies are not merely consuming present resources; they are borrowing against the future at compound interest, with no mechanism for repayment other than "future efficiency gains" that may never materialize.

LAYERED DISSECTION

The temporal colonialism of AI stratifies into five interdependent layers.

TEMPORAL DEBT LAYER
Each additional model parameter represents future energy that must be expended to serve inferences, retrain, and maintain the model. Scaling laws guarantee exponential growth in compute requirements, but the hardware to meet that demand has not been built. The debt compounds with each new model release.
COMPONENTS: PARAMETER COUNT • INFERENCE ENERGY • RETRAINING CYCLES
HARDWARE ROADMAP CAPTURE
Nvidia, AMD, and other chipmakers have reoriented their roadmaps around AI accelerators, diverting fab capacity and R&D from general-purpose computing. Future hardware will be optimized for AI inference—but only if sufficient energy and materials exist. The industry has locked itself into a trajectory that assumes future conditions it cannot guarantee.
COMPONENTS: AI ACCELERATORS • FAB CAPACITY • SUPPLY CHAINS
RESEARCH FICTION LAYER
Academic and corporate papers routinely describe results achieved with "future hardware" or "when scaled to hypothetical cluster sizes." These fictions enter the literature as accomplishments, creating pressure to pursue architectures that are not yet physically possible. The field treats tomorrow's compute as today's given.
COMPONENTS: SCALING LAWS • COMPUTE HYPOTHESES • PUBLICATION CULTURE
ENERGY MORTGAGE LAYER
AI companies sign power purchase agreements and make renewable energy commitments that assume future grid decarbonization. If that decarbonization does not occur at the projected rate, the energy deficit will be borne by other sectors—or by future generations facing climate consequences.
COMPONENTS: PPAS • RENEWABLE FORECASTS • GRID CAPACITY PLANNING
CAPABILITY FORWARD PRICING
AI services are priced based on assumptions of rapidly falling compute costs—Moore's‑law‑like curves that have already slowed. If future compute does not become dramatically cheaper, today's business models collapse, and the infrastructure built to support them becomes stranded.
COMPONENTS: PRICING MODELS • COST PROJECTIONS • INVESTMENT THESES

INTERDEPENDENCY MAPPING

These layers form a self‑reinforcing debt cycle.

Temporal Debt drives Hardware Roadmap Capture. Exponential parameter growth creates demand for specialized hardware, which diverts manufacturing capacity, raising costs for all other computing.

Hardware Roadmap Capture enables Research Fiction. Once the industry commits to AI‑specific silicon, researchers assume its continued exponential improvement, publishing work that depends on unbuilt hardware.

Research Fiction justifies Energy Mortgages. Promises of future efficiency gains are used to secure today's power purchase agreements, pushing climate liability forward.

Energy Mortgages enable Capability Forward Pricing. Cheap future energy is priced into today's inference costs, making AI appear cheaper than it will ever be.

Capability Forward Pricing validates further Temporal Debt. Low current prices encourage more training, more parameters, more future obligation.

SYSTEM FAILURE MODES

HARDWARE BOTTLENECK

If fab capacity cannot keep pace with demand, or if supply chains fail, the promised future compute never arrives. Models become stranded assets.

ENERGY CRUNCH

If grid decarbonization stalls, AI's energy consumption will either be curtailed or will displace other essential uses, creating political backlash.

FINANCIAL COLLAPSE

If compute costs do not fall as projected, AI service providers cannot maintain pricing. A wave of bankruptcies would strand infrastructure.

REGULATORY INTERVENTION

Governments may impose compute budgets or carbon limits on AI training, capping the debt that can be accumulated.

DIAGNOSTIC FRAMEWORK

To analyze the temporal debt of any AI system:

1. PARAMETER TRAJECTORY
What is the growth rate of parameters in this model family? What hardware is assumed to serve that growth?

2. HARDWARE COMMITMENTS
Are the required chips already manufactured, or are they projected? What would happen if fab capacity is diverted?

3. ENERGY ASSUMPTIONS
What grid mix and carbon intensity are assumed for future inference? Are those assumptions legally or physically guaranteed?

4. PRICING MODEL
Does current pricing assume falling compute costs? What is the break‑even point if costs plateau or rise?

5. EXTERNALIZED BURDEN
Who will bear the cost if future compute is more expensive or dirtier than assumed? Which generations, which regions, which industries?

SYSTEM NOTES
Every AI demo today is a promissory note against energy infrastructure that hasn't been built, signed by generations who haven't been born.
Scaling laws are not physical laws; they are choices backed by borrowed time.
Hardware roadmaps are not destiny—they are bets on future resource availability.
Research that assumes unbuilt hardware is architecture without foundation.
Renewable energy credits cannot prepay for future grid capacity.
AI pricing that relies on exponential compute deflation will eventually face arithmetic.
The most profound externality of AI is time itself.
Future generations cannot negotiate with the present about compute debt.
Temporal colonialism extracts value from the future without representation.
The only sustainable parameter count is one whose energy cost can be paid by the generation that trains it.