SYSTEMIC MISPERCEPTION: AI AS PRESENT ACHIEVEMENT
Artificial intelligence presents as a triumph of present capability—models trained on existing hardware, running on today's grids, delivering immediate utility. This presentation is a temporal illusion. The underlying reality is intergenerational debt.
What appears as breakthrough performance is actually a promissory note against compute capacity that does not yet exist. Every parameter added to a large language model today requires future energy expenditure, future hardware manufacturing, future grid capacity—infrastructure that will be built, powered, and maintained by generations who had no say in the training decision.
AI companies are not merely consuming present resources; they are borrowing against the future at compound interest, with no mechanism for repayment other than "future efficiency gains" that may never materialize.
LAYERED DISSECTION
The temporal colonialism of AI stratifies into five interdependent layers.
INTERDEPENDENCY MAPPING
These layers form a self‑reinforcing debt cycle.
Temporal Debt drives Hardware Roadmap Capture. Exponential parameter growth creates demand for specialized hardware, which diverts manufacturing capacity, raising costs for all other computing.
Hardware Roadmap Capture enables Research Fiction. Once the industry commits to AI‑specific silicon, researchers assume its continued exponential improvement, publishing work that depends on unbuilt hardware.
Research Fiction justifies Energy Mortgages. Promises of future efficiency gains are used to secure today's power purchase agreements, pushing climate liability forward.
Energy Mortgages enable Capability Forward Pricing. Cheap future energy is priced into today's inference costs, making AI appear cheaper than it will ever be.
Capability Forward Pricing validates further Temporal Debt. Low current prices encourage more training, more parameters, more future obligation.
SYSTEM FAILURE MODES
If fab capacity cannot keep pace with demand, or if supply chains fail, the promised future compute never arrives. Models become stranded assets.
If grid decarbonization stalls, AI's energy consumption will either be curtailed or will displace other essential uses, creating political backlash.
If compute costs do not fall as projected, AI service providers cannot maintain pricing. A wave of bankruptcies would strand infrastructure.
Governments may impose compute budgets or carbon limits on AI training, capping the debt that can be accumulated.
DIAGNOSTIC FRAMEWORK
To analyze the temporal debt of any AI system:
1. PARAMETER TRAJECTORY
What is the growth rate of parameters in this model family? What hardware is assumed to serve that growth?
2. HARDWARE COMMITMENTS
Are the required chips already manufactured, or are they projected? What would happen if fab capacity is diverted?
3. ENERGY ASSUMPTIONS
What grid mix and carbon intensity are assumed for future inference? Are those assumptions legally or physically guaranteed?
4. PRICING MODEL
Does current pricing assume falling compute costs? What is the break‑even point if costs plateau or rise?
5. EXTERNALIZED BURDEN
Who will bear the cost if future compute is more expensive or dirtier than assumed? Which generations, which regions, which industries?