Higher-Order Effects: The Four Scarcities

Tracing AI's economic impact through four orders of knock-on effects reveals a convergent structure: four resources become scarce, and education emerges as the pivotal intervention.

Causal Chain Visualization

The view adapts to your selected lens — use the lens toggle in the navigation to switch between Educator (accordion), Developer (flowchart), and Researcher (step-through).

Hover any node to highlight its chain. Click to view detail. Nodes fade when unrelated to the selected path.

Composite AConstraint Remo…Jevons Explosio…Bottleneck Migr…Quality Erosion…Bifurcated Econ…Commoditization…Attention Crisi…Energy/Infrastr…Judgment Premiu…Energy Constrai…Specification B…Quality Stratif…Value Migration…Energy-Quality …Attention Platf…Judgment-Diffus…Four ScarcitiesValues / Meaning

Hypothesis Interaction Convergence Speculative Convergent validation

Beyond the Immediate

The differential diagnosis identifies the mechanism: constraint removal → demand explosion → bottleneck migration → bifurcation. But first-order effects rarely capture the full picture. The deeper story emerges when we trace knock-on effects through multiple orders.

Order 1: What Becomes Scarce

When software labor ceases to constrain, four resources become the new bottlenecks:

1. Judgment

“Does this work correctly?” Evaluating AI output — determining whether generated code, analysis, or content meets the need — requires human judgment that AI cannot self-apply. Judgment develops through experience: making decisions, observing consequences, receiving feedback.

Why it becomes scarce: As AI handles more tasks, fewer entry-level roles provide the decision-making practice through which judgment develops. The resource becomes scarce precisely because the technology that creates demand for it also eliminates the pipeline that produces it.

A worked example: A law firm uses AI to draft initial legal research memos. Before AI, junior associates wrote these memos — learning to evaluate sources, weigh precedents, and construct arguments through the work itself. With AI handling the drafts, junior associates review AI output instead. But reviewing requires the judgment that writing develops. The firm needs senior attorneys to review AI output, while the pipeline that produces senior attorneys — years of writing, making mistakes, receiving feedback — narrows. Five years later, the firm has excellent AI tools and fewer people who can direct them wisely.

2. Specification

“What should we build?” Translating human needs into requirements that AI systems can act on requires domain knowledge, empathy, and communication precision. AI generates solutions; someone must define the problem correctly.

Why it becomes scarce: Specification skill combines technical literacy with domain expertise. As AI handles the technical execution, the bottleneck shifts to people who understand what to build — not how to build it. A hospital administrator who can precisely describe what a patient scheduling system needs to accomplish — including edge cases, accessibility requirements, and integration constraints — creates more value than a programmer who can build whatever gets specified. The specification gap widens as AI capability grows: the more powerful the tool, the more precisely someone must define what it should accomplish.

3. Attention (Curation)

“Which of a million options?” When AI makes production cheap, abundance floods every domain. The scarce resource shifts from production to curation — helping people navigate overwhelming choice.

Why it becomes scarce: Attention does not scale with abundance. A human can evaluate a limited number of options regardless of how many exist. The person or platform that effectively curates AI-generated abundance captures enormous value.

4. Energy

Physical computation requires physical energy. AI capital expenditure of $527 billion in 2026 translates to massive energy infrastructure demand. Energy represents the one physical constraint that no amount of software can eliminate.

Why it becomes scarce: AI demand for energy grows with adoption. Data centers compete with residential and industrial users. Energy costs affect every sector of the economy.

Order 2: How Scarcities Interact

The four scarcities do not operate independently. They generate four critical interactions:

Value Migration (Order 2-A)

Software itself commoditizes. The value moves to judgment (evaluating AI output) and specification (defining what to build). A third leg emerges: curation (navigating abundance). This triad — judgment, specification, curation — defines where economic value concentrates in the AI economy.

ICESCR implication: Workers whose value came from execution (writing code, producing analysis, creating content) face declining returns. Workers whose value comes from judgment, specification, and curation see increasing returns. Article 6 (work) and Article 13 (education) engage directly.

Energy-Quality Feedback (Order 2-B)

Energy becomes expensive. Premium software — optimized, efficient — uses less energy per unit of value. Commodity software — bloated, unoptimized — uses more. The economics create selection pressure: energy costs incentivize quality, partially self-correcting the quality erosion problem (H6 — more AI output, lower average quality).

ICESCR implication: Energy costs affect living standards (Article 11). The energy-quality feedback creates economic incentive for better AI products — but only for users who can afford the premium tier.

Platform Recurrence (Order 2-C)

High confidence — strong historical precedent. When any domain floods with abundance, platform gatekeepers emerge to curate access. Google for information. App stores for software. Social media for content. The same pattern will recur for AI-generated software and services.

Whoever controls the curation platform captures disproportionate value. Access to the platform determines who benefits from AI’s abundance.

ICESCR implication: Article 15 (right to benefit from science) engages directly. If platforms control access to AI’s benefits, the “diffusion of science” obligation requires ensuring platform access does not depend solely on ability to pay.

The Judgment-Diffusion Paradox (Order 2-D)

The critical finding.

Technology diffuses globally over time. AI tools will spread from early adopters to the broader economy. Access will equalize — eventually.

Judgment does not diffuse the same way. Judgment develops through practice: making decisions, experiencing consequences, receiving mentorship from experienced practitioners. You cannot download judgment. You cannot train it in a weekend.

When AI eliminates junior roles — the entry-level positions through which people develop judgment — the pipeline breaks. Even after AI tools become universally available, the ability to use them effectively concentrates among those who already developed judgment.

The result: permanent stratification between the judgment-rich and the judgment-poor, persisting even after technology access equalizes. This differs fundamentally from previous technology divides. The digital divide (internet access) closed as infrastructure expanded. The AI divide promises to close similarly — tools will become universal. But the judgment divide cannot close through infrastructure alone. It requires the developmental experience that AI adoption itself eliminates. The paradox generates a self-reinforcing cycle: AI removes the conditions that produce the human capability AI requires.

Article 13 emerges as pivotal for exactly this reason. If education does not adapt to produce judgment — not just knowledge, not just skills, but the capacity to evaluate, decide, and specify — then technology diffusion will equalize access without equalizing capability. The bifurcation shifts from “AI access vs. no access” to “judgment-rich vs. judgment-poor” — a harder divide to cross.

Order 3: Convergence

At the third order, the analysis converges on a structural finding:

Two of the four scarcities depend directly on education (judgment and specification). A third connects through cultural and scientific literacy (curation). Only energy lies outside the educational domain.

This means Article 13 (right to education) addresses 75% of the AI economy’s binding constraints. No other ICESCR article carries this scope.

The convergence also reveals that Article 15 (right to benefit from science) addresses the distribution question: even if education produces judgment for everyone, who benefits from the resulting AI economy? Platform recurrence threatens to concentrate benefits among platform controllers. Article 15’s diffusion mandate provides the legal framework for ensuring broad access.

Together, Articles 13 and 15 form the co-pivotal pair: education produces capability (Article 13) and the right to benefit ensures capability translates to shared prosperity (Article 15).

Order 4: Productive Exhaustion

At the fourth order, remaining questions concern:

  • Values and meaning: What do people choose to do with judgment capability? (Speculative — honest about confidence)
  • Structural norms: Does AI create new forms of social organization? (Horizon-level, beyond prediction)

The analysis terminates at Order 4 because remaining effects represent implementation details or speculation, not architectural risks. The structural finding — education as pivotal, benefit-sharing as essential — holds across all plausible scenarios at Orders 1-3.

The Map

OrderBinding ConstraintPrimary ArticleWhat Must Happen
0Software labor (removed by AI)Art. 15Ensure access to AI
1Judgment, specification, curation, energyArt. 13, 11, 15Transform education; manage energy; ensure access
2Pipeline breaks, platform gatekeepersArt. 13, 15, 6Preserve judgment development; regulate platforms
3Judgment as stratifierArt. 13Education must produce judgment, not just knowledge
4Values, purpose, meaningPreamble: DignityICESCR philosophical foundation

Live Evidence: The Human Rights Observatory tracks which UDHR rights the tech community discusses most — revealing where attention concentrates and where the scarcity gaps this analysis predicts appear in real-time discourse.