THE REVIEW
VOL. I, NO. 1 • MONDAY, JANUARY 27, 2026 • PRICE: ONE MOMENT OF ATTENTION
“Where software meets atoms—and atoms push back”
When the Machines Got Thirsty
A special edition examining the collision between artificial intelligence and physical reality
Dear reader, we present to you an unusual newspaper—one that tells the story of an industry discovering that even the most ethereal technology must obey the laws of physics.
For decades, the technology industry operated on a simple faith: computing power would grow exponentially, unbound by earthly concerns. Silicon would shrink, models would expand, and the future would arrive on schedule. Then came 2025—the year the world’s most valuable companies discovered they had a physics problem.
This special edition of The Review examines what might be called the Great Reckoning: the moment when artificial intelligence collided with the stubbornly material world. The stories that follow trace this collision across six continents, from Japanese glass looms to orbital satellites, from nuclear reactor control rooms to data center cooling towers.
“The companies building the future of artificial intelligence are learning, sometimes painfully, that the future must be built from physical things.”
You will read about a 589 billion from a single company’s value. About nuclear reactors resurrected from the dead to power chatbots. About Japanese cloth more strategically important than any semiconductor. About computers that must now be cooled with liquid because air is no longer cold enough. About open-source chip designs quietly reshaping the global balance of power. And about the ultimate escape hatch: putting data centers in space, where the sun never sets and the cooling is free.
Welcome to the age of material computing.
❧ ❧ ❧
A $6 Million Earthquake Shook Wall Street
How a Chinese startup trained a world-class AI for the price of a Manhattan apartment—and terrified investors
A PDF posted to the internet cost Nvidia $589 billion in a single day—the largest single-day value destruction in market history.
On Jan. 27, 2025—exactly one year ago today—Chinese AI company DeepSeek released a technical report claiming its R1 reasoning model’s final training run had cost approximately $5.6 million. American laboratories had spent hundreds of millions, sometimes billions, reaching similar capabilities. Investors suddenly wondered if they had been funding an arms race with a water pistol.
The number was simultaneously misleading and profound. Critics correctly noted that DeepSeek’s figure excluded years of research, failed experiments, and the rumored 50,000 Nvidia GPUs sitting in what analysts called the company’s “shadow cluster.” The true infrastructure cost, according to SemiAnalysis, approached $1.6 billion.
“We had no incentive to find the efficiency frontier when investors were providing billions to find the capability frontier instead. DeepSeek, backed into a corner by geopolitics, discovered that the path we weren’t taking actually led somewhere.” — Senior researcher at a major American AI lab (anonymous)
Yet to dismiss the achievement as accounting trickery misses the strategic earthquake beneath the spreadsheets. DeepSeek’s engineers, denied access to Nvidia’s fastest chips by U.S. export controls, had been forced to innovate around their hardware limitations. The H800 processors available to Chinese buyers featured deliberately crippled communication speeds—they could think fast but struggled to talk to their neighbors.
Constraint became catalyst. DeepSeek developed techniques—multi-head latent attention, auxiliary-loss-free load balancing, aggressive quantization—that allowed their 671 billion parameter model to train with the compute budget typically reserved for models one-tenth its size. They open-sourced everything.
The irony was not lost on policy analysts: U.S. export controls designed to hobble Chinese AI had instead produced the AI equivalent of high-altitude training. One year later, Chinese labs are leaner, more rigorous, and arguably better prepared for the thermodynamic constraints now facing the entire industry.
DeepSeek’s R2 model is expected within weeks.
For Further Reading: Perspectives
| Title | Source | |
|---|---|---|
| PRO | ”DeepSeek’s Latest Breakthrough Is Redefining the AI Race” — Yasir Atalan, CSIS Futures Lab | csis.org (Dec 2025) |
| CON | ”Challenging US Dominance: China’s DeepSeek Model and the Pluralisation of AI Development” — Clotilde Bômont & Tim Rühlig, EU ISS | iss.europa.eu (Jul 2025) |
❧ ❧ ❧
Big Tech’s Unlikely Power Source: The Atom
Tech giants discover that chatbots need power plants—and the only ones available split atoms
The machines that will think for us must first be fed, and what they eat is power—more power than anyone anticipated.
American data centers consumed 183 terawatt-hours of electricity in 2024—roughly equivalent to the entire nation of Pakistan. By 2030, the International Energy Agency projects this will grow to 426 terawatt-hours. In Virginia’s Loudoun County, already nicknamed “Data Center Alley,” server farms consume more electricity than all the homes combined.
When a minor disturbance in Fairfax County last year caused 60 data centers to switch simultaneously to backup generators, the sudden disappearance of 1,500 megawatts—Boston’s entire demand—nearly triggered cascading grid failures.
“While other sectors of the economy are backing away from this, large tech companies are still talking about it. It’s clear that nuclear energy has to be a big part of meeting the demand for power from AI.” — Rob Barnett, Bloomberg Intelligence analyst
The technology industry’s response has been inevitable: nuclear power. Microsoft announced a 20-year agreement to restart Three Mile Island Unit 1. Google signed agreements for 500 megawatts of small modular reactors from Kairos Power. Amazon invested over $20 billion in nuclear-powered data center projects. Meta announced deals totaling over 4 gigawatts of nuclear capacity—enough to power a medium-sized European country.
The fundamental appeal is physical: nuclear plants provide “baseload” power—constant, weather-independent output that precisely matches AI training’s requirements. There is no Dunkelflaute in fission.
But a timing problem looms. AI compute demand doubles every six to eighteen months. Nuclear reactors take years to license and build.
For Further Reading: Perspectives
| Title | Source | |
|---|---|---|
| PRO | ”2026: The Year Nuclear Power Reclaims Relevance” — Carbon Credits | carboncredits.com (Dec 2025) |
| CON | ”Microsoft Wants to Resurrect Three Mile Island. It Will Never Happen.” — Neil Chatterjee, Former FERC Chairman | thehill.com (Jan 2026) |
❧ ❧ ❧
When Air Won’t Cool the Machines
The AI industry’s cooling systems are failing—and there aren’t enough engineers to fix them
Air can no longer keep computers cool enough. This simple physical fact has upended decades of data center engineering.
A modern Nvidia Blackwell rack consumes 120 kilowatts—six times what air cooling can handle. The heat generated by next-generation GPUs cannot be removed by fans alone; it requires liquid flowing directly over the processors. The semiconductor industry’s roadmap calls for 4,400-watt chips by 2028. Heat dissipation per square centimeter now approaches 50 watts—comparable to a nuclear reactor core.
The industry has responded with a rushed transition to direct-to-chip liquid cooling. The physics are sound. The implementation has been, at times, catastrophic.
“Managing a liquid-cooled data center requires the skills of a chemical engineer and a master plumber. The current workforce is trained in swapping hard drives and managing airflow.”
When Microsoft’s Quincy data center experienced a cooling failure lasting 37 minutes, GPU temperatures spiked to 94 degrees Celsius. The result: $3.2 million in hardware damage and 72 hours of downtime.
Galvanic corrosion has emerged as the silent destroyer. In the rush to deploy liquid cooling, many facilities mixed incompatible metals—copper cold plates connected to aluminum radiators. When coolant flows between them, the system becomes a battery.
The industry’s response: “cartridge-ification”—hermetically sealed, modular cooling cartridges that generalist technicians can swap without understanding fluid chemistry. At sufficient scale, complexity becomes the enemy of survival.
For Further Reading: Perspectives
| Title | Source | |
|---|---|---|
| PRO | ”The Data Center Cooling State of Play (2025)” — Tom’s Hardware | tomshardware.com (Dec 2025) |
| CON | ”Why Liquid Cooling for AI Data Centers Is Harder Than It Looks” — Schneider Electric | blog.se.com (Aug 2025) |
❧ ❧ ❧
The Trillion-Dollar Loom Problem
The entire AI industry is waiting on output from specialized looms in Japan—and the glass that might replace them
The most important material in artificial intelligence is not silicon. It is a specialized fiberglass cloth called T-Glass, and a single Japanese company controls nearly all of it.
To understand why, one must understand the “reticle limit.” The lithography machines that print chips have a maximum exposure area of roughly 858 square millimeters. No single silicon die can exceed this size. Yet frontier AI chips require trillions of transistors.
The industry’s solution has been “chiplet” architectures: stitching together multiple smaller dies onto a base substrate. For decades, these substrates were organic resin reinforced with fiberglass cloth. It is failing for AI.
“The entire trillion-dollar AI hardware market is throttled by the output of a few glass looms in Japan.”
When a massive AI package is heated during manufacturing, the organic substrate expands at a different rate than the silicon sitting on it. This mismatch causes warping that severs circuits.
T-Glass is the current solution—but manufacturing it requires spinning molten glass into yarn finer than a human hair. Global production is dominated by Nitto Boseki of Japan. The company is reportedly sold out through 2027.
The long-term solution is solid glass substrates. Intel displayed its first thick-core glass substrate yesterday at NEPCON Japan. The future of AI may depend on mastering one of humanity’s oldest materials.
For Further Reading: Perspectives
| Title | Source | |
|---|---|---|
| PRO | ”Glass Substrates: The Breakthrough Material for Next-Generation AI Chip Packaging” | financialcontent.com (Jan 2026) |
| CON | ”The Race To Glass Substrates” — SemiEngineering | semiengineering.com (Aug 2025) |
❧ ❧ ❧
The Chip Design Nobody Owns
China has embraced an open-source processor architecture governed from Switzerland—and Washington is worried
For decades, the computing world has been a duopoly. Intel and AMD controlled x86. ARM Holdings dominated mobile. Both require licensing and royalties. RISC-V is different.
RISC-V is an open-standard instruction set architecture—the fundamental language that software uses to command hardware. Anyone can implement it without paying royalties. The specification is maintained by a non-profit based in Switzerland.
Until recently, RISC-V was dismissed as a “toy” architecture. That assessment is now obsolete.
“If a server vendor chooses RISC-V, we want to support that too.” — Frans Sijstermans, Nvidia VP of Hardware Engineering, announcing CUDA support for RISC-V
China’s government announced plans this month to promote nationwide RISC-V adoption across eight agencies. Beijing has concluded that reliance on x86 or ARM represents an existential risk. RISC-V’s Swiss governance makes it legally difficult for any nation to restrict access.
Alibaba’s XuanTie C930 targets server-grade performance. The Chinese Academy of Sciences’ Kunminghu architecture aims for 3 GHz. Chinese entities have filed more than 2,500 RISC-V patents, far surpassing American filings.
India has emerged as a second front. Under Digital India RISC-V, India launched DHRUV64 in December—its first homegrown dual-core processor.
The era of proprietary monopoly on instruction sets is ending.
For Further Reading: Perspectives
| Title | Source | |
|---|---|---|
| PRO | ”What RISC-V Means for the Future of Chip Development” — Sujai Shivakumar, CSIS | csis.org (Jul 2025) |
| CON | ”RISC-V Deserves the Same Scrutiny China Gives Nvidia” — Jared Whitley, Washington Times | washingtontimes.com (Oct 2025) |
❧ ❧ ❧
The Final Frontier Has Wi-Fi
Some companies think the solution to AI’s earthly constraints is obvious: leave Earth
In November, Nvidia-backed startup Starcloud successfully trained an AI model in orbit. The model learned from Shakespeare’s complete works, began speaking in Elizabethan English, and made history.
“Greetings, Earthlings!” wrote the orbital AI. “Let’s see what wonders this view of your world holds.”
The appeal is fundamentally physical. In sun-synchronous orbit, satellites receive continuous solar energy—no night, no clouds. Deep space is an infinite heat sink at 2.7 Kelvin. There are no grid queues in orbit. No permit hearings.
“In 10 years, nearly all new data centers will be built in outer space.” — Philip Johnston, Starcloud CEO
Google’s “Project Suncatcher” aims to launch by 2027. SpaceX is exploring orbital data centers for xAI. Blue Origin expects gigawatt data centers in space within a decade.
Yet obstacles remain: launch costs of thousands per kilogram, no on-site repairs, cosmic ray bit-flips. And a Saarland University study found orbital facilities might create more carbon than terrestrial alternatives.
Most profound: an AI trained in international orbit is outside reach of the EU AI Act, American orders, or Chinese regulations. “Orbital data havens” may prompt new international treaties.
Starcloud-2 launches this October with H100 chips and Blackwell B200. The next generation of AI may literally be looking down on us.
For Further Reading: Perspectives
| Title | Source | |
|---|---|---|
| PRO | ”How Starcloud Is Bringing Data Centers to Outer Space” — NVIDIA Blog | blogs.nvidia.com (Oct 2025) |
| CON | ”Space-Based Data Centers Could Power AI with Solar Energy—At a Cost” — Scientific American | scientificamerican.com (Dec 2025) |
❧ ❧ ❧
EDITORIAL
When Software Met Atoms
The age of unlimited digital growth is over. What comes next?
There is a famous line in technology punditry: “Software is eating the world.” The artificial intelligence era represents something different.
Software has eaten enough of the world that it is now bumping into the world’s physical constraints. The data centers that house AI models require concrete, steel, water, and electricity in quantities that register on national resource statistics.
The stories in this edition trace a single theme: the collision between exponential digital ambition and finite material reality. DeepSeek proved efficiency is mandatory when hardware constraints bite. The nuclear rush proves clean energy commitments matter less than reliable baseload. The cooling crisis proves thermodynamics does not care about investor timelines. The glass shortage proves supply chains have chokepoints nobody predicted. RISC-V proves architectural openness may matter more than raw performance when sovereignty is at stake. And orbital computing proves that when you run out of room on Earth, some will simply leave it.
“By the end of this decade, the semiconductor industry will likely reach its limits on being able to scale transistors on a silicon package using organic materials.” — Intel Corporation, Glass Substrate Announcement
What unites these stories is not pessimism but pragmatism. The AI industry is learning—sometimes at the cost of hundreds of billions of dollars—that there is no software solution to a hardware problem.
This is, in its way, good news. The recognition of material limits opens the door to material solutions.
The next decade of AI will not be determined solely by who writes the cleverest algorithms. It will be determined by who can source T-Glass, restart nuclear reactors, train cooling technicians, manufacture glass substrates, and perhaps—eventually—escape the gravity well entirely.
The era of software eating the world is giving way to something more complex: a dialectic between digital ambition and material constraint. The world is eating back.
For Further Reading: Perspectives
| Title | Source | |
|---|---|---|
| PRO | ”The Great Decoupling: How RISC-V Became China’s Ultimate Weapon for Semiconductor Sovereignty” | financialcontent.com (Dec 2025) |
| CON | ”The Public Pays the Price for Big Tech’s Data Centers” — The Invading Sea | theinvadingsea.com (Dec 2025) |
Production Note
This edition of The Review was produced on Monday, January 26, 2026, in collaboration between human editorial direction and artificial intelligence drafting and research capabilities. All factual claims have been sourced from reputable publications including Bloomberg, CNBC, Tom’s Hardware, Scientific American, the International Energy Agency, and official corporate announcements. Opinion pieces presented in “For Further Reading” sections represent genuine published perspectives and are linked to their original sources.
Your skepticism remains appropriate and encouraged.
Coming Next Week: The Talent War—examining how the shortage of cooling engineers, nuclear physicists, and glass substrate specialists is reshaping technology hiring. Also: Updates on DeepSeek R2 and Meta’s nuclear deals.
© 2026 The Review. All rights reserved. Editor: [Your Name] | Submissions: letters@thereview.example