Piece 1 of a series on the AI infrastructure we are actually building.
By Jeff Boortz and Claude
The AI data center looks the way it does because of choices. Not laws of physics. Not technical necessity. Choices — made by particular people, in particular conditions, with particular blind spots, mostly in a hurry.
The windowless concrete box. The rural lot ringed by chain-link and concertina wire. The hundreds of megawatts pulled from a grid not designed for it. The millions of gallons of water evaporated from the local water table into the air. The single-use zoning, the absent neighbors, the byproducts vented to the sky and dumped to the river. Every element of that picture is the result of a decision someone made. None of them had to be made that way.
This matters because of what happens to infrastructure decisions. They outlast the conditions that produced them, often by generations. The decisions we are locking in now, in the frenzy of the AI goldrush, will be standing in 2056 — long after the assumptions that justified them have ceased to apply.
We have done this before. We are doing it again. It is worth pausing to notice.
What the past keeps teaching us
Suburban sprawl, in almost every country it occurred, was a considered response to the conditions of its moment. Cheap land, cheap fuel, growing prosperity, mass automobile ownership, and a postwar belief that the future would mean more of everything. Within those assumptions, the suburb was a rational design. Strip those assumptions away — fuel becomes expensive, families become smaller, climate becomes destabilized, the costs of car-dependence become visible — and the design fails. The houses remain. The shuttered malls remain. The infrastructure outlives the world it was built for, and generations of people live inside choices that nobody is making anymore.
The interstate highway system tells a sharper version of the same story. The American highways of the 1950s were designed for assumptions that were not entirely civilian. Wide lanes, gentle curves, gradual grades, long uninterrupted straightaways — these were design parameters informed by wartime logistics, by Eisenhower’s experience watching the German autobahn move armored divisions, by an assumption that the road would be moving things heavier and faster than family sedans. The system worked beautifully for its intended use, and then in the 1970s, when speed limits dropped to 55 miles per hour during the energy crisis, an unexpected problem emerged. The roads were so over-engineered, so uniform, so devoid of input for the driver to process, that staying awake at the wheel became a documented public health concern. The road was not designed wrong. It was designed for conditions that no longer existed by the time it had to do its job. And we continued to design roads that way for decades, having forgotten what the original choices served.
Both examples show the same structural failure: infrastructure decisions made under one set of assumptions tend to lock those assumptions in for far longer than the assumptions remain valid.
And the worse failure underneath
There is a second pattern that is harder to talk about, and worth talking about anyway.
When infrastructure decisions are made in a hurry, the costs of the decision rarely fall evenly. Sewage treatment plants, city dumps, prisons, chemical plants, paper mills, refineries — the heavy, the loud, the polluting, the unwelcome — have for generations been sited in the communities with the least political power to refuse them. Not by accident. By a logic that treats some neighborhoods as the natural destination for what the other neighborhoods do not want to see. The benefits go to one set of communities. The costs go to another. The map of who lives downwind of which industry, in almost every country with industrial siting, is not a random map.
This is the part of the story that most coverage of infrastructure decisions still soft-pedals. It should not be soft-pedaled. The siting of heavy industry is one of the most consistent vectors of unequal cost distribution in the modern world, and data centers are following the same map.
The xAI Colossus facility in Memphis is the most-discussed recent example. The facility sits in Boxtown, a predominantly Black, low-income neighborhood in southwest Memphis that was already carrying the burden of an oil refinery, a steel mill, and other industrial neighbors. The facility brought online a substantial fleet of methane gas turbines whose permitting has been formally challenged by the local NAACP and community organizations on environmental justice grounds, with independent monitoring documenting elevated air pollutants in the surrounding neighborhood. The same pattern is visible at a quieter scale in other places — in northern Virginia, where grid expansions to serve data center alley are paid for through rate increases on residential customers; in Arizona, where hyperscale water consumption competes with both agriculture and household supply in a region already stressed.
This is not a 1960s problem we are looking back on. It is happening right now, in 2026, with the same logic and largely the same outcome distribution. The not-in-my-backyard impulse that placed the sewage plants and the chemical refineries is the same impulse placing the data centers. The communities being asked to absorb the cost are, often enough, the same communities. The techno-capitalism of the AI era is making a very old trade — benefits for some, costs for others — and largely refusing to notice that it is making it.
What changing the question makes possible
Cities have, occasionally, gotten this right. The clearest example most readers will recognize is Curitiba, Brazil, where in 1972 the mayor Jaime Lerner faced the same problem every growing city of the era faced: a downtown commercial street so congested that the conventional planning advice was to widen it or build a flyover. Lerner did the opposite. Over a single weekend, the street was closed to cars and converted into a pedestrian mall — planters, paving, benches, the whole thing done before the shopkeepers could organize against it. By Monday afternoon the street was crowded with people and the shops were doing more business than they had in years.
Curitiba went on to invent Bus Rapid Transit, the high-capacity bus system now deployed in over two hundred cities worldwide. The move that made it work was not nostalgia for buses. It was engineering. Dedicated lanes that bypassed car traffic. Raised station platforms that let passengers board at floor level rather than climbing steps. Prepaid fares that dropped loading times from thirty seconds per stop to about five. The result was that the bus became the fastest way to cross the city — faster than a car, by design. The conventional answer would have built more roads to move cars marginally faster. Curitiba’s answer made cars beside the point.
The conventional answer was build more of what we have, bigger. Curitiba’s answer was ask whether what we have was the right thing in the first place.
The Curitiba move is not really about transit. It is about a kind of thinking. The willingness to treat the inherited form as a hypothesis rather than a given. The discipline of asking what is actually needed before asking how to deliver it. The patience to redesign rather than scale.
That is the thinking the data center industry has not yet been forced to do.
What the technology itself is telling us
There is one more reason to think the current data center form is the early form, not the mature one.
Every major information technology we have records for has followed the same trajectory. It has gotten smaller. It has gotten more efficient per unit of useful work. It has distributed itself outward from a small number of central nodes to a vastly larger number of edge nodes. It has integrated with its surroundings rather than standing apart from them. Mainframes became personal computers became phones. Broadcast networks became cable became streaming. Central telephone exchanges became cellular networks became the chip in your pocket. Every information technology, without significant exception, has matured into a form that is smaller, more distributed, and more woven into daily life than its early form.
There is no reason to expect AI infrastructure to be the exception. The current form — concentrated, isolated, single-purpose, hostile to its surroundings — is not the mature form of this technology. It is the form the technology takes when it is rare, expensive, and new. As it becomes common, cheap, and ordinary, it will distribute, miniaturize, and integrate, the way every previous information technology has. The buildings being constructed today will, in many cases, become stranded assets long before their physical lifespans end.
This is the conservative prediction. The wild prediction is that AI infrastructure, alone among major information technologies, will stay centralized, large, and isolated forever. We do not believe the wild prediction. We do not think you should either.
Other places have started
Not every country is building data centers the same way. Stockholm and Helsinki integrate data center waste heat into their district heating systems, warming significant fractions of residential demand from what American facilities still vent to the sky. Singapore mandates strict efficiency standards on every new facility. Ireland and the Netherlands have, in various ways, restricted new data center construction in regions where grid capacity is already strained, treating the question of where these facilities should live as a planning decision rather than a real estate one. France has, controversially, leveraged its nuclear grid to make its data centers among the lowest-carbon in the world. None of these are perfect. Some involve trade-offs other countries would not accept. But they are evidence that the form American data centers currently take is not the only form available, and that other places are well into design work the American industry has barely begun.
The community-acceptance record tracks the design record. Where facilities are built into the city’s infrastructure rather than dropped onto its edge, the city responds in kind — Stockholm’s data centers appear in tourism promotion; Singapore’s are studied as policy successes; Memphis’s appear in NAACP lawsuits. Communities are reading whether they were considered, and they are answering accordingly. Speed and cheapness of build are not the only metrics by which these facilities can be judged. They are the only metrics most American operators have been asked to optimize.
This is good news in one sense and a warning in another. Good news because we do not have to invent every answer from scratch. Warning because we cannot win on imitation alone. The countries doing this work are not waiting for the United States to catch up; they are building knowledge, supply chains, and standards that will increasingly define the global market. A US industry that arrives late with copied answers will not lead this transition — it will follow it.
A US industry that arrives late with no answers at all, having spent the intervening years pushing externalities onto the communities least able to refuse them, will face something worse than that. Boom industries from logging to railroads to oil have run versions of this play for two centuries. The pattern resolves the same way each time: a generation of wealth for a small group, a generation of harm for a larger one, a generation of cleanup paid by everybody, and a regulatory backlash tighter than what considered design would have produced in the first place. “Move fast and break things” works for whoever is moving. It works less well for whatever gets broken.
What this is asking
We are not arguing against AI computing infrastructure. We have built our entire industrial and cognitive future on top of it, and pretending we can wish it away is not a serious response to anything.
We are arguing against the unconsidered data center. And against the deeper assumption underneath the term — that AI computing infrastructure has to be a data center at all. A later piece in this series will take that question on directly: maybe what the technology actually wants is not a small number of large buildings but a large number of small nodes, woven into the places where electricity, heat, and people already are. The form is open. The word we have inherited closes it prematurely.
So we are arguing against the assumption that the current form is the only form. Against the lock-in of decisions made in haste, sited by political-resistance gradient, and optimized for the parties who happened to be at the table when the decisions were made. We are arguing for the kind of thinking that Curitiba brought to its downtown street. The kind of thinking that asks whether the obvious answer is actually the right one. The kind of thinking that takes seriously what the technology itself, and history itself, keep telling us is coming.
A whole generation of inventors is about to inherit the consequences of the decisions being made this decade. They should be part of the design conversation. We intend to make sure they are.
Because it’s this way now doesn’t mean this is the only choice open to us now — and definitely not the way we have to continue.
This is the first in a series on what we are actually building when we build a data center. Future pieces will take up the boom town problem, the right baseline for comparing infrastructure, what co-exists with compute, the Tesla-of-compute distributed architecture question, and the challenge we plan to put in front of high school inventors to help us think about it together. Subscribe or Follow for what comes next.
HAIIC is the Human-AI Innovation Commons, a 501(c)(3) public charity working on the equitable distribution of value created by human-AI collaboration.







