Arizona Data Centers are Causing the Affordability Crisis

Did you ever wonder what’s in those giant data centers like Project Blue? They hold price-fixing software that makes groceries, gas, rent, homes and cars so expensive.

In today’s America, housing is out of reach for 51 million people, and the number is growing. Behind the affordability crisis lies a new culprit: AI-powered revenue management technology.

This pricing software enables powerful corporate landlords to leverage big data and AI to implement algorithmic price-fixing, inflate rents, and maximize profits—transforming housing into a high-tech pricing cartel.

By 2022, housing had become unaffordable for most Americans. This crisis was driven by 145 large, energy-intensive data centers now operating in Phoenix, Mesa, Goodyear and Chandler, making Arizona 7th in the nation. The massive data center campuses range from 1 to 3 million square feet.

Today, Americans are concerned about the rapid expansion of data centers, which are gorging on our precious water and electricity and driving up consumer prices. But what is the data center’s true purpose? The answer may surprise you.

Historically, prices for goods and services were determined by the free market. However, that is no longer the case. The free-market system now appears to be a relic of the Industrial Age. In a traditional free market, a store puts a price tag on a shelf, and you decide if it’s fair. In the new Digital Economy, the price tag is looking back at you, determining how much you’re willing to pay.

What is the highest price you would pay?

Arizona’s 145 data center campuses range from one million to three million square feet.

To make that determination, the first step in this pricing scenario is their massive data-gathering campaign on their applicants, tenants, and properties, a truly invasive and Orwellian process. But you may ask: How does algorithmic price-fixing work?

Once the algorithms have gathered and stored this data in these data centers, their pricing algorithm then amasses all of that data. It feeds it through their predictive analytics, with the ultimate goal of predicting a consumer’s Willingness Factor. The algorithm doesn’t evaluate the product’s value; instead, it determines your breaking point. By examining your income, debt, spending history, and current location, these data centers provide the intelligence that companies like RealPage rent-fixing software use to identify the highest price you’re willing to pay before walking away.

The pricing technology is cold and calculated: my research indicates that, to achieve maximum profits, landlords intentionally exclude the lowest earners—the bottom 15%. By utilizing data centers to remove lower-income individuals, the algorithm sets prices without the affordability constraints of the working class. It then aggressively increases prices for the highest-income earners, effectively forcing everyone else to pay a premium for gas, food, cars, and rent.

Massive hidden costs

This technology has effectively replaced the Invisible Hand of the free market with a Digital Predator. As Arizona Attorney General Kris Mayes recently demonstrated in her 2026 settlements against predatory landlords, this isn’t a conspiracy theory—it’s a business model.

All the data gathered during their extensive collection process is stored in data centers, and this information will determine:

  • How much you will pay for rent.
  • How much you will pay for your next home and your next car.
  • How much you will pay for everything from grocery shopping trips to filling up at the gas station, and everything in between.

These are the massive hidden costs that data center owners have yet to share with consumers.


Nelson’s forthcoming book, Priced Out By Design, is a powerful exposé of the rental housing industry. It exposes the predatory nature of AI-powered revenue management technology. It details how data centers are at the heart of algorithmic price-fixing; they are co-dependent on each other—the pricing technology and the source and storage of vast amounts of data.

Without the storage capabilities, algorithmic price-fixing could not occur. This practice has doubled housing costs and caused the housing crisis. The book describes how the technology works and includes compelling stories from both housed and unhoused tenants, highlighting its profound impact on their lives.


Discover more from Blog for Arizona

Subscribe to get the latest posts sent to your email.

5 thoughts on “Arizona Data Centers are Causing the Affordability Crisis”

  1. I think an overlooked issue in the development of AI hyperscale data centers is the unit economics of the current big and centralized LLM models is fundamentally flawed, in the view of many. The cost of developing and training, the infrastructure, and imputs, is not met by current or even reasonably projected demand and costs. Many believe that many of these hyperscale centers will end up like the dark fiber in the early 2000s in the last tech infrastructure boom cycle. Many of these centers would end up stranding massive investments in potentially idled and underutilized assets. I think there is certainly room for caution and circumspection about local communities making major concessions to build these centers. The risks of such investments should fall squarely on the speculators, not the public.

    Personally, I am convinced that the unit economics and scaling of the current models will soon doom the LLM boom, and most compaies will inovate and deploy using much more limited and specialized AI models deployed on LOCAL compute – i.e. on the actual compute located at the point of deployment (on your own computer, per Apple’s AI approach, or on computers localed in the factory or workplace). The much lower cost of local compute alone dictates that most business and personal use of AI models will be structured as smaller, dedicated and specialized models, not the current crop of hyper-expensive centralized compute. Just as mainframes and mini computes gave way to the unit economics of local micro-computers we overwhelmingly use today, those investing in generalized, do-everything LLMs will ultimately be the losers in actualy AI deployment.

    What many get wrong is the intuition that massive scale automatically equates to economy of scale and lower unit cost: that intution correct in the ‘real world’ but is wrong when it comes to ‘virtual world’ of compute deployment. Hyperscale compute has proven to be simply MORE expensive per operation than micro-scale deployment, historically and currently. Thus the unit cost of compute FALLS as it scales down to our local micros. The problem is that the cost of data movement is NOT free to deploy around a big network; the larger the data volume moved and the longers the distance of deployment INCREASES inefficiency and costs. Moving data the shortest possible distance for processing therefore LOWERS cost. Ultimately this simple fact strongly advantages distrubuted compute versus central compute costs. That means the the ultimately lowest unit cost is running AI on the CHIP level is vastly less expensive than moving all the data to hyperscale centers and back to you. This fact alone DOOMS the deployment and deployment of centralized LLMS. Consider this phase of development as a sunk cost of R&D, and you get a more realistic view of how AI will actually be deployed. The big AI LLM players are laying the foundations of AI development (think Bell Labs massive R&D investiments last century that laid the foundations of communications tech), but the real market value will be largely captured by specialized players using that knowledge on local compute with vastly lower per operation costs.

    Reply
  2. Correction needed…

    “Arizona Data Centers are “ADDING TO” the Affordability Crisis”

    There, isn’t that better?

    If something is “affordable” to you it’s because you make enough money to pay for it.

    But don’t take my word for it, ask anyone in any generation younger than us boomers.

    Reply

Leave a Comment