What Defines a Data Center Project (vs Other Real Estate or Energy Projects)?
Part I of Mastering the Data Center Development Process
This piece lays the foundation for how these projects are actually built, from capital deployment to land acquisition and the activation of each megawatt. If you do not understand the foundation, nothing else in this industry will make sense.
Welcome to Global Data Center Hub. Join investors, operators, and innovators reading to stay ahead of the latest trends in the data center sector in developed and emerging markets globally.
The Inherited Framework Is Wrong
The associate opened the cap rate model.
His first data center deal was in front of him.
He was looking for a comparable long lease, creditworthy tenant, stable cash flows.
He found one.
The analysis was wrong before it began.
Most investors encountering a data center deal apply real estate or energy infrastructure logic.
Both produce systematic mispricing.
The defining variables and binding constraints differ.
The underwriting question that determines project viability differs from either framework.
The correct lens must be built from first principles.
This piece does that.
How the Misclassification Was Built
The classification error has structural origins.
Colocation operators adopted REIT structures in the early 2000s because no better vehicle existed. Institutional capital entered through property allocations because that was the closest available bucket.
Zoning codes filed facilities under industrial or commercial because no regulatory category existed for a building that functioned as a utility.
Each decision was reasonable. Together, they embedded real estate as the default analytical framework for an asset class that is not, in any operative sense, a real estate asset.
That inheritance shaped the vocabulary: cap rates, net operating income per square foot, comparable sales. It did not shape the underlying economics. The gap was manageable when rack densities were modest.
Then AI workloads arrived. Enterprise racks ran at 10–15 kW, but AI clusters now reach 50–100 kW at near-continuous utilization. Power demands evolved faster than analytical frameworks could adapt, turning a small gap into a structural constraint.
The Building Is Not the Asset
The capital structure is where the distinction becomes undeniable.
In office, industrial, or logistics projects, the building shell accounts for 70–80% of capex, while in data centers it is only 20–30%, with 60–70% in mechanical and electrical systems.
A 100MW data center costs $900M–$1.5B before servers, including $450M–$750M in electrical systems.
Traditional real estate is priced per square foot, while data centers are priced per kilowatt at $100–$150/kW plus 15–20% power pass-through. Leases run 10–15 years with take-or-pay terms, functioning as infrastructure off-take agreements.
The repurposability assumption in real estate does not apply. Office assets can be re-tenanted, but data centers are engineered for specific power and thermal loads that limit alternative use.
Interconnection positions and power agreements are not transferable with the building, making the asset structurally non-replicable through acquisition alone.
The asset is the interconnection position, the power agreements, and the operational delivery record. None of that survives the building.
Pricing the Container, Missing the Asset
The narrative that brought institutional capital into this sector was not wrong. Stable cash flows. Long leases. Creditworthy tenants. It described the investable thesis accurately. It described the asset incorrectly.
The investable thesis is underwritten against the power position. The cash flows are stable because the power delivery obligation is contractually enforced through service level agreements that carry financial penalties for non-performance.
The leases are long because the tenant is securing compute capacity at a facility whose interconnection position took years to establish.
You are not pricing a building. You are pricing the guarantee the building was built to deliver.
The Product Is Compute, Not Electrons
A power plant generates electrons, a transmission line moves them, and a data center consumes them.
A 100MW data center draws the load of 75,000–80,000 homes and converts it into compute, storage, and low-latency data transfer. The product is not the electron, but what it enables.
Location logic follows the product.
Energy assets prioritize fuel and transmission; data centers require grid headroom, fiber density, latency, and permitting speed. A site optimal for power generation may be unusable for compute.
Power is the key constraint, but grid interconnection queues in primary markets now run three to five years. Without secured power, a site is only an option. Hyperscalers are responding by funding grid upgrades directly to bypass the queue.
The Binding Constraint by Investor Type
Independent operators face a specification problem the AI transition has made acute.
The shift to AI workloads has obsoleted facilities designed for legacy air-cooled rack densities of 10 to 15 kilowatts.
Retrofitting to liquid cooling (direct-to-chip, rear-door heat exchangers, immersion) requires structural engineering changes and capital the original underwriting did not contemplate.
The constraint is not occupancy. It is technical fitness for the workload that commands premium rates. Full occupancy at the wrong density is not a solved problem.
Private equity and infrastructure investors face an underwriting reframe at entry.
In traditional real estate, the first question is location and comparable occupancy. In data center development, the question that determines whether projected returns are achievable is whether the megawatts deliver on schedule.
That depends on grid interconnection timing, not tenant creditworthiness. Development-stage IRRs of 25 to 40 percent in Tier 1 markets reflect genuine execution risk concentrated in the power delivery sequence.
Public equity investors face a structural signal in recent transaction data. Blackstone’s $16 billion acquisition of AirTrunk in 2024 confirmed that institutional capital has already priced the Tier 1 scarcity premium.
The forward signal is not where capital is already concentrated, but where infrastructure is being built ahead of demand in secondary markets where grid capacity, fiber, and regulation are being prepared before hyperscalers arrive.
Price the Megawatts, Not the Building
Global data center capacity demand is growing at 22%+ CAGR through 2030.
Supply is constrained by interconnection delays, cooling transitions, and permitting backlogs that limit delivery below announced capital.
Three concepts govern correct underwriting at every stage.
The building is the container.
The asset is the contracted megawatts, the interconnection position, the fiber diversity, and the uptime delivery guarantee.
The building matters to the extent it houses those inputs reliably.
The binding constraint is power position.
A site without secured grid interconnection is an option. The development timeline that matters is the interconnection queue, not the construction schedule.
The product is compute capacity. The tenant is contracting against a service delivery obligation. The framework that prices the service delivery obligation correctly produces correct returns.
Return to those three concepts whenever the inherited frameworks reassert themselves. They will.


