Folgen

  • Why Utilities Are the Quiet Kingmakers of AI
    Jan 9 2026

    The AI buildout depends on decisions made by regulated utilities — entities most tech investors have never studied. Utilities are the quiet kingmakers of this cycle.

    Utilities operate under regulatory frameworks that prioritize reliability and rate stability. Adding 200 megawatts of load for a single customer requires new substations, transmission upgrades, and generation procurement — all approved by state regulators.

    Utility strategy is becoming a differentiator in AI infrastructure deployment. Companies that understand how to work within regulated frameworks have structural advantages that pure capital cannot replicate.]]>

    Mehr anzeigen Weniger anzeigen
    3 Min.
  • How Power Pricing Shapes Data Center Geography
    Jan 8 2026

    Data centers don't just need power — they need cheap power. And the geography of electricity pricing is reshaping where AI infrastructure gets built.

    For a 100-megawatt facility running 24/7, each cent per kilowatt-hour represents roughly $8.7 million in annual operating cost. A 5-cent differential means $43 million per year — nearly a billion dollars over a 20-year horizon.

    Power pricing is creating durable geographic advantages that no amount of tax credits can offset. Capital follows the electrons.]]>

    Mehr anzeigen Weniger anzeigen
    3 Min.
  • Why Grid Interconnection Is the Real Bottleneck
    Jan 7 2026

    You can buy land. You can order GPUs. You can raise capital. But you cannot buy a faster grid connection. Interconnection is the binding constraint on AI infrastructure.

    In PJM — the grid operator for the Mid-Atlantic — the interconnection queue now exceeds 2,500 projects totaling over 250 gigawatts. The average wait time has stretched beyond 4 years. Developers who secured interconnection agreements 3-5 years ago now hold strategic assets.

    Capital is repricing assets based on interconnection status, not just location or capacity. The race for AI compute is increasingly a race for queue position.]]>

    Mehr anzeigen Weniger anzeigen
    3 Min.
  • What Hyperscalers Care About That Startups Don't
    Jan 6 2026

    Hyperscalers and AI startups operate in the same industry but play entirely different games. The difference is infrastructure.

    Hyperscalers sign 10-15 year power purchase agreements, coordinate grid interconnection at utility scale, deploy liquid cooling systems, and plan 5-10 years ahead. Microsoft has nuclear agreements. Google invests in geothermal. Amazon backs utility-scale renewables.

    Startups rent the result. Hyperscalers own the moat. In AI infrastructure, scale isn't an advantage — it's the only advantage.]]>

    Mehr anzeigen Weniger anzeigen
    2 Min.
  • Why Land Is Back in the AI Equation
    Jan 5 2026

    For a decade, software ate the world. Now AI is making land matter again.

    Meta's 2-gigawatt facility requires hundreds of acres. Microsoft's expansion spans millions of square feet globally. These aren't software deployments — they're industrial developments with 30-year time horizons requiring transmission adjacency, water rights, and favorable zoning.

    Developers with optioned land near transmission corridors are commanding premium valuations. Land isn't just back — it's becoming a rate-limiting input to the AI buildout.]]>

    Mehr anzeigen Weniger anzeigen
    3 Min.
  • The Real Reason Data Centers Are Moving Locations
    Jan 2 2026

    Data centers are relocating. Not for tax incentives. Not for talent. For power.

    Virginia's data center corridor consumed 26% of state electricity in 2023. Dominion Energy is struggling to keep pace. Grid stress events and connection delays stretching to years are pushing capital to Texas, Ohio, Indiana, Southeast Asia, and the Nordics.

    Capital is following power geography, not network geography. The next decade of AI infrastructure will be built where grid capacity exists — not where the internet already concentrates.]]>

    Mehr anzeigen Weniger anzeigen
    3 Min.
  • Why Compute Is Constrained by Electricity, Not Chips
    Jan 1 2026

    The GPU shortage narrative is fading. The real constraint on AI compute is electricity — and it's not even close.

    Chip fabrication is scaling — TSMC, Intel, and Samsung are investing billions. But grid capacity doesn't scale like silicon. A single NVIDIA H100 draws 700 watts. AI training clusters now exceed 100 megawatts. Goldman Sachs projects 165% increase in global data center power demand by 2030.

    The problem isn't that chips don't exist. The problem is you can buy the chips, but you can't plug them in.]]>

    Mehr anzeigen Weniger anzeigen
    3 Min.