AI deployment is accelerating capital expenditure at a scale few technologies have ever reached. Yet regardless of how quickly AI adoption unfolds, one constraint remains constant: power. According to Josh Dienstag, CIO, Carbon Direct Capital, the gap between AI ambitions and grid capacity is creating immediate, durable demand for technologies that either reduce power intensity or expand clean, firm electricity supply, a defining investment theme for energy transition investors attending SuperReturn Energy Transition.
AI data centres, capital expenditure, and energy system constraints
Investors are grappling with both the magnitude of capex on AI data centres and the resulting strain on the energy system.
Hyperscalers are investing $670 billion into AI data centres in 2026 alone [1] and potentially $1.5 trillion by 2030 [2]. The implications for the energy system are profound. U.S. electricity generation has only grown at roughly 1% per year over the past decade. Data centres currently account for around 4% of U.S. electricity demand; we project that share will grow to 10% or more by 2030 [3].
In absolute terms, U.S. data centre capacity is projected to grow from 25 GW to 67 GW, or over 40 GW of new demand on a grid that has barely expanded in the last decade. The global story is similar, with data centre capacity expected to increase from 52 GW to 120 GW by 2030 [4]. Further, over 80% of projected AI power demand will serve inference [5], the continuous operation of deployed AI models, rather than training, which is more one-time in nature. This means that baseload power needs to durably scale to meet the demands of AI.
"U.S. data center capacity is projected to grow from 25 GW to 67 GW, or over 40 GW of new demand on a grid that has barely expanded in the last decade.”
- Josh Dienstag
How should investors approach AI power constraints in the private markets?
For private market investors, the question is not whether AI exceeds or falls short of consensus expectations, but how to invest through the power constraint.
Carbon Direct Capital approaches this challenge through growth-stage equity investing in technologies that meet both commercial and emissions goals for hyperscalers. A defining feature of its investment process is deep scientific and technical diligence, supported by its affiliated advisory firm, Carbon Direct Inc., which employs over 70 scientists.
"Whether AI deployment meets, exceeds, or falls short of consensus estimates, one thing is certain: the power constraint is real and immediate.”
- Josh Dienstag
Investing in AI chip efficiency technologies
One investment strategy is to invest in chip efficiency technologies that dramatically reduce the energy intensity of AI computations.
An example is our recent investment in Neurophos’ oversubscribed $110 million Series A. Neurophos is a Texas-based designer of photonic chips for AI inference. A photonic chip can conduct the same amount of AI computations as an incumbent chip but with ~1/90th of the electricity.
Neurophos’ Optical Processing Units (OPUs) are drop-in replacements into any data center, whether terrestrial or orbital. Scientists at Carbon Direct Inc. have tracked chip architectures for a decade and provided us input during our technical evaluation of Neurophos. Microsoft’s corporate venture arm, M12, participated in Neurophos’ fundraising, reflecting broader interest in next-generation computing architectures.
Investing in clean firm power technologies for AI infrastructure
A second investment strategy is to invest in scalable, clean firm power technologies. Clean firm power technologies deliver low-carbon electricity reliably and at speed. Examples include geothermal, advanced power generation, and hybrid approaches that pair conventional generation with carbon capture.
These solutions are increasingly attractive to hyperscalers because they align with the commercial imperatives of speed, reliability, and cost. Recently, we co-led Sage Geosystems’ $97 million Series B with Ormat, the geothermal incumbent.
Sage’s proprietary ‘pressure geothermal’ technology harnesses both heat and subsurface pressure from hot, dry rock formations found worldwide, unlocking an estimated 130x [6] more geothermal resource potential in the U.S. than conventional approaches. Sage has already secured a 150MW power purchase agreement with Meta.
We gained conviction in Sage’s approach following a detailed technology comparison of geothermal pathways with assistance from Carbon Direct Inc.
Technical fluency as an investment advantage in AI infrastructure
Technical fluency is a defining edge when investing in the AI pick-and-shovel ecosystem.
Companies competing to serve AI data centers operate at the intersection of energy, materials science, and advanced engineering: analyzing them as an investor requires deep scientific expertise. Carbon Direct Inc.’s technical advisory services provide CD Capital this depth, allowing us to identify companies well positioned to scale.
Why AI power constraints create durable investment opportunities
Whether AI deployment meets, exceeds, or falls short of consensus estimates, one thing is certain: the power constraint is real and immediate. U.S. grid infrastructure cannot support 40+ GW of new data center demand without fundamental innovation in both efficiency and generation.
In a scenario where AI adoption exceeds expectations, clean firm power providers like Sage will be essential to meeting unprecedented demand. In a scenario where AI growth disappoints, chip efficiency technologies like Neurophos become even more critical - allowing operators to maximize output from constrained power budgets and underutilized infrastructure.
The investment opportunity lies not in predicting AI's future, but in recognizing that the gap between today's grid capacity and tomorrow's AI ambitions creates immediate, tangible demand for technologies that either reduce power intensity or expand clean firm power supply. Whether the AI revolution unfolds rapidly or gradually, we believe these companies have tailwinds.
"The investment opportunity lies not in predicting AI's future, but in recognizing
that the gap between today's grid capacity and tomorrow's AI ambitions
creates immediate, tangible demand.”
- Josh Dienstag
Why this matters for LPs and GPs
- The AI investment debate is secondary to the power constraint. Regardless of whether AI deployment exceeds or undershoots consensus expectations, energy infrastructure limitations are already shaping outcomes. This creates durable demand for efficiency and generation solutions across scenarios.
- Energy intensity and clean firm power are converging investment themes. LPs and GPs evaluating AI-linked exposure increasingly need to assess both sides of the equation: technologies that reduce electricity consumption and those that expand reliable, low carbon supply.
- Technical diligence is becoming a competitive advantage. As AI data centre infrastructure sits at the intersection of energy, materials science and advanced engineering, managers with deep technical underwriting capabilities are better positioned to manage risk and capture value.
- Power availability is becoming a gating factor for growth. For hyperscalers and their capital partners, constrained grids are no longer a distant risk but an immediate operational reality — reshaping capital allocation decisions today.
References
[1] Source: Wall Street Journal article of 2-7-26.
[2] Source: CD Capital estimates. Note: 2030E data center capacity.
[3] Source: CD Capital estimates.
[4] Source: CD Capital estimates
[5] Source: Tensormesh.ai.
[6]Source: Sage Geosystems management.

