AI Power Crunch Solution: Unlocking 50% More Compute from Existing Grids
- 50% more AI compute capacity from existing data center infrastructure
- 20 to 50 times more energy required by modern AI workloads compared to traditional IT equipment
- Under 20 milliseconds latency in Karman's power orchestration system
Experts agree that AI-driven power orchestration, such as Utilidata's Karman platform, offers a critical solution to the power bottleneck in AI infrastructure, enabling immediate utilization of stranded capacity while promoting sustainable growth.
AI Power Crunch Solution: Unlocking 50% More Compute from Existing Grids
ANN ARBOR, MI – March 12, 2026 – As the artificial intelligence boom strains global power grids, a new collaboration aims to solve the industry’s most critical bottleneck. Embedded AI leader Utilidata and European AI cloud provider NexGen Cloud have announced the deployment of a novel power orchestration platform, Karman, which promises to unlock up to 50% more AI compute capacity from existing data center infrastructure.
This partnership represents a pivotal shift in how the industry addresses the insatiable energy appetite of AI. Instead of waiting years for new power plants and grid upgrades, this solution looks inward, using AI to reclaim vast amounts of unused energy already flowing into data centers. The initiative will begin at a showcase facility in Montreal before a wider rollout, potentially setting a new standard for scaling AI infrastructure efficiently and sustainably.
The Billion-Dollar Power Bottleneck
The rapid proliferation of generative AI has created an unprecedented demand for processing power, and by extension, electrical power. Modern AI workloads, running on clusters of high-performance GPUs, can require 20 to 50 times more energy per rack than traditional IT equipment. This has pushed rack power densities from a few kilowatts to over 50 kW, with future systems projected to exceed 100 kW. This surge has made power availability, not just computing hardware, the primary constraint on AI growth.
Data center operators face a daunting challenge: the demand for AI capacity is immediate, but the timeline for securing more power from the grid can span years. This leads to a significant amount of “stranded capacity”—power that has been provisioned and paid for but cannot be safely or efficiently utilized due to conservative design margins and the unpredictable, spiky nature of AI workloads. Industry estimates suggest billions of dollars in energy capacity sit idle within data centers worldwide.
“Power is the single biggest constraint for AI growth today,” said Josh Brumberger, CEO of Utilidata, in a statement. “Billions of dollars of existing capacity are underutilized in data centers worldwide. Karman unlocks that trapped capacity immediately while ensuring new data centers are optimized from day one.”
Orchestrating Electrons with AI
Utilidata’s Karman platform tackles this problem by embedding an AI-driven intelligence layer directly into the data center’s power distribution infrastructure. Running on a custom module co-developed with NVIDIA, Karman acts as a high-speed traffic controller for electricity. It samples power data over a million times per second, allowing it to see and react to fluctuations with a latency of under 20 milliseconds—as fast as the servers themselves.
This real-time visibility and control enables Karman to dynamically orchestrate power delivery to each server rack. Instead of relying on static, over-provisioned power limits, the system can safely push the infrastructure closer to its true operational limits. It intelligently balances performance, efficiency, and reliability, allowing operators to deploy dramatically more GPUs within their existing power envelope.
This approach complements the server-level optimizations already performed by GPUs. While GPUs optimize compute tasks, Karman optimizes the electrical delivery that fuels them, creating a parallel intelligence layer that ensures the entire system, from the grid connection to the processor, is working in concert. This is a significant departure from traditional Data Center Infrastructure Management (DCIM) software, which typically focuses on higher-level monitoring and analytics rather than active, sub-millisecond control of the power flow itself.
A Strategic Edge in the AI Cloud Wars
For NexGen Cloud, a major European player that has invested $1 billion to build an AI Supercloud, this technological edge is a significant commercial advantage. By deploying Karman, the company can deliver more AI capacity to its customers at a lower price point and with enhanced power-related services through its Hyperstack platform.
“For infrastructure providers like NexGen Cloud servicing such projects, the next critical factor is energy management—specifically, maximizing the capacity already on-site,” noted Chris Starkey, CEO of NexGen Cloud. “Karman gives us a significant commercial advantage by unlocking that trapped capacity.”
The initial deployment in Montreal serves as a proof of concept for NexGen Cloud’s ambitious expansion plans, which include new “AI Factory” deployments across North America and the Nordics, with capacity targets reaching hundreds of megawatts. By extracting more performance from its current and future sites, the company can accelerate its growth trajectory without being entirely dependent on the slow pace of traditional utility upgrades.
This ability to scale faster and more cost-effectively could be a disruptive force in the competitive AI cloud market. As enterprises and startups alike seek powerful and affordable AI infrastructure, providers who can offer more compute for less cost will be positioned to capture significant market share.
The Path to Sustainable AI Growth
Beyond the commercial benefits, this partnership highlights a crucial pathway toward more sustainable AI development. NexGen Cloud already operates its European and Canadian data centers on 100% renewable energy. The Karman platform amplifies the impact of this commitment by ensuring that every green-sourced watt is used to its maximum potential.
By increasing the computational output per watt, this power optimization technology directly reduces the overall energy footprint required for a given AI task. This concept, often called “Green AI,” is becoming increasingly important as the industry grapples with its environmental impact. The International Energy Agency has warned that electricity demand from data centers could more than double by 2030, largely driven by AI.
Furthermore, by maximizing the use of existing facilities, the technology helps delay or even eliminate the need for new data center construction and grid expansion, avoiding the significant embedded carbon and resource consumption associated with building new infrastructure. This strategy of densification—packing more compute into the same physical and electrical footprint—is a cornerstone of sustainable scaling. It demonstrates that technological innovation can provide solutions not only for performance and cost but also for the pressing environmental challenges posed by the AI revolution.
📝 This article is still being updated
Are you a relevant expert who could contribute your opinion or insights to this article? We'd love to hear from you. We will give you full credit for your contribution.
Contribute Your Expertise →