

DG Matrix, a North Carolina-based innovator in solid-state power electronics, has raised $60 million to address one of the most pressing bottlenecks in the AI revolution: data center energy management. In an era where generative AI workloads demand unprecedented power density and reliability, traditional electrical infrastructure falls short, leading to skyrocketing costs, space constraints, and grid strain. DG Matrix’s Power Router platform—featuring multi-port solid-state transformers (SSTs)—promises to consolidate legacy systems into compact, ultra-efficient units, enabling hyperscalers and AI factories to scale without compromise.
Data centers now consume more electricity than small countries, with AI training runs alone rivaling nuclear plant outputs. Conventional setups chain uninterruptible power supplies (UPS), power distribution units (PDUs), switchgear, and transformers—achieving 82-90% efficiency while occupying vast footprints. As racks hit 100+ kW densities per NVIDIA’s AI Factory guidelines, these legacy chains waste energy as heat, demand excessive cooling, and delay deployments amid transformer shortages and utility backlogs.
DG Matrix reimagines this with Interport, a single 4×4-foot device handling 2.4 megawatts. It routes power from diverse sources—grid, solar, batteries—directly to 12 high-density racks, eliminating UPS layers and slashing space by 75%. At 95-98% efficiency, it cuts losses by two-thirds at peak loads, freeing floorspace for revenue-generating servers. Liquid-cooled sealed units ensure reliability under pulse workloads, while microsecond dynamic response stabilizes AI’s erratic demands.
The Power Router family—Interport, Power SideCar, Power Bridge—leverages SSTs to merge AC/DC ecosystems. Unlike mechanical transformers, solid-state designs use high-frequency switching for precise control, supporting 400-1500 VDC standards from OpenCompute’s Mount Diablo architecture. One unit replaces 10-20 devices, integrating renewables behind-the-meter to bypass grid upgrades.
Key specs include 2-5X power density, enabling more GPUs per rack; future-proof voltage agility for evolving standards; and multi-source orchestration, such as blending 600 kW solar with batteries for 24/7 uptime. Recent deals, like powering Exowatt’s solar-plus-storage containers, validate real-world deployment at gigawatt scales.
The $60 million round draws marquee investors including ABB, which took a minority stake to co-develop SSTs for AI data centers, microgrids, and EV charging. ABB’s Smart Power division sees DG Matrix as pivotal for “intelligent electricity,” aligning with electrification megatrends. Co-founder and CEO Haroon Inam highlighted the platform’s TCO reduction—half the footprint, lower OpEx—while CTO Subhashish Bhattacharya emphasized seamless energy filtering for GenAI’s volatility.
Proceeds will accelerate commercialization, targeting AI campuses, hyperscalers, and campus-scale builds racing toward multi-hundred-megawatt capacities. Partnerships mitigate risks like utility delays and stranded assets, fast-tracking commissioning to months.
For cloud giants, Interport unlocks white-space efficiency: consolidate power layers, activate off-grid assets, and dynamically allocate amid rack evolution. Industrial electrification benefits from resilient microgrids, while edge AI deployments gain compact reliability. In MarTech contexts, efficient power underpins analytics platforms processing petabytes—think agentic AI like Amplitude’s agents thriving on stable, dense compute without energy waste.
Hyperscalers deploying gigawatt factories (e.g., xAI, Oracle) prioritize such solutions to meet 2026 timelines amid transformer lead times exceeding years. DG Matrix’s modularity scales predictably, supporting NVIDIA 800 VDC blueprints.
This raise arrives amid surging AI power needs—projected 160% data center growth by 2030—colliding with renewables mandates and grid constraints. In the US, post-Trump’s 2025 reelection accelerates permitting for AI infra; India’s gigawatt ambitions (L&T-NVIDIA) mirror this via sovereign compute. DG Matrix bridges them: ABB’s global reach accelerates adoption in Europe’s green data centers and Asia’s hyperscale boom.
Challenges like initial capex yield to lifecycle savings, with 50% TCO cuts via efficiency and space gains. Solid-state maturity mitigates early reliability concerns, proven in Exowatt pilots.
DG Matrix differentiates from UPS incumbents (Eaton, Vertiv) via SST integration, outpacing pure-play power electronics. Roadmaps target 98.5% efficiency, AI-specific pulse management, and OCP compliance. Expansions into industrial EVs and renewables position it beyond data centers.
For content strategists tracking enterprise SaaS, this underscores infra’s return as a moat: power optimization enables denser AI stacks, fueling MarTech’s agentic era. As semiconductors like India’s fabless designs power edge inference, DG Matrix ensures the juice flows smarter.
DG Matrix’s triumph signals power as AI’s unsung hero. Investors betting on compute (NVIDIA ecosystem) now hedge energy: $60 million validates SSTs as the next power shift, akin to flash storage revolutionizing NAND. Enterprises gain agility—deploy AI factories sans grid waitlists, monetize stranded renewables.
In a world where electricity defines progress, DG Matrix doesn’t just fundraise—it rewires the future. By making megawatts manageable, it propels the AI economy from constraint to abundance, one efficient electron at a time.