Quick Update | AI Training Is 'Consuming' the Power Grid: Niv-AI Secures $12 Million Seed Round to Solve GPU Power Surge Challenge

Image

Image

Image source: Niv-AI

Electricity is the key raw material for artificial intelligence, but new processing technologies have exceeded the ability of data-center operators to manage their relationship with the power grid. This forces them to cut power usage by up to 30%.

"These AI factories are wasting too much power," NVIDIA CEO Jensen Huang said during his keynote at the company’s annual GTC customer conference. At the event, the company announced, "Every watt of unused power represents lost revenue."

Today, Tel Aviv-based startup Niv-AI has emerged from stealth mode after raising $12 million in seed funding to tackle this challenge by precisely measuring GPU power consumption with new sensors and building tools to manage energy use more efficiently.

The company was founded last year by CEO Tomer Timor and CTO Edward Kizis. Backers include Glilot Capital, Grove Ventures, Arc VC, Encoded VC, Leap Forward, and Aurora Capital Partners.

As cutting-edge labs coordinate thousands of GPUs to train and run advanced models, rapid, millisecond-level power-demand spikes occur as processors switch between computing tasks and communicating with other GPUs.

These spikes make it difficult for data centers to manage the power drawn from the grid. To avoid shortages, centers either pay for temporary energy storage to cover the surges or restrict GPU usage. Either solution reduces the return on investment for expensive chips.

"We can no longer keep building data centers in the current way," said Lior Handelman, partner at Grove Ventures and a member of Niv’s board.

Niv’s road map begins with understanding the status quo. The company is deploying rack-level sensors that monitor power usage of its own GPUs and partner systems to the millisecond. The goal is to map the specific power footprint of different deep-learning tasks and then develop mitigation technologies that let data centers release more existing computing power.

Naturally, engineers expect to use the data to train an AI model that can predict and synchronize the data-center power load—a "co-pilot" for facility engineers.

The company plans to roll out operational systems to a handful of U.S. data centers in the next six to eight months. It’s an attractive fix for hyperscalers struggling with land-use hurdles and supply-chain bottlenecks. Founders envision their ultimate product as the missing "intelligent layer" between data centers and the grid.

"The grid is actually worried about data centers sucking too much power at peak moments," Timor told TechCrunch. "We’re addressing two ends of the rope—helping data centers utilize more GPUs and use paid-for power more efficiently, and building a more responsible power-allocation scheme between data centers and the grid."

https://techcrunch.com/2026/03/17/niv-ai-exits-stealth-to-wring-more-power-performance-out-of-gpus/


分享網址
AINews·AI 新聞聚合平台
© 2026 AINews. All rights reserved.