As AI scales, inefficient data movement is turning high-performance infrastructure into a major source of hidden energy waste.
NeuReality, a pioneer in purpose-built AI inference infrastructure, today released new research, “Why Scaling AI Breaks Your Network,” showing that improving networking efficiency in AI systems can significantly reduce data center carbon emissions, positioning network optimization as a critical lever for sustainable AI at scale.
Coinciding with Earth Day 2026, the white paper identifies inefficient data movement across distributed systems as a major source of energy waste in AI environments. As workloads scale, networking bottlenecks leave compute resources idle while still consuming power, increasing overall energy usage and emissions.
According to the research, and based on standard industry calculations, a single high-performance AI rack operating continuously can generate approximately 600 tons of CO₂ annually. Improving system utilization by just 33% through more efficient networking can reduce emissions by roughly 200 tons of CO₂ per rack per year, a significant reduction for enterprises and hyperscalers operating at scale.
“If your AI infrastructure isn’t optimized for efficient data movement, you’re not only leaving performance on the table, you’re also burning energy unnecessarily,” said Moshe Tanach, NeuReality CEO. “It’s time to stop chasing compute to meet demand. The real opportunity is unlocking idle capacity, up to 2× more output, by fixing system-level inefficiencies with an inference operating system like NR-NEXUS to lower cost and carbon per token at scale.”
Where AI Infrastructure Wastes Energy Today
As AI systems become more distributed, performance increasingly depends on how efficiently data moves between compute resources. In many environments, delays in data transfer and coordination cause accelerators to sit idle while still drawing power.
At the core of these gains is one factor: utilization of existing infrastructure. NeuReality’s research shows that improving networking efficiency can:
- Cut carbon per token while increasing output from existing infrastructure.
- Eliminate idle energy waste by keeping accelerators fully utilized.
- Unlock up to 2× more throughput without adding power or hardware.
- Delay or avoid new data center builds by unlocking existing capacity.
In practice, this makes networking efficiency one of the most immediate and underutilized levers for reducing both cost and emissions in AI systems. NeuReality addresses this with a unified system-level architecture spanning its NR2 AI-SuperNIC for high-efficiency data movement and NR-NEXUS, an inference operating system that orchestrates distributed workloads to maximize utilization, each deployed independently or together.
These findings point to a structural shift in AI infrastructure, where performance, cost, and sustainability are driven by the same underlying factor: how effectively infrastructure is utilized.
By fixing networking inefficiencies, organizations can unlock more output from existing infrastructure, avoid unnecessary expansion of power-intensive data centers, and significantly reduce both cost and emissions.
“Sustainability doesn’t require trade-offs,” added Tanach. “When you fix system-level inefficiencies, performance, cost, and carbon all move in the same direction. What’s good for the business is good for the planet.”
To view the full whitepaper, visit Why Scaling AI Breaks Your Network.
About NeuReality
Founded in 2019, NeuReality is a pioneer in purpose-built inference infrastructure for AI factories. Based on an open, standards-based approach, NR-NEXUS® and NR2® AI-SuperNIC, built on the foundation of NR1�� AI-CPU and NR1® Inference Appliance, are designed to operate across heterogeneous AI environments and support the agentic future ahead of us with solid multi-purpose tokens serving infrastructure. NeuReality has offices in Israel, Poland, and the United States. To learn more, visit http://www.neureality.ai.
View source version on businesswire.com: https://www.businesswire.com/news/home/20260422292099/en/
“If your AI infrastructure isn’t optimized for efficient data movement, you’re not only leaving performance on the table, you’re also burning energy unnecessarily.” - Moshe Tanach, NeuReality CEO
Contacts
Media Contact:
Joe Livarchik
Voxus PR
NeuReality@VoxusPR.com
