DataCentreNews Ireland - Specialist news for cloud & data centre decision-makers
Flux result 895189c5 f8a9 47a4 9a17 bb76167a72eb

Aria launches Deep Networking platform with USD $125 million

Wed, 8th Apr 2026

Aria Networks has launched its Deep Networking platform, now generally available, and announced USD $125 million in funding.

The round includes Sutter Hill Ventures, Atreides Management, Valor Equity Partners and Eclipse Ventures. Gavin Baker, managing partner and CIO at Atreides Management, has joined the board alongside Stefan Dyckerhoff of Sutter Hill Ventures and the founding team.

Aria is entering the AI data centre market, where networking is becoming a bigger part of cluster design as training and inference systems grow. The company argues that network performance directly affects model utilisation and the cost of producing tokens, an increasingly common commercial metric in AI services.

The platform combines SONiC-based switching software, telemetry collected across switches, transceivers and hosts, and software agents that monitor and adjust network behaviour. It is designed for 800GbE and 1.6T Ethernet deployments in both liquid-cooled and air-cooled environments.

Aria says operators can replace more static network management methods with software that tunes routing, load balancing, congestion handling and failover in real time. Customers can also use natural-language queries to investigate alerts and work with Aria's software agents on operational decisions.

The company says it has customer orders and is already deploying the product. Aria was founded in 2025 and is based in Palo Alto, California.

Board changes

The investment adds a new board member from one of Aria's backers and raises the profile of a young infrastructure supplier seeking to establish itself in a crowded AI market. Investors have focused heavily on chips and large-scale computing platforms, but networking is drawing more attention as AI clusters become more distributed and bandwidth demands rise.

According to Aria, Ethernet has become the main fabric for new AI back-end deployments, alongside growing adoption of liquid cooling and a faster shift toward 1.6T ports. The company is positioning itself around open Ethernet rather than proprietary interconnects.

That places Aria in a part of the market where hardware choices, software control and operational visibility increasingly overlap. Operators running large GPU or accelerator fleets are under pressure to improve utilisation rates as the cost of AI infrastructure remains high.

"The network has become a key obstacle in AI infrastructure. Deep Networking changes that - and the economics prove it: a 10% gain in tokens per second is a 10% gain in revenue. What this team has built and shipped in such a short time is extraordinary," said Mansour Karam, founder and CEO of Aria Networks.

Investor view

Baker's appointment ties the financing directly to oversight of Aria's next stage of growth. Atreides joined the round as the company moves from product development to deployments.

"Networking is one of the most consequential bets in the AI infrastructure stack - and Aria is getting it right. They identified a real problem, built a differentiated solution around a metric that operators actually care about, and they already have customer orders in hand. Deep Networking is a category-defining approach, and I'm excited to help Aria deliver it," said Baker.

Dyckerhoff also pointed to early commercial traction.

"Aria has done something rare - identified a measurable problem and built a differentiated solution that customers are already ordering. Our conviction is grounded in the technology, the momentum, and above all, the founders," he said.

Market focus

Industry partners and customers linked the offering to a broader shift in AI cluster operations, where visibility into network traffic and failures can affect utilisation and costs. Their comments reflect growing concern among operators that conventional tools are ill-suited to bursty AI workloads distributed across thousands of accelerators.

"Managing AI clusters at scale exposed a critical blind spot: the networking layer. Without visibility into the fabric, performance issues remain unresolved, costs spiral fast and traditional tools simply can't keep up. An AI-first, autonomous approach finally gives us control and measurable savings," said Dali Kilani, founder and CTO of Blackfuel.ai.

Aria's strategy also depends on partnerships across the supply chain, including switch silicon, network interface cards, servers and connectivity components. Supportive comments in the announcement came from Broadcom, AMD, Supermicro, Amphenol, Positron, and San Francisco Compute.

The company's central claim is that better telemetry and automated intervention can improve model and bandwidth utilisation enough to justify network spending in large AI clusters. Aria says a 1% improvement in Model Flop Utilisation can recoup the entire cost of the network.