Aries PCIe® Smart Gearbox

Now sampling industry’s first purpose-built PCIe Gearbox

Industry’s first PCIe® 6 Gearbox Portfolio

Intelligently bridges the performance gap between cutting-edge PCIe 6 and established PCIe 5 ecosystems for maximum data throughput

  • Solves bandwidth bottlenecks that arise when integrating the latest PCIe 6 with existing PCIe 5 infrastructure
  • Optimizes PCIe lane utilization and reach-extension for demanding AI workloads and cloud-scale deployments
  • Ensures a reliable, scalable, and customizable interconnect solution for diverse cloud environments, maximizing I/O bandwidth
  • Comprehensive solution, complete with COSMOS software suite for advanced Link, Fleet & RAS telemetry & diagnostics

Gearbox Highlights

Bridging multi-generational PCIe ecosystem

Increase effective CPU or GPU lanes by

When attaching PCIe 6 devices with PCIe 5 devices, reducing TCO

Robust Signal
Integrity

64GT/s PAM4 SerDes and DSP customized for demanding AI server channels

Enhanced Diagnostics
& Telemetry

Extended capabilities through in-band and out-of-band management

Why Use Aries Gearbox

Enhanced portfolio with years of learnings from cloud-scale deployments

Maximize Throughput

  • Flexible Protocol

    Supports a comprehensive range of PCIe rates (64 GT/s to 2.5 GT/s) ensuring full utilization of all available PCIe lanes in mixed-generation setups.

  • Seamless Protocol Translation

    Enables efficient 1-to-1 communication between the latest PCIe 6 (Flit-Mode) and older PCIe 5 (Non-Flit-Mode) devices, ensuring maximum data transfer rates are maintained across the interconnected ecosystem.

  • Extended Reach

    Advanced long-reach SerDes maintains signal integrity and performance across high-loss channels, maximizing bandwidth over extended distances.

Quick Deploy

  • Purpose-built Design

    Small package footprint with flexible clocking modes simplify routing and reduce solution size

  • Quick Debug

    Built-in protocol analyzer with Link state history and timestamps, full non-destructive eye scan for RX Lane margining, self-test features to minimize link downtime and accelerate fault isolation

  • SW-defined Architecture 

    COSMOS with Aries software-defined architecture adjusts protocol and state transitions to support open and diverse PCIe ecosystem

Increase Uptime

  • Deep Diagnostics

    Advanced Error Reporting (AER) and FW driven link health monitoring to alert system of any possible link performance issues

  • Robust Error Handling

    Integrates Host and Downstream Port Containment (DPC) and hardware interrupt generation for uncorrectable errors, ensuring link stability and data integrity

  • In-field Upgrade

    COSMOS software upgrades enable seamless deployment of new features into existing infrastructure in the field

Resources

Ordering Information

Orderable Part NumberDocumentsFrom (Lanes @PCIe Gen)To (Lanes @PCIe Gen)Total LanesOrderingStatus
PG60161LRPortfolio Brief8L @ PCIe 6
8L @ PCIe 5
8L @ PCIe 4
16L @ PCIe 5
16L @ PCIe 4
16L @ PCIe 3
24Contact UsSampling
PG60321LRPortfolio Brief8+8L @ PCIe 6
8+8L @ PCIe 5
8+8L @ PCIe 4
16+16L @ PCIe 5
16+16L @ PCIe 4
16+16L @ PCIe 3
48Contact UsSampling

Astera Labs Completes Acquisition of aiXscale Photonics

Today, Astera Labs completed its acquisition of aiXscale Photonics GmbH, following the definitive agreement announced in October 2025. This marks an important step in our journey to deliver comprehensive connectivity solutions for AI Infrastructure 2.0.Why Photonics Matter for Scale-Up AIAs AI systems scale to hundreds of tightly integrated accelerators per rack, optical connectivity… Read more

Building the Software Stack for AI Infrastructure 2.0: Why Standards-Based Connectivity Management Matters

How our collaboration with ASPEED and Insyde on OpenBMC support advances the vision of open, interoperable AI rack infrastructure We’re at an inflection point in AI infrastructure. As I watch hyperscalers architect their next-generation AI racks, I see a fundamental shift happening—one that goes far beyond just faster GPUs and higher bandwidth connections. We’re witnessing the… Read more

Breaking the 100 GB/s Barrier: Astera Labs and Micron Demonstrate Production-Ready PCIe 6 Storage Performance at FMS 2025

The future of AI infrastructure arrived at the Future of Memory and Storage (FMS) 2025 conference, where Astera Labs and Micron achieved a groundbreaking milestone: over 100 GB/s of sustained throughput using production-ready PCIe 6 technology. This industry-first demonstration represents a transformative advancement in AI storage performance, featuring four Micron 9650 PCIe Gen6 SSDs and… Read more

Astera Labs Announces Financial Results for the Third Quarter of Fiscal Year 2025

Record quarterly revenue of $230.6 million, up 20% QoQ and 104% YoYStrong Q3 revenue growth driven by new AI platform ramps featuring multiple product familiesScorpio fabric switch design wins expand to several platforms at multiple hyperscaler customersSAN JOSE, CA, U.S. – November 4, 2025 – Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions… Read more

Astera Labs to Acquire aiXscale Photonics

Deal is expected to help enable scale-up photonic chiplets in AI Infrastructure 2.0 and accelerate Astera Labs’ deployment of rack-scale solutionsSAN JOSE, Calif.—October 22, 2025—Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions for rack-scale AI infrastructure, today announced that it has entered into a definitive agreement to acquire aiXscale… Read more

Astera Labs Joins Arm Total Design to Accelerate Custom AI Infrastructure Solutions 

Collaboration addresses growing rack-scale infrastructure demands with custom AI chiplet architecture connectivity solutions SAN JOSE, Calif.–October 14, 2025–Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions for rack-scale AI infrastructure, today announced it has joined Arm® Total Design, a comprehensive ecosystem dedicated to accelerating… Read more

Astera Labs Showcases Rack-Scale AI Ecosystem Momentum at OCP Global Summit

Comprehensive collaborations spanning GPU, CPU, cables & connectors, ODM, software management, and IP/design & verification providers show growing support for open standards driving AI Infrastructure 2.0SAN JOSE, Calif.—October 13, 2025—Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions for rack-scale AI infrastructure, today announced… Read more

COSMOS and OpenBMC Demo at OCP 2025

Through collaboration with ASPEED Technology and Insyde Software, Astera Labs has extended COSMOS features into production AI systems to standardize management, monitoring and diagnostics of retimers and scale-up switches with OpenBMC and DMTF Redfish APIs. COSMOS now offers:Standard APIs for configuration, monitoring and lifecycle controlActionable insights with granular visibility into… Read more

CXL Memory Innovation at OCP 2025

Rack-scale memory innovation is unlocked with CXL. At OCP Global Summit 2025, Astera Labs demonstrated how Leo CXL Smart Memory Controllers can eliminate AI infrastructure bottlenecks with regards to memory capacity and bandwidth.3x concurrent LLM instances at higher AI throughput3x increased throughput w/ CXL at higher user count3x lower latency with CXLHigher GPU utilization on average with… Read more

AI Open Rack Demo at OCP 2025

As AI workloads exceed the limits of traditional servers, Astera Labs is making Open Rack Architecture a reality. Seamless integration of compute and switch platforms at rack scale is key to unlocking a unified AI engine. Together with our ecosystem partners, Astera Labs is driving this vision forward—transforming concept into real-world impact.Watch Brian Deng showcase our AI Open Rack… Read more