Aries PCIe® Smart Gearbox

Now sampling industry’s first purpose-built PCIe Gearbox

Industry’s first PCIe® 6 Gearbox Portfolio

Intelligently bridges the performance gap between cutting-edge PCIe 6 and established PCIe 5 ecosystems for maximum data throughput

  • Solves bandwidth bottlenecks that arise when integrating the latest PCIe 6 with existing PCIe 5 infrastructure
  • Optimizes PCIe lane utilization and reach-extension for demanding AI workloads and cloud-scale deployments
  • Ensures a reliable, scalable, and customizable interconnect solution for diverse cloud environments, maximizing I/O bandwidth
  • Comprehensive solution, complete with COSMOS software suite for advanced Link, Fleet & RAS telemetry & diagnostics

Gearbox Highlights

Bridging multi-generational PCIe ecosystem

Increase effective CPU or GPU lanes by

When attaching PCIe 6 devices with PCIe 5 devices, reducing TCO

Robust Signal
Integrity

64GT/s PAM4 SerDes and DSP customized for demanding AI server channels

Enhanced Diagnostics
& Telemetry

Extended capabilities through in-band and out-of-band management

Why Use Aries Gearbox

Enhanced portfolio with years of learnings from cloud-scale deployments

Maximize Throughput

  • Flexible Protocol

    Supports a comprehensive range of PCIe rates (64 GT/s to 2.5 GT/s) ensuring full utilization of all available PCIe lanes in mixed-generation setups.

  • Seamless Protocol Translation

    Enables efficient 1-to-1 communication between the latest PCIe 6 (Flit-Mode) and older PCIe 5 (Non-Flit-Mode) devices, ensuring maximum data transfer rates are maintained across the interconnected ecosystem.

  • Extended Reach

    Advanced long-reach SerDes maintains signal integrity and performance across high-loss channels, maximizing bandwidth over extended distances.

Quick Deploy

  • Purpose-built Design

    Small package footprint with flexible clocking modes simplify routing and reduce solution size

  • Quick Debug

    Built-in protocol analyzer with Link state history and timestamps, full non-destructive eye scan for RX Lane margining, self-test features to minimize link downtime and accelerate fault isolation

  • SW-defined Architecture 

    COSMOS with Aries software-defined architecture adjusts protocol and state transitions to support open and diverse PCIe ecosystem

Increase Uptime

  • Deep Diagnostics

    Advanced Error Reporting (AER) and FW driven link health monitoring to alert system of any possible link performance issues

  • Robust Error Handling

    Integrates Host and Downstream Port Containment (DPC) and hardware interrupt generation for uncorrectable errors, ensuring link stability and data integrity

  • In-field Upgrade

    COSMOS software upgrades enable seamless deployment of new features into existing infrastructure in the field

Resources

Ordering Information

Orderable Part NumberDocumentsFrom (Lanes @PCIe Gen)To (Lanes @PCIe Gen)Total LanesOrderingStatus
PG60161LRPortfolio Brief8L @ PCIe 6
8L @ PCIe 5
8L @ PCIe 4
16L @ PCIe 5
16L @ PCIe 4
16L @ PCIe 3
24Contact UsSampling
PG60321LRPortfolio Brief8+8L @ PCIe 6
8+8L @ PCIe 5
8+8L @ PCIe 4
16+16L @ PCIe 5
16+16L @ PCIe 4
16+16L @ PCIe 3
48Contact UsSampling

Breaking the 100 GB/s Barrier: Astera Labs and Micron Demonstrate Production-Ready PCIe 6 Storage Performance at FMS 2025

The future of AI infrastructure arrived at the Future of Memory and Storage (FMS) 2025 conference, where Astera Labs and Micron achieved a groundbreaking milestone: over 100 GB/s of sustained throughput using production-ready PCIe 6 technology. This industry-first demonstration represents a transformative advancement in AI storage performance, featuring four Micron 9650 PCIe Gen6 SSDs and…

Read more

Astera Labs at FMS 2025: Accelerating Storage and Memory Innovation in the AI Infrastructure 2.0 Era

As AI models push computational boundaries with breakthrough reasoning capabilities, storage and memory must also evolve to optimize AI Infrastructure 2.0. From training massive models to enabling real-time inference, every part of the AI workflow relies on seamless connectivity between diverse storage and memory technologies.Join Astera Labs at the Future of Memory and Storage Summit…

Read more

Astera Labs at OCP APAC Summit: Advancing Open AI Infrastructure 2.0 Through Rack-Scale Connectivity

As AI training clusters scale to 200,000+ GPUs, traditional server architectures require a fundamental paradigm shift to handle this unprecedented scale. Join Astera Labs at the OCP APAC Summit, August 5-6 in Taipei, as we put a spotlight on the transition to AI Infrastructure 2.0—where the rack is replacing the server as the new unit of compute.This transformation isn’t just evolutionary—it’s…

Read more

Astera Labs Announces Third Quarter 2025 Financial Conference Participation

SAN JOSE, CA, U.S. – Aug. 20, 2025 – Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions for rack-scale AI infrastructure, today announced its participation in financial conferences for the third quarter 2025.Deutsche Bank 2025 Technology Conference on Aug. 28, 2025. Astera Labs’ presentation is scheduled for 12:30 pm PT.Citi’s 2025 Global…

Read more

Astera Labs Announces Financial Results for the Second Quarter of Fiscal Year 2025

Record quarterly revenue of $191.9 million, up 20% QoQ and 150% YoY, and record operating cash flow generation of $135.4 millionIndustry leading PCIe 6 connectivity portfolio ramping in volume on customized rack-scale AI systemsScorpio Fabric Switch design wins expand across multiple new customers and applicationsSAN JOSE, CA, U.S. – August 5, 2025 – Astera Labs, Inc. (Nasdaq:…

Read more

Astera Labs Opens New Global Headquarters in San Jose to Accelerate AI Infrastructure Innovation

900-employee campus powers Astera Labs’ mission to usher in the rack-scale computing eraSAN JOSE, Calif.– July 18, 2025 – Astera Labs, Inc. (Nasdaq: ALAB), a provider of semiconductor connectivity solutions for AI and cloud infrastructure, today announced the opening of its new corporate headquarters in San Jose, California. Designed to accommodate up to 900 employees, the new…

Read more

Astera Labs Announces Conference Call to Review Second Quarter 2025 Financial Results

SAN JOSE, Calif., July 08, 2025 — Astera Labs, Inc. (Nasdaq: ALAB), a global leader in semiconductor-based connectivity solutions for AI and cloud infrastructure, today announced that it will release its financial results for the second quarter 2025 after the close of market on Tuesday, Aug. 5, 2025. Astera Labs will host a corresponding conference call at 1:30 p.m. Pacific Time, 4:30…

Read more

Leading the AI Infrastructure 2.0 Era

AI has outgrown the server. The rack is now the unit of compute — a tightly integrated, AI-optimized system. Welcome to AI Infrastructure 2.0.Astera Labs’ co-founders share how the company is leading the transition to AI Infrastructure 2.0 with a complete portfolio of purpose-built silicon hardware and software solutions grounded in open standards.

Read more

Accelerate AI with Real-World CXL Platforms

We collaborated with SMART Modular Technologies to showcase our Leo CXL™ Smart Memory Controller in a real-world setup that delivered up to 2TB of additional CXL memory with SMART CXL add-in cards.In the demo, Large Language Model inferencing tasks were run using FlexGen, achieving 5.5× higher throughput and 90% GPU utilization—demonstrating the efficiency gains CXL brings to AI workloads….

Read more

Boost AI with 100+ GB/s PCIe 6

Building on our successful DesignCon 2025 demonstration with Micron that achieved 54 GB/s with two SSDs, we’ve now scaled to four Micron 9650 PCIe Gen6 SSDs with three Scorpio P-Series Fabric Switches to achieve over 100 GB/s aggregate performance. This marks another step in expanding PCIe 6 adoption as Astera Labs continues to validate interoperability across an increasing range of endpoints,…

Read more