Leo CXL® Smart Memory Controllers

Purpose-built memory expansion, sharing, and pooling for AI and cloud platforms

Memory Solutions for the AI Era

  • Accelerates AI and cloud infrastructure with memory expansion, sharing, and pooling for enhanced performance
  • Eliminates bandwidth/capacity bottlenecks, reduces total cost of ownership, and optimizes memory utilization
  • Ensures end-to-end data integrity and protection with best-in-class industry standard security features
  • Delivers server-grade customizable RAS, advanced diagnostics, and fleet management via COSMOS suite

Leo Highlights

Accelerating AI with CXL Memory

Increase Memory
Capacity

With multiple DDR5 channels,
up to 5600 MT/s

Seamless
Interoperability Across

Stress tested with all major xPUs and Memory vendors

Enhanced Diagnostics
& Telemetry

Advanced capabilities through in-band and out-of-band management

Leo Delivers Proven Application Value with CXL

Astera Labs’ Leo CXL Smart Memory Controller is the industry’s first purpose-built solution that supports both memory expansion and memory pooling to solve performance bottlenecks and capacity constraints in cloud servers.

AI Inferencing: Recommendation System

73% more recommendations per second

AI Inferencing: Chatbot Services

40% faster time to insights with LLM

HPC: Computer Aided Engineering

50% more iterations per second

In-Memory Databases: Transaction Processing

150% more transactions per second

E-Series

  • Memory Expansion

P-Series

  • Memory Expansion
  • Memory Pooling
  • Memory Sharing

Leo A-Series Hardware Solutions

Leo A-Series CXL Smart Memory Hardware solutions offer all the benefits of Leo Controllers and enable quick plug-and-play deployment with faster time-to-market for system OEMs and data centers.

  • PCIe x16 CEM add-in card form factor
  • Up to 4x DDR5 RDIMMs supporting up to 2TB
  • On-board debug connectors for fleet management on Cloud Servers
  • Temperature and health monitoring of Leo controller and memory
  • RDIMM fault isolation and error correction
  • High volume production-qualified solutions with robust supply chain

Use Cases

Read more

Memory Disaggregation

Expand, pool and share memory between multiple servers to increase memory bandwidth and capacity while providing the option to reclaim stranded or under-utilized…

Resources

Ordering Information

Orderable Part NumberImageDocumentsDescriptionCXL SpecCXL LinkMemoryCapacityExpansionPooling / SharingOrdering
A1000-1254ABA1000-1254ABhttps://www.asteralabs.com/wp-content/uploads/2025/05/Leo_CXL_Smart_Memory_Controllers_Portfolio_Brief.pdfLeo A1000 CXL™ Smart Memory Add-in Card CXL 1.1/2.016x32G4x DDR5-5600 RDIMM slots, 2TB2TBYesYeshttps://www.asteralabs.com/product-details/a1000-1254ab/
CM5082E-*CM5082E-*https://www.asteralabs.com/wp-content/uploads/2025/05/Leo_CXL_Smart_Memory_Controllers_Portfolio_Brief.pdfLeo E-Series CXL 2.0 x8 Smart Memory ControllerCXL 1.1/2.08x32G2ch DDR5 Up To 5600MT/s2TBYesNohttps://www.asteralabs.com/product-details/cm5082e/
CM5162E-*CM5162E-*https://www.asteralabs.com/wp-content/uploads/2025/05/Leo_CXL_Smart_Memory_Controllers_Portfolio_Brief.pdfLeo E-Series CXL 2.0 x16 Smart Memory Controller CXL 1.1/2.016x32G2ch DDR5 Up To 5600MT/s2TBYesNohttps://www.asteralabs.com/product-details/cm5162e/
CM5162P-*CM5162P-*https://www.asteralabs.com/wp-content/uploads/2025/05/Leo_CXL_Smart_Memory_Controllers_Portfolio_Brief.pdfLeo P-Series CXL 2.0 x16 Smart Memory Controller CXL 1.1/2.016x32G2ch DDR5 Up To 5600MT/s2TBYesYeshttps://www.asteralabs.com/product-details/cm5162p/

Breaking the 100 GB/s Barrier: Astera Labs and Micron Demonstrate Production-Ready PCIe 6 Storage Performance at FMS 2025

The future of AI infrastructure arrived at the Future of Memory and Storage (FMS) 2025 conference, where Astera Labs and Micron achieved a groundbreaking milestone: over 100 GB/s of sustained throughput using production-ready PCIe 6 technology. This industry-first demonstration represents a transformative advancement in AI storage performance, featuring four Micron 9650 PCIe Gen6 SSDs and…

Read more

Astera Labs at FMS 2025: Accelerating Storage and Memory Innovation in the AI Infrastructure 2.0 Era

As AI models push computational boundaries with breakthrough reasoning capabilities, storage and memory must also evolve to optimize AI Infrastructure 2.0. From training massive models to enabling real-time inference, every part of the AI workflow relies on seamless connectivity between diverse storage and memory technologies.Join Astera Labs at the Future of Memory and Storage Summit…

Read more

Astera Labs at OCP APAC Summit: Advancing Open AI Infrastructure 2.0 Through Rack-Scale Connectivity

As AI training clusters scale to 200,000+ GPUs, traditional server architectures require a fundamental paradigm shift to handle this unprecedented scale. Join Astera Labs at the OCP APAC Summit, August 5-6 in Taipei, as we put a spotlight on the transition to AI Infrastructure 2.0—where the rack is replacing the server as the new unit of compute.This transformation isn’t just evolutionary—it’s…

Read more

Astera Labs Announces Financial Results for the Second Quarter of Fiscal Year 2025

Record quarterly revenue of $191.9 million, up 20% QoQ and 150% YoY, and record operating cash flow generation of $135.4 millionIndustry leading PCIe 6 connectivity portfolio ramping in volume on customized rack-scale AI systemsScorpio Fabric Switch design wins expand across multiple new customers and applicationsSAN JOSE, CA, U.S. – August 5, 2025 – Astera Labs, Inc. (Nasdaq:…

Read more

Astera Labs Opens New Global Headquarters in San Jose to Accelerate AI Infrastructure Innovation

900-employee campus powers Astera Labs’ mission to usher in the rack-scale computing eraSAN JOSE, Calif.– July 18, 2025 – Astera Labs, Inc. (Nasdaq: ALAB), a provider of semiconductor connectivity solutions for AI and cloud infrastructure, today announced the opening of its new corporate headquarters in San Jose, California. Designed to accommodate up to 900 employees, the new…

Read more

Astera Labs Announces Conference Call to Review Second Quarter 2025 Financial Results

SAN JOSE, Calif., July 08, 2025 — Astera Labs, Inc. (Nasdaq: ALAB), a global leader in semiconductor-based connectivity solutions for AI and cloud infrastructure, today announced that it will release its financial results for the second quarter 2025 after the close of market on Tuesday, Aug. 5, 2025. Astera Labs will host a corresponding conference call at 1:30 p.m. Pacific Time, 4:30…

Read more

Astera Labs and Alchip Announce Strategic Partnership to Advance Silicon Ecosystem for AI Rack-Scale Connectivity

Hyperscalers benefit from seamless integration of purpose-built compute and connectivity solutions to rapidly deploy AI infrastructure at scale– SANTA CLARA, Calif. and TAIPEI, Taiwan – June 16, 2025 – Astera Labs, a leading provider of purpose-built connectivity solutions for AI and cloud infrastructure, and Alchip Technologies, the high-performance ASIC leader, today announced a…

Read more

Demo: COSMOS Developer Kit Diagnostics Tools

Astera Labs’ COSMOS Developer Kit provides user-friendly test and debug capabilities that accelerate validation workflows. The Developer Kit supports device, discovery, configuration, security attestation, firmware updates, scripting, and automation, making it a versatile solution for system designers and integration.

Read more

First Look Demo: Scorpio X-Series Fabric Switch

Scorpio X-Series Fabric Switches are architected to deliver the highest back-end bandwidth for AI scale-up (GPU-to-GPU communications.) Learn how Scorpio X-Series enables direct memory access across the fabric, allowing accelerators to read and write data to each other without PU intervention. This design significantly enhances data parallelism, reduces latency, and improves scalability for…

Read more

First Look Demo: Aries 6 Smart Gearbox

See the industry’s first purpose-built PCIe gearbox solution that intelligently bridges the performance gap between the latest PCIe 6 devices and existing PCIe 5 ecosystem.Learn how Aries 6 Gearbox solves the challenge of degraded-performance in mixed-generation systems, ensuring full utilization of high-speed lanes and accelerating the deployment of next-generation AI platforms while…

Read more