Leo CXL® Smart Memory Controllers

Purpose-built memory expansion, sharing, and pooling for AI and cloud platforms

Memory Solutions for the AI Era

  • Accelerates AI and cloud infrastructure with memory expansion, sharing, and pooling for enhanced performance
  • Eliminates bandwidth/capacity bottlenecks, reduces total cost of ownership, and optimizes memory utilization
  • Ensures end-to-end data integrity and protection with best-in-class industry standard security features
  • Delivers server-grade customizable RAS, advanced diagnostics, and fleet management via COSMOS suite

Leo Highlights

Accelerating AI with CXL Memory

Increase Memory
Capacity

With multiple DDR5 channels,
up to 5600 MT/s

Seamless
Interoperability Across

Stress tested with all major xPUs and Memory vendors

Enhanced Diagnostics
& Telemetry

Advanced capabilities through in-band and out-of-band management

Leo Delivers Proven Application Value with CXL

Astera Labs’ Leo CXL Smart Memory Controller is the industry’s first purpose-built solution that supports both memory expansion and memory pooling to solve performance bottlenecks and capacity constraints in cloud servers.

AI Inferencing: Recommendation System

73% more recommendations per second

AI Inferencing: Chatbot Services

40% faster time to insights with LLM

HPC: Computer Aided Engineering

50% more iterations per second

In-Memory Databases: Transaction Processing

150% more transactions per second

E-Series

  • Memory Expansion

P-Series

  • Memory Expansion
  • Memory Pooling
  • Memory Sharing

Leo A-Series Hardware Solutions

Leo A-Series CXL Smart Memory Hardware solutions offer all the benefits of Leo Controllers and enable quick plug-and-play deployment with faster time-to-market for system OEMs and data centers.

  • PCIe x16 CEM add-in card form factor
  • Up to 4x DDR5 RDIMMs supporting up to 2TB
  • On-board debug connectors for fleet management on Cloud Servers
  • Temperature and health monitoring of Leo controller and memory
  • RDIMM fault isolation and error correction
  • High volume production-qualified solutions with robust supply chain

Use Cases

Read more

Memory Disaggregation

Expand, pool and share memory between multiple servers to increase memory bandwidth and capacity while providing the option to reclaim stranded or under-utilized…

Resources

Ordering Information

Orderable Part NumberImageDocumentsDescriptionCXL SpecCXL LinkMemoryCapacityExpansionPooling / SharingOrdering
A1000-1254ABA1000-1254ABhttps://www.asteralabs.com/wp-content/uploads/2025/05/Leo_CXL_Smart_Memory_Controllers_Portfolio_Brief.pdfLeo A1000 CXL™ Smart Memory Add-in Card CXL 1.1/2.016x32G4x DDR5-5600 RDIMM slots, 2TB2TBYesYeshttps://www.asteralabs.com/product-details/a1000-1254ab/
CM5082E-*CM5082E-*https://www.asteralabs.com/wp-content/uploads/2025/05/Leo_CXL_Smart_Memory_Controllers_Portfolio_Brief.pdfLeo E-Series CXL 2.0 x8 Smart Memory ControllerCXL 1.1/2.08x32G2ch DDR5 Up To 5600MT/s2TBYesNohttps://www.asteralabs.com/product-details/cm5082e/
CM5162E-*CM5162E-*https://www.asteralabs.com/wp-content/uploads/2025/05/Leo_CXL_Smart_Memory_Controllers_Portfolio_Brief.pdfLeo E-Series CXL 2.0 x16 Smart Memory Controller CXL 1.1/2.016x32G2ch DDR5 Up To 5600MT/s2TBYesNohttps://www.asteralabs.com/product-details/cm5162e/
CM5162P-*CM5162P-*https://www.asteralabs.com/wp-content/uploads/2025/05/Leo_CXL_Smart_Memory_Controllers_Portfolio_Brief.pdfLeo P-Series CXL 2.0 x16 Smart Memory Controller CXL 1.1/2.016x32G2ch DDR5 Up To 5600MT/s2TBYesYeshttps://www.asteralabs.com/product-details/cm5162p/

Building the Software Stack for AI Infrastructure 2.0: Why Standards-Based Connectivity Management Matters

How our collaboration with ASPEED and Insyde on OpenBMC support advances the vision of open, interoperable AI rack infrastructure We’re at an inflection point in AI infrastructure. As I watch hyperscalers architect their next-generation AI racks, I see a fundamental shift happening—one that goes far beyond just faster GPUs and higher bandwidth connections. We’re witnessing the… Read more

Breaking the 100 GB/s Barrier: Astera Labs and Micron Demonstrate Production-Ready PCIe 6 Storage Performance at FMS 2025

The future of AI infrastructure arrived at the Future of Memory and Storage (FMS) 2025 conference, where Astera Labs and Micron achieved a groundbreaking milestone: over 100 GB/s of sustained throughput using production-ready PCIe 6 technology. This industry-first demonstration represents a transformative advancement in AI storage performance, featuring four Micron 9650 PCIe Gen6 SSDs and… Read more

Astera Labs at FMS 2025: Accelerating Storage and Memory Innovation in the AI Infrastructure 2.0 Era

As AI models push computational boundaries with breakthrough reasoning capabilities, storage and memory must also evolve to optimize AI Infrastructure 2.0. From training massive models to enabling real-time inference, every part of the AI workflow relies on seamless connectivity between diverse storage and memory technologies.Join Astera Labs at the Future of Memory and Storage Summit… Read more

Astera Labs Joins Arm Total Design to Accelerate Custom AI Infrastructure Solutions 

Collaboration addresses growing rack-scale infrastructure demands with custom AI chiplet architecture connectivity solutions SAN JOSE, Calif.–October 14, 2025–Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions for rack-scale AI infrastructure, today announced it has joined Arm® Total Design, a comprehensive ecosystem dedicated to accelerating… Read more

Astera Labs Showcases Rack-Scale AI Ecosystem Momentum at OCP Global Summit

Comprehensive collaborations spanning GPU, CPU, cables & connectors, ODM, software management, and IP/design & verification providers show growing support for open standards driving AI Infrastructure 2.0SAN JOSE, Calif.—October 13, 2025—Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions for rack-scale AI infrastructure, today announced… Read more

Astera Labs Announces Third Quarter 2025 Financial Conference Participation

SAN JOSE, CA, U.S. – Aug. 20, 2025 – Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions for rack-scale AI infrastructure, today announced its participation in financial conferences for the third quarter 2025.Deutsche Bank 2025 Technology Conference on Aug. 28, 2025. Astera Labs’ presentation is scheduled for 12:30 pm PT.Citi’s 2025 Global… Read more

Astera Labs Announces Financial Results for the Second Quarter of Fiscal Year 2025

Record quarterly revenue of $191.9 million, up 20% QoQ and 150% YoY, and record operating cash flow generation of $135.4 millionIndustry leading PCIe 6 connectivity portfolio ramping in volume on customized rack-scale AI systemsScorpio Fabric Switch design wins expand across multiple new customers and applicationsSAN JOSE, CA, U.S. – August 5, 2025 – Astera Labs, Inc. (Nasdaq:… Read more

Introduction to COSMOS Explorer

COSMOS Explorer is a powerful device management and monitoring tool designed to simplify the process of configuring, debugging, and validating link stability for all of Astera Labs’ products. It offers a rich set of system configuration options to help you achieve optimal performance and diagnostic capabilities.To show this is a real environment, the demo set-up features a host connected… Read more

Rack-Scale AI Connectivity with UALink

When the ecosystem works together, innovation happens faster. As a founding board member of the UALink™ Consortium, learn how Astera Labs – in collaboration with XPU partners – plans to deliver a complete portfolio of UALink products to support scale-up AI infrastructure. Read more

Leading the AI Infrastructure 2.0 Era

AI has outgrown the server. The rack is now the unit of compute — a tightly integrated, AI-optimized system. Welcome to AI Infrastructure 2.0.Astera Labs’ co-founders share how the company is leading the transition to AI Infrastructure 2.0 with a complete portfolio of purpose-built silicon hardware and software solutions grounded in open standards. Read more