Buy and Sample Products

Order from distributors

You can order from one of our authorized distributors

Building the Software Stack for AI Infrastructure 2.0: Why Standards-Based Connectivity Management Matters

How our collaboration with ASPEED and Insyde on OpenBMC support advances the vision of open, interoperable AI rack infrastructure We’re at an inflection point in AI infrastructure. As I watch hyperscalers architect their next-generation AI racks, I see a fundamental shift happening—one that goes far beyond just faster GPUs and higher bandwidth connections. We’re witnessing the… Read more

Breaking the 100 GB/s Barrier: Astera Labs and Micron Demonstrate Production-Ready PCIe 6 Storage Performance at FMS 2025

The future of AI infrastructure arrived at the Future of Memory and Storage (FMS) 2025 conference, where Astera Labs and Micron achieved a groundbreaking milestone: over 100 GB/s of sustained throughput using production-ready PCIe 6 technology. This industry-first demonstration represents a transformative advancement in AI storage performance, featuring four Micron 9650 PCIe Gen6 SSDs and… Read more

Astera Labs at FMS 2025: Accelerating Storage and Memory Innovation in the AI Infrastructure 2.0 Era

As AI models push computational boundaries with breakthrough reasoning capabilities, storage and memory must also evolve to optimize AI Infrastructure 2.0. From training massive models to enabling real-time inference, every part of the AI workflow relies on seamless connectivity between diverse storage and memory technologies.Join Astera Labs at the Future of Memory and Storage Summit… Read more

Astera Labs Announces Financial Results for the Third Quarter of Fiscal Year 2025

Record quarterly revenue of $230.6 million, up 20% QoQ and 104% YoYStrong Q3 revenue growth driven by new AI platform ramps featuring multiple product familiesScorpio fabric switch design wins expand to several platforms at multiple hyperscaler customersSAN JOSE, CA, U.S. – November 4, 2025 – Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions… Read more

Astera Labs to Acquire aiXscale Photonics

Deal is expected to help enable scale-up photonic chiplets in AI Infrastructure 2.0 and accelerate Astera Labs’ deployment of rack-scale solutionsSAN JOSE, Calif.—October 22, 2025—Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions for rack-scale AI infrastructure, today announced that it has entered into a definitive agreement to acquire aiXscale… Read more

Astera Labs Joins Arm Total Design to Accelerate Custom AI Infrastructure Solutions 

Collaboration addresses growing rack-scale infrastructure demands with custom AI chiplet architecture connectivity solutions SAN JOSE, Calif.–October 14, 2025–Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions for rack-scale AI infrastructure, today announced it has joined Arm® Total Design, a comprehensive ecosystem dedicated to accelerating… Read more

Astera Labs Showcases Rack-Scale AI Ecosystem Momentum at OCP Global Summit

Comprehensive collaborations spanning GPU, CPU, cables & connectors, ODM, software management, and IP/design & verification providers show growing support for open standards driving AI Infrastructure 2.0SAN JOSE, Calif.—October 13, 2025—Astera Labs, Inc. (Nasdaq: ALAB), a leader in semiconductor-based connectivity solutions for rack-scale AI infrastructure, today announced… Read more

COSMOS and OpenBMC Demo at OCP 2025

Through collaboration with ASPEED Technology and Insyde Software, Astera Labs has extended COSMOS features into production AI systems to standardize management, monitoring and diagnostics of retimers and scale-up switches with OpenBMC and DMTF Redfish APIs. COSMOS now offers:Standard APIs for configuration, monitoring and lifecycle controlActionable insights with granular visibility into… Read more

CXL Memory Innovation at OCP 2025

Rack-scale memory innovation is unlocked with CXL. At OCP Global Summit 2025, Astera Labs demonstrated how Leo CXL Smart Memory Controllers can eliminate AI infrastructure bottlenecks with regards to memory capacity and bandwidth.3x concurrent LLM instances at higher AI throughput3x increased throughput w/ CXL at higher user count3x lower latency with CXLHigher GPU utilization on average with… Read more

AI Open Rack Demo at OCP 2025

As AI workloads exceed the limits of traditional servers, Astera Labs is making Open Rack Architecture a reality. Seamless integration of compute and switch platforms at rack scale is key to unlocking a unified AI engine. Together with our ecosystem partners, Astera Labs is driving this vision forward—transforming concept into real-world impact.Watch Brian Deng showcase our AI Open Rack… Read more