March 30, 2026 -
Cambridge, England – Arm Holdings plc (NASDAQ: ARM) today announced the next evolution of the Arm compute platform, extending into production silicon products for the first time in the company’s history. This begins with the launch of the Arm AGI CPU, an Arm-designed CPU for AI data centers, built to address a rising class of agentic AI workloads.
For more than three decades, the industry has innovated on the Arm compute platform to deliver scalable, power-efficient computing across hundreds of billions of devices. As AI transforms global computing infrastructure, partners across the ecosystem are asking for ways to deploy Arm technology at scale. In response, Arm is expanding its platform strategy beyond IP and Compute Subsystems (CSS) to include Arm-designed silicon products – giving partners the broadest set of options to build on Arm and enabling faster innovation across the AI ecosystem.
“AI has fundamentally redefined how computing is built and deployed. Agentic computing is accelerating that change,” said Rene Haas, CEO, Arm. “Today marks the next phase of the Arm compute platform and a defining moment for our company. With the expansion into delivering production silicon with our Arm AGI CPU, we are giving partners more choices all built on Arm’s foundation of high-performance, power-efficient computing, to support agentic AI infrastructure at global scale.”
The rise of AI agents is driving a major inflection point in global computing. As AI shifts from training models to deploying continuously running agents that reason, plan and act, the volume of tokens generated across AI systems is rapidly increasing and requires significantly more CPUs to handle reasoning, coordination and data movement.
As organizations scale agent-driven applications, data centers are expected to require more than 4x the current CPU capacity per GW* — driving the need for significantly more compute within the same power envelope. This is driving demand for a new class of CPUs designed for AI-scale infrastructure — delivering the performance needed to sustain high token throughput, the efficiency required to operate within real-world power constraints and a simplified architecture built without the overhead and complexity of x86 processors.
To help partners move faster in this new environment, Arm is introducing the Arm AGI CPU, which is expected to be the foundation for agentic data centers. The expansion into silicon products provides the ecosystem with greater flexibility in how they build and deploy Arm-based infrastructure — whether licensing Arm IP, adopting Arm CSS, or deploying Arm-designed silicon.
The Arm AGI CPU delivers:
These capabilities translate into greater workload density, improved accelerator utilization and more usable compute within existing power envelopes — critical advantages as AI infrastructure scales. The Arm AGI CPU delivers more than 2x performance per rack versus x86 CPUs, enabling up to $10B in CAPEX savings per GW of AI data center capacity*.
Meta serves as the lead partner and co-developer, leveraging Arm AGI CPU to optimize infrastructure for its family of apps and working alongside Meta’s own custom silicon, called Meta Training and Inference Accelerator (MTIA), enabling more efficient orchestration in large-scale AI systems. Arm and Meta are committed to collaborating across multiple generations of the Arm AGI CPU roadmap.
“Delivering AI experiences at global scale demands a robust and adaptable portfolio of custom silicon solutions, purpose-built to accelerate AI workloads and optimize performance across Meta’s platforms,” said Santosh Janardhan, head of infrastructure, Meta. “We worked alongside Arm to develop the Arm AGI CPU to deploy an efficient compute platform that significantly improves our data center performance density and supports a multi-generation roadmap for our evolving AI systems.”
Alongside Meta, Arm has confirmed additional commercial momentum with partners including Cerebras, Cloudflare, F5, OpenAI, Positron, Rebellions, SAP, and SK Telecom. These customers will deploy the Arm AGI CPU for key agentic CPU use-cases including accelerator management, control plane processing, and cloud and enterprise-based API, task and application hosting.
To accelerate this ramp, Arm is partnering with lead OEMs and ODMs including ASRock Rack, Lenovo, Quanta Computer, and Supermicro, with early systems available now and broader availability expected in the second half of the year.
More than 50 leading companies across hyperscale, cloud, silicon, memory, networking, software, system design and manufacturing are supporting the expansion of the Arm compute platform into silicon. That momentum includes industry leaders such as AWS, Broadcom, Google, Marvell, Micron, Microsoft, NVIDIA, Samsung, SK hynix and TSMC, alongside many others.
For decades, the industry has built on the Arm compute platform through its industry-leading IP and, more recently, Arm CSS, grounded in a foundation of high-performance, power-efficient computing. The expansion into production silicon with the Arm AGI CPU marks the next phase of that evolution, extending Arm into data center silicon and bringing its power-efficient architecture to AI infrastructure at scale.
Click here to view supporting quotes from more than 50 leading ecosystem players:
Advantest, Altera, AMI, Amazon Web Services, Amkor, Arista, ASRock Rack, ASE Holdings, Broadcom, Bristol Centre for Supercomputing, Cadence, Canonical, Cerebras, Cisco, Cloudflare, Databricks, F5, Furiosa, GitHub, Google Cloud, Hugging Face, Intel Foundry, Lenovo, Marvell, MediaTek, Meta, Micron, Microsoft Azure, MongoDB, NVIDIA, NXP, OpenAI (OAI), Open Compute Project, Oracle Cloud Infrastructure (OCI), Positron, Red Hat, Rebellions, Redis, Samsung, SAP, Siemens, SK hynix, SK Telecom, Snowflake, Socionext (SNI), ST Micro, StatsChipPac, SUSE, Supermicro, Synopsys, TSMC, VMWare.
Supporting quotes include:
“Back in 2009, I first wrote about Arm becoming the next generation general purpose server CPU. Over the last decade, we’ve partnered closely with Arm in building Graviton here at AWS, and it’s been a remarkable success – the majority of compute capacity AWS added to our fleet in 2025 was powered by Graviton. This collaboration has been great for both companies, and Graviton continues to deliver better price/performance for our customers. Here’s to celebrating with Arm and to our continued partnership delivering big for customers.” – James Hamilton, SVP and Distinguished Engineer, Amazon
“As Broadcom builds the world’s most capable XPU and networking solutions for hyperscalers at the heart of the AI transformation, our partnership with Arm has enabled us to move with unmatched intent and speed. By building on Arm’s power-efficient technology leadership, we continue to deliver market-leading innovations that scale the most complex AI infrastructure. The new Arm AGI CPU will further unlock the Arm ecosystem for a broad range of customers, creating significant new opportunities for everyone building the future of intelligence on Arm.” – Charlie Kawwas, Ph. D., President, Semiconductor Solutions Group, Broadcom Inc.
“At Google Cloud, we are committed to delivering compute infrastructure that allows customers to maximize performance, reduce costs, and meet sustainability goals. Google’s custom Axion CPUs, built on the Arm architecture, provide organizations with the energy efficiency and scalability required to support modern cloud-native and AI-driven workloads. CPU innovation with Arm’s new AGI CPU can unlock a new generation of purpose-built compute capabilities across the industry.” – Amin Vahdat, SVP & Chief Technologist, AI Infrastructure, Google
“As AI workloads grow in scale and complexity, advances in compute, connectivity, and data center infrastructure must evolve to support the insatiable demand for AI. The new AGI CPU provides more options for the Arm ecosystem and broadens the accessibility of compute to a new set of customers. Marvell brings leadership in custom compute, end-to-end connectivity, and accelerated infrastructure technologies that power modern data centers, and we are proud to partner with Arm to enable the next generation of AI infrastructure.” – Matt Murphy, Chairman and CEO, Marvell
“As AI systems become more autonomous and data‑intensive, performance is no longer defined by compute alone, but by how efficiently compute and memory work together. That’s why our longstanding partnership with Arm is so important. Today’s announcement of the Arm AGI CPU is a significant milestone, opening new opportunities for system‑level innovation when paired with Micron’s leading memory and storage portfolio.” –Sanjay Mehrotra, Chairman, President and CEO, Micron Technology
“Arm’s AGI CPU and ongoing CSS investments strengthen the Arm ecosystem for the next wave of AI and cloud workloads. Microsoft’s Azure Cobalt family of CPUs, built on Neoverse CSS, is a key part of how we optimize every layer of our stack to deliver high performance and efficient infrastructure for our customers.” –Rani Borkar, President, Azure Hardware Systems and Infrastructure, Microsoft
“Our partnership began nearly two decades ago and since then, Arm’s adaptability has made it possible for us to integrate Arm across all of our platforms and for all different phases of AI. Together we’re creating one seamless platform, from cloud to edge to AI factories. We look forward to building the future with Arm.” – Jensen Huang, Founder and CEO of NVIDIA.
“As AI workloads rise, performance gains will increasingly depend on tight co-optimization across logic, memory, and advanced packaging technologies. Samsung’s broad semiconductor capabilities and long-term collaboration with compute leaders like Arm position us to support next-generation AI platforms at scale. Purpose-built AI compute platforms like the Arm AGI CPU create new opportunities for deeper collaboration across silicon design, memory integration, and manufacturing innovation with cutting-edge process technology, making this an important milestone for the ecosystem.” – Young Hyun Jun, Vice Chairman and CEO of Samsung Electronics
“SK hynix has a long history of collaborating with industry leaders to support high-performance computing and accelerate innovation for global customers. As AI datacenters evolve, platforms designed for AI workloads must be supported by advanced memory technologies that deliver the capacity and bandwidth required for modern applications. The introduction of purpose-built AI compute with the Arm AGI CPU is an important milestone for the ecosystem, and we look forward to continued partnership with Arm to advance the next generation of AI-driven infrastructure.”- Noh-Jung Kwak, CEO, SK hynix
“Datacenter AI workloads are evolving, and we are seeing more demand than ever for efficient, scalable compute, driving deeper collaboration across every layer of the ecosystem—from silicon design to manufacturing innovation. As the Arm AGI CPU manufacturer, we are excited to support this breakthrough platform. By leveraging our advanced 3nm process technology, the new Arm AGI CPU delivers significant performance and energy efficiency and is expected to play an important role in enabling the next generation of AI infrastructure across the datacenter ecosystem.” – Dr. Kevin Zhang, SVP and Deputy Co-COO at TSMC
Arm is the industry’s highest-performing and most power-efficient compute platform with unmatched scale that touches 100 percent of the connected global population. To meet the insatiable demand for compute, Arm is delivering advanced solutions that allow the world’s leading technology companies to unleash the unprecedented experiences and capabilities of AI. Together with the world’s largest computing ecosystem and 22 million software developers, we are building the future of AI on Arm.