NeuReality Boosts AI Acelerator Utilization With NAPU
By Sally Ward-Foxton, EETimes (April 4, 2024)
Startup NeuReality wants to replace the host CPU in data center AI inference systems with dedicated silicon that can cut total cost of ownership and power consumption. The Israeli startup developed a class of chip it calls the network addressable processing unit (NAPU), which includes hardware implementations for typical CPU functions like the hypervisor. NeuReality’s aim is to increase AI accelerator utilization by removing bottlenecks caused by today’s host CPUs.
NeuReality CEO Moshe Tanach told EE Times its NAPU enables 100% utilization of AI accelerators.
E-mail This Article | Printer-Friendly Page |
Related News
- Europe Leaps Ahead in Global AI Arms Race, Joining $20 Million Investment in NeuReality to Advance Affordable, Carbon-Neutral AI Data Centers
- BrainChip Boosts Space Heritage with Launch of Akida into Low Earth Orbit
- Arm shares jump 50% on AI, China boosts to results
- Rambus Boosts AI Performance with 9.6 Gbps HBM3 Memory Controller IP
- NeuReality and Veriest Achieve Great Engineering Feat to Advance AI Chips for World's Largest Data Centers
Breaking News
- TSMC's A16 Process Moves Goalposts in Tech-Leadership Game
- EDA toolset parade at TSMC's U.S. design symposium
- Altera in negotiation on private equity partner
- Arm China's ex-CEO sets up RISC-V company
- CAN FD Controller & LIN 2.1 Controller IP Cores, Available for Immediate Licensing with Proven Automotive Compatibility
Most Popular
- Controversial former Arm China CEO founds RISC-V chip startup
- Qualitas Semiconductor Partners with TUV Rheinland Korea to Enhance ISO 26262 Functional Safety Management System
- TSMC Celebrates 30th North America Technology Symposium with Innovations Powering AI with Silicon Leadership
- TSMC plans 1.6nm process for 2026
- M31 has successfully launched MIPI C/D PHY Combo IP on the advanced TSMC 5nm process