4-/8-bit mixed-precision NPU IP
ENLIGHT, a high-performance neural network processor IP, features a highly optimized network model compiler that moves DRAM traffic from intermediate activation data by grouped layer partition and scheduling. Plus, it supports load balancing partition for multi-core NPU. With the industry's first adoption of 4-/8-bit mixed-quantization, it is easy to customize ENLIGHT at different core sizes and performance for the target market applications and achieve significant efficiencies in size, power, performance, and DRAM bandwidth.
A production-proven IP, ENLIGHT, has been licensed in a wide range of applications, including IP cameras, IoT, ADAS, and more.
View 4-/8-bit mixed-precision NPU IP full description to...
- see the entire 4-/8-bit mixed-precision NPU IP datasheet
- get in contact with 4-/8-bit mixed-precision NPU IP Supplier
Block Diagram of the 4-/8-bit mixed-precision NPU IP IP Core

NPU IP
- Edge AI/ML accelerator (NPU)
- Neural network processor designed for edge devices
- Edge AI Accelerator NNE 1.0
- Neural Network Processor for Intelligent Vision, Voice, Natural Language Processing
- AI accelerator - 4.5K, 9K, or 18K INT8 MACs, 16 to 32TOPS
- AI accelerator - 36K or 54K INT8 MACs, 32 to 128TOPS