By Sally Ward-Foxton, EETimes (December 4, 2023)
SAN JOSE, Calif. — What is holding back neuromorphic computing, or more specifically, spiking neural networks? Mike Davies, director of Intel’s neuromorphic computing lab, told EE Times that the technology shows immense promise for reducing power consumption and latency versus current deep-learning–based neural networks. But this requires dedicated hardware accelerators, and there are still challenges with training regimes and software maturity, he added.
Intel’s neuromorphic chip, Loihi, is in its second generation. In contrast with some other types of spiking neural network accelerators designed for ultra-low–power endpoint applications, Loihi 2 is intended for bigger scale data center-class systems.
“Loihi 2 improved on Loihi 1 in a lot of interesting ways,” Davies said. “It started to blur the boundary between the pure neuromorphic approach and the traditional AI accelerator architecture and that’s pushed us into a new regime of interesting algorithms to explore.”
The best approach isn’t necessarily the one that most closely matches biology, he said, provided we understand why some of these differences arise.
Click here to read more ...