What happened to neuromorphic computing? Is it a dead end?
Posted by Dr-Nicolas@reddit | hardware | View on Reddit | 6 comments
Is neuromorphic computing a dead end? What about digital neuromorphic computing? I understand that in the digital format there are advantages over analog, like training an AI in GPUs and then replicating it in the neuromorphic processor instead of having to train it again.
fotcorn@reddit
Not dead, but mostly useful for very specific use-cases.
Innatera for example released a microcontroller with both analog and digital spiking neural network accelerator on chip for ultra low power sensor data processing: https://www.eetimes.com/innatera-adds-more-accelerators-to-spiking-microcontroller/
There is also a growing neuromorphic community for both academics and commercial interests at https://open-neuromorphic.org/
Disclaimer: Developer at Innatera
narwi@reddit
Has not been part of hype cycle yet. Will probably do it at one point an dthen miserably fail.
PolishTar@reddit
No clue what the consensus of folks in the industry is as a whole, but I saw a couple videos recently from the chief scientist at Nvidia who appears disillusioned with neuromorphic & analog approaches due to fundamental energy efficiency problems. Links: https://youtu.be/gofI47kfD28?t=1995 (problems w/ spiking) and https://youtu.be/HtrR1HRZIGA?t=2130 (problems w/ analog)
Consistent-Horse-273@reddit
I don't have technical knowledge on neuromorphic computing (NC) but occasionally listen to a podcast (EETIMES current) dedicated to frontier research on the field. My impression is large scale application of NC is still working on the proof of concept, for example it is commonly quoted that NC has energy or performance advantage on some task based on some small scale calculation, but they generally lacks of rigorous theoretical framework to prove which types of works they are good at. They have good progress in scaling up number of neurons in NC, but the software ecosystem is still very immature, there's missing gap between low level and high level language so it is difficult to bring researchers worldwide to work on the ecosystem. But certainly not a dead end since not much resources (e.g. compare to quantum computing) have been given to it anyway, and it seems like an exciting and rapidly expanding field from academic perspective.
Silent-Selection8161@reddit
Still under development.
The problem is the model of a bunch of variable signal spikes just isn't as easy to make work, from a physics standpoint, on semi conductors as an "on/off" switch with wiring in between each little switch. Maybe it'll never really get anywhere, at least on silicon or it's close replacements.
knghtwhosaysni@reddit
I don't know but I hope it's not dead. I feel like there are lots of interesting things that could be done modeling various kinds of "neurons" in analog-ish hardware