The algorithms that underlie everything from Alexa’s voice recognition to credit card fraud detection typically owe their skills to deep learning, in which the software learns to perform specific tasks by churning through vast databases of examples.
These programs… don’t organize and process information the way human brains do, and they fall short when it comes to the versatile smarts needed for fully autonomous robots, for example.
In place of standard computing architecture, which processes information linearly, neuromorphic chips emulate the way our brains process information, with myriad digital neurons working in parallel to send electrical impulses, or spikes, to networks of other neurons.
…
Intel released the second generation of its neuromorphic chip, Loihi. It packs in 1 million artificial neurons, six times more than its predecessor, which connect to one another through 120 million synapses.
Other companies, such as BrainChip and SynSense, have also recently rolled out new neuromorphic hardware, with chips that speed tasks such as computer vision and audio processing.
Neuromorphic computing “is going to be a rock star,” says Thomas Cleland, a neurobiologist at Cornell University. “It won’t do everything better. But it will completely own a fraction of the field of computing.”