In some ways, we do work like computers and use distributed networks of firing neurons in important ways.
…
“I also think that our brains use far different learning algorithms than our current deep learning systems,” sai Jay McClelland, a noted cognitive scientist at Stanford University. “I’m taking inspiration from a fairly-recently discovered new form of learning called Behavioral Time Scale Synaptic plasticity [BTSP] to think about our brains might have come to be able to learn with far less training data than we currently need to train contemporary AI systems.”
This is the basic understanding that underlies deep learning models used in AI. But the authors of the Cell study propose that in human brains, what’s actually going on is a different form of synaptic plasticity, BTSP, which allows for far fewer firings of neurons to create a memory — in fact, you might just need a single “event” to result in learning. [B]TSP works well because it doesn’t need the kind of overlap that Hebbian-style [AI] plasticity requires.

























