Behavioral timescale synaptic plasticity rewires the brain after an experience (quantamagazine.org)

121 points by ibobev 2 days ago

danwills 34 minutes ago

Only partway through the article but it was a small shock when the word 'rodent' turned up unexpectedly:

"...later, if the rodent reenters that place, the cell will fire"

Totally fair and normal of course I just had been imagining human or generic neurons/dendrites up to that point. The test species wasn't mentioned earlier as far as I can see!

largbae 7 hours ago

It seems obvious that a humanoid robot system or other truly general-purpose AI will need a stack of model types that work in concert. An LLM could be analogous to the conscious part of our brains, while many smaller and possibly frequently updateable models might provide "muscle memory" and reflexes.

If that becomes the case, then similarly built humanoid robots might have differentiated capabilities depending on their experience, just like us.

harrall 3 hours ago

An LLM is more like the unconscious part of my brain. It’s my gut. It shits out answers using an ungodly amount of parallel processing and it’s often right.

But it also hallucinates thoughts and beliefs too, and that’s where the conscious parts have to intervene.

But the conscious parts are expensive to run and I can’t multi-task that.

The conscious parts also degrade first when I don’t get enough sleep.

balamatom 3 hours ago

OK AI user.

Did it truly take someone else to externalize the mechanics of cognition into a machine for you, for you to become able to notice them and become interested in them?

And then to remain focused on the machine that you see, rather than the machine that you are.

Pitiful.