The AI hype rides on people taking for granted that computational neural nets are a simplified-but-actual digital representation of something like the human brain.
-
Matthew Millerreplied to young man yells at the cloud on last edited by
Pretty much, yeah.
-
They don't need to what?
-
Matthew Millerreplied to Matthew Miller on last edited by
Look at the stunning complexity of the just-mapped fruit fly connectome https://www.nature.com/articles/d41586-024-03190-y... that doesn't even include glial cells at all!
-
They do not need to be a simplified-but-actual digital representation of the human brain. They just need to behave like one.
-
Matthew Millerreplied to Remença last edited by [email protected]
I'm not really sure what you're getting at here.
They clearly do not behave like human brains.
-
Matthew Millerreplied to Matthew Miller last edited by
I mean, I guess your can be hyped up by whatever you want to be.
-
@mattdm in certain tasks they do, in other not yet, and they might never do. But still, the aim is the same. You don't need to imitate a brain to show intelligent behavior. You only need to show intelligent behavior. How you do it it does not matter.
-
It depends on the intent. If you want to accomplish specific tasks, sure. But you can't get to generalized intelligence with surface-level mimicry.
And, much of the hype is around specifically that — and, as I said, I think a lot of the acceptance of that hype is built on the assumption that we're building some kind of "digital brain".
I see this in even AI skeptics in the non-technical public.
-
@mattdm then you only need deep mimicry. It is a matter of scale. But that in order to display intelligence you need a brain, well, I'd even say that it is a an unfalsifiable statement.
-
I think you're missing my point.