- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
We are constantly fed a version of AI that looks, sounds and acts suspiciously like us. It speaks in polished sentences, mimics emotions, expresses curiosity, claims to feel compassion, even dabbles in what it calls creativity.
But what we call AI today is nothing more than a statistical machine: a digital parrot regurgitating patterns mined from oceans of human data (the situation hasn’t changed much since it was discussed here five years ago). When it writes an answer to a question, it literally just guesses which letter and word will come next in a sequence – based on the data it’s been trained on.
This means AI has no understanding. No consciousness. No knowledge in any real, human sense. Just pure probability-driven, engineered brilliance — nothing more, and nothing less.
So why is a real “thinking” AI likely impossible? Because it’s bodiless. It has no senses, no flesh, no nerves, no pain, no pleasure. It doesn’t hunger, desire or fear. And because there is no cognition — not a shred — there’s a fundamental gap between the data it consumes (data born out of human feelings and experience) and what it can do with them.
Philosopher David Chalmers calls the mysterious mechanism underlying the relationship between our physical body and consciousness the “hard problem of consciousness”. Eminent scientists have recently hypothesised that consciousness actually emerges from the integration of internal, mental states with sensory representations (such as changes in heart rate, sweating and much more).
Given the paramount importance of the human senses and emotion for consciousness to “happen”, there is a profound and probably irreconcilable disconnect between general AI, the machine, and consciousness, a human phenomenon.
Philosophers are so desperate for humans to be special. How is outputting things based on things it has learned any different to what humans do?
We observe things, we learn things and when required we do or say things based on the things we observed and learned. That’s exactly what the AI is doing.
I don’t think we have achieved “AGI” but I do think this argument is stupid.
Humans are not probabilistic, predictive chat models. If you think reasoning is taking a series of inputs, and then echoing the most common of those as output then you mustn’t reason well or often.
If you were born during the first industrial revolution, then you’d think the mind was a complicated machine. People seem to always anthropomorphize inventions of the era.
This is great
Do you think most people reason well?
The answer is why AI is so convincing.
I think people are easily fooled. I mean look at the president.
When you typed this response, you were acting as a probabilistic, predictive chat model. You predicted the most likely effective sequence of words to convey ideas. You did this using very different circuitry, but the underlying strategy was the same.
I wasn’t, and that wasn’t my process at all. Go touch grass.
Yes, the first step to determining that AI has no capability for cognition is apparently to admit that neither you nor anyone else has any real understanding of what cognition* is or how it can possibly arise from purely mechanistic computation (either with carbon or with silicon).
Given? Given by what? Fiction in which robots can’t comprehend the human concept called “love”?
*Or “sentience” or whatever other term is used to describe the same concept.
No it’s really not at all the same. Humans don’t think according to the probabilities of what is the likely best next word.
No you think according to the chemical proteins floating around your head. You don’t even know he decisions your making when you make them.
https://www.unsw.edu.au/newsroom/news/2019/03/our-brains-reveal-our-choices-before-were-even-aware-of-them--st
You’re a meat based copy machine with a built in justification box.
How could you have a conversation about anything without the ability to predict the word most likely to be best?
Most people, evidently including you, can only ever recycle old ideas. Like modern “AI”. Some of us can concieve new ideas.