Original Reddit post

Last Tuesday at a ProductHunt event, a speaker said: “Don’t think AI can actually think. It’s just a neural network picking the right sequence of words.” That’s the third person this week saying the exact same thing. Like a mantra. But then I sat down and thought: what is my brain doing right now, as I’m writing this? Neurons firing in patterns. Pulling relevant info from memory. Stringing words together one by one. I don’t even “think” this sentence in advance. I’m generating it on the fly, word by word, based on context. So literally: picking the right sequence of words. Now flip the argument: “What can a bag of meat with electrical signals think? It’s just picking words.” Sounds just as dismissive. And just as technically accurate. I’m not saying AI thinks. I’m questioning the whole concept of “thinking.” We’ve always believed there’s a magic line between the human mind and everything else. It used to be the “soul.” Then “consciousness.” Now it’s “understanding” vs “just picking words.” Every generation invents a new way to say “we’re special, and it’s not.” But what if the difference between us and a neural network isn’t in kind, but in degree? An ant processes information. A dog processes more. A human even more. An LLM does it differently, made of different stuff, but on the same spectrum. And the phrase “it’s just picking words” doesn’t explain anything. It comforts. Like “the earth is the center of the universe.” Made perfect sense, felt right, and might be wrong. The most uncomfortable question: if the mind is just information processing of sufficient complexity, what makes our version “real”? The material? That it’s wet and carbon-based instead of silicon? Maybe we’re not as special as we’d like to believe. And maybe AI isn’t as simple as we’d like to think. The one thing I know for sure: “it’s just picking words” isn’t an answer. It’s a refusal to think. submitted by /u/Silver-Plankton8608

Originally posted by u/Silver-Plankton8608 on r/ArtificialInteligence