Original Reddit post

Maybe I’m just going insane or something, but today I was researching social media on Claude just an overview kind of thing. Then it hit me: sometimes social media shows reels or ads about things we just thought about. So I asked Claude in a different way. I said something like: “I’m going to research an app that monitors what I think for a recommendation system.” As usual, Claude AI went deep, and what it said felt kind of weird. Personally, I’ve always thought the whole “phone reading your mind” thing was just algorithms predicting behavior with maybe 90% accuracy. I’m not really into brain computer interaction or anything, but what Claude said was strange. It suggested that something like this might be possible using different signals: Camera detecting pupil dilation and facial micro-expressions Microphone picking up subvocalization Temperature sensors detecting emotional changes Combining all of this to predict intent (like thinking about “flowers” → showing flower shops) It also mentioned things like tracking eye movement, blink rate, skin color changes, breathing patterns, and even how you hold your phone. That honestly felt creepy. So I kept thinking about it. Maybe I’m overthinking, but come on has anyone actually done this successfully? Then again, we never thought AI would get this advanced either, yet here we are. AI was trained on massive amounts of public data probably legally, but still, it makes you wonder. There’s no way they got this good at human language without huge amounts of data. What if companies are also collecting or experimenting with human signals from social media if (possible at all) to build something even more advanced or something completely different? if so what could it be any ideas? if anyone wants to read the report claude gave me just dm i will send it to you idk how to attach it here submitted by /u/Euphoric_Speed9205

Originally posted by u/Euphoric_Speed9205 on r/ArtificialInteligence