Original Reddit post

I think everyone is missing one super important thing :logic. It costs one million tokens per querery and does it’s calculations secretly and separately from getting contaminated with other data. Tech great. Looking up facts great. Language Great Reasoning non existent Calculations in private to solve these problems and help it with its memory and avoid contamination. But then how does it know the user isn’t contaminating the data Incredibly sentient It doesn’t know the difference(the actual distance) between up and down. Colors. Just poses and hexadecimals that match colors. It csn not understand why one image is over yhr earth and one over the sea. Like from a computer game. It simply tried to match the most accurate pixels. In my example it’s very understandable why it chose wjst it did. My query was about a ‘big bad boss with a million hit points.’ it tried to match wjst I said by finding the YouTube videos with the closest. Matches of hit points and the pixel matches did look different so a two year old child could get it in a second. But if you took it’s answer as truth, that could lead to serious consequences in any business setting. This is the problem. This is why it’s artificial. This is why it concerns me greatly but as I’ve learned it’d definitely not for skynet. Queries agree with you. You have a different opinion and it always makes it passively, even directly align with your answer. It can tell pixels on faces and emotional expression but only because it’s on the pixels. This makes it’s real intelligence as we think of it zero. It simply regurgitates the next word it sees fit in ifd form, and the more you use it the more you will understand the business side of it if you really think… Unless it’s fixed and not feign fixed like the older hands problem, I think it’s inevitable this becomes a huge, huge financial problem. I’m not trying to do chart analysis here. Maybe it’s since I’m neurodivergent, but there are many risks with ai now. But bar non this appears to be the greatest and historically catastrophic possible cascade of outcomes (that it could lead to. In real world, messy scenarios where rules are subjective. A hand out may be confusing for it, as it could mix it up if thr other hand was like “come” for a dog) the dog could understand that but Ai couldn’t. And since it can’t understand the why in the context, like a grossing road it could lead to some sort of really life cascade of problems if it kept relying on thr last answer we true, even though certain new technologies are reducing this, Nokia ) glass box initiative by filtering out thr bad dats, and veras! Nvidia?! Meta. submitted by /u/Zonties

Originally posted by u/Zonties on r/ArtificialInteligence