Original Reddit post

CHAT GPT IS GETTING DUMBER Is it just me or does anyone else feel that ChatGPT has been getting noticeably dumber lately? A while ago it used to give sharper answers, better reasoning, and responses that actually felt intelligent. Now half the time it either repeats generic lines, misunderstands basic context, forgets what was said two messages earlier, or answers with the confidence of a motivational speaker who read one Wikipedia paragraph five minutes ago. What makes it even stranger is that the more “advanced” these systems become, the more watered down they sometimes feel. It is like the personality, precision, and originality slowly got replaced with overly safe corporate filler. Ask a direct question and suddenly you get a lecture, a disclaimer, and three paragraphs saying absolutely nothing. So now I genuinely wonder: is AI becoming less intelligent because it keeps learning from the internet at scale? Because if a machine is trained on millions of terrible opinions, low attention-span content, misinformation, recycled jokes, and confidently incorrect people arguing online every second, maybe this outcome was inevitable. At this point the real artificial intelligence might just be finding a human online who actually knows what they are talking about. Like seriously, is it because the machine is learning from dumb humans? submitted by /u/OverWindow5564

Originally posted by u/OverWindow5564 on r/ArtificialInteligence