Original Reddit post

… and why is it used so often as a supreme milestone, and that type of shit? We’re barely keeping ourselves under control, as humans. What do you expect to happen when AI surpasses human intelligence? (won’t happen, but keep on hoping lol) All this hype sounds like a poorly written nighttime story, nothing more than that. LLMs were initially presented as some supreme artificial intelligence, when in fact they’re just glorified chatbots and quick bullshit generators. They’re language models, of course language is their strongest point. That how they’re supposed to work. All I’m saying is don’t get too excited about these. They are tools and they should be treated that way - not like some monolith from Space Odyssey that gives you the answers to life. submitted by /u/ionitaxbogdan

Originally posted by u/ionitaxbogdan on r/ArtificialInteligence