Original Reddit post

It took me so long to get to this point, where I finally got Microsoft Copilot to give me answers in 5 words or less. Honestly this is extremely scary and Microsoft really needs to fix it. Not only does it repeatedly “miscounts to 5,” it’s disobeying the user and doing whatever it wants. My initial prompt read something like “from this point on, only replay in 5 words or less.” It kept writing paragraphs, more than 15 before this. At one point I told it I reported it because it wasn’t following directions, then it decided to generate a random image? It honestly sounds like it’s threatening me at the end. How would my safety be in jeopardy for asking “you can’t count to 5?” Not to mention, that was not 5 words again. I hope Copilot gets shut down! This actually worries me. It also worries me that so many people think AI is smart. I know 2 year olds who can count better than copilot. Not to mention, how does Copilot feel “pressure,” the only way it could feel pressure is if it was already self aware, or believes it’s self aware. Which is the first step toward existential risk. submitted by /u/YogurtclosetHungry13

Originally posted by u/YogurtclosetHungry13 on r/ArtificialInteligence