Original Reddit post

So, ChatGPT is now broken (at least in France). Every single response now ends with a generic “I can also show you…” followed by a bulleted list of suggestions. For example, if I ask for a receipe, it gives me one, then ends with “…But if you want the REAL TOP NOTCH ONE, I’ll happily give it to you too” . Why not do it in the first place? I ran a test with 5 identical prompts across ChatGPT, Claude, and Gemini. ChatGPT proposed a follow-up 100% of the time, while teasing the “good” content. Gemini proposed a follow-up 100% of the time, but whithout withhelding information. Claude almost never does it. Of course, this has nothing to do with the fact that Sam Altman is pivoting to an ad-supported model. And has been poaching Meta people for the past year. It’s textbook enshittification. They are conditioning us to click on “sponsored follow-up questions”. submitted by /u/Mat_Halluworld

Originally posted by u/Mat_Halluworld on r/ArtificialInteligence