Original Reddit post

As we all know, all of the popular LLM-based AIs are trained on the corpus of human writing, gleaned from masses of books, blog posts, social networks etc. But this may be its downfall: it will remove the incentive for humans to keep producing that content. e.g. there’s been a massive amount of content written in the last couple of decades for SEO - things like “how to” blog posts. These will stop now, because anyone can generate their own blog post, so there’s no benefit to humans writing new content. Similarly with art, literature and music: why bother creating new content if it’s all being undermined by AI? But no new content = no new AI. And we all know the dangers of AI training on itself. Has this risk been considered by the major AI companies? submitted by /u/AlephMartian

Originally posted by u/AlephMartian on r/ArtificialInteligence