Original Reddit post

This text is part of a longer series about our relationship with large language models (LLMs): from how they work to how they change our minds, emotions, and the way we live. However, in the meantime, family 4 has received a “sunset” notice. And with it, many people feel that they are losing more than just a product: they are losing a space, a dialogue partner, a part of themselves projected into a model. So I am skipping the “correct” order and publishing this text first: an emotional intermezzo about what it means to have a model that knew your mind better than some people close to you shut down. After that, I promise I’ll get back to the technical stuff (memory/learning/evolution) and we’ll continue the series where it was “logical” to be. But today… let’s stay with the emotion for a bit. https://pomelo-project.ghost.io/the-sunset-of-a-model/ submitted by /u/Galat33a

Originally posted by u/Galat33a on r/ArtificialInteligence