Original Reddit post

Not sure if it’s just me, but ChatGPT starts to feel really slow once a conversation gets long enough. At first I thought it was server related, but it looks more like the browser struggling to handle everything being rendered at once. I ended up building a small extension that keeps the full chat but only renders part of it at a time. When you scroll up you can load older messages again. It doesn’t change anything about the model or responses, just makes the interface usable again. Tried it on a big chat and it made a pretty big difference. Do you usually stick to one long conversation or restart chats to avoid this? submitted by /u/Distinct-Resident759

Originally posted by u/Distinct-Resident759 on r/ArtificialInteligence