[UPDATE March 25] The official Chrome Web Store version is now live. A lot of people wanted to wait for the proper store version instead of installing the ZIP, so it’s here now: ** https://chromewebstore.google.com/detail/pclighhhemgemdkhnhejgmdnjnoggfif?utm%5C%5C%5C_source=item-share-cb%5C*%5C*%5C If long chats have been making ChatGPT lag, freeze, or become unusable, this is exactly what I built it for. Would genuinely love to hear if it fixes it for you. Hey everyone, Like many of you I use ChatGPT for long coding sessions and research threads. After 30–40 messages the whole tab starts crawling. Typing lags, scrolling stutters, CPU spikes. Starting a new chat every time isn’t a solution when you’re mid-project. Why it happens ChatGPT renders every single message in the DOM simultaneously. A 200 message chat means your browser is juggling thousands of live elements at once. It has nothing to do with OpenAI’s servers. It’s entirely a browser rendering problem. What I built A Chrome extension that intercepts the conversation data before React renders it and trims it to only the messages you need. It also shows a live speed multiplier so you can see exactly how much faster it’s running. My test on a 1554 message chat showed 48.6x faster, rendering only 32 messages instead of 1554. Your full history stays intact, just scroll up and click “Load older messages” to browse back anytime. What it includes Live stats showing speed multiplier, messages rendered vs total, and a chat health score. Four speed modes depending on how aggressive you want the trimming to be. Everything runs 100% locally, no data ever leaves your browser, no tracking, no uploads. Curious if anyone here has the same issue on longer threads and whether this fixes it for you. submitted by /u/Distinct-Resident759
Originally posted by u/Distinct-Resident759 on r/ArtificialInteligence
