Original Reddit post

Hi everyone, I’m looking for some technical advice. Over the past couple of years I’ve built up around 850MB of conversations inside ChatGPT. This includes long-form writing and ongoing projects that are very important to me. I’ve recently decided to stop using ChatGPT because I’m not comfortable with the company’s decision to collaborate with the Pentagon. Regardless of where people stand politically, for me it’s an ethical line, and I prefer not to financially support tools connected to military infrastructure. Now I’m trying to figure out: - What’s the most reliable way to export all conversations in bulk? - What format does the official export come in (JSON, HTML, etc.)? - Has anyone successfully migrated large archives into another model (e.g., Claude, Gemini, grok, open-source LLMs, local models)? - Are there tools to clean, structure, or vectorize the data so it can be used as long-term memory in another system? - Any best practices for handling a dataset this large? If anyone has done something similar at this scale, I’d really appreciate practical guidance. Thanks 🙏 submitted by /u/IndicationWorldly604

Originally posted by u/IndicationWorldly604 on r/ArtificialInteligence