I’ve been building an automated documentary production engine called Dead Ledger. It orchestrates Claude for scriptwriting, ElevenLabs for narration, Grok/Replicate for AI images and video, and Remotion for rendering — all stitched together with Node.js. The system takes a topic, generates a full script with 60-70 scenes, picks from a library of 1,265 visual components (data visualizations, dossiers, evidence boards, court transcripts, etc.), generates all assets, and renders a full 13-minute documentary. Just shipped Episode 2 on the Bre-X Gold Scandal — a 1990s mining fraud where $6 billion in gold vanished from Borneo: https://www.youtube.com/watch?v=7Ob2M8JbhD0 Episode 1 (The South Sea Bubble) is also up if you want to compare the production quality improvements between iterations. Tech stack: Claude Opus for scripts, ElevenLabs TTS, Grok + Replicate Flux for images, Replicate Wan 2.1 for hero video clips, Remotion for composition/rendering, ffmpeg for audio normalization. Happy to answer questions about the architecture. submitted by /u/No_Neighborhood_5817
Originally posted by u/No_Neighborhood_5817 on r/ArtificialInteligence
