Original Reddit post

Hey r/ArtificialInteligence , We talk a lot about LLMs writing code, but the real shift happening right now is Agentic AI stepping into complex data engineering workflows. It’s no longer just about generating a quick SQL query; it’s about autonomous agents managing the pipeline lifecycle. A few areas where agents are starting to make a massive dent: Automated ETL/ELT: Agents that don’t just map data, but dynamically adjust to schema drifts on the fly without breaking pipelines downstream. Self-Healing Architecture: Identifying bottlenecks or failed jobs in orchestration tools and autonomously rerunning, debugging, or patching them. Intelligent Data Governance: Continuous, context-aware auditing of data lakes to tag PII or flag anomalies in real-time, way beyond traditional rule-based systems. It feels like we are rapidly moving from “AI as a copilot” to “AI as the pipeline operator.” Are any of you deploying agentic workflows in your data stacks right now? What frameworks are actually holding up in production for you? submitted by /u/netcommah

Originally posted by u/netcommah on r/ArtificialInteligence