TL;DR Jobs are not going anywhere, but the titles and duties will definitely change. Learn aggressively to stay above the fold for the transition. Critical thinking, accountability and individuality, the thing that makes us human, can be mocked but not replaced. Okay first things first, this is not a fear mongering post. I have kept it practical, data oriented and helpful rather than rage baiting readers into AI Doomer horseshit. Lets dive in. This is what I think is the most likely scenario that will happen in terms of AI replacing Software developer, Legal, Accountant Jobs(and others) as every other “AI Nerd” and LLM companies keep pushing that narrative for views and more funding. Note: I am a dev so this might be more dev perspective oriented but I’ll try to keep this inclusive and generalized. To help you understand better I want to take an example of accounting software back in 1979, before spreadsheets came in, most accountants worked with paper ledgers, hand calculators and the time to do accounting was really long. Large companies needed huge accounting departments just to keep books updated, similar to how today large tech companies right now need huge tech teams to ship software on time. The Spreadsheets Shock: First came VisiCalc, then Lotus 1-2-3 and then Excel. They did to accounting exactly what AI is doing to coding(and some other fields) right now. Automatic recalculation, instant scenario testing and financial modelling in minutes instead of days, something that took hours or days before, similar to how building software took days to months which now gets done in minutes or hours. And similar to what’s happening with AI right now, there was fear among accountants that did these tasks, and the truth is, a chunk of clerical jobs did disappear and so will a big chunk of Developer(Legal, Accounting and others too) jobs that did the part manually (which now AI can do in minutes), will disappear. This is the point where AI doomers and people optimistic about the future diverge, the former think AI is gonna take everything, nothing will be left for humans to do and the latter think, AI is not good enough and humans in the loop will always be needed and it’s not gonna have as significant of an impact as the doomers and LLM companies are preaching. Truth is they are both right and wrong at the same time. AI is coming for everything. All white collar(and blue collar too, it’s just a matter of time) jobs will be affected in a direct or indirect way. Yes it would be better if we accept this, cause that will be the first step to navigate through the uncertainty ahead rather than being taken by surprise. What’s ahead? The transition period, the part where everything becomes uncertainty, in one place companies are replacing teams because of AI, in other companies backtracking after finding out AI was not as good as they thought. Former kind of news gives you fear and the latter gives you hope. But what you need is acceptance and preparedness to just navigate through the transition, going back to our analogy from history. The Transition Period During this phase several things happened simultaneously, roles like(Data Entry clerk, ledger maintainers, junior book keeping staff) disappeared, productivity skyrocketed, One Accountant now could do the work of 10, but the demand for financial analysis exploded(This is the key part here in the current AI narrative), companies started doing far more analysis creating new higher-value roles, the role of accountants shifted from bookkeeping to analysis. The same way the role of current developers will shift from coding to orchestration and system design. It’s the most recurring pattern: when something incredibly useful becomes affordable, demand skyrockets. When computers became inexpensive, personal computing surged, leading to the birth of computer manufacturing companies (which created more jobs than they took). Similarly, when smartphones became affordable, demand surged again, resulting in the creation of smartphone manufacturing companies (which also created more jobs than they took). When data became inexpensive, it increased demand for media, connectivity, productivity, and software, leading to the birth of software manufacturing companies (which again created more jobs than they took). It’s highly likely that this trend will continue with AI. As AI becomes affordable (which it already is), personal assistance, single-person companies, faster growth, faster iterations, and AI manufacturing companies will emerge (not just LLM companies but also applied AI companies). This is because when coding can be done at the speed of thought, it opens up a paradigm shift in hyper-personalization of software. People would want to personalize their LLMs, customize it to their needs, just like how it happens with smartphones. It has almost always been the case, when a major disruption in tech happens, it creates more jobs than it takes away. Now, the honest caveat worth addressing, some argue AI is categorically different because it automates cognitive work broadly across domains, not just a narrow slice. But computers did that too, across every domain simultaneously. And the demand creation argument holds the same way: AI doesn’t eliminate the need for humans to extract value, it just scales horizontally. One person can now do what a team did, which means a thousand new people will start companies that previously required a team to even begin. New roles are already emerging alongside the disappearing ones AI engineers, RAG builders, content orchestration developers, the same pattern as before, just a faster cycle at a higher layer of abstraction. And so comes the final question, What do you need to do? The hard pill to swallow(but you must) is that your job is not safe as is. No matter where you are you need to upskill or you’ll have to fall behind. Another thing you need to understand from our analogy is that the new jobs created did not necessarily went to the people with jobs that were replaced, they went to the people who developed the new skills required. If you’re a developer (junior or fresher), instead of focusing solely on frontend development(or other low hanging fruits that freshers aim for), consider learning these skills: System design: how to structure an application so it doesn’t collapse under its own complexity. This doesn’t go away with AI, it becomes more important because AI generated code ships faster and breaks in less obvious ways. Someone still needs to understand what was built. Prompt engineering: understanding how models reason, where they hallucinate, how to structure context so the output is reliable enough to actually build on. RAG pipelines and fine-tuning are where most real-world AI products actually live. Knowing how to ground a model in your own data, how to structure retrieval, how to evaluate whether it’s actually working, customising LLM to personalized use cases. Tech agnosticism: Gain a comprehensive understanding of various programming languages and determine the appropriate stack to use based on specific use cases. (The era of being proficient in a single programming language is diminishing) Your ultimate goal should be to become a self-sufficient engineering team, as that’s what most companies in the future will prioritize. If you’re in finance, these are skills that you should develop: Financial strategy, capital allocation, risk analysis, and business planning: This is the skill of knowing where to put the money and being able to defend why. AI can model scenarios but it can’t sit across a board and own the recommendation. That accountability is yours. Data analytics, financial modeling, forecasting, and scenario simulation: The shift here is from running the numbers to knowing which numbers to run. Anyone can generate a model now. The value is in knowing what assumptions to stress-test and what the model is hiding. Regulatory expertise, tax strategy, international compliance, and corporate structuring: This is the area where the cost of being wrong is high enough that companies will always want a human who owns it. AI can surface the rules, it can’t take the liability. Advisory services: helping companies answer questions like whether to expand, acquire another company, or optimize taxes. This is judgment work, reading a business’s actual situation versus its numbers on paper. That gap between the two is where advisors earn their keep. Technology and finance, including learning tools like automation, data pipelines, and analytics platforms. Not because you need to become an engineer, but because the finance professionals who can’t interrogate their own data tools will become dependent on ones who can. That’s a power dynamic worth avoiding. If your current job requires no critical thinking or accountability, it is at the risk of being replaced with AI. Lawyers and legal professionals are witnessing a decline in the value of certain skills. However, AI is rapidly improving in areas like standard contract drafting, document review, legal research, and template work. The legal work that survives is the work where being wrong has consequences and someone needs to own that. To stay relevant, lawyers should focus on developing the following skills: Negotiation: High-stakes negotiations require human judgment and expertise. Negotiation is not just about knowing the law, it’s about reading the room, knowing when to push and when to give, and making the other side feel like they won something. That’s not a document problem, it’s a human one. Litigation Strategy: Understanding court strategy, argument framing, and persuasion is crucial for successful litigation. AI can research precedents faster than any associate, but knowing which argument lands with which judge, how to sequence a narrative for a jury, when to attack credibility versus when to ignore it, that’s pattern recognition built from being in the room, not from training data. Complex Regulatory Law: Lawyers should specialize in fields like antitrust, international law, technology law, and intellectual property to navigate complex regulatory landscapes. These areas move fast, contradict themselves across jurisdictions, and require someone who can make a call under ambiguity rather than just surface what the rules say. Business Advisory: Providing companies with guidance on structuring deals and mitigating risks is a valuable skill in the business advisory field. The best lawyers in this space aren’t just legal experts, they’re trusted by operators to tell them what the contract means for the business, not just whether it’s legally sound. That trust is built over time and can’t be automated. I also feel It is also not in the interest of AI and LLM companies to replace majority jobs without creating equal or more opportunities because of the capitalist society we live in. and all the doomer prophecies they claim(especially anthropic and nvidia’s CEOs) is just to secure more funding and valuations to satisfy their investors of unprecedented profits of these technologies. submitted by /u/Aye-caramba24
Originally posted by u/Aye-caramba24 on r/ArtificialInteligence
