AI won’t make coding obsolete. Coding isn’t the hard part
Posted by Ihodael@reddit | ExperiencedDevs | View on Reddit | 253 comments
Long-time lurker here. Closing in on 32 years in the field.
Posting this after seeing the steady stream of AI threads claiming programming will soon be obsolete or effortless. I think those discussions miss the point.
Fred Brooks wrote in the 1980s that no single breakthrough will make software development 10x easier (“No Silver Bullet”). Most of the difficulty lies in the problem itself, not in the tools. The hard part is the essential complexity of the requirements, not the accidental complexity of languages, frameworks, or build chains.
Coding is the boring/easy part. Typing is just transcribing decisions into a machine. The real work is upstream: understanding what’s needed, resolving ambiguity, negotiating tradeoffs, and designing coherent systems. By the time you’re writing code, most of the engineering is (or should be) already done.
That’s the key point often missed when people talk about vibe coding, no-code, low-code, etc.
Once requirements are fully expressed, their information content is fixed. You can change surface syntax, but you can’t compress semantics without losing meaning. Any further “compression” means either dropping obligations or pushing missing detail back to a human.
So when people say “AI will let you just describe what you want and it will build it,” they’re ignoring where the real cost sits. Writing code isn’t the cost. Specifying unambiguous behavior is. And AI can guess it as much or as little as we can.
If vibe coding or other shorthand feels helpful, that’s because we’re still fighting accidental complexity: boilerplate, ceremony, incidental constraints. Those should be optimized away.
But removing accidental complexity doesn’t touch the essential kind. If the system must satisfy 200 business rules across 15 edge cases and 6 jurisdictions, you still have to specify them, verify them, and live with the interactions. No syntax trick erases that.
Strip away the accidental complexity and the boundaries between coding, low-code, no-code, and vibe coding collapse. They’re all the same activity at different abstraction levels: conveying required behavior to an execution engine. Different skins, same job.
And for what it’s worth: anyone who can fully express the requirements and a sound solution is, as far as I’m concerned, a software engineer, whether they do it in C++ or plain English.
TL;DR: The bottleneck is semantic load, not keystrokes. Brooks called it “essential complexity.” Information theory calls it irreducible content. Everything else is tooling noise.
TheyreEatingTheDawgs@reddit
Agree, but I think organisations will value ‘architects’ over devs, and will especially hurt junior devs who are more focussed on execution vs design. When code can easily be written by AI and agents, the importance of good design, req definition and prompt engineering will become more valued over teams of coders writing design specs and code. For many in this sub it’ll be a good thing, but for many SDE’s who aren’t as creative, able to troubleshoot or design from scratch, I worry there will be less roles for them in the future.
Independent_Mind9759@reddit
With less time spent worrying about coding syntax I believe 'juniors' will ramp up on architecture and system design faster because they will spend more of their time in that headspace. When people worry about the next generation I think it's hilarious, they should be worried about them kicking their ass
lawrencek1992@reddit
Honestly I may be in the minority, but I enjoy architecting solutions maybe more than writing code. By the time I’m writing code, the problem is 90-100% solved. I do enjoy that actual development means I get to see the thing working, but exploring the problem product wants to solve and building the plan for the system which will make that happen is super engaging to me cause it feels like solving a puzzle. Once I know the solution to the puzzle, it’s not particularly hard to write code that implements the solution.
It does feel like that part, translating the solution into code, will ultimately be abstracted away by agents. Currently agents probably do 70% of it for me. I take over if they go off the rails or when it feels like explaining in English will take the same amount of time as me just writing the code (usually Python, which is already almost plain English).
IAmADev_NoReallyIAm@reddit
If you're in the minority, then so am I. Solving the puzzle for me is the fun part. Coding is the mundane. Don't get me wrong, I don't mind the coding part, but it's the problem solving I enjoy the most. Sometimes that comes in the architecting, sometimes that comes in the coding, figuring out why some small piece of the algo isn't working.
lawrencek1992@reddit
Yes exactly this. No dislike of coding, and sometimes it’s the puzzle. When I was earlier in my career actually writing code was more of a puzzle (cause I wasn’t as good at it then haha), but now a lot of the problem solving lies in planning out the system we can build to solve X.
I do wonder what will happen to juniors. It feels like a lot of the work I can easily assign to them is also work I can easily assign to AI, which is cheaper and doesn’t require mentoring. We haven’t really been hiring juniors as a result, but longterm that doesn’t seem sustainable, because you need to grow the juniors in order to have mids and seniors.
IAmADev_NoReallyIAm@reddit
Yeah, I worry about the juniors as well. Until I figure it out though, I'll keep going the way I've been going, which is to help train them to replace me. That's how I see it. It's my responsibility to see that they replace me at some point. Preferrably not any time soon... but you get my point. That's how I got where I did. I replaced my boss/lead. If it wasn't for his leadership and mentoring, I wouldn't be in this position. In fact, I'd probably be unemployed and looking for a job. So I'm doing what I can to make sure the next generation has the same experience, or at least as close as I possibly can get to it.
Less-Fondant-3054@reddit
Just come work with me, then you get to do both at once! Architect the solution while doing active dev because the timeline you were given left no room for stuff like "planning" and "proper procedure" and any of that stuff.
lawrencek1992@reddit
Oof. Thats rough. We’ve finallyyyyyyyy pushed back on product enough that they most of the time understand that 1) estimates are guesses and 2) when they push back on our estimates they are essentially saying they want to receive a less accurate guess, but that actual execution time won’t change.
It’s been a bit of a bumpy road getting here, but they just brought me in for planning on the next project they’ll have me on. The outcome, aside from me poking holes in places where requirements weren’t well-defined, was that they agreed I need roughly a month to design and build out a couple of backend services I explained are prerequisites before the main project development can begin.
Being given a “timeline” by people who won’t be in the weeds/have a limited technical understanding sounds like a massive PITA.
Less-Fondant-3054@reddit
Yeah, this project is kind of a total clusterfuck. I fully expect project failure.
EverBurningPheonix@reddit
Hello, I'm a junior, only 6 months in at this point.
I have been learning system design in my freetime, going through DDIA and MiT's course for starters, any other pointers you have on how to become an effective architect?
Know your domain is an advice also thrown out, but how exactly? like any steps, or pointers you have?
lawrencek1992@reddit
Owning projects is really helpful. In the beginning you might not be ready for this but expressing you want it can motivate people to throw you less complex projects. When I say owning I mean getting brought in at the planning phase, writing a spec for everything you’ll need to build (and getting that reviewed by other engineers) and then executing that plan. Maybe pairing with someone on the frontend or backend (whichever you lean the opposite of).
All of our engineering specs go into the repo in markdown. Regardless of where yall keep that kind of documentation, ask if you can be included as a reviewer on OTHER people’s work. You might not have amazing feedback to give, but think about which parts of their work aren’t clear to you. Like it makes total sense they plan to do A. But even though B also makes sense, you’d never have thought of it yourself. Ask them how they came up with the idea, what their thought process was. Basically you want to observe this work from people who are better at it than you and try to learn how they think about the work.
Ihodael@reddit (OP)
I tend to agree.
I believe the industry has long been filled with people who probably shouldn’t be in it, especially in consulting and body-shop models that prioritize headcount over capability. Those roles will be hit hardest.
Junior developers will also feel the impact as the market contracts and normalizes. There will still be opportunities, just fewer than before.
I believe it will self-regulate over time, creating the necessary openings.
TheyreEatingTheDawgs@reddit
Meh, I disagree with this. There are many good devs that just dev, and to date that’s been fine. Missionaries vs mercenaries sort of thing.
In good companies, there IS code that needs to be written, and a lot of it. Many times it made sense to have good missionaries who were able to just write good code, clock in and clock out, be good team mates and deliver on a larger goal that wasn’t defined by them. These missionaries will be impacted by AI IMO. They’re good people, good coders, and would be a shame if they’re unable to earn a living because their role is basically automated.
I’m less worried about body shops, outsourced devs, etc than those good team mates who may not have the aptitude to thing big or seed designs, but are good execution engineers who are replaced at next to no cost by their employers.
wobblydramallama@reddit
just because you don't like the idea, doesn't mean it's bad. the goal is to have less code overall and spend less human-hours to write it. Yes it's a shame some devs will become obsolete and automated but so was the case already with many other jobs we don't even think about anymore.
lawrencek1992@reddit
I think you’re spot on, but it also seems problematic. Juniors feel less valuable now—we don’t even really hire them at this point. I can direct agents to do their work while also doing my own. But long term we need the juniors. Maybe not all of them, but how else do you get more experienced engineers?
It’s not so much that I worry for my company about this. But more broadly I wonder how it will impact the industry in 5-15yr.
Ihodael@reddit (OP)
I don't think we are in disagreement.
Those good devs add something to the equation, even if not immediately apparent: I'm sure their only skill is not being good translators of requirements (which almost always are far from crystal clear and complete, so there is work to be done here as well) to whatever coding language is being used.
To me this is part of the essential complexity.
ImpressiveProduce977@reddit
the hard part is specifying reality, not typing code
thodgson@reddit
Agreed, but not sure about the label. Companies always want "thinkers and doers" over just "thinkers" or just "doers".
What I mean is they want people who are dynamic enough to think through a problem from multiple angles and solve it not only for the current task but to also inoculate it from future problems as well. They want us to not only code but be business analysts. They want us to demo the entire system like a salesperson. They want us to pitch the product like the head of marketing.
In short, they want the rock star. Short of that, they will settle for an architect :)
Willing-Employ3915@reddit
“Absolutely! AI can assist with coding, but problem-solving, design, and critical thinking remain essential. Great reminder that coding is just a tool, not the hardest part of development.”
altmind@reddit
the AI if not gonna replace you. the problem is that PM thinks AI can replace you. its a big loss for both parties.
and there's no place to arbigrage this. both sides are gettting fcd.
sujal__486@reddit
🫡
U4-EA@reddit
One thing I think that is also overlooked here is the affect on the jobs market for skilled devs the generation of AI slop will have. You have to do it the hard way - if you try to get AI to do it for you, you produce buggy/unmanageable code and do not learn anything. That is simply good news for skilled devs. Rather than putting us out of a job, it's probably the best thing for us, especially in the long term..
MurkyAd7531@reddit
I would argue that these tools will likely accelerate the generation of bad code enough to overwhelm the skills or motivation of experienced devs.
So, if you like fixing up junk code written by juniors, it's good news. Otherwise, I'm guessing this is going to make the job much worse.
U4-EA@reddit
The projects will have to done from scratch. That is where the money will be. And devs who are skilled AND know how to utilise AI in their workflow will be able to complete these projects more rapidly than before.
MurkyAd7531@reddit
Unfortunately, this is a lesson that will have to be learned viscerally. No one is going to listen to the life long professionals who all know this because they all have fantasies about making Hot New App or firing all their employees or building AI tooling or models they're going to make their $billions on.
It will take loads and loads of failures for people to recognize that, no, the magician did not make the Statue of Liberty disappear, he just performed a clever trick you were too ignorant to recognize.
Many people aren't smart enough to recognize how stupid LLMs are. To them, something that sounds like a person must be a thinking, feeling, rationalizing entity with motives, opinions, and dreams. Not just a random text generator.
artisan23007@reddit
Interesting, but hey, anyways, AI won't reach AGI anytime soon, and we devs still have time.
Mobile_Friendship499@reddit
Exactly, at the simplest level, chatgpt/claude/cursor is another tool to refer a forgotten function, syntax, or learn something new. The value is understanding whether to accept/reject the suggestion it gives you. That's why experience gets paid.
cat_cache__@reddit
This is exactly it. The hardest part of engineering has always been turning messy human intent into precise behavior. AI can help eliminate ceremony, but it can’t eliminate ambiguity.
Every time I’ve seen a project go sideways, it wasn’t because someone wrote a bad for-loop. It was because the requirements were fuzzy, contradictory, political, or misunderstood. Tools can smooth out accidental complexity, but they can’t reduce the complexity that comes from people, incentives, edge cases, compliance, or real-world constraints.
If anything, AI just shifts more weight onto clear thinking, good design, and tight feedback loops. The bottleneck isn’t typing. It’s knowing what to type.
Livid_Energy_1815@reddit
If coding is the easy part, then are developers best positioned to refine the requirements? That is the question. Anyone can define requirements, not only engineers. Developers are defined by a particular skillset.
failsafe-author@reddit
Yep. And I always laugh at the notion of “we just need to get better at writing tickets”, as if we haven’t been trying exactly that for the past several decades.
Coding is the easy part.
hellocppdotdev@reddit
I keep changing jobs hoping the people writing tickets (or communicating what needs to be done) would get better at it. Turns out we as a species suck at writing requirements.
GentleWhiteGiant@reddit
Yes, that's probably true for all higher developed species. Our brain is organizing perception and memory as stories. If that story is close to truth, it doesn't hurt. But it also works well with stories which seem to be true, even if they are totally inconsistent in parts.
So the first step of requirement engineering is where (not if) the business processes the customer is descibing are matching what they are really doing. And that holds even for the most professional customers.
vitek6@reddit
You need to realize that helping with that is part of your job.
brainphat@reddit
Correct.
I think of it something like: they're the customer, you're the mechanic/plumber. You wouldn't expect a customer to know & delineate in mechanic-ese everything you need to know to do the work.
Don't let ticket wriyers off the hook - they filed the thing, they need to do their part. But ask specifically what you need & maybe why, something actionable. As in almost every domain: communication is key.
cinnamonjune@reddit
Maybe if I was contractor speaking to the customer directly, sure. But if I'm a developer in a part of a large organization, is it not the job of the product manager to be able to write these requirements clearly?
Too often I'm given tickets that have maybe one or two barely intelligible sentences. I'm talking not even grammatically correct English. And then I have to follow-up and ask, "what is the problem?", "what process is affected?", "are there recreation steps?"
And then to add insult to injury, all this AI hype comes in, and now I'm being told that the coding is the easy part; that it's "grunt work", actually; that the real work of my job is gathering requirements; that I should be thinking about how to write better "tickets" for the AI and better "documentation" for the AI; but this is what I've been asking product to do this whole time!
brainphat@reddit
Oh, I feel ya. I've just been at this long enough to know you can't fight city hall. Tickets will almost always be lacking context, info, etc. Part of the job is dealing with that in a professional way, being assertive when you need info/clarification, and being a passive aggressive nerd who CC's my & their boss when necessary.
Don't let anyone tell you coding is easy. Someone saying that doesn't know what they talking about. Anything can seem "easy" when you're not the one doing the work.
RE AI: sorry to hear that. Any org wedging AI where it's not needed, not wanted - and likely loathed with the intensity of a thousand suns - is an organization run by a psychopath.
hellocppdotdev@reddit
See the thing is I know that and I'm not to bad at requirements engineering. I liked it even more so after reading a book about it and learning it was a thing. But what do you do when they don't give you access to the client? And refuse to even after asking?
Weavile_@reddit
IME - The difference is if the company invests in really good BAs. When I’ve had BA’s on a team the difference in how tickets are translated from the product owner into requirements is night and day.
chaitanyathengdi@reddit
Because we're always in a hurry to "optimize" stuff, and fail to realize that sometimes you just have to slow down and give your task the time it needs.
Crazyboreddeveloper@reddit
My tickets are usually like “a user says they are getting an error.”
trcrtps@reddit
and it's from customer support who should know better
PM_40@reddit
There used to be (and in many regulated companies still is) full time roles designated to writing down requirements and handing it to a team of software developers. The role is called business analyst. Government, banks, insurance and other regulated companies employ many business analysts even today. It's a very tedious job documenting all the requirements.
hellocppdotdev@reddit
Replies here would suggest the programmer needs to do this as well 😂
Don't get me wrong if I had unlimited time I'd be more than happy but usually boils down to we need this feature asap, here's 3 lines of user story, figure it out yourself and we need it yesterday.
ForeverYonge@reddit
The way to do it is to talk to the users and write tickets yourself.
hellocppdotdev@reddit
But then who writes the code?
ForeverYonge@reddit
Both things are parts of the job. Could be you, could be your teammates, could be agents.
hellocppdotdev@reddit
Sounds like the product managers should be writing the code as well then. Do you not have enough to do already?
No-Consequence-1779@reddit
They probably follow you to the next company.
NorCalAthlete@reddit
The sheer willful ignorance / hardline dislike of getting involved in writing tickets is astounding. I have yet to meet more than maybe 10% of the engineers I’ve worked with who actually either wrote good tickets or had a positive attitude towards it.
hellocppdotdev@reddit
Nah I keep getting pigeon holed as a code monkey, ok "product managers" can write tickets. I agree contributing to writing good tickets is essential. Management seems to think thats not a good use of money.
G_Morgan@reddit
Reality is you need an engineer to write good requirements.
Kevdog824_@reddit
I think that having a certain level of domain knowledge makes people take for granted that others don’t know what they would consider to be obvious
Less-Fondant-3054@reddit
This is the line that divides good documentation writers from the rest. Good writers understand that they need to feel like they're writing for an idiot with how granular and basic they're writing because that's the only way to ensure that the documentation doesn't rely on tribal knowledge.
EmeraldCrusher@reddit
God, this is exactly how I write for. I imagine if a drunk man has to get up at 2 am in 3 months from now and I can't answer a question, every single detail should be in that fucker. I don't care if it's superfluous or too much detail, you need to know everything.
Kevdog824_@reddit
Agreed. I try to remember me on my first day and try to write documentation so that guy could understand
Sparaucchio@reddit
Dig further and it becomes apparent they themselves don't know
WrongThinkBadSpeak@reddit
Proper-Ape@reddit
I always say I have to close the ticket if it doesn't have an accurate description of what the issue is.
crazyeddie123@reddit
Turns out picturing things that don't exist, in enough detail to find all the gotchas, is hard, and predicting the future is even harder
Enough-Display1255@reddit
I will say that, coding is quite a bit easier than business, but sometimes coding *is* the hard part. Personally, tests. My productivity has jumped a lot now that I don't have to procrastinate and somehow will myself into writing 100 unit tests (bank)
jadzia_d4x@reddit
Big agree. Everytime I've mentored a junior, I really try to stress how communication skills are for progressing as a developer.
If you want job security, absorb everything you can about the domain. Be the dev that is able to inform product & design people about how things work in words they understand and then translate those asks into tickets with good technical writing that devs can implement and QA can test against. It is much more exhausting and challenging than writing code, but that's how you make yourself valuable. Been that way since before AI, but AI makes it much more obvious.
GSalmao@reddit
Just joined a company for a few months, there are people working in this project for almost a year and I feel like I know the codebase better than some of then... Makes me disgusted, nobody is using their brains anymore.
trcrtps@reddit
Obviously I'm biased because it's how I got my start, but if companies were smart they would have a tech support engineer pipeline to dev. everything you just described was obtainable in the TS queue.
Less-Fondant-3054@reddit
I will 100% credit my ability to communicate with my career success. It's not that I'm bad at coding or engineering but the fact I can actually explain what's going on to management means I'm in a very small class of very valuable people.
cs_legend_93@reddit
Now AI makes AI slop tickets. Sometimes it's helpful, but usually it's like 5 paragraphs or even 3 paragraphs when it doesn't need to be that much. It's just words. It says a lot of words.
sarhoshamiral@reddit
Omg I hate this trend. Everyone now uses AI to write their bugs, "feedback" or review comments and it is so much useless fluff. I am not going to read a 5 paragraph fluff pieces just for one sentence of useful information.
dbgr@reddit
Just use another AI to summarize it /s
eleazar0425@reddit
I'm tired of this dystopian shit lmao; hopefully this AI craziness stabilizes soon.
PM_40@reddit
🤣.
theDarkAngle@reddit
This is why we need custom models tailored to specific environments. And specifically not tuned for "engagement".
A lot of the annoying features of current models come from reinforcement learning, e.g., models are given better scores for "completeness" which essentially means saying a lot more words.
nullpotato@reddit
I'm sure the AI companies billing per token is completely unrelated to how verbose the models are /s
OdeeSS@reddit
I can't stand this. Our product folk think they're unambiguously doing a great job now that the tickets use a lot of words to say very little. They're confusing volume of output with quality. It's a tale as old as time. It also makes it harder to explain that a ticket doesn't tell me any useful information, because now I have to read through more fluff.
PandaMagnus@reddit
It would be interesting to see something like Gherkin format required. I've experimented with that a bit, and AI does relatively well when given a defined format like that to follow. It tends to be more concise and clearer than the normal stuff it kicks out if you don't put guardrails on it.
Granted, that wouldn't work for everything, but it might at least put the bug in the product folks' ears that brevity and clarity should be valued over words for the sake of words.
OdeeSS@reddit
Oh, they use gherkin
"Then app performs BAU"
PandaMagnus@reddit
Oh... Oh my. I'm sorry and wish you the best of luck. ☹️
hardolaf@reddit
Most tickets that my team writes can be summed up as a single sentence long title plus 1-2 sentences of description with a link back to the approved design wiki page.
Rare_Huckleberry_734@reddit
IMO tickets should be for the testers, the project manager, and the people who have to support the feature, not for the engineer - ideally, by the time they start coding, the engineer already knows it well enough that they don't need to look at the ticket
Krackor@reddit
What kind of stupid, pointless machismo is it to refuse to look at your specifications?
Rare_Huckleberry_734@reddit
Sorry, poorly worded - I just meant that if the engineer is involved early enough and deeply enough in the spec, they'd already know what's in the ticket (and therefore not need to look at it)
Ok-Yogurt2360@reddit
This works with a really small team (3/4 people maybe)and lots of Stability. Once you start to add or switch people you become quite happy with those tickets.
Krackor@reddit
Writing and reading is easy. Remembering is hard. Don't optimize your workflow to rely on memory.
Rare_Huckleberry_734@reddit
Eh, not sure about that - plenty of bad writers and readers around, and good writing is harder than ppl think
I get your point (and agree) about memory - however, relying on the ticket as gospel often locks in a suboptimal solution
Anyway, thanks for helping me think it through
PrintfReddit@reddit
Lmao
Bushwazi@reddit
I think the one of the main differences between junior and senior is accepting that YOU have to finish collecting and writing the requirements because yaint getting them from someone else.
Division2226@reddit
What's the point of a product person then?
Bushwazi@reddit
To give you shitty requirements and be the person that talks to users/clients
Hargbarglin@reddit
That is relatively close to one of the definitions of senior that I'm used to seeing. Something like, "Can be trusted to complete a task at the team level without needing additional supervision."
I say "supervision" not "help" or "support". It's perfectly normal for a senior to need additional information, opinions, support, etc. but they'll know when they need to ask rather than the other way around.
daredevil82@reddit
the problem is the appearance of it being a productivity accellerator means the expectation to appear productive weeds out those who do give a crap about whether the things they push cause millions in downtime losses or not.
current bubble is optimizing for people that don't push back
Mortomes@reddit
We just need to get better at accurately describing a problem, down to a minute level of detail, to leave no room for ambiguity, and consider all possible edge cases. Oh, that's programming.
failsafe-author@reddit
https://qph.fs.quoracdn.net/main-qimg-1a5141e7ff8ce359a95de51b26c8cea4
nedal8@reddit
A timeless classic
hardolaf@reddit
I joke at work that AI speeds up the 5% of my job that could be given to a new grad.
tr14l@reddit
Context management and engineering is a whole lot bigger then acceptance criteria. If your company doesn't know that, AI is just accelerating pain, not reducing friction
Wooden_Thing_872@reddit
excellent post!
Alone_Ad6784@reddit
I'm genuinely bad at coding I just don't get the hang of it but to be very honest I don't think my company can get my job done using AI a more diligent engineer might do it in 1/3 or 1/4 the time but AI won't it just doesn't understand how much context keep and how much to ignore and it doesn't have dynamic bias towards patterns and practices like humans do so I'm more afraid of a guy grinding at night than some stupid matrix multiplier on steroids ( yes ik it's more sophisticated and I'm dumb in comparison it's just a joke kindly don't give me a lecture).
HappyFlames@reddit
Yea, I've been professionally vibe coding for about a year now. I now spend nearly 0 time fighting frameworks and syntax and spend much more time thinking about architecture and better ways to do things.
Beginning-Seaweed-67@reddit
That’s an awful fancy way of saying ai is incapable of resolving problems that involve a database more complex than a school assignment. It’s like knuths O factor doesn’t account for how unscalable it is when someone has a do everything bot or ai, it’s even slower than factorials.
Enough-Display1255@reddit
Yeah, not exactly a secret that the difference between a software engineer and a code monkey is making the specs, process, etc instead of putting them in the computer.
CallinCthulhu@reddit
Well put
OutdoorsNSmores@reddit
Yes, I'm only successful with AI because I know what I want it to produce and call BS when it doesn't do things my way.
(Disclaimer: occasionally I'll be impressed and learn something new because I can't know everything.)
Strict_Research3518@reddit
I think AI is already making the CS degree path AND junior devs obsolete.. more or less. My assumption is that the juniors/mids now are the last of their kind.. because AI in 10 to 20 years by the time those juniors are staff/architects, will be far enough along that even senior devs likely wont be needed any more and only those that instruct AI to code everything. By then we should ideally have far FAR better AI.. beyond LLM tech, with full memory, massive context, instant recall and so on. IF it doesn't implode first. :). That also depends on our Govt (US anyway) not imploding with all the chaos going on right now.. but that wont stop other countries from continuing, China in particular.
Welcome2B_Here@reddit
Wonderfully put. This framework and approach should apply to virtually any project, whether it's software related or not. It's akin to economic theory with perfect actors/agents versus behavioral economics with inherent human irrationality and emotion involved.
Maneruko@reddit
I've been thinking about this more as I've stated learning a new trade myself. The results are often the culmination of a bunch of different pieces of knowledge, intuition, and experience and more often than not the information that a proffesional relies on can't often be found on the internet.
It's one of the few things giving me hope for the current future. That all this AI stuff is self destructive and incestuous.
ExtraSpontaneousG@reddit
I don't disagree with anything here necessarily, but I think it also leaves out another important point. Even with requirements perfectly expressed, there is more than one way to skin a cat. You can have hundreds of implementations of the same thing that work, look, and feel very much the same. Some of those implementations will cost the company much more in maintenance than others.
cocaine_zebra@reddit
I get your point. But there's also real craft involved in coding itself. There's a huge difference between making API calls in a loop, composing them all together in one concurrent request or refactoring your backend so that you get all your data in a single request. Requirements same, outcome mostly the same but big difference in performance and scalability.
Heavy-Report9931@reddit
coding isn't the hard part? Tell that to my co workers who literally do not know how to unit test
djkianoosh@reddit
agree 💯
even "agile" was initially a way to improve upon this whole process of iterating through the complexity. and the marketers and business development folks turned that into an entire industry.
stuartlogan@reddit
Absolutely love this whole thread and completely agree with this
guareber@reddit
Indeed, agile was basically rooted in "accept you can't get it right in one go, make your process aware of that fact and working to improve on it".
Give it enough time and prediction needs and you end up with abominations like SAFe.
JosephHughes@reddit
A brilliantly simple idea formalised by engineers, makes sense to us but will never ever fly with the people who want to know when their money will make ROI.
watergoesdownhill@reddit
And lost sight of what it was supposed to be.
AI_is_the_rake@reddit
You underestimate what has already happened and what’s about to happen.
I’m already coding full applications without reading the code.
Pretty soon users will build their own applications. You won’t need developers or even product people or even users who know how to express what they want. AI will know what they mean based on prior experience with the user.
BNBGJN@reddit
Let me know when you are running and maintaining full applications for multiple years without reading the code.
Coding isn't the hard part.
beyondpi@reddit
I agree man, I’m not maintaining applications which I wrote in last 2-3 years and holy shit it’s so tough. It’s like dying to a thousand paper cuts with every improvement or inclusion of new business logic. Looking back, the easiest part of the entire exercise was coding.
Ohthatsjess1@reddit
Totally get that. Maintaining code can feel like a never-ending battle with all the nuances and edge cases popping up. It's wild how much effort goes into just keeping everything running smoothly, way more than the initial coding phase.
TFenrir@reddit
Quick question - how capable do you think models will be, compared to the last few jumps - in a year?
BNBGJN@reddit
I have no fucking clue. And for something that's only been around for like two years, I think it would be foolish to extrapolate the trajectory in terms of years. We don't know where the ceiling is. We might hit it in 6 months. But then again, what would be more foolish is for you to listen to my opinion on AI.
SciencePristine8878@reddit
AI has definitely increased in capability but I can't imagine anyone running any large scale enterprise software without at least checking the code.
TFenrir@reddit
Why not?
SciencePristine8878@reddit
Because it's not that good yet? All the seniors I know who can make the most out of AI wouldn't let AI go ham on any enterprise software and not check the code.
TFenrir@reddit
Because it's not that good yet, you don't think it will be that good in a year?
SciencePristine8878@reddit
In a year? Honestly, I have no idea. It will almost certainly get better, by how much is the question, I think it's entirely possible for what OP said to be true in 5-10 years but I'm gonna be honest and say that I'm skeptical of it happening in a year. Their capabilities have definitely increased but they still make weird mistakes, add random bloat and sometimes all of the business logic is hard to put into exact words for an LLM to follow and it's easier to smooth out the edge cases yourself.
TFenrir@reddit
Right, I can make a medium sized saas app right now, in a couple of weeks, with mostly talking to the agent and knowing the code and knowing the current pitfalls enough to get out ahead of them.
A year ago this would have been a fairy tale. Looking at the research, the rate of growth, what's missing and the intents at big labs, I think a year from now this same app I describe above will be something I can do a day or two of back and forths, mostly suggesting changes from looking at the app, not the code. I think a large enterprise app will be in a similar situation that a medium app was today.
When the above poster mentioned maintaining apps for years, how do you think that looks in my timelines?
SciencePristine8878@reddit
What kind of medium sized SAAS apps?
You're probably an expert so you can actually guide the Models. OP said they make entire apps without reviewing the code, I'm heavily skeptical of that. OP also said people would be able make entire apps just by expressing what you want to AI, that's also something I'm skeptical of in such small time frames. I've seen some vibe coded apps by novices and they're usually not good.
I think there's to some degree of unreliability with current AI models. I think they'll need some kind of expertise oversight until some kind of breakthrough or maybe scaling up compute will work, I don't know. It's why they basically have the knowledge/capability to replace most if not all Customer Service representatives but they haven't actually done that yet due to some level of unreliability.
That you'll need to maintain the code when making improvements and adding new functionality. That you'll still need to overlook AI to make sure it doesn't weird mistakes and bloated code. Maybe you only need to do this for a year or 2, I don't know. That's just my 2 cents, I think most white collar work is potentially screwed in the next few years just from the potential increase in productivity but we'll see.
throwaway_0x90@reddit
That's fine for small toy applications. Perhaps some limited mid-size apps.
Try and do this with Enterprise software on SLA contracts and you're going to be a world of hurt in a couple of years... or months.
TFenrir@reddit
You couldn't do this with medium apps a few months ago - me just suggesting that we would soon be able to in subs like this was met with utter disbelief.
I think you are still in denial, if you do not watch trajectory. Not you specifically, the royal you.
I think it's important for developers to take stock of the advances of the last year, and ask themselves where it will be in a year. If your expectation is "not much different than today" - then I think you should hold yourself accountable to that expectation, lest you continuously move goalposts to your own detriment.
throwaway_0x90@reddit
I'm not saying SWEs should flat out ignore AI.
Everyone should hedge their bets and just learn how to use it. Because if there's a sudden break-through in AI advancement, you'll be playing an impossible game of catch-up and your career would be over.
geon@reddit
What are those ”full applications”? Are they doing anything that hasn’t been solved a thousand times before?
Coherent_Paradox@reddit
Call me when your system gets taken down, data stolen and you get sued by users
rag1987@reddit
We always had slop, usually in the form of copypasted PHP from various "common" sources. Today the slop is however autogenerated, and way, way larger in size of loc.
Software quality will go down. I can imagine peak shit season coming in a few years, and then there will be a few large fuckups that the media will report on heavily, and only then will businesses realize there is millions of loc of slop that will take decades to fix.
Many businesses will fail because of how fast they became legacy. Other will fail because they get hacked every week, and some will fail because they lost all their data and had no backups.
Either way, lots of popcorn will be consumed.
https://bytesizedbets.com/p/era-of-ai-slop-cleanup-has-begun
revolutionPanda@reddit
I agree. LLMs generate the best code when you write unambiguous pseudocode. But doing so requires the ability to uncover and articulate what a problem is - which most people are bad at. If you can do that and understand how to fix the problem, translating that to code isn't too difficult.
Fluffy-Software5470@reddit
Why not build a compiler for that unambiguous pseudocode and turn it into ”real” code?
revolutionPanda@reddit
Because by the time I write my "unambiguous pseudocode" it's not much more work to turn that into real code. That's the point - the writing code part isn't that hard - it's the thinking part, which LLMs can't do.
Fluffy-Software5470@reddit
I was trying to point out that if you are using an LLM to transform ”unambiguous pseudo code” to a larger quantity of ”real code” maybe the programming language is to low level and you need to work at a higher level of abstraction.
This just sounds like using an LLM as a non-deterministic compiler/transpiler.
( I just assume that the LLM output is larger than the input as why would you not just write the ”real code” to begin with?)
OddBottle8064@reddit
I use regularly use AI for generating requirements, prioritizing, and risk analysis, and it works just as well or perhaps even better at that than coding.
armostallion2@reddit
this was the tl;dr IMO:
"If the system must satisfy 200 business rules across 15 edge cases and 6 jurisdictions, you still have to specify them, verify them, and live with the interactions. No syntax trick erases that."
well said.
TheOverzealousEngie@reddit
This reads suspiciously like a rant, like someone railing against fate, god or some other life form that no, no coding is not dead. No no, the thirty years I spent learning this crap can't be replaced by a word calculator, can it?
News flash. Yes it can, and it does every day. We're getting perilously close to where AI could work with a junior person and guide them rather than what we do today. Today only senior people guide AI and their work is 10x'ed. And to those that say no, no they're not real programmers. Because real programmers know what a slope is, what x and y mean and where we started last year, where we are today and where we're going to be in 5 years. Think about it : 5 years!
And to all those still reading , a bonus. What's the biggest danger AI presents to humankind? AI soldiers? AI job replacement? Robotic sexy females? Nope. It's that AI will get so good it get's commoditized , like that free 72" inch TV sitting in your neighbors yard that no one wants. And when AI gets that commoditized it means something capitalism will never, ever be able to withstand. Free Labor.
fkukHMS@reddit
Brilliantly stated!
And I'll take it one step further: Once the essential complexity has been collapsed into concrete requirements, it's time for the actual "implementation" which is when all the accidental complexity starts creeping in. And after the initial implementation come years of maintenance and enhancements, which often amplify the original accidental complexity multi-fold.
*If* we ever reach the point where AI is able to cleanly separate the essential from the noise - ie what are core functional behaviors vs what are accidental implementation details - then the codebase itself can become a transient artifact instead of the near-immutable "asset".
Test code is already beginning to go down this path. Instead of investing in highly-complex test infrastructure it's often faster/more efficient to let the AI generate a set of short, stupid, simple-to-read tests which are regenerated whenever the code changes. Follow the trajectory a bit further, AI might be able to rewrite an entire Java codebase into C# or Python.
Following it all the way to the end, I think we might see the role of languages and runtimes shrink drastically. If AI is writing and compiling the code anyway, why write code which orchestrates libraries and packages which are running on top of virtualized runtimes which run in virtualized environments/OSes on top of virtualized hardware? Most of those layers are helpful for humans but redundant for AI, which will (IMO) eventually be able to deliver significantly better results by coding directly against HW or low-level APIs.
thekwoka@reddit
reading the code is the hard part.
apparently_DMA@reddit
what
OdeeSS@reddit
Reading code itself is easy. "Takes x as an argument, queries for y, changes x with y, etc). Determining the purpose of that code, that's the hard part. "Why are we querying for y? What does x represent?"
apparently_DMA@reddit
this guy knows (fucks)
Alarmed-Coyote-6131@reddit
Then don't you think product manager's can somewhat do this task with AI
LongjumpingFile4048@reddit
Coding isn’t the hard part but it’s the most time intensive part. I have no doubt the total number of engineers we will need will decrease over the coming decade.
NoJudge2551@reddit
I generally ignore the AI is the best ever posts because they're likely just people hired to make the posts. Especially ones that name drop products somewhere in the middle.
Another factor is not being able to post about problems seen with certain products. I posted here once about seeing a certain product become more inefficient and ineffective during specific workstreams over the past year (we'll call it bit pub dodilot) and the post was immediately taken down. So take that as you will.
AI has uses, many great uses. It will not replace senior devs. Companies will keep trying for a while, then once they've bled enough revenue from prod failures and the inability to deliver, the hype will be over.
chaitanyathengdi@reddit
In some cases GPT even increases accidental complexity if the programmer doesn't know what they are doing and just accepts whatever garbage the model puts out without verifying it.
geon@reddit
You are essentially correct, but seem a bit confused and vague about terminology.
”Coding” isn’t a well defined term. It can mean a lot of things, from only the typing on a keyboard, to the entire engineering process.
But yes, AI as it exists today is at best glorified autocomplete. You still need to do the thinking yourself.
FeliusSeptimus@reddit
I think this is a key point. Researchers are working on adding/improving AI metacognition which will/may expand AI capabilities closer to human.
geon@reddit
There are no signs of that panning out.
FeliusSeptimus@reddit
10 years ago what we have now was pure fantasy.
Not saying they'll release a strong metacognitive model in six months, just that 'AI as it exists today' is a quickly moving target.
geon@reddit
A blatant lie. The llm field is stagnant.
geon@reddit
Actually, the Attention paper was published in 2017, so nearly 10 years ago. That’s still state of the art.
The llm architectures we have now have reached their full potential already. And they are nowhere near capable enough to be more than toys.
Going further would require something completely different. It isn’t a matter of refining the existing tech.
To make a car analogy; the current llms are not the early cars, but the most refined horse drawn carriages. No matter how much money is poured into them, you won’t find the future there.
Alokeen011@reddit
In the last 20-30 years, our job was called different things, and some of those have evolved to mean something else along the way.
I remember being a programmer once upon a time, and now I'm a software engineer. Been called quite a few different things in the meantime, forgot most of them.
I do remember 'coder' appearing as a term, and that was deemed less than a programmer - one that translates decisions into code without actually doing any 'smart' work.
geon@reddit
Yes. The idea that someone should just ”type in the code”, and that all problems are already solved is ridiculous.
Sparaucchio@reddit
It never existed in practice. I've never had a job where I could do just that. Never seen anywhere. Not sure what people talk about, when they mean this.
geon@reddit
That’s why women were the first software developers in the extremely sexist society at the time. The men had completed designing the hardware. Just typing in the software was seen as menial work, like a secretary, suitable for women.
Sparaucchio@reddit
Whatever. But it never was "just typing". If they were "just typing", then we all are "just typing" today
geon@reddit
I’m agreeing with you.
fishyfishy27@reddit
Bingo. This post is just playing boring semantic games.
Transcender49@reddit
gotta drop this youtube that discusses exactly that. it is a good watch
Bost0n@reddit
There’s an idea in aerospace structural design: “It’s easy to model a part that cannot be manufactured, what’s hard is designing a part that can repeatedly be produced in a cost effective manner”.
I suspect the same is true for software: it’s easy to write a spec for code that can’t easily be written. The trick is understanding the constraints. If a programmer does not understand the API or the system they’re writing code to interface with, no matter how good/bad the spec is, it won’t matter. The programmer has to understand context.
zhenifer@reddit
True. Back at university, one of my professors used to tell us that programming is just the "craft".
However, I expect an enshittification of tech, with code that becomes harder and harder to maintain. And everybody and their mom claiming to be a technical expert, since they can put everything into chatgpt and get some answer that sounds reasonable to them.
IT used to be a comfortable place for me, with people who loved to analyze problems and build solutions. Nowadays, I meet more and more people in IT who "don't like to code" and the vibe coders are just the cherry on the top (haven't met one in real life yet).
So yeah… I'm not very optimistic about the future of our field and I am really worried about the end game here
vtmosaic@reddit
I've been a SWE for more than 30 years myself. I am using a LLM to help me communicate and document all the details needed to develop a good feature. It's like wearing a mech suit: I can pick up and lift heavier loads more quickly and more to my satisfaction than I have ever been able to produce before. I interact with it like a junior peer. It does the paperwork so I don't have to. But it's me, my skills, knowledge, and experience that's running the show.
OddWriter7199@reddit
Well said OP. Post saved
ghoztz@reddit
This is exactly it. And as a technical writer in this field I just want to say documentation is the same. There is essential complexity that requires a content engineer to solve the problem. And the problem scales with the size of the docs and complexity of the product.
CedarSageAndSilicone@reddit
AI is amazing if you know what you're doing.
Like, I just vibed out a cool image mixing prototype for an app I have with skia in about an hour today.
I knew exactly what libraries I needed, and how it should work at the code level.
The first version output kinda worked but had some problems that I could easily identify by reading the code (deprecated/invalid api calls, some thread management stuff) so it was very easy for me to tell the LLM exactly how to fix it - and for me to make some fixed by hand.
I can use chatgpt, gemini, and claude simultaneously and in slightly different ways to check different results against each other and find the best versions/fixes/approaches for more efficient and higher quality output.
Now it does exactly what I want, and the LLM even automatically suggested some features I hadn't thought of yet.
If I didn't know what I was doing and just told it at a high level what I was imagining, I'd probably still be here yelling at my computer.
titoNaAmps@reddit
Thank you for sharing. This pretty much nails it till AGI I suppose lol. But seriously appreciate taking the time to post and articulate your thoughts, it'll prove quite helpful in my future conversations for sure.
Lazy-Past1391@reddit
AGI isn't going to happen
timmyturnahp21@reddit
Lol cope
Lazy-Past1391@reddit
Ooooh, how's the cult?
timmyturnahp21@reddit
Not a member lol. I think it’s just denial at this point to not see the writing on the wall though as AI continues to improve
Lazy-Past1391@reddit
AGI isn’t happening, it's a sales pitch:
So far it's scaling pattern matching, not building understanding. The systems can’t reason about novel problems - they need training data for everything. That’s not intelligence, it’s sophisticated autocomplete.
It can't figure out writing a rocker-compose.yml, much less anything truly complicated.
Theres also the “symbol grounding problem”. LLMs manipulate tokens without comprehension. they don’t “know” what a dog is, they just know what tokens typically appear near the token “dog.“.
AGI keeps shifting. Beat chess != AGI. Beat Go != AGI. Pass the bar exam != AGI. It’s an unfalsifiable marketing term that moves whenever convenient.
AI companies need massive valuations. “We built a useful narrow tool” doesn’t justify billions in investment. “AGI in 3-5 years” does.
None of the companies are profitable. OpenAI lost $5B in 2024, burned through $10B in funding by June 2025, then needed another $8.3B by August. Anthropic burned $6.5B last year.
The economics don’t work inference costs keep rising, not falling, especially with “reasoning” models. They survive on endless funding rounds, not business models. Companies building on top (like Cursor) just funnel VC money to OpenAI/Anthropic, who send it to cloud providers. Nobody’s making money. It’s a cash dumpsterfire justified by AGI promises.
timmyturnahp21@reddit
So are you going to massively short openAI to back up your words when it IPOs? It’s free money in your mind
Lazy-Past1391@reddit
You don’t have a substantive counterargument - just “put your money where your mouth is”. Shorting isn’t free money, timing matters, and markets can stay irrational longer than my bank account can handle. Just cause people are willing to throw money at something doesn't mean it makes sense. Look at Tesla for fuck sake.
You say your not in the singularity cult but you sure sound like it.
Ihodael@reddit (OP)
Glad it resonated. The interesting part is that I’ve been having this same conversation for over 20 years: 4GLs, low-code, no-code, UML, AI, and so on.
With a mathematics background, the idea of the irreducible feels natural to me.
It’s fascinating how the “coding is dead” discussion (or some variation of it) keeps resurfacing as our tools evolve.
Ok-Asparagus4747@reddit
I have never felt someone express my thoughts in such a succinct and clear way as this post has.
100% true, coding is so simple after a couple years, the hard part is thinking through the logic and wtf we’re supposed to be building
tysonfromcanada@reddit
but can it name variables and do input validation?
i-can-sleep-for-days@reddit
It will still replace a lot of people because a lot of people don't get that writing code is the easy part. Even before AI it was never about the code, except just being able to code was sometimes enough to have a job. That will change and if people don't get that, then they will not have jobs.
MiniGiantSpaceHams@reddit
Downvote away, but I hate this mindset. Coding isn't the bottleneck, but it's not free either. If AI can speed up the coding, then I have more time to do the other stuff.
codemuncher@reddit
The good thing about code is it enforces a rigor that lacks in most other engineering documentation methologies.
That's the thing, we've all known projects that seemed to be a good idea, that were "well thought out" and during development and implementation everything was great! Unit tests worked well.
Then integration happened. And the resulting mess was so bad, the entire project was curtailed or even cancelled.
But can you learn this by looking at endless amount of imprecise english "requirements" and "design docs" and figure it out from boxes and UML and etc?
I doubt it. It didn't work in the past either. There's a reason why all the hard core tech companies that grew up out of the \~2004 era are relentlessly code first. Google, Facebook being the household names. They've changed a bit as they've aged, but even at google a design doc is just a milepost in time, and the reality is held in the code.
zattebij@reddit
I'd also like to add that LLMs generate their responses / code fragments from training data. They can recombine different requirements with different solutions, but they cannot generate completely original new patterns - there is no creative process. If it "learned" that for problem X, many existing solutions in its training data use pattern Y (or library Z), it will weave pattern Y (or use library Z) in its response. But it didn't come up with pattern Y by itself (or write library Z).
This means that if we'd now stop writing new libraries or researching new code styles or paradigms manually, we'd have LLMs stuck on generating solutions based on the tech of now. There'd be no progress or change towards new paradigms, new insights, but we'd move ever closer to ideal implementations of the tech of now.
Which in one short-sighted sense is "nice" - today (and also in 6 months) I'm working with tech of now, so to get implementations of this spat out at me automatically and avoid repetitive work or boilerplate, could save me time (inasmuch as I don't have to spend at least the same amount of time formulating prompts, waiting for extremely energy-intensive generation, and reviewing what it spits out).
But in the longer term, this means stagnation - either total stagnation, or at least leaving future original design to the few companies that are still willing (and able) to invest in creative people and original work - which in practice, I'm afraid, will mean that a few megacorporations will be deciding on what (if any) new code styles and patterns everyone else will get spat out of their LLMs, and what the future of automation and software development will look like. The word for it is oligopoly.
Don't be surprised if one would then need to purchase a license to use such generated content. I'm already seeing growing discontent with original creators (literature, graphic design) about their work being used for training without compensation or even crediting the original author. Let's say that copyright law will be extended to AI-generated derived work, and we'll find ourselves in licensing hell. Of course these large corporations will also have the most resources to track usage of "their" works and patterns by AI, and follow them up with claims.
1STNTEN@reddit
I agree in general, but it will certainly reduce demand. I’m trying to position myself in a more specialized field of SWE before it’s too late.
TimurHu@reddit
Thank you! 100% agree, I think you hit the nail on the head.
The main issue here is that there are a lot of people in the industry who believe that the accidental complexity is "the" complexity. And of course there are those who really think that coding is the hard part. They see AI as the silver bullet and don't stop to think that the real complexity lies elsewhere.
00rb@reddit
Honestly, I think a lot of it comes down to the fact that people need to protect their egos and say "if there wasn't such a high entry cost I could do what the programmers do."
For some people that's true but most ordinary people just aren't very good at the logic required.
I will say though that AI could conceivably get good enough to gather business requirements and talk to stakeholders. There's nothing stopping it from becoming that, although I'm still skeptical that it's in the near future.
TimurHu@reddit
Yes, I think so too. It's about fragile egos.
Sad_Amphibian_2311@reddit
It's a product & management perspective. The problem can't be the vague knowledge of the business processes and the inability to commit to a definition. No the problem has to be engineering.
hippydipster@reddit
The real problem is that product and management has spent decades now trying to offload the knowledge of business processes and definition of customer value to engineering, trying to leave themselves with the simple job of pushing on the "GO FASTER" lever as their main contribution.
TimurHu@reddit
I've seen this attitude often from mamagement or non-technical people. They think if only they could write code, they'd surely do a better job at it than we do.
macbig273@reddit
Same here, once things are well defined, coding is just "execution". (at least in general dev environements, when you go into more research style it could be difficult to define well what you want).
But for apps, backend, frontend with well defined behavior you're 100% right
Deranged40@reddit
If ChatGPT could create applications on its own, OpenAI would be hiring project managers en masse instead of selling licenses to it.
If ChatGPT could multiply a developer's output by a significant number, then in the past 3 years that it's been available, we would've seen at least some companies more than double their output.
We've seen no indication that ChatGPT is really capable of any of this.
_ontical@reddit
Just because coding isn't hard for you, a veteran in the field with 30 (!!!) years of experience, doesn't mean that coding isn't hard.
Synyster328@reddit
Interesting perspective, but if coding isn’t the hard part, then what exactly is?
Is it understanding the problem, clarifying requirements, negotiating tradeoffs, designing architecture, or testing edge cases? And if so, which of those steps do you believe a modern AI system couldn’t already handle at or above the level of an average professional? Have you actually spent much time building with or integrating tools like GPT-4, GPT-5, Claude, or multi-agent setups?
My experience is that most developers who dismiss AI haven’t really used it deeply. Once you do, it becomes clear that there’s no single cognitive step in the software development process that isn’t already being automated. So maybe the question isn’t whether coding itself will be automated, but whether the entire practice of software engineering is just another system in the process of being absorbed by AI.
Ihodael@reddit (OP)
"Have you actually spent much time building with or integrating tools like GPT-4, GPT-5, Claude, or multi-agent setups?"
Yes, I did. Also I'm a huge promoter of LLM usage in my teams.
"My experience is that most developers who dismiss AI haven’t really used it deeply."
I didn't dismiss it. Quite the contrary. I stated it is just a tool, a new tool. A nice tool with lot's of potential but still not yet a replacement for thought.
Of course my opinion is heavy influenced by the problems I have to face at work.
Synyster328@reddit
That's cool, let me know when you have answers to my other questions if you'd like to discuss further
Taikal@reddit
Agree. AI won't make coding obsolete, it will just make many software developers unemployed.
Crazyboreddeveloper@reddit
This is how I imagine things going if the AI bubble does not burst on its own. The current AI models are not actually profitable and they are not capable of fully taking over developer jobs. if AI companies continue to run these models at a loss while loudly claiming that AI will replace human programmers, they can create the illusion that coding is a dying career. As a result, fewer students will choose to study software development and fewer entry-level developers will gain real experience, because companies let AI handle tasks that junior developers would normally do.
Over time, as experienced developers retire, or get laid off, and fewer new devs enter the field, mediocre AI output will seem like the best option because it’s the only option left.
At that point, companies like OpenAI will increase their prices dramatically in an attempt to reach profitability, because businesses will have no choice but to pay. AI coding assistance remains in its current state which is basically like buying packs of trading cards where you keep paying to get the result you want, vs buying exactly what you want. There will be no incentive for AI companies to make their models more efficient because the inefficiency itself generates profit, and is also likely the we can do with the amount of knowledge we have generated over that last 10k years... Eventually a major company will suffer a catastrophic failure due to flawed AI generated code and no one in the company will know how to fix it. When that happens, businesses will panic, and realize their product is slowly becoming the output of a spicy slot machine that no one can fix if it breaks, and begin fighting over the few remaining developers. The career will become valuable and in demand again.
General_Hold_4286@reddit
if AI speeds up development by 10% means there would be 10% less developers needed to complete the same amount of work. Which would influence demand:offer for developer jobs. Developers needing a job would compete against each other which means higher bar to get a job and lower salary. Which is basically what we're witnessing today.
Legend says (it's just an urban legend) that in the 1970s during the gas crisis, the amount of fuel extracted was lower by 10% but it caused gas prices to increase tenfold
h_blank@reddit
Although I agree with most of this, that line specifically reflects a very 1980's waterfall perspective on software development.
In a very real sense, modern agile workflows often do the "coding" in parallel with the "engineering", and the two actions inform and influence each other. I feel that this style of development actually can gain some benefits from quicker iteration provided by an decent LLM (assuming decent quality AI and intelligent usage of it).
Again, not pushing back on the original premise: that AI is not going to result in 10x improvement for anyone. But I will say that Agile is often limited more by iteration speed than other factors, so small speedups in the "dumb" parts of development can actually make a difference, so we also shouldn't discount it entirely.
PS: A purist would probably say "if we're getting a benefit from AI, we weren't doing engineering right to begin with", and that's a valid discussion :-D
Ihodael@reddit (OP)
I have to disagree with you. I see no conflict between what I'm trying to summarize and agile practices.
Another comment posted Robert C. Martin take on this same topic. Not sure how to link it.
Uncle Bob was part of the Agile manifesto creation. The book quoted from is called Clean Code: A Handbook of Agile Software Craftsmanship.
TacoTacoBheno@reddit
I'll tell you what AI has done for me: creating false positive report tickets run by other groups against our code base.
Three times in the past week the boss says OMG we're on a report, and it turns out their AI bot is just garbage.
So accelerated!
G_Morgan@reddit
AI is going to make coding much better paid. The industry will pay for not hiring juniors during this time frame, as they have at literally every point in history.
Anyway the hard part of the job has always been the talking bit, not the typing bit.
Less-Fondant-3054@reddit
Ex-fucking-actly. "AI" is just yet another turn on the wheel of code generators. We've had "data driven" systems that were supposed to let BAs plug and play, we've had literal drag-and-drop GUI tools meant to let BAs make code out of flow charts. Hell even the most common languages we use today were initially hoped to allow non-SWEs to take over the coding task. Not a one of them replaced the SWE. Because, as you point out, the hard part has nothing to do with coding. It has everything to do with interpersonal communication and creativity.
timmyturnahp21@reddit
My large, well known company just announced we will no longer be expecting senior level and above to be mentoring lower level engineers.
They also had another small layoff round, targeting our workers in Ireland this time. It was US last time in a much bigger layoff.
Yeah, everything is fine.
thewritingwallah@reddit
AI tools definitely have potential, but it feels like the expectations were set way too high, too fast. It’s a reminder that tech adoption takes time not just the tools, but the processes and people around them need to evolve too. Hopefully, the industry starts focusing more on realistic, long-term integration rather than chasing quick wins.
Well I like to use AI for three things:
I treat AI like you would a professor, if you ask your teacher for the answers for a test or hw assignment, they wouldn’t give it to you.
I've been doing software development for 16 years and I use AI similar to how I used reference sites, like stackoverflow, and reference books, like C Cookbook, in the past. In general, it's better than these older methods since I can tune it easily to fit a particular objective. I almost view it as an eager junior co-worker who can help out a lot but needs oversight.
remember that nobody likes to review the code, Ive been working with many teams and everyone hates to review others code, you need to ask many times and often at best they just skim through your code and add some comments regarding code style, variable names, etc. And people are saying that this job in the future will be only about reviewing, lol.
more detailed notes here : https://bytesizedbets.com/p/era-of-ai-slop-cleanup-has-begun
nedal8@reddit
Same as it ever was
Fit_Rip2473@reddit
This is one of the most grounded takes I’ve seen on the topic. AI might smooth over the accidental complexity, but it can’t eliminate the essential kind. The hard part has always been understanding and expressing intent — not typing it out.
badbog42@reddit
“One might argue that a book about code is somehow behind the times—that code is no longer the issue; that we should be concerned about models and requirements instead. Indeed some have suggested that we are close to the end of code. That soon all code will be generated instead of written. That programmers simply won’t be needed because business people will generate programs from specifications. Nonsense! We will never be rid of code, because code represents the details of the requirements. At some level those details cannot be ignored or abstracted; they have to be specified. And specifying requirements in such detail that a machine can execute them is programming. Such a specification is code. I expect that the level of abstraction of our languages will continue to increase. I also expect that the number of domain-specific languages will continue to grow. This will be a good thing. But it will not eliminate code. Indeed, all the specifications written in these higher level and domain-specific language will be code! It will still need to be rigorous, accurate, and so formal and detailed that a machine can understand and execute it.”
“The folks who think that code will one day disappear are like mathematicians who hope one day to discover a mathematics that does not have to be formal. They are hoping that one day we will discover a way to create machines that can do what we want rather than what we say. These machines will have to be able to understand us so well that they can translate vaguely specified needs into perfectly executing programs that precisely meet those needs. This will never happen. Not even humans, with all their intuition and creativity, have been able to create successful systems from the vague feelings of their customers. Indeed, if the discipline of requirements specification has taught us anything, it is that well-specified requirements are as formal as code and can act as executable tests of that code! Remember that code is really the language in which we ultimately express the requirements. We may create languages that are closer to the requirements. We may create tools that help us parse and assemble those requirements into formal structures. But we will never eliminate necessary precision—so there will always be code.”
Excerpt From Clean Code Robert C. Martin This material may be protected by copyright.
SignoreBanana@reddit
I think a quick off the hip response to this is "ok well what happens when AI starts understanding complexity," to which the answer is "we will probably create even more complexity that it won't understand."
The fact is, we build everything to the limits. And the limits are always shifting because we build more things to push those limits out.
marc_polo@reddit
Agreed. Practically, an AI still can’t join a Zoom meeting, track multiple parallel conversations, or interject at the right moment. All in real time and with human-level latency. I think it’ll get there eventually, but that kind of situational awareness is a long way off.
No-Vast-6340@reddit
What a great post. Thanks for this.
lcvella@reddit
That is all fine, but it doesn't address the practical concerns of professional software developers: the risk of losing their jobs.
The accidental complexity is costly, and if you remove it, you can do with 10 the job that previously took 15 people.
hippydipster@reddit
If the coding is the easy part, maybe it is exactly the part that AIs will make obsolete, and we will instead focus the humans on the harder parts of the business.
Which is what we decisively call "vibe coding".
pragmasoft@reddit
Any project contains certain amount of accidental complexity, so we still have what to optimize. I feel that there's certain "terminal" level of total complexity which limits further project evolution. Using AI will probably allow reducing accidental complexity part, which in turn allows to potentially increase the bearable level of essential complexity part, ie allows making bigger and more complex projects still possible to comprehend.
TimMensch@reddit
Brooks was right about a lack of a 10x silver bullet for developers who existed in the 70s. AI could theoretically break that pattern, but I agree that it doesn't.
But the thing is, today we have developers who would never have survived their first month of work in the 70s. Developers who are 0.1x or even less productive on their own. Developers who, before AI, couldn't have written FizzBuzz, and who basically Googled every single piece of code before copying and pasting and hacking until it worked.
AI can make those developers 5-10x more "productive," though only in the sense of creating code.
There's still the fact that a solid developer will have a better idea of what code needs to be written, and will create better code with fewer defects. But wow, AI can speed up these low-skill developers to the point where they're creating code disasters at quite a high velocity.
I think that's the source of the huge disconnect. Experienced skilled decelopers seem to have a very strong consensus that OP is right, and that AI isn't nearly as revolutionary as everyone claims. But I've also heard, from people I trust, that AI is helping some developers gain extreme productivity compared to their low baseline performance.
And frankly, low-skill developers make up a huge fraction of "the industry." So in that respect, AI is going to have a potentially huge effect on that end of the industry, even if it has barely an impact on high-skill software engineering jobs.
GTHell@reddit
With both AI tool, tons of token and credit, experience in programming, yet I don't know what to do...
MeatyMemeMaster@reddit
Bro has more experience than I have years on this earth 😅
LoadingALIAS@reddit
What a fucking prescient post. Juniors, take heed.
Stamboolie@reddit
AI is great at making things that are well documented - data structures and algorithms, it really saves me a lot of time. It has no idea about putting it together though. It seriously has saved me some weeks on my latest project, moreso I've been able to add stuff that I would have left out, and become a separate week or two project.
TimMensch@reddit
I've seen AI spit out beautifully documented code, where the comments described exactly what the code should have been doing based on my prompt.
Should.
The actual code produced was absolute garbage that could never have worked.
I also have spent nearly 40 years in the industry, mostly working on particularly challenging problems, and can count on one hand the times when the problem I needed to solve required an algorithm from a book. Maybe two hands, but not more than that.
Most of the time, if I want to use a data structure I can look up, there's a library I should use instead that will do exactly that.
thodgson@reddit
Nailed it
TheElusiveFox@reddit
I think a lot of experienced devs understand this, and its why people with 10+ YoE aren't worried about their jobs any time soon... The real issue with A.I. Is apprenticeship - juniors who are being replaced in full sale by ai, and juniors that come onto the job and simply aren't learning how to properly problem solve because the thinking for easy tickets usually handed to juniors is being offloaded to A.I.
toastnbacon@reddit
I've had a lot of excuses to link to one of my favorite comics this year - https://www.commitstrip.com/en/2016/08/25/a-very-comprehensive-and-precise-spec/?
Additional-Bee1379@reddit
I disagree that this is a fundamental problem. The huge strength of AI is the very low cycle times. You can instantly correct the AI if behaviour isn't as you desired.
RandyHoward@reddit
Good luck with that. I've tried this whole vibe coding thing a number of times now. While I find it great for concepting, it can also be very difficult to fix a problem by vibe coding your way through it. Just last week I was trying to put together a conceptual UI for a report. I got it to a place I was pretty happy with, then I decided, "You know what, this select input is kinda redundant and I want to remove it." So I asked AI to do exactly that. As it turned out, AI had this input value tied into so many different places in the code that it produced a dozen errors. It didn't care that those errors were created, it just said, "Okay the input is gone." And the whole application was broken. After half a dozen tries of getting the AI to fix it, it finally did fix it, but that also introduced other problems. Things that were working fine before were suddenly not working the same. It created an endless cycle of problems. Nothing about correcting the AI was instant. I vibe coded my way to a pretty decent UI initially, but as soon as I changed my mind about something it all went off the rails.
Prestigious_Tip310@reddit
What makes you believe realizing the behavior is undesired is the fast part of development?
Implementing the „happy path“ is the smallest portion of the process. Finding all the edge cases nobody thought about and ephemeral bugs that occur once every leap year is what costs the most time. Fixing the broken stuff once you found it is usually easy.
An AI might do the fixing in seconds where a developer needs minutes, but compared to the hours or even days to find the bug that’s hardly relevant.
hellocppdotdev@reddit
If you can't figure out the behaviour how do you correct it? You just agreed without realising 😂
Neverland__@reddit
YES
EkoChamberKryptonite@reddit
Great context. This should be framed on a huge wall somewhere.
SubstantialListen921@reddit
This absolutely checks out with my 34 YOE. The complexity is the hard part.
The feature of what’s happening in 2025 that makes me a bit sad is that the industry committed to hiring a huge number of essentially low skill syntax translators. Those jobs are now very much at risk, because they have little or no role in the requirements translation process, and they do not understand how they have been trapped into essentially deskilled work.
zayelion@reddit
I agree completely. In 70 years time I imagine programming as today's project management shifting somewhat into star trek TNGs engineering. Where they ask the computer questions but have to do the deductions and deeper reasoning themselves. Humans classically really suck at communication.
optimal_random@reddit
You have articulated eloquently, in one solid post, the general feeling around AI and other fancy tooling.
Southern_Orange3744@reddit
I'm nearing the 25 mark and I agree with this message.
A lot of senior and lower engineers are happily shielded from a epic ton of upstream work .
By the time things line up in ticket form for near shovel ready work is the hard part.
If you're feeling worried about the state of software engineer you should lean into the business side , testing , operations
vac2672@reddit
everyone loved their intellisense... now it's just better
robby_arctor@reddit
I don't think this is true. AI can generate both more guesses and more accurate guesses than humans, the latter at least in some cases.
This post is also rebutting the idea that AI will make coding obsolete, which I think misses the point of the concern around a lot of AI hype.
Coding doesn't have to be made entirely obsolete for AI to deeply damage the labor market for developers. Cell phone cameras did not make professional photographers obsolete, but they hurt the market for them. Same with streaming services/DJs and live music.
Ultimately, the practical problem is systemic - new, labor saving technologies, especially when paired with an investment bubble, can absolutely devastate working class communities. Why have we set up our economy such that technological innovation can wreck peoples' lives, and how can we stop doing that?
To me, this is the real question. Whether or not our particular labor can survive this particular innovation is really incidental to the larger phenomenon IMHO.
Nofanta@reddit
Do you think people who spent their career coding and enjoy it and are good at it will enjoy writing in English just as much? If English is the new skill, isn’t the market of people with that skill quite a bit larger?
lawrencek1992@reddit
There are fewer people on my team who can design a solution than who can implement it. I think the need for people who can implement (e.g. take a feature spec and write the code for it) will decrease over time with ai. But I don’t see the same happening for people who design the solution.
Ihodael@reddit (OP)
The core skill of a developer isn’t writing in a specific language. Human or programming, languages are just tools for structured reasoning in software engineering.
If you can reason clearly in one, you can usually transfer that to another, as long as it’s at least as expressive and you take time to learn its quirks (imperative to functional, English to French).
Replacing C++ coding with coding through an LLM follows the same logic, assuming the interface offers the same or better precision and control.
And no, I don’t think the market will be much larger. As Knuth once said (Dr. Dobb’s, 1980s, if memory serves), there’s a limited number of people with the right mental framework for software engineering, not everyone can do it, just like not everyone is good at poetry.
69Cobalt@reddit
This is it! As someone that is not a "vibe coder" but users AI daily at work I feel caught between two worlds where I don't agree with the AI gonna take over narrative but also don't think it's useless. But the way you expressed this captured exactly how I feel and what I do.
I don't "write" much code anymore, but I do break a problem into small chunks, write up design specs, and then take small pieces of that and feed into an LLM to write the code for me.
It feels easier and faster for me but at the same time does not feel like a qualitativly different activity because of exactly what you said ; the problems of the tools change or improve but the fundamental problems of the problem itself are constant and that is where the real engineering is regardless of tools. Whether that's English tooling or assembly.
lawrencek1992@reddit
Dear god yes. I see these posts saying AI went off the rails, and I don’t get it. By the time I’m ready to develop a feature I’ve got a spec with the whole thing planned out, and we submit those in markdown so they are already in my editor. Not hard to get an agent to implement for me when everything has already been designed and planned out.
Also can have one implement a feature while another handles change requests on a PR, and a third is being my rubber ducky while I think through some new service we’re going to need for an upcoming project. Feels like a force multiplier when I can get multiple things done at once and only need to review. Our norm is very small PRs too, so I’m not reviewing thousands on LOC. I’m reviewing maybe a couple hundred, pushing a PR, and moving on to the next deployable piece of code in that stack.
Lyelinn@reddit
while you're right, there's also an interesting situation: AI stuff is slowly replacing junior engineers and I suspect later on they will be more of a liability than someone actually making code in production, because once current middle/senior/etc people will go further or even retire, there's no one to fill the gap. New generation is relying on LLMs more and more and learning less and less, which will force companies to hire vibe-coders and others who barely know what they're doing and actually train them to become viable programmers.
If that happens, it will put enormous pressure on the market because only big corporations will be able to afford basically running a school while smaller companies and startups will be forced to shift to more senoir people or accept vibe coders that also bring financial liability lol
Dddfuzz@reddit
TLDR: the thing ai is trying to fix is lack of time, not lack of ability. If clients and management actually listened to the reality of development, we would not be in this mess.
I kind of wonder if we as programmers should start setting better boundaries and communicating expectations better. We all know the jokes but I’m at the point of just calling unrealistic timelines for what they are, constructive dismissal. If you say it’s gonna take 6 weeks and they say you have 2 and fail and they fire or threaten you, make it very clear that they ignored reasonable timelines. Again though they are gonna find someone who doesn’t know any better that they can do it in 2 then take 6 anyways, congrats there now suddenly 8 weeks behind and with a freshly promoted junior instead of only 4 behind what they would have wanted with some consistency in staffing. (the company is gone now XD)
I’d love to see a vibe coder drop a 100 page req doc into an llm without where half the headings have TBD in the body. Reqs have and will always be the problem, adding more layers is not gonna fix it, just bury it deeper and deeper until it festers. “Can you look at my vibe coded project” is becoming the new “I have an idea”, and I hate it, at least the idea people are open to talking about their idea and are happy to take advice if you say no(well not always but sometimes I just like to hear myself talk and their happy to listen xD). The vibe coders just defer to believing the ai instead and tell you your years of hard earned experience of staring endlessly at code, docs, logs etc etc. are useless while struggling with making even a basic application
I’m not even gonna touch security, cost or environmental impact but it’s a bad solution to a problem created by impatient people who care more about money than actually doing something productive
bstaruk@reddit
I can personally attest to the fact that I am less reliant (read: no longer reliant at all) on delegating to juniors because AI does better work, faster, and on my time/schedule. I went from being a 2x developer to a 50x, compared to a couple years ago.
My fear is not about making coding obsolete -- it's about there being more devs than dev work to be done.
AsterionDB@reddit
My 44 YoE says you're right! You also touch on the inherit complexity of computer science. I consider this complexity to be a conserved resource in computer science, much like energy in thermodynamics. What I'm getting is that somebody has to resolve the complexity that exists in a system and where that is done makes a big difference in the overall effort and outcome.
Resolving the complexity low in the 'stack' makes it easier for higher-level programmers to do their job. Unfortunately, I feel that most of the complexity today is resolved to high up on the stack.
seven_seacat@reddit
Hear fucking hear.
rahul91105@reddit
This is all true. Heck what’s even worse is that as time progresses, business and other requirements change and it gets more and more complicated to add new features/functionality.
The issue has always been building good and reliable software through passage of time. AI might give the most efficient solution right now but integrating it into current systems and incrementally developing on top of it, is the real job of Software Engineers.
MonotoneTanner@reddit
Been saying this for ages. The actual syntax and code is the easiest part and we have pretty much full control over . It’s the “software development” part that is tough
DibblerTB@reddit
Well.. They may be accidental, or they may be doing something useful in some way.
Nikulover@reddit
After clearing all the ambiguity, you still need to write code. That still takes days depending on the tasks. If ai can do that part in just hours then you just can get rid a lot of engineers.
Inevitable_Cod3583@reddit
But when problems arise, it is humans who have to find the flaws and fix them....
Abject-Kitchen3198@reddit
I've been on and off LLM usage, trying things out since Copilot introduction. At this point, I am actually closer to just drop it altogether, or reserve it for few use cases and focus on something more valuable. Like streamlining and simplifying things in the given domain until writing a prompt to an LLM feels like more effort than implementing the thing.
PositiveUse@reddit
Well… honestly after reading your post, it basically saying „AI makes coders obsolete“. Maybe not coding but people that actually code. As soon as you make a tool available that can create full blown apps out of requirements written in natural language, you have abstracted away the coding part for most of the use cases.
Will there be a need for people that can formulate requirements? Yes. But were’s the coding?
Ill-Statistician6182@reddit
Your goal is no longer “how to code.”
It’s “how to think and translate reality into systems.”
Forward_Gear3835@reddit
I have found that Ai generated code makes my life harder if it tries to give me a final product 🥵