This is actually good for us. Worse developers, more developers needed, less supply for the same demand, higher need for good developers, higher salaries, and more opportunities.
Well if only companies would understand and react fast.
Unfortunately it will be slow enough to impact lots of developers and managers that took firing/hiring decisions will hold on to them to not be themselves made accountable for product quality drop, cost increase, bad customer feedback and sales drop.
So everything will be slow. People will be fired. Salaries will go down. Opportunities will go down. Barely after an aligned understanding and in social media, we'll see a rebound.
I work in tech consulting, and there is an important tactic. If the client has an idiot manager/director/VP who is about to fuck up something major, you stay out of it and just let them do it. You never stop them.
Once the tragedy happens and it needs fixing, you go in and advise one level higher. You act like they have no clue what they are doing, you push the right buttons (which is often relatively simple), and charge a shitload of money. They will be the happiest people ever.
You can do this as a developer as well, and many developers will do it. Work might become a bit more seasonal, but trust me, once their teams stop delivering those "AI-only" tech leaders will disappear fast.
The more likely reality is that the guy who uses 4 AI models simultaneously to produce 10x more utter garbage code than you will get promoted, and then promote their buddies, and you’ll be the first to go when layoffs come around.
Under ideal conditions you might be right but meritocracy isn’t real. If it were we wouldn’t be having this conversation
And than that company will go down. Or at least their development will suck.
Engineering culture is one of the biggest contributor to low velocity. There are many reasons why the culture can cause problems, vibe coding hype is one of them.
It's a problem on individual level, and can cause personal dramas (you being laid off), but it rarely work in large numbers. Good developers even avoid bad culture, therefore they will end up with a department full of idiots.
Tools are a force multiplier. If you’re bad at math, a calculator will not get you far. Same with power tools if you don’t know what you are doing. Developers that become religious regarding tools are bad developers, either because they rely too much on it or because they have skill issues and don’t know how to work with them.
Bad developers with AI are still bad developers. The skillset will shift, you have to adapt to it, but without strong elementary knowledge you won't get far.
Working on legacy code is already a higher paid job. Cobol developers maintaining banking systems are making serious money.
We have a project with a massive legacy code base, it's eating up the 4th lead in the company in 2 years. I've advised a college not to take the lead role and we had candidates who said no on this project. Code refactoring will be an expensive hell. People who can use AI to research, write code and refactor at the same time will be wanted more.
Yep, we're going to see AI cycles just like outsourcing cycles.
2 years of outsourcing/AI where standards get forgotten and slop gets pushed through, then 2 years of mass hiring desperate developers on low salaries to clean up the slop, then mass layoffs and reinstituting outsourcing/AI again.
Bad take. Either AI does not affect developer skills, because it's not useful, or it does make developers worse, because it's adequately replacing things they used to do themselves.
A counter example. Recently a junior teammate asked me to review a test file that was generated by AI and then refactored by him. The test file was 1000 lines long, it was full of of repetitions, and it was deeply coupled with implementation details, making refactoring painful.
What's worse: the code under test, that was written by hand by the same dev, had design issues. It was mixing concerns, it was poorly structured etc. The AI tool enabled him to gloss over these problems, making him feel productive instead of adequately feeling the pain of testing poorly structured code.
The AI tool gave him the wrong feedback, and it made the code worse.
This PR didn't pass my code review. With a bit of refactoring, and an effort to test the right thing, we went down to ~100 lines of tests.
I don't see the counter example here. AI doesn't turn junior engineers into senior engineers, anymore than using a saw makes me a carpenter. Someone that's inexperienced will produce code needing thorough review either way, and they will make poor decisions like being unable to adequately evaluate feedback from their AI tool. People have to be trained to use their tools.
Don’t be a jerk. Act maturely. No racism, unnecessarily foul language, ad hominem charges, sexism - none of these are tolerated here. This includes posts that could be interpreted as trolling, such as complaining about DEI (Diversity) initiatives or people of a specific sex or background at your company.
Do not submit posts or comments that break, or promote breaking the Reddit Terms and Conditions or Content Policy or any other Reddit policy.
Rule 9: No Low Effort Posts, Excessive Venting, or Bragging.
Using this subreddit to crowd source answers to something that isn't really contributing to the spirit of this subreddit is forbidden at moderator's discretion. This includes posts that are mostly focused around venting or bragging; both of these types of posts are difficult to moderate and don't contribute much to the subreddit.
it’ll kill the market because people spent too much money on it. when people realize the roi isn’t great, the economy will likely go to shit as the magnificent seven stocks tank
Real problem is lack of accountability. LLM is just a tool. End of the day everyone must still be accountable for the code they commit.
Slop is slop whether they wrote it themselves or prompted an AI to generate it. You deal with it by making them accountable for the code. If they say some shit like “that’s what the chatGPT generated” it’s a red flag imo.
I think a weird side effect of AI is that it increases code throughput of people (bad and good code), but because of the higher throughput, the burden of code reviews is even higher, so people pushing a lot of code cause resentment in people who have to review their code.
You outsource thinking to somewhere. One might say" oh but you still think, just on higher level" and I can't agree with it - I think the hardest when writing code.
With AI developers are enabled to focus on things beyond routine coding. You sound like the people who used to say anyone who didn’t know ASSSMBLY couldn’t be a good dev.
There is a fine line between "too little abstraction" and "too much abstraction". By ditching assembly it maybe said that the code became slower. By ditching higher level programming languages it's not even the speed, but the intent of code itself becomes muddy. Too much abstraction to be useful in the long run.
companies are making AI tools mandatory and the response from devs is often prompt and don't give a damn about the output just push and the reviewer will read it.
some truth to it, new people will not enter the market so easily and all the vibe coders think they are masters of their craft but when I review their code it's still at junior level. When you have no experience you don't know how to prompt an llm to write good code
The issue I have is that it will solve problems and I don't understand the solution. Having not worked through the problem solving process myself I'm usually not anywhere near a place to understand it. Is it good to use a solution you do not understand? Traditionally the answer to that is obviously no, you need someone on the team who can maintain it and understand the issues. Now that person becomes the AI assistant. I'm not super comfortable or confident in this solution though, and it certainly doesn't make me important to the process.
Nah, I'm not worried - well, not worried for long term. AI-slop is definitely creating a bunch of messy codebases out there that will eventually fail and then they'll actually need real devs to fix it.
Of course at that point I'll be an independent contractor charging an arm & leg to do so.
Kukaac@reddit
This is actually good for us. Worse developers, more developers needed, less supply for the same demand, higher need for good developers, higher salaries, and more opportunities.
Infamous_Ruin6848@reddit
Well if only companies would understand and react fast.
Unfortunately it will be slow enough to impact lots of developers and managers that took firing/hiring decisions will hold on to them to not be themselves made accountable for product quality drop, cost increase, bad customer feedback and sales drop.
So everything will be slow. People will be fired. Salaries will go down. Opportunities will go down. Barely after an aligned understanding and in social media, we'll see a rebound.
Kukaac@reddit
I work in tech consulting, and there is an important tactic. If the client has an idiot manager/director/VP who is about to fuck up something major, you stay out of it and just let them do it. You never stop them.
Once the tragedy happens and it needs fixing, you go in and advise one level higher. You act like they have no clue what they are doing, you push the right buttons (which is often relatively simple), and charge a shitload of money. They will be the happiest people ever.
You can do this as a developer as well, and many developers will do it. Work might become a bit more seasonal, but trust me, once their teams stop delivering those "AI-only" tech leaders will disappear fast.
Kevdog824_@reddit
The more likely reality is that the guy who uses 4 AI models simultaneously to produce 10x more utter garbage code than you will get promoted, and then promote their buddies, and you’ll be the first to go when layoffs come around.
Under ideal conditions you might be right but meritocracy isn’t real. If it were we wouldn’t be having this conversation
Ddog78@reddit
I mean, that's the game right. It's not like the vibe coders don't have weaknesses. It'll demand a bit of politics now, but that's life.
And anyways, software development is not even 50% coding.
Kukaac@reddit
And than that company will go down. Or at least their development will suck.
Engineering culture is one of the biggest contributor to low velocity. There are many reasons why the culture can cause problems, vibe coding hype is one of them.
It's a problem on individual level, and can cause personal dramas (you being laid off), but it rarely work in large numbers. Good developers even avoid bad culture, therefore they will end up with a department full of idiots.
So if they fire you, they are doing you a favor.
who_am_i_to_say_so@reddit
I think I’m good, but am not. Is there hope for me?
chaitanyathengdi@reddit
Given that the new devs also don't fall into the same "vibe coding" trap.
flatfisher@reddit
Tools are a force multiplier. If you’re bad at math, a calculator will not get you far. Same with power tools if you don’t know what you are doing. Developers that become religious regarding tools are bad developers, either because they rely too much on it or because they have skill issues and don’t know how to work with them.
flavius-as@reddit
Your response is a very good theoretical explanation of everything.
What it doesn't explain: AI is out there. What will happen?
Kukaac@reddit
It does, you just have to read between the lines.
Bad developers with AI are still bad developers. The skillset will shift, you have to adapt to it, but without strong elementary knowledge you won't get far.
lllama@reddit
There will be a lot more work restructuring fucked up codebase.
There will also be a lot more fake work restructuring fucked up codebases.
Kukaac@reddit
Working on legacy code is already a higher paid job. Cobol developers maintaining banking systems are making serious money.
We have a project with a massive legacy code base, it's eating up the 4th lead in the company in 2 years. I've advised a college not to take the lead role and we had candidates who said no on this project. Code refactoring will be an expensive hell. People who can use AI to research, write code and refactor at the same time will be wanted more.
DorphinPack@reddit
Sounds like a lot of overwork or underpay under the current system. Easily as likely as what you’re describing.
Pulling for your vision though!
pineapplecodepen@reddit
Yep, we're going to see AI cycles just like outsourcing cycles.
2 years of outsourcing/AI where standards get forgotten and slop gets pushed through, then 2 years of mass hiring desperate developers on low salaries to clean up the slop, then mass layoffs and reinstituting outsourcing/AI again.
VariousRefuse9381@reddit
Yes fu*k those cursor users
Spider_pig448@reddit
Bad take. Either AI does not affect developer skills, because it's not useful, or it does make developers worse, because it's adequately replacing things they used to do themselves.
Helkafen1@reddit
A counter example. Recently a junior teammate asked me to review a test file that was generated by AI and then refactored by him. The test file was 1000 lines long, it was full of of repetitions, and it was deeply coupled with implementation details, making refactoring painful.
What's worse: the code under test, that was written by hand by the same dev, had design issues. It was mixing concerns, it was poorly structured etc. The AI tool enabled him to gloss over these problems, making him feel productive instead of adequately feeling the pain of testing poorly structured code.
The AI tool gave him the wrong feedback, and it made the code worse.
This PR didn't pass my code review. With a bit of refactoring, and an effort to test the right thing, we went down to ~100 lines of tests.
Spider_pig448@reddit
I don't see the counter example here. AI doesn't turn junior engineers into senior engineers, anymore than using a saw makes me a carpenter. Someone that's inexperienced will produce code needing thorough review either way, and they will make poor decisions like being unable to adequately evaluate feedback from their AI tool. People have to be trained to use their tools.
GronklyTheSnerd@reddit
If the tool isn’t able to do the job, and the developer relying on it becomes unable as well, it does more harm than good.
Spider_pig448@reddit
If the tool isn't able to do the job, then that's scenario one. Developers will rely on tools only once they've proven they do the job.
circularDependency-@reddit
Insufferable post, cluttering the sub.
gjionergqwebrlkbjg@reddit
And the op is an Indian student, who would have thought.
ExperiencedDevs-ModTeam@reddit
Rule 2: No Disrespectful Language or Conduct
Don’t be a jerk. Act maturely. No racism, unnecessarily foul language, ad hominem charges, sexism - none of these are tolerated here. This includes posts that could be interpreted as trolling, such as complaining about DEI (Diversity) initiatives or people of a specific sex or background at your company.
Do not submit posts or comments that break, or promote breaking the Reddit Terms and Conditions or Content Policy or any other Reddit policy.
Violations = Warning, 7-Day Ban, Permanent Ban.
wobblydramallama@reddit
no worries, shitposting transcends race and cultures
NoCardio_@reddit
But not all of them are responsible for killing the developer market in the past. That's some good irony.
wobblydramallama@reddit
if after 25 YOE you're still that insecure idk what to tell you
chaitanyathengdi@reddit
GTFO with your racist attitude
gjionergqwebrlkbjg@reddit
Half the people who post shoot here without reading the rules are Indian students.
circularDependency-@reddit
You made me check his post history and now I've seen his penis. I think I'll go back to work and stay off Reddit for today.
Snoo_28140@reddit
Omfg....
maria_la_guerta@reddit
I for one love 50 posts daily about how "experienced" developers are burying their head in the sand about AI.
NoCardio_@reddit
This sub has gone from /r/cscareerquestions to /r/iHateAI.
sneakpeekbot@reddit
Here's a sneak peek of /r/cscareerquestions using the top posts of the year!
#1: I attended a screening with HR shirtless
#2: They fired 80% of the developers at my company
#3: Friendly reminder for everyone on this subreddit
^^I'm ^^a ^^bot, ^^beep ^^boop ^^| ^^Downvote ^^to ^^remove ^^| ^^Contact ^^| ^^Info ^^| ^^Opt-out ^^| ^^GitHub
ExperiencedDevs-ModTeam@reddit
Rule 9: No Low Effort Posts, Excessive Venting, or Bragging.
Using this subreddit to crowd source answers to something that isn't really contributing to the spirit of this subreddit is forbidden at moderator's discretion. This includes posts that are mostly focused around venting or bragging; both of these types of posts are difficult to moderate and don't contribute much to the subreddit.
Confident-Ant-9567@reddit
What a fucking stupid post.
freekayZekey@reddit
it’ll kill the market because people spent too much money on it. when people realize the roi isn’t great, the economy will likely go to shit as the magnificent seven stocks tank
DigitalAquarius@reddit
How does a tool make you worse? This makes no sense. This is like saying a calculator will make a mathematician worse.
sanityjanity@reddit
Also because employers will refuse to hire developers who don't rely on AI
gjionergqwebrlkbjg@reddit
No low effort posts/venting/bragging
local-person-nc@reddit
Might as well throw that rule out alin this sub. Mods don't care.
gjionergqwebrlkbjg@reddit
Most of the rules to be honest.
3rdPoliceman@reddit
Hello I am 3 months into reading Automate the Boring Stuff, how get job?
pySerialKiller@reddit
Ai will kill the developers who did a horrible job, those who did the bare minimum.
Good software engineers will just add genai to the toolbox, just like they did with IDEs, modern compilers, containerization, etc
NoCardio_@reddit
This sub has gone from /r/cscareerquestions to /r/iHateAI.
cbusmatty@reddit
it makes bad*** developers worse at coding. Good developers understand its another tool in their belt and use it appropriately for significant gains.
Master-Variety3841@reddit
I just started working at a bank, I was so excited for an environment with no AI usage.
My first project?
Evaluate Cursor, Windsurf or Copilot usage and establish engineering standards in the org.
Kill me.
thewritingwallah@reddit
Real problem is lack of accountability. LLM is just a tool. End of the day everyone must still be accountable for the code they commit.
Slop is slop whether they wrote it themselves or prompted an AI to generate it. You deal with it by making them accountable for the code. If they say some shit like “that’s what the chatGPT generated” it’s a red flag imo.
I think a weird side effect of AI is that it increases code throughput of people (bad and good code), but because of the higher throughput, the burden of code reviews is even higher, so people pushing a lot of code cause resentment in people who have to review their code.
paapanna@reddit
Genuinely curious, how does it make developers worse? Because it makes them lazy enough to actually learn coding?
Affectionate-Mail612@reddit
You outsource thinking to somewhere. One might say" oh but you still think, just on higher level" and I can't agree with it - I think the hardest when writing code.
basonjourne98@reddit
With AI developers are enabled to focus on things beyond routine coding. You sound like the people who used to say anyone who didn’t know ASSSMBLY couldn’t be a good dev.
Affectionate-Mail612@reddit
There is a fine line between "too little abstraction" and "too much abstraction". By ditching assembly it maybe said that the code became slower. By ditching higher level programming languages it's not even the speed, but the intent of code itself becomes muddy. Too much abstraction to be useful in the long run.
Pad-Thai-Enjoyer@reddit
Are you being forced to use a tool or something? The horror of having more tools at your disposal…
Only-Cheetah-9579@reddit
companies are making AI tools mandatory and the response from devs is often prompt and don't give a damn about the output just push and the reviewer will read it.
xDannyS_@reddit
A lot of people are, yes
Only-Cheetah-9579@reddit
some truth to it, new people will not enter the market so easily and all the vibe coders think they are masters of their craft but when I review their code it's still at junior level. When you have no experience you don't know how to prompt an llm to write good code
donatj@reddit
The issue I have is that it will solve problems and I don't understand the solution. Having not worked through the problem solving process myself I'm usually not anywhere near a place to understand it. Is it good to use a solution you do not understand? Traditionally the answer to that is obviously no, you need someone on the team who can maintain it and understand the issues. Now that person becomes the AI assistant. I'm not super comfortable or confident in this solution though, and it certainly doesn't make me important to the process.
throwaway_0x90@reddit
Nah, I'm not worried - well, not worried for long term. AI-slop is definitely creating a bunch of messy codebases out there that will eventually fail and then they'll actually need real devs to fix it.
Of course at that point I'll be an independent contractor charging an arm & leg to do so.
sureyouknowurself@reddit
Man some of the code reviews from developers that really should not be prompting AI.
PermissionNo4771@reddit
A truthful point buried in an exaggerated inaccurate perspective.
ii-___-ii@reddit
Developers become worse at coding, which somehow makes the jobs disappear. Huh?
60days@reddit
To respond in kind to such a low quality post:
skill issue.
IAmMonke2@reddit
I think if your work can be replaced by ai, corporates will replace it with AI; they won’t give a fuck if you are a good dev or not.