Junior devs who learned to code with AI assistants are mass entering the job market. How is your team handling it?
Posted by Ambitious-Garbage-73@reddit | ExperiencedDevs | View on Reddit | 489 comments
We hired two junior devs in the last quarter. Both passed the interview fine. Both can produce working code reasonably fast. But something is off in a way I have not seen before.
When something breaks, they do not debug it. They paste the error into ChatGPT and apply whatever it suggests. If that does not work, they paste the new error. I watched one of them go through four rounds of this before I stepped in and showed them how to read the stack trace. They had never done that before.
Code reviews are also different. When I ask "why did you structure it this way?" I often get a blank look. The code works, it looks reasonable, but they cannot explain the reasoning because there was no reasoning. They described what they wanted and the AI produced it.
I am not blaming them. They learned to code in an environment where AI tools were available from day one. Of course they use them. But the gap between "can produce working code" and "understands what the code is doing" seems wider than it used to be.
The mentoring challenge is real. You cannot teach someone to debug if their instinct is to ask the AI before they think. You cannot teach architecture if they have never had to hold a system in their head. The foundational skills that senior devs built the hard way are just not there.
How are other teams handling this? Are you adjusting your interview process? Changing how you onboard juniors? Or just accepting this as the new normal?
lordbrocktree1@reddit
I slammed my laptop shut today after seeing the PR from one of my Junior devs (3yoe but he hasn’t been promoted to the next level).
It’s all ai slop, barely runs, completely un-maintainable. It’s worse code than he was writing himself 2.5 years ago.
I’ve already had 2 team wide meetings in the last 3 months about them relying on AI to write all their code and the quality is absolutely trash and it’s unmaintainable slop.
I don’t mind AI, in fact we are working on AI platforms, but the code quality they are producing keeps getting worse and worse and they keep understanding less and less.
single_plum_floating@reddit
Auto reject and a meeting today or within the hour if it made you that mad. All further PRs have do be explained face to face until you trust him again.
lordbrocktree1@reddit
That’s pretty much what I’ve done. And put one team member who was particularly egregious on a PIP. But it’s becoming prevalent across teams
single_plum_floating@reddit
I genuinely can't imagine pushing that sort of slop through with a straight face. Honestly i have always had the exact opposite problem with AI development where i keep finding improvements to make.
RegretNo6554@reddit
does your org mandate AI usage?
lordbrocktree1@reddit
Nope!
And we don’t track lines of code, points at the individual level, number of PRs, or any other pointless metrics either. I also very clearly tell them to use communicate with me if they will need more time with a ticket when they start working or when they are touching a new technology that they haven’t used before.
I’ve done mentorship, paid for trainings, done book clubs for anyone interested in them, found ways to get them to learn tech they are interested in the company dime in R&D, and even threatened to quit to protect their jobs during the layoffs 2 years ago where I had leverage on the execs to force them to accept that my team wasn’t going anywhere.
psyyduck@reddit
Just set up automatic scanning for each PR. It's super easy, just a prompt like
When the guy gets 4/10 in each category he'll be embarrassed to ping you. Tell them you only want to see 8s, and maybe one 5 if he has a very good reason (tradeoffs).
gius-italy@reddit
sounds like they just don’t care then
fork_yuu@reddit
I'm seeing them paste like 300 words essays into slack for large debugging going off the rails. I'm like I ain't reading that shit and just ignore them. Sometimes it's clear they don't even understand what they pasted cause any follow up or question is met with silence or confusion.
Like are seniors supposed to be their ai validation while they just paste seniors ask into ai?
Izkata@reddit
Yeah, I noticed something similar from our junior, his work getting worse over the past year. The last merge request he opened before layoffs did a third of what was needed for the case and didn't even call the new function he'd added for the third he did implement. Also the function was named wrong, for a different third of the case.
FxAnd@reddit
why it sounds like fake post. it's copypasta from other question anout hiring
throwaway_0x90@reddit
Why do you need to "handle it" at all?
Make sure the job actually requires these things and you're not just forcing it on them because that's how you had to do it back in the day. My parents didn't have scientific-graphing-calculators, but I do. I don't care if they had to calculate the area under a curve by hand, I don't need to do that today. If AI can debug and fix their code after a couple of prompts you might have to accept this is the new world.
What I'm trying to say here is, be careful of creating a hostile work environment. If they're getting their job done and you cannot point to any particular metric to say they're under-performing then you probably shouldn't say anything.
kalashej@reddit
I generally agree with that. I don’t know a lot of details what happens in the hardware when it executes compiled instructions. Other than the very general ”how does a cpu work?” course in university 20 years ago. Same thing with some libraries. And I don’t require others to. To me that’s not the same as AI writing code though. The previous examples are somewhat well-defined domains where it’s possible to (more or less) guarantee a certain behavior which makes it possible for you to use your brain for other problems. To get to the same point with AI it needs to be way more deterministic than today and also basically 100% correct in what is does. Otherwise it’s like using a calculator that will give you random incorrect answers. Maybe that’s fine, but it is a fundamentally different change compare to any point before. Only time will tell if it works or not.
throwaway_0x90@reddit
I have a prediction here. One day AI agents will support extensions/plugins. You will be able to tell AI that for certain reasoning paths, execute this plugin code and trust its results. This will bring that deterministic behavior for those critical paths.
BetterWhereas3245@reddit
Right now we do that with automated testing, don't we?
Unless you let the AI generate the tests start to finish and don't even look at them.
I've found that doing TDD and spending more time on writing and reviewing the tests allows a much better result with AI agents.
throwaway_0x90@reddit
Indeed, I'm betting my whole career on that general idea
BetterWhereas3245@reddit
I read through your comments and I agree completely. I have 10YOE and in the past six months I've done a 180 on AI.
I had a personal project on the backburner for over a year, a few months ago I decided to give it a try using claude code.
Wrote the whole project architecture down, every single feature with every detail, code style guidelines, data structures already predefined, everything was laid down exactly as I wanted it to be. Then I split that into "sprints" of features to develop iteratively, just like we do in the real world.
And I was amazed that the result was about as good as I could have written it myself, but it took a couple of weekends to write instead of months.
I doubt someone without the technical knowledge and the experience dealing with stakeholders could have achieved it (with the speed, quality and reliability I achieved). Maybe someone could have vibe coded their way to a similar product, but my codebase is very lean, easy to maintain, has testing coverage over 90%, type safety everywhere, etc.
It's clearly here to stay and the future of how to develop things. I never wanted to be a manager and wanted to remain an IC for the rest of my career, and I'm now seriously rethinking that.
HowTheStoryEnds@reddit
Once they do, then they're on their own. The probabilistic word generators won't save them from themselves.
livsjollyranchers@reddit
I suspect the code is constantly breaking. That would suggest something needs to change.
throwaway_0x90@reddit
Cool, so if that's true that'll manifest itself in a very objectively clear graph that since the jr.SWEs joined the amount of bugs have sharply increased. Show that report to management and see if they actually care or not.
In particular, if there more bugs but also those bugs are getting fixed and closed rapidly and the overall velocity of delivering features doesn't appear to have slowed down then increased breakage might not matter.
But if it's true that code is breaking, resulting in outages like what happened with Amazon then we have a concrete issue that can be discussed.
Toohotz@reddit
I agree with much of your points, having been in FAANG for a number of years, the fix in itself can create a regression in other areas (this happened at Amazon recently that took down AWS).
Levels aside, explaining to another engineer they cannot just shoe horn their solutions in critical performance areas was a challenging discussion I had a while back with other fellow senior and staff engineers. It’s not unique to juniors, it’s getting the work done in the best interest of time.
I don’t fight other engineers, I’m always here to help and I do so without a lot of friction. It’s not like we have much accountability these days of a SEV is created while delivering a new feature.
To the point of hostile environments, I can say many functions of Meta thrives off of hostility and toxicity. This is unique only to Meta and is not a culture I’d want other companies to adopt.
th3gr8catsby@reddit
At this point many studies have shown that over reliance on ai tools is detrimental to developing new skills. If they’re relying on these tools as juniors, they’re only going to stay juniors longer.
Call me old school, but I think knowing how to read, understand, and debug code is an integral part of the job. When it no longer is, I’ll take that as my queue to find a new one.
throwaway_0x90@reddit
Can you convince management of this view? Because I'm pretty sure high velocity jr SWEs will get promoted by management.
355_over_113@reddit
I had to teach 2-3 new devs how to debug their stack trace. That was before our team went all-in with the vibe-coding. The newest one was too arrogant for me to bother teaching. At this point everyone up and down the management hierarchy is vibe-coding. Everyone's workflow is like what you described above. I still try to do it the old-school way in areas that I understand but I'm wondering if I'm doing something futile. They (whether jokingly or not, it's not clear) have no qualms about what they are doing and tried to convince me to just give up trying to check every detail in the coding agent's work.
BetterWhereas3245@reddit
At the end of the day, it's just a job you do for a paycheck. At some workplaces I've been I would have 100% let the AI take the wheel and do everything no fucks given.
Perhaps one ought to ask why are people not invested into the code they write. Maybe they're not paid enough to care, or management manages to make them not care, or they are just lazy. At the end of the day, it's the business that makes the choices.
If they pay me to refactor the same thing for a whole month, who am I to say no? It's their money, it's my paycheck.
samu-dra@reddit
I'm genuinely asking, as a cs student, how do I learn this? My courses weren't really structured well and I genuinely want to know how to approach stuff instead of using AI. Any tips or guidance helps
BetterWhereas3245@reddit
Build things yourself, try to do everything by hand, read documentation yourself, or ask AI to help you find the right documentation pages that you then read yourself.
That's really the only way to learn. It takes years of experience, and there is no shortcut.
HiSimpy@reddit
Most teams are hitting a capacity mismatch here. AI-native juniors can produce more output, but review and coaching systems were not designed for that pace. Defining explicit quality gates and ownership for onboarding feedback prevents review queues from becoming the bottleneck.
Turd_King@reddit
I truly believe this is a critical issue and unsolvable with AI coding tools.
Fundamentally there will never be a situation where we entrust fully autonomous ai agents to write our software without a human in the loop. If the humans cannot understand what the AI is doing we are doomed.
There will be some crazy incidents that occur due to this gap in understanding and we will see a shift backwards to more rigorous human in the loop methods
Whole_Cricket_2092@reddit
I’m still in school and I love AI because the competition is slim. I know my classmates aren’t getting any jobs. Working in our schools research lab I ask them to explain what the script they want to review does. It’s just silence. Even the people above me in our schools graduate program are clueless. AI is great to help but far from intelligent as people like to believe. It’s answer to debugging? Add 100 more lines of code to fix a one character error.
Only-Fisherman5788@reddit
watched a senior do the same thing last month. used claude to fix a production bug, the fix looked right, tests passed, code review approved. introduced a subtle state management issue that only showed up under load three days later. the AI was confidently wrong and nobody caught it because the output looked correct. this isn't a junior problem, it's a verification problem. the question isn't whether someone can write code, it's whether anything catches the code that looks right but isn't.
TheEvilBlight@reddit
Feels like an increment from copy paste stack overflow without reading and internalizing the reasoning
Chocolate_Pickle@reddit
My grad (not really mine per-say, but the one in our team) is kinda' like that. Thankfully he seems quick on the uptake and learned how to use the debugger pretty quickly.
So far, we've just been demonstrating the preferred way to do things and letting him copy it.
sciences_bitch@reddit
Per se.
Idk13008@reddit
Perchance.
Chocolate_Pickle@reddit
In my defence, I had just woken up and commented before caffeine.
compsciasaur@reddit
Sounds like juniors before AI, too. They're juniors. They don't know half of why they do anything. They're code monkeys with a bit of algorithm and data structure knowledge.
bupkizz@reddit
This is the big risk to AI. As an experienced dev, I’m set. There is NO plan for building a pipeline for making more folks like me.
Pyran@reddit
I was talking to a friend of mine today about this, and I see it going one of three ways:
AI gets good enough to serve as seniors, which means our industry pretty much vanishes as a human-powered endeavor. (Next up: Principals replaced by AI, so we no longer need people for Juniors, Seniors, or Principals.)
Seniors get paid outrageous sums of money to hold the fort instead of retiring. Our jobs would be to keep the software running until a generation of Juniors could be hired and turned into Seniors. Think "COBOL developers in 1999" here.
Lots of software collapses altogether once Seniors start retiring because there's no one to replace them. We revert to the state of the industry as it was in the 60s (when no one could possibly have really been Senior because the industry wasn't old enough to have those yet), only with better tools.
My suspicion is #2. I'm 49; that... kind of puts me in a remarkably good position. Which I appreciate, because at 49 my career basically started with the .com bust. It'd be nice to end on a high note.
bupkizz@reddit
Im 42 and yeah been building stuff on computers since around 7th grade. We’re part of 2 transitions that will never happen again. Analog childhood -> digital adulthood -> wherever this is going. I’m not pessimistic but it’ll be wild.
Here’s what im expecting. Seniors get paid outrageous sums and it looks a good bit like it does today for me. I use AI as a tool like a carpenter uses a nail gun. Fast and powerful and dangerous if you don’t know wtf you’re doing and you can slap some crap together that looks ok from the outside but falls apart moments later and isn’t safe.
There will be a whole lot of high profile AI whoopsies where folks vibe code their way into exposing every credit card in existence unencrypted on the front page of NYT, as well as things getting popular and then collapsing because it was built on soggy noodle quality code. Folks get brought in and paid like the wolf in pulp fiction.
Offshoring disappears because the writing of code isn’t the expensive part, it’s the higher minded decisions, and the loop between product decisions and execution has to stay super tight, and the level of trust you have to have with your ai assisted engineering team has to be very high.
However the savvy overseas folks are now jobless and have tokens to burn so India and china and Ukraine end up with net new huge software companies.
Nobody cares about CS degrees in certain areas because the valuable part becomes pure problem solving, but what you need to know and get out of a cs degree changes quite a bit. I never went to school for this stuff but my liberal arts degree helps me solve tech problems daily.
Some c suite folks use all of this to restructure orgs however they want and blame it on ai.
Others cut huge swaths of folks, profits balloon for a sec, they get huge bonuses, then parachute out leaving a gutted orgs and no pipeline. Eventually experienced devs are paid so much they have to figure out how to train problem solvers in the age of AI.
Dangerous_Ad_707@reddit
Just curious, how does your liberal arts degree help you solve tech problems?
bupkizz@reddit
Lots of ways. For me, a liberal arts education helped me realize that creative problem solving, curiosity, communicating with others effectively about complex things, and that spending time thinking hard about hard problems is fun and I'm good at it.
But beyond that, after 20 years doing this, I've found that most tech problems are actually people problems. Working with a team, communicating effectively, actively listening, and building relationships with those around me... That's core to being a good dev. The actual code is just a small part of it, and I got a whole lot better at all of those things during my time in college.
Not that I couldn't learn those things in other ways, or wouldn't have eventually, but to sum it up, I feel like I learned how to learn in college and I wouldn't trade that for the world.
computer_porblem@reddit
this is weirdly similar to my experience tbh. all those essays on post-structuralism were worth it
bupkizz@reddit
My career has never been altered significantly by a single piece of code.
My career has been altered significantly by a single email or slack message on many occasions.
So if all I learned in college was to communicate effectively in writing and solve problems creatively, then yeah it was indispensable.
galacticother@reddit
Just saw this subreddit for the first time... I hope it's not the kind where I get flamed for bringing up that eventually it is going to be 1. unless the world ends, indeed with most industries being replaced by AI.
This has finally been accepted in most circles, I wonder how it hits here.
AntDracula@reddit
ok doomer 👍
-Knockabout@reddit
Quick, what is your opinion on the nature of technology? 🎤
If your answer is anything along the lines of "a specific technology will always get better forever and ever", bzzzzt you are wrong. We are not just sending better telegrams today, and cars are not just better carriages. LLMs will contribute to new technologies, sure, but the AI we see today is not even remotely close to doing any kind of informed decision-making, not least because it does not actually "know" anything.
bupkizz@reddit
Are you an experienced dev? Because yeah absolutely not. “Most industries” wtf are you even talking about.
SLiV9@reddit
AI replacing programmers has been accepted in most circles except programmers who see the kind of shit AI outputs. AI replacing artists has been accepted by everyone except artists and people that enjoy art. AI replacing fiction authors has been accepted by everyone except people that actually read more than one book a year.
It's not about people not wanting to lose their jobs. Professionals understand their craft in ways laypeople and managers can't. Things that may seem cheaper can turn up worse and more expensive in the long run. Sometimes engineers and artists know what is best for their customers.
Possible-Werewolf791@reddit
Misspelled something, Sliv! It's "manglers", not "managers"!
Perfect-Campaign9551@reddit
I don't see how it's going to replace everyone when it still needs prompting
randylush@reddit
“AI, figure out something useful to do and make AI prompts.” /s
AntDracula@reddit
Man I hope you're right. I'm a few years away from being able to retire from FTE, but probably keeping some smaller contracts or part time work to keep sharp and busy. It would be lovely to have 2-3 days a week of highly paid on-call work fixing slop.
laviksa@reddit
I'll suggest a 4'th way: Junior hiring will pick up together with the economy as a whole, but the software dev job has been 'augmented' by daily wading through AI slop code. The slop will be everywhere, in code, in commit messages, in requirements, in QA reports,.. It will be in the paper you read every morning. Uptime and quality will suffer, but cycle time was seriously sped up. So revenue as a whole is up. Those that expect or require real quality and uptime will go through moderately expensive indian QA-testing farms. The depressing question for us devs is: will we enjoy reading through the slop as a job?
randylush@reddit
I’m 36 and I can retire very early. I am with you, ending on a high note sounds really nice.
KickAssWilson@reddit
I’ve thought it was what you did for 2 a while now.
The people I work with that are really scared are the ones that picked up programming through a non-CS major. They’re employed as programmers, but never get to a senior level of coding.
aeroverra@reddit
I don’t know what to think but I’m almost 30 and been it in professionally for about 10 years and longer if you count my projects when I was 13.
I’m happy to be in a good position but I know it’s going to be plenty of painful years in the near future.
roynoise@reddit
They plan to replace us with cheap offshore warm bodies after the expensive US developers who care about quality have retired.
That has always been the plan. Look at Oracle - they just filed thousands of h(one)b visas after that 30000 person layoff.
FrostingHopeful7642@reddit
Just because the developer is from the US does not making him automatically better than anyone else in the world. Nor they have better standards and only they care about quality.
Get off the high horse.
Coffee_and_horror937@reddit
You are seriously delusional if you think American developers in general don't have better standards than developers hired from India where the main goal is to pocket as much money as possible. And I say this as an immigrant who migrated from a third world country.
FrostingHopeful7642@reddit
he didn't say India tho. he said offshore warm bodies.
what was the saying?
"Get in and close the door"
roynoise@reddit
Could you be any more tone deaf? Let alone wrong? (Spoiler, no, you couldn't. )
roynoise@reddit
I mean, their plan is to use cheap indians after the expensive Americans die out. That has always been the purpose of this farce of an industry.
roynoise@reddit
Reddit, you're a sad joke.
kaladin_stormchest@reddit
It's because you're a unique individual and there can't be anyone like you 🤗🥰
DutyStrategist1969@reddit
This is a training norm problem, not an AI problem. Your team s expectation around debugging is implicit, and implicit expectations always lose to the path of least resistance.
Three things we have started writing down for new hires, junior or not:
Define what understood the bug means in your shop. In ours: you can name the failing assumption, the smallest reproduction, and the test that will catch a regression. If you can name those, you may use whatever tool you want. If you cannot, you may not paste yet.
Pair the first ten bugs, mandatory. Not to teach syntax --- to teach the inner monologue. The thing AI cannot give a junior is the question what would have to be true for this to fail this way. They learn that by hearing a senior say it out loud, in front of the actual stack trace.
Make the rule visible. One line on the team page: we read errors before we route them. That sentence has done more for our debugging culture than any tool ban.
If you ban the AI you have just hidden the gap. The work is to make I understand this a verifiable artifact, then let people choose their tools.
LongjumpingWheel11@reddit
Very surprising to read some of the replies. Working in big tech, this is not an issue and that’s because it’s the target. Mandated AI usage and explicit orders to no longer code are a reality. I used to think that it would never happened and IF it happened it would be years from now. That was until a few weeks ago when I was told stop opening IDEs and simply use Claude code as my main interface into the code.
All FAANG and big tech companies are mandating their employees and mandating them to use AI almost exclusively. I’m guessing the rest of the industry will follow at some point. Dark times ahead
EternalBefuddlement@reddit
4 YOE, swapped into a new place and I was unaware how heavy Claude usage is from all levels. My team (and company) is handling it simply by promoting it, and it is driving me insane. Basic errors, absurd PR comments, meetings for AI usage and skill generation. It feels like nobody really thinks anymore.
I had added a tiny new method which I deliberately kept different from similar ones as the entire codebase is flakey and the unit tests suck. The reviewer (someone prompting Claude) requested a full refactor of all existing related methods, ignored the purpose of the method, requested specific validations (which weren't even possible or related), and all round hallucinated nonsense.
s1renetta@reddit
I am a junior with 2YOE and heavy AI usage is actually what both seniors on my team are doing and promoting. It's not always juniors who are to blame, sometimes the team/company is promoting it for the sake of speed.
I did stuff the traditional way but with light help of chat-format AI when debugging (we can't use Cursor as we are a financial institution with sensitive data). This is obviously hella slow compared to our seniors producing a working POC for a new use case in under 2 hours. At some point I asked one "But then what am I still doing if the AI is supposed to do all the thinking... like I feel kinda useless working that way." And we just sort of ended up in a discussion about how it was maybe more stupid to try and attempt certain things that a computer can do 9999999x faster, or read through pages of documentation to write functions if I can just learn to ask the AI correctly what I need my function to do.
Yes... maybe if you're a senior who can tell when AI making stupid decisions in its output. Someone at my level has only two options: write junior-level code and do slow but valuable debugging during development, or copy-paste code directly from AI and never improve.
GaladrielStar@reddit
I happen to have just read this essay this morning on why AI use by experts is a whole other ballgame than AI use by juniors. It’s written about physics but the point applies. Goes along well with this comment.
https://ergosphere.blog/posts/the-machines-are-fine/
SecretChimp2024@reddit
Great read. A treatise on the Tragedy of the Commons - https://en.wikipedia.org/wiki/Tragedy_of_the_commons. Truths are universal and timeless.
s1renetta@reddit
I just read it too, good read. I am not sure where we're heading... I've been "Alice" all my life but it's going to cost me a lot of my own time and energy to stay "Alice" while navigating my own vs our team's strategy. I imagine it's the same for people in all fields of science and tech right now.
Zetus@reddit
The loss of long term understanding will lead to what I refer to as epistemological collapse/unmooring of many systems, the essay was great btw
Loose-Potential-3597@reddit
I have found it a lot better if you interrogate the AI and try to either learn from it or explain everything it's doing (and potentially identify mistakes in the process), instead of just telling it to do something all at once and then push all the code. Ideally you should have it make a step by step implementation plan for a large task and follow/learn form everything it's doing along the way. You don't have to be able to identify mistakes on your own, but just asking it to explain things you don't understand why it's doing sometimes helps it correct its own mistakes.
Also this workflow is a good way to learn and unblock yourself if your teammates are clueless or just dicks and don't help you at all, like mine. It's almost like pair programming with a senior engineer.
mogamibo@reddit
I am so glad I reached senior levels before AI (7YOE now). I've been using ai for a while, and my strategy is using it to boost me. I still spend time coding to hone and uphold my skill, I review ai code thoroughly, and when learning new stuff, I'll treat the ai as a discussion partner that doesn't necessarily know more than me, but has access to docs.
As a junior or when you're learning completely new things (e.g. a new language), you gotta make sure you do this as well: Make sure you actually get coding experience (the alternative is that two years later you might not be able to write code in the language by yourself), ask it control questions, questions about alternative solutions etc, don't stop reading documentation yourself. When something fails due to an error, don't ask the agent to "fix it", ask "what does this error mean". And don't necessarily take the first answer as the true answer (eg. an agent would likely say that using a hash map yields better performance than an array, and fail to mention that for smaller datasets, the overhead of a hash map makes the performance gain negligible).
Playful_Pianist815@reddit
This is my experience. Seniors can look at the code and know if it's good. Juniors are lost. They either ship code they don't understand or they fall behind. Worst part is that the senior can spot issues and correct them. The junior just ships trash. That used to be true before, but at least the junior used to learn while writing the bad code. Now he generates it with zero understanding and learns nothing.
VerbumGames@reddit
Sounds like you were using Haiku. I've never had a problem quite like that. Stick to Sonnet or Opus.
qzkrm@reddit
"yay, no one writes code anymore" 🙃
U4-EA@reddit
If you look at how Anthropic and OpenAI are throttling AI now, they had better learn how to code and fast because AI I think will most likely be too financially prohibitive to use shortly.
DevMadness@reddit
AI is a tool, and the purpose of that tool is to help us do our jobs. AI should not be used as an excuse not to think. When code reaches the review stage, it is the sole responsibility of the author to fully understand and defend it, before it is merged in.
Our organization fully embraces AI-assisted coding, but you can really see the holes starting to form. I’m starting to feel my own skills slip from it. It’s a challenging situation.
speedisntfree@reddit
Wait until they just put all your code review comments back into the LLM.
I'm wondering whether code reviews need to be done live now and if the person submitting the PR cannot explain the code, the PR is rejected.
mend0k@reddit
How did devs adapt when high level languages became popular?
Bulky-Condition-3490@reddit
Perhaps the world and industry are adapting? Maybe the old way is old for a reason? Teach what you can and adapt too.
xvelez08@reddit
By continuing to give them the tools that they had when they were learning.
Just like an engineer before you did with autocomplete, understanding that while they did not learn with it… it’s a tool that exists and it’s time to adapt.
dgmib@reddit
A jr dev is only valuable as an investment. If they aren’t eager to learn they should be fired.
No business should hire “vibe coders” whose only value is to prompt AI. We don’t need that anymore we have AI agents now that will take a ticket, write code for it, recursively feed the error messages back into AI and generate a PR. The one thing JRs were arguably still bringing to the table at the start of their career (being someone SRs could offload easy well defined tickets to) is now replaceable with AI agents.
The AI agent integration we’ve seen in all the major ticketing systems in the last year, has made JRs useless.
Even before we had AI, one sr dev would outperform (at least) 2 jr devs for the same price. It was never worth it to hire JRs unless they grew their value delivery to cost ratio over time.
As an industry, we never needed jrs that didn’t want to grow. AI hasn’t changed that. It’s only created a bunch of people that think they can be a good dev without putting in the effort.
I don’t believe AI will ever be able to outperform a senior dev at coding, but it’s already better than an jr at every aspect of a jrs role. A jr dev needs to demonstrate that want to become a senior otherwise they should be let go.
BusinessBandicoot@reddit
Honestly I keep wondering how I struggled so long to get hired, when the level of skill that seems to be expected from juniors is "do this basic task and try not to break anything". I basically started my career well beyond that skill level.
max123246@reddit
The typical interview of leetcode puts a lot more emphasis on memorizing solved problems instead of the on the job thinking you have to do when designing something novel. It tests for the wrong skill.
drahgon@reddit
Agreed. Though I would say an eager junior brings a lot of energy that seniors can lack. I think that is worth quite a bit and is a unique reason to hire juniors. They can bring the business in new directions.
dgmib@reddit
100% agree.
You take a young kid that’s eager to learn and grow and create for them an environment where they’re supported and mentored not only do they quickly become valuable, their enthusiasm brings the whole team up.
Singularity-42@reddit
Don't hire anyone who started in 2023 later. Harsh,but it will solve your problem.
Toohotz@reddit
When my staff title fades away one day, who will become the seniors and staff of tomorrow? We as humans have a finite lifespan.
Singularity-42@reddit
That's a problem for future us.
Coffee_and_horror937@reddit
It's gonna be hilarious when your bosses decide to replace you with AI
Electronic_Back1502@reddit
That’s arguably the worst possible solution, what a brain dead approach
kri5@reddit
As true as it is, it's also the solution taken by most companies for any problem. As long as this quarter/year goes up, who cares? It's the culture created by society
Electronic_Back1502@reddit
And for every company that follows this approach, it just makes the problem worse. My company still actively hires juniors, just as many as they did pre-AI. Trains them up. Most of the people end up staying at this company for decades
Singularity-42@reddit
Thank you for your service, I wouldn't
Electronic_Back1502@reddit
Bum
kri5@reddit
Good on them. Hope they continue to succeed
kevin7254@reddit
Jesus what a shit take. Do you feel the same about the climate or?
FirefighterAntique70@reddit
Holy shit, I'm glad I don't work with you, what a tool...
crazyeddie123@reddit
yeah but "not pushing out everyone over 45" should get us by for a good long while
AnimaLepton@reddit
I'm not a doomer, but IDK that there's an infinite amount of growth and need for more people to do the work in 20 years. How much 'real' work is there to do? And if there's no 'real' work, do you need a growing number of junior developers, or is a smaller number enough to feed the machine? Seems realistic enough to shrink without disappearing entirely, people learn on the job, transition in from related fields, etc.
I'm not a believer in Graeber's "Bullshit Jobs" specifically. But I'd believe you if you told me 10-20% of people consider their own jobs to fall in that category, or that 10-20% of jobs exist as net negatives or with marginal value, having neither financial value/returns commensurate with spending nor having any positive societal impact. I just don't think companies making these layoff decisions can actually identify those people in a meaningful way.
The whole idea behind software jobs compensating as much as they do is that the work is fairly scalable. To some extent, a lot of software has already been written. There's a lot of foundational work and technology that already exists. Many problems have been solved to some degree or in a specific vertical. It takes plenty of people to keep the machine running, or expand what's already been done at one place to some other niche. But there's an argument that iteration or rebuilding things in a different context is easier than it's ever been, and there are fewer truly novel problems. The work still matters. But it's not going to grow forever.
Automation was already chipping away at things pre-AI (not just for software devs), and AI has just expanded the potential scope + areas where the labor could be compressed. It's not going to be an overnight thing by any means, but there are many companies that can probably afford to 'tighten their belts'/ruin some lives with layoffs today without actually affecting their products and services, even ignoring the AI angle. I don't think there's an easy answer for it.
https://illinisuccess.illinois.edu/24-25-annual-report - 62% of UIUC CS class of 2025 BS grads landed a job, and 34% went on to grad school. Add a bunch more from Computer Engineering and the like. The pipeline is huge and the funnel is smaller than before, but new grads are still at least getting hired.
_dekoorc@reddit
Maybe there just won't be as much title inflation.
Repulsive-Hurry8172@reddit
I had an application to a backend role that required me to go to their office for a coding exam (it was a remote position). They supplied laptop that has access to an online code editor and we solved leetcode-like problems for an hour. After that it's a tech interview via video call in our local language. So far the most AI-proof interview I've had.
Longjumping_Feed3270@reddit
What a nightmare.
So that person can do leetcode. Congratulations.
Doesn't mean they can read a stack trace.
drahgon@reddit
about to see juniors showing up to interview with fake beards and bleach white hair lol.
semicolondenier@reddit
Many of those people are in need of guidance. They were given an extremely powerful tool, that works, and are faced with demands that are based on the assumption that this tool will be used. Pair that with the imposter syndrome some of them may be facing, and you have a very complex situation that they probably do not know how to navigate.
To combat that, I developed some rules for myself (started out 3.5 years ago and had access to chat gpt): - If prod breaks and I have to use ai, I will do so. This is the only time I will allow it to develop for me, while trying to understand the issue. I will always review the fox before shipping it. - In any other case, I never give promises on development time based on the assumption AI will help - I ask AI for little help, when time allows it, and ALWAYS ask for resources from the docs - I type the code myself, unless it's something extremely easy and I am bored. - I always make sure I can explain each line before committing it. If something is weird, I make sure its complexity makes sense
My advice would be, approach those people, and help them set a career path where in a few months / years they will not be just better prompters, but engineers as well
LindsayListens1@reddit
Yeah, that last part is the whole thing for me, because the moment AI stops being a scaffold and starts being a substitute, you get a lot of very confident people shipping code they do not actually understand.
MundaneValuable7@reddit
We haven't hired a new junior in three years and I don't see that changing any time soon.
Pozeidan@reddit
We've been firing more than hiring, absolutely dry for us at the junior level. I don't foresee that changing anytime soon.
Fidodo@reddit
My acceptance criteria for juniors has boiled down to basically one thing. Curiosity. In the interview do they go out of their way to understand something? If something goes wrong do they show genuine interest in not just fixing it but actually understanding it? A good thing is that it's very hard to fake curiosity and learning.
With AI I think that is by far the number one indicator of success. AI can either be a tool to offload your thinking and be lazy, or it can be an infinite well to satisfy your curiosity. If a developer is curious they won't accept an AI answer and move on, they will want to understand it. At this point that's all I care about.
cheezzy4ever@reddit
> In the interview do they go out of their way to understand something? If something goes wrong do they show genuine interest in not just fixing it but actually understanding it?
In your experience, how often does this come through during interviews? I stopped doing interviews a view years ago, but when I did, there was really only ever time for a coding question. Never a whole lot of discussion beyond anything surface-level
Fidodo@reddit
I find it comes out every time, but I optimize my interviews to surface it. My opener question is just "tell me about any project you've worked on that you found the most interesting" and then ask them questions until I actually understand how it works, not just buzz words. That question alone gives me a great signal into how they will do at live coding.
For the live coding I don't do leetcode, I don't care about what algorithms they've memorized. I give them learning and discovery tasks. I'll give them an API I don't expect them to have much familiarity in, and it's open book. They can research the docs all they want, but I point them at specific pages they'll need to speed things up so they can complete in time. Just watching how they learn and apply that learning and their process of diving deeper is extremely informative.
Then after they complete the first requirement I give them more requirements progressively. This is on purpose to see how they refactor and adapt. So far it has been incredibly predictive of on the job success.
Even with AI I think it's incredibly hard to fake that you're learning. If they're just instantly understanding the task with no discovery process it's incredibly obvious if they're running ai in the background. Plus ai isn't fast or great at using less popular APIs. AI can one shot leetcode questions from training data, it can't for random APIs. A candidate can't naturally parrot the multi turn process of an ai debugging its own code. Youd just go quiet for minutes doing nothing then have a bunch of code appear out of nowhere.
skywalkerze@reddit
Can I work for you?
I mean, isn't it obvious this is the right approach for interviews? It's not, lots of people at lots of companies do stupid things like leetcode, but I don't understand why.
Fidodo@reddit
I think the reason it's not standard is that it requires better interviewers. Leetcode you can scale more easily because it's closer to pass fail. You can apply rubic to the more subjective method that I prefer but it's still more subjective and requires more careful training and consideration from the interviewer.
If you're a massive corporation you will want to try to standardize it, but I think the subjective qualities are unavoidable so it's better to provide a more rigorous methodology on the subjective side than to pretend it's objective.
I try to instill some objectivity into the process. For example, for the technical component I progressively introduce more requirements. How far they get into the requirements is a good data point to compare candidates. I still prefer candidates that didn't get as far who demonstrated clearer thinking, but all good candidates got to a deeper requirement than the ones that clearly didn't know what they were doing.
LillyTS@reddit
I think I'd really love to be interviewed by you some day
throwaway30127@reddit
I wish more companies would adopt this approach and more interviewers would use similar criteria for judgement. As a junior developer, I recently had an interview at a tech company and I specifically went through their engineering blogs to understand what they're currently working on and asked questions about it but the interviewer was not interested in discussing any of that beyond surface level. He just asked me to write the code asap and completed the interview and got rejected the next day probably because I needed some hints at the end.
Fidodo@reddit
Interviews are a two way street. Don't think of it as a rejection, it's a red flag. When you interview someone you're looking for someone to work with. If they're disengaged and just trying to hit a quota then I doubt they care. At the end of the day it's just a job, but if you want a job that doesn't suck, the other employees need to care enough to make it not suck.
guareber@reddit
100%. Curiosity on literally anything in our discipline is a huge positive signal right now. Even if that anything is understanding the AI itself
Fidodo@reddit
One of my questions is just tell me about a project you worked on you found especially interesting. It can be any project. I just want to see how deep they went and how curious they are and also if they can explain it.
GoTheFuckToBed@reddit
don‘t leak it, now all the hiring tutorials are gonna include „how to fale curiosity“
coredalae@reddit
This has been my main acceptance criteria for basically since I started. Tech and knowledge changes, personality is a lot harder to train.
2nd one is do I expect them to be able to take responsibility
natashag1ggles9655@reddit
how do you teach debugging now
MelAlton@reddit
(Pastes error message into prompt) pls gippity, make code work.
DisheveledJesus@reddit
No mistakes
Playful_Pianist815@reddit
And make it secure.
People always forget to make it secure
caboosetp@reddit
I still make my stupids code on a whiteboard. I'm never going to let that go and I'm standing by it harder now. Following logic without running the program is becoming a much more needed skill as people just look at generated code.
I've been doing it more at work too when helping people. AI is not at the point you can just generate code and yeet it out there hoping it works. That's how we get serious bugs in prod. I just left a job that was both pushing AI super hard and getting infuriated at us for letting bugs through. It was a very demoralizing and psychologically unsafe culture.
Nottabird_Nottaplane@reddit
And the plan for developing the talent pipeline that will let you all have capable seniors to build future teams and products is…what?
MundaneValuable7@reddit
The seniors we've hired are doing pretty well.
Nottabird_Nottaplane@reddit
That’s not the question. If you’re not willing to hire juniors, and no one else is, then where do the competent seniors of the future come from?
Additional_City6635@reddit
if/when seniors become scarce then the pendulum will shift back to hiring juniors
Material_Policy6327@reddit
Sadly im not so sure. The corporate world is really getting on ai making every knowledge worker replaceable and if it doesn’t happen they will just offshore more
MathmoKiwi@reddit
Basically that. When they run out of local Seniors to hire, in a few years from now, they'll grab all the offshore Seniors that exist.
Maybe once that resource is finally tapped out, then just maybe, they'll look into hiring local Juniors again.
tcpWalker@reddit
You also have plenty of seniors who have not really adopted AI yet who will need to either do so or leave the market, which opens a few spaces.
AntDracula@reddit
lol
Additional_City6635@reddit
Someone's gotta run the AIs, and kids are a lot better at learning new tech than old people are
hurley_chisholm@reddit
Sadly, they aren’t. You are confusing technological literacy with being impressionable and familiarity from growing up consuming technology. Young people aren’t any better and are in some ways worse than older people at learning and understanding how technology fundamentally works^1 ^2 . “Digital natives” are frequently not taught how computers work and so they don’t understand how computers work^3 .
1: https://www.edweek.org/technology/u-s-students-computer-literacy-performance-drops/2024/12
2: https://world.edu/digital-illiteracy-the-difficulties-young-people-face-with-digital-technology/
3: https://pmc.ncbi.nlm.nih.gov/articles/PMC10123718/ - An 2023 meta-analysis of digital literacy research in nursing and nursing education. Spoiler: exposure ≠ literacy.
Additional_City6635@reddit
That may be true but young people still have much more elastic brains. Anyways, I dont really understand the point of this whole thread. Do you guys think the industry is just gonna throw its hands up and cease to exist in 30 years? No. Someone, somewhere, is going to teach young people to build software
hurley_chisholm@reddit
None of the sources I linked are talking about compilers. It’s more fundamental than that. Young people are graduating college without understanding what a file and file system are.
And I don’t think the industry will disappear, but rather that most organizations will optimize for not training early career folks until they can’t avoid it and it isn’t clear how long that will take given that AI tools are genuinely useful. It will definitely take longer than most young people can wait to settle on a career.
im_a_sam@reddit
I haven't seen anyone deny there will be a massive shortage of seniors. But even companies accepting this aren't incentivized to develop juniors, because they can hop as soon as they hit senior, and the next company can offer more because they aren't spending resources developing juniors.
Sparaucchio@reddit
There won't be a massive shortage of seniors
Market will just shrink
seattlecyclone@reddit
Aging seniors who demand escalating amounts of money to stay out of retirement.
MathmoKiwi@reddit
Bingo, offer someone $1M/yr and suddenly you won't have a shortage any longer.
koreth@reddit
My hunch is that a lot of the seniors who retire in the coming years will do so because they're burned out from spending ever-increasing amounts of their time fending off AI-generated gobbeldygook from their coworkers. More money will definitely motivate some of them, but others may feel like no level of pay is worth sacrificing their mental health.
MathmoKiwi@reddit
Maybe they'll negotiate harder for flexitime/remote/part-time hours.
jakesboy2@reddit
That’s no one companies priority or problem. It’s a shared future concern which realistically is nobody’s concern. Maybe a FAANG can operate with enough foresight to set up their own pipeline but ye olde insurance company or startup is not concerned with the problem, despite it being real.
MundaneValuable7@reddit
Other countries I guess, like half my team is.
bicx@reddit
Hire senior from other companies
AntDracula@reddit
I could live with a future bidding war.
MI-ght@reddit
Aren't you the dumbest senior alive?
Krackor@reddit
Distributed benefits across the industry but with concentrated costs for the employer who chooses to train them. It's a losing proposition for most companies to consider doing this.
humanquester@reddit
In a very short-term way yeah, but long term (I know, that's not something the shareholders care much about so is it even worth discussing?) if there are no seniors because each company expected the other companies to train them the value of seniors will rise to astronomical levels and it will end up being way more expensive.
MathmoKiwi@reddit
Tragedy of the commons
Lucho_199@reddit
Would I be wrong in thinking that's corporation's problem? Seriously asking for some perspective
Nottabird_Nottaplane@reddit
At some point, the corporation is people making decisions. And there’s another commenter talking about how they’re in startup land operating this way; at that level the business is often A person, forget people.
So if no one is choosing to hire or train juniors, and techies are encouraging each other to arrange the talent pipeline, then what?
PeachScary413@reddit
Infinite job security 🤑💰
recycled_ideas@reddit
Then either AI takes over and we're all fucked anyway or there's way less competition and I make more money.
Either way I don't care.
MelAlton@reddit
In 2072 there is one senior developer left alive in the world. He makes $800 million per year.
03263@reddit
There's no plan. That's somebody else's problem.
Saittama@reddit
Sounds like that’s a problem that companies or juniors need to solve.
tuckfrump69@reddit
Same, frankly, I would only hire juniors I know personally at this point
otw@reddit
I know it's not going to be good long term, but this is the most productive and peaceful I've ever felt in the industry in my entire career. We basically only have fully onboarded devs and busy work goes to AI.
I know one day it'll come for me, but dang not having to onboard and interview people is so nice.
wakeofchaos@reddit
Must be nice… meanwhile I’m graduating into a market that’s suddenly decided I’m not needed :/
MundaneValuable7@reddit
I would start by reading rule number 1 which says don't participate if you have less than 3 years experience.
Worldly-Standard6660@reddit
Degenerate
otw@reddit
Yeah sorry dog wish I had any advice to give you. I think if you really love developing and have a passion someone will pick you up, but if you were just trying to get a job (which is valid) it does feel like a very different world now where I think that’s gonna be a pretty big grind. My advice would be to lean more full stack if you can, with AI people are looking more for people with wide breadth in a lot of areas rather than deep depth in a few. Especially large scale architecture which AI struggles with right now.
Sparaucchio@reddit
It's only nice until we get laid off and need to find another job
floghdraki@reddit
The industry is betting that by the time more seniors are needed AI has become so good you can automate the whole profession away.
At this point I'm not even sure they are wrong.
TranquilMarmot@reddit
Yeah, I haven't worked with a junior dev in ~5 years now. We don't even post any non-senior jobs anymore.
bmain1345@reddit
Just realized, I haven’t heard of my org bringing on a new junior at all this past year. All I can remember is new seniors being added to teams hmm
kristyc0okie9521@reddit
how are you addressing the debugging skills gap?
MinimumArmadillo2394@reddit
Training is how my team is handling it.
We train people.
Ad3763_Throwaway@reddit
It sounds off that you need to train people for rudimentary skills. Like having to teach a cook how a n egg.
tinycockatoo@reddit
I'm sure there are optimal and suboptimal ways to boil eggs.
alienangel2@reddit
This has always been the case. I remember being frustrated having to explain "If A and B imply C, and you're seeing C, have you checked A and B? no? Ok go check them." to juniors a decade ago. Some of those juniors are very accomplished engineers now.
MinimumArmadillo2394@reddit
How else will someone know how to debug a system more complicated than fizz buzz? Thats not something they teach in schools.
Meta_Machine_00@reddit
Using punch cards with computers used to be a rudimentary skill. Same for writing in Assembly. Things are evolving faster than ever before.
MissinqLink@reddit
Thank you
WildRookie@reddit
Based on the progress coding agents have made in the last 6 months, in another 6 months debugging is probably not going to be problematic.
---solace2k@reddit
I suggest you actually learn a bit about how AI works. And I meant that with respect. It's eye opening.
Disastrous_Crew_9260@reddit
We hired 2 new juniors 2 years ago and they are basically super human.
They just received promos to mid level along with me (I entered 2 years before them but was stuck in a legacy project receiving trivial tasks with no room to grow until a year ago).
They do however go above and beyond with engineering tasks and code review so maybe we were just lucky.
bmain1345@reddit
Well since we seem to be only hiring seniors there isn’t one
MathmoKiwi@reddit
That's one way to address it!
zebbadee@reddit
I don’t hire juniors any more either, I don’t understand the economics for those that do
pirateNarwhal@reddit
I'm the second most junior on my team... just hit 14 years.
HiddenStoat@reddit
Get off r/ExperiencedDevs you imposter!!
MathmoKiwi@reddit
How does it feel to be the young'un?
wuteverman@reddit
The closest we get is an L2
bicx@reddit
Yeah. I’m in startups and no teams I’ve been on had below senior in several years.
Nottabird_Nottaplane@reddit
And the plan for the talent pipeline that will let you all have capable seniors to build future teams and products is…what?
po-handz3@reddit
Are they using Claude or Claude Code with a max subscription? Then it's fine.
Are they using something else? Then hopefully we're not paying them much
scoot2006@reddit
This is the problem: we’re creating a skills gap AI won’t be able to fill. Once we literally age out of senior devs what will they do?
We’re creating a serious potential for a gap which can’t be filled. After which there will be no mentorship, no coaching, and no human to human interaction unless we force it.
There has to be paths for people to actually learn things vs AI just taking over everything entry level. It’s actually scary.
At this point, I just hope I can retire before too many executives think they can honestly replace us.
crazyeddie123@reddit
Given that we were aging out senior devs at like 50, we could just stop doing that and keep things going for 20 years. After that who knows what computers will be doing on their own?
scoopydidit@reddit
I mean it's clear that tech companies have put all of their eggs in the "hope AI gets good enough that it can replace seniors" basket. Right now, they're completely okay with breaking the junior talent pipeline. And it seems they are hell bent on breaking senior talent pipeline also.
scoot2006@reddit
I hear you. Just not sure they understand the implications of it. They seem to have this utopian view of how AI will change society without an idea of the actual path to get there.
scoopydidit@reddit
Frankly I don't believe any CEO truly knows AI capabilities. They're being sold snake oil by their c suite. Plus it's a recession proof way to do layoffs which is awesome for them.
e_ccentricity@reddit
This just seems like a failure in the interview and hiring process?
Like, you didn't have them code and explain their code? You didn't give them a simple debug problem? You didn't simulate a real working day and ask them to work through some exisiting code database? Maybe walk through how they would add a feature? You didn't have them walk you through a project in their portfolio?
What was the interview process like? Am I crazy to think that these issues should have been noticed then? Aren't there a billion applicants per job listing? This is what your interview filter got you?
This post just seems sus to me.
streetbob2021@reddit
This is how the art of writing software or designing a system will slowly (quickly in this case) will fade away and machines will take control. It will fix its own bugs.
Material_Policy6327@reddit
We’ve been fairly lucky so far but last few folks I have interviewed are starting to show signs of not being able to problem solve
momobecraycray@reddit
TBF I know devs pre AI who also could not problem-solve or anticipate or plan requirements.
Mostly they quit being devs on their own, I think that self realisation is going to be slower or non existent with AI tools now.
baezizbae@reddit
I sat and watched via zoom one day as an engineer on my team struggled with an agent, trying to get it to tell him what file contained what method constructor call. This went on for about 5 minutes before I looked for it with ripgrep and unmuted to go “hey is this what you’re looking for?” all in about 12 seconds.
anicetito@reddit
Damn you can even use the IDE search, what's on the mind of this people
Perfect-Campaign9551@reddit
That's not AI that's just Gen Z in general
Material_Policy6327@reddit
Id argue it’s only that cause LLMs are used heavily now by them and younger for everything
atomheartother@reddit
70% of students are using LLMs, at this point these problems are indistinguishable
katedevil@reddit
Sadly, this point is what is most worrisome both on this sub and far beyond the eng wheelhouse. A most worrisome knock-on enshitification impact with a nasty blast radius.
Special-Fee-4418@reddit
My junior acts like shes the ceo. Push whatever. Show up to work whenever. Leaves works or is sick at least once per week. I just approve all her pr and fix her bugs later after a full yr of teaching and 0 inprovement. Idc anymore
-no_aura-@reddit
You guys are hiring juniors?
Mad_Season9607@reddit
lost me at "mass entering the job market"
-no_aura-@reddit
For real. What job market?
Ratiocinor@reddit
Wait you guys didn't get laid off?
Deathspiral222@reddit
Wait, you guys used to have JOBS?
-no_aura-@reddit
I still do, but I used to too
ButWhatIfPotato@reddit
From my experience juniors are always hired, but the majority of times it's because stakeholders are doing a corporate power move by thinking the junior is some kind of savant that can do the job of a senior team.
DoLAN420RT@reddit
Yeah. My boss hired several juniors. They can’t manage themselves for shit (as expected) and my boss is advertising their cvs as senior levels. Already received one customer complaint from a big customer (consulting)
LazyLabMan@reddit
Wow that is brave lol
Augentee@reddit
In my bubble they offer junior salary but request senior skill plus responsibilities and cry "lack of skilled workers" when they fail to find someone dumb enough.
flatjarbinks@reddit
I have seen this rant over and over again all these years. First it was junior developers don’t reading the documentation, then it was copy pasting StackOverflow answers, now it’s AI driven responses
I was in the same position and I was taught the right way, just teach them too
Business-Error-2961@reddit
Im seeing the opposite actually. Usually ramp up for new hires (especially juniors) is about 6 months. Now I’m seeing juniors pulling their weight as early as 1 to 2 months. AI is bridging that gap for them. The biggest problem I see with AI is that it’s lowering the barrier to entry and this gives leadership ideas to offshore or hire h1b devs. They’re usually pretty cheap so leadership loves them, but it’s ridiculously common for these devs to have poor communication skills. Communication is the most important skill and it’s now more important to be able to communicate clearly and effectively.
Personally, AI is overhyped and it’s the scapegoat for the poor job market.
apartment-seeker@reddit
by not hiring them :e
roynoise@reddit
I have someone like this on my team. He has no idea what I was willing to do for him - i tried to do code wars kata with him, to teach him architecture and design patterns and testing.. after like two days he lost interest and just pastes whatever chatgpt gives him.
I have no support from leadership. I'm supposed to be mentoring him, but leadership has dunning-kruger as bad as he does, and they actually listen to whatever he regurgitates from chatgpt over what I say about technical choices, after our director told me "you're the senior and I trust whatever you say about technology".
Effing humiliating.
To add insult to injury, it's already a very low paying, 100% onsite position in a shitty part of a HCOL city.
Trying vigorously to leave - even with referrals, even with relatively smooth interviews, the jobs just don't come.
Aneurysm is very high on the list of things that will probably kill me.
National_Tale5389@reddit
When AI is not a trend anymore the decision about how to use it will go back to the senior devs. Eventually all these goofball leaders will have to admit they created a problem they don’t know how to fix. This bubble is going to burst hard and it will start by banning the use of ai once the investors start pulling out. It’s crazy how uncommon this prediction of this being where AI is headed is. This is just a standard hype cycle/ trend, people were saying bitcoin was going to be at 1M dollars by now a few years ago
positivcheg@reddit
I might be in minority there but you know, when I was entering my first job in C++ I was learning to debug code too. Because writing some small programs differs quite a lot, the depth of the call stack is like 10-15 compared to some pet projects where depth usually is like up to 5.
Learning to read long C++ template errors was also something pretty new to me back then.
But I do agree on the generic frustration. I’ve learned from the very beginning to learn on my own. So when I got into some hard situation with a bug that happens 1 in 100 runs, some huge templated error I’ve learned on my own all the skills needed to debug it. That’s how I was taught to tackle problems. Googling stuff, trying various ways to fix it, changing Google prompt, learning unrelated things while googling my issue. I think it’s pretty important. When one just mindlessly copy-pastes things here and there he doesn’t learn anything, he is just an operator, extra hands for the AI.
I believe those “AI operators” don’t have future, they will be replaced pretty soon by models trained to do exactly this - copy-paste things, rerun tests and basically do it in a loop.
National_Tale5389@reddit
Thank you, this is my intuition. I keep saying the only jobs that will be replaced by AI are the jobs where people only use AI. AI will eventually do that and someone else with a job it can’t do will just use the tool
raddiwallah@reddit
Agree on the code review part. I asked my junior on why he did a certain change. All I got was a blank look. I then propped “did you do this to achieve XYZ?” He sheepishly said yes.
Juniors are doomed.
scoopydidit@reddit
Don't ask just your juniors. Ask your mid and seniors too. I'm surprisingly seeing a brain drain across all levels on our team. A senior who I would describe as reasonably competent threw a 2k pr at me recently to add some comments to github PRs for security findings. I asked him about his decisions... couldn't explain it. Asked him about how this might cause memory leaks... couldn't explain it. These are all things he would've done fine before. He admitted to using AI to write the full thing (and I'm guessing he did so in a vibe code manner and just accepted everything at face value that AI generated).
Then we went back and redid the full pr by hand. Final change was 350 lines of code. So we cut down the code by 1700 lines of code. Way more maintainable, way more pragmatic, way easier to read, a lot less chances of things breaking. And he talked me through every damn line. It was awesome to see some "normality" in a code review again.
I am worried for the future where we have no choice but to accept vibe coded prs because management is not giving us time to thoroughly review changes. I was very fortunate to have some cycles to go over this PR thoroughly... but that isn't always going to be the case for me.
WickedProblems@reddit
So what has changed now vs back then? Or are we all just getting old?
I don't think much has changed, except the tools.
waloz1212@reddit
Yea, like they check the errors with AI? So did I when I was junior, with Google. They copy paste code without understand what it does? So did I, many times lol. Junior level is junior level for a reason, people are forgetting they were just as clueless back then.
scoopydidit@reddit
If you copy and paste code, you still had to integrate it into your overall application. Or change variable names or modify it slightly to get what you needed from it. You had to have some base line understanding. This is very different to asking ai to just "fix my junk" and walking away. And there's a lot of that going on. I see it in the office all the time when I sit next to engineers, even the seasoned guys. They are just prompting -> accept -> repeat. No understanding going on. Then when shit breaks, we're all clueless why.
lasagnaman@reddit
I mean I never did. Why would I commit code that I don't understand?
_dekoorc@reddit
committing code is not copy and pasting something into google.
lasagnaman@reddit
?? People are talking about copy/pasting code from stackoverflow into their codebase. What does that have to do with Google?
Izkata@reddit
Man, I must have been a senior developer at 11 years old then.
Really not being sarcastic. I truly don't understand this mindset.
svix_ftw@reddit
I get what you are saying but there is a big difference.
copying and pasting only got you so far, you still had to understand at a vague level what was happening.
AI agent inside your editor pretty much does everything for you.
KillerCodeMonky@reddit
Exactly. 10 years ago "AI" was copy-pasting "answers" from whatever Stack Overflow popped up in after pasting the error into Google. TBH, I'd bet that AI is actually an improvement on that front.
makemesplooge@reddit
Right. It’s like someone else said, back then people got an error and just ran to stack overflow to see if someone has resolved the same error. There were always people that created posts about their errors without even trying to fix it them themselves.
svix_ftw@reddit
comparing stackoverflow to ai is a wild take, lol.
Most of the time stackoverflow didn't have your exact issue and you just had to figure out and fix the issue on your own.
This-Nectarine-3761@reddit
Exactly. Most of the time you found similar solution to similar problem and you had to find a way how to apply it to your situation. That required much more thinking than just repeated prompting.
ComprehensiveWord201@reddit
Yup. It's not even close. You had to digest the issue enough to understand how to find the solution.
DeviantDork@reddit
Same with ai. Even if you’re using an enterprise edition you can put detailed environment info into, it’s not going to have the exact resolution unless it’s incredibly easy.
Just like with StackOverflow, you get some pretty close answers that you have to try out and see what happens.
pijuskri@reddit
Difference is people know that stack overflow is limited and if their basic copy paste doesn't work they are on their own.
With llms some people trust everything 100% and keep promting until they find a "fix"(which iften turns out to be a workaround that doesn't fix anything). 0 interest in stopping and thinking for yourself.
_dekoorc@reddit
IDK, I've been working on a task where I migrate tens of thousands of records from XML to like 100,000 actual database records. I've been having a lot of success with it giving me exact answers, even without seeing the individual database records.
DeviantDork@reddit
That sounds like a pretty straight forward task?
The problem with StackOverflow, which ai has only partially solved, is when you have a legacy, highly customized environment with dozens of integrations, there is no plug-and-play answer. Because these bastards are always special.
_dekoorc@reddit
I thought that tool, but instead its the same data in at least three differen XML formats, while trying to make the records look the same as ones more "organically" created. On a part of the codebase I've never worked on before. It sucks.
legiraphe@reddit
I didn't have stackoverflow when I started working in IT, I honestly don't know how I debugged stuff
Taco_Enjoyer3000@reddit
>just ran to stack overflow
I mean, that was limited by how much content you want to sift through, and even if you found a solution idk about anyone else but I still read through it to understood what was going on.
Stack Overflow at its most active is nothing compared to endless mindless prompting at anyone's fingertips.
midasgoldentouch@reddit
Or without posting a fix for the more unusual errors 😩
kbielefe@reddit
In my career, I've gone through:
Juniors have always been worse than we imagine we were as juniors.
drahgon@reddit
All of these were true. Juniors before AI were hot garbage. The code of my current company attests to it. Software I would has also been getting worse for the same reason. Bloated and not optimized at all.
troche_y_moche@reddit
This is a false equivalence. Yes, old juniors' code could be horrendously written, but at least they had to sit down and write it, line by line, and so they ultimately understood what it did. These days they "write" thousands lines of plausible looking code without having the faintest idea why it works and how to fix it when it breaks. This is obviously worse for any software that you wish to maintain for more than 2 days.
MelAlton@reddit
As a junior, I thought I was writing elegant code to recursively go thru a dataset to change some values that were chained together, but I forgot to check how long data chains were in production (they were very long). My code used up all the ram on the production servers.
failsafe-author@reddit
It’s far easier to learn by doing that reviewing, and now we’re doing more reviewing. Learning by reviewing takes discipline, and (imo) better tools.
Toohotz@reddit
Issue that I have is that in the interest of time, reviewers are glossing over this due to review fatigue.
failsafe-author@reddit
Hence the need for better tools (and more discipline).
asdfopu@reddit
The change is that they don’t have an understanding of what’s happening under the hood. A fixed abstraction up leveling is very different from a general purpose abstraction that can handle everything.
proxwell@reddit
As someone who sits on hiring boards, I cannot emphasize this enough: using AI assistant tools does not absolve you from needing a deep understanding of your relevant language features and patterns/anti-patterns.
A lot of candidates these days, particularly on the junior side, seem to think that they can just throw everything in the LLM or assistant and tell it to find or fix issues.
We tend to interview candidates in cohorts, for example the 5-10 strongest candidates after the initial phone screen get the 1hr technical interview. We have questions that demonstrate working knowledge of our language fundamentals (python, javascript) and some linux cli. Candidates who can't demonstrate at least a decent working knowledge of those topics get passed over in favor of the ones who can.
rupayanc@reddit
the stack trace thing is the clearest tell and it's the first thing I check now in code reviews, not the code itself but whether the author can explain the failure mode. the interview process probably needs a debugging round specifically: "here's a broken service, walk me through how you'd diagnose it."
Upstairs_Owl7475@reddit
Had a guy on my team that had AI wrote an entire micro service and when we met to ask questions about how it works he couldn’t answer any questions. He didn’t make it too far.
ConspicuousPineapple@reddit
Wait a couple years and you'll see juniors who not only learned with AI, but also went through covid during their high school years and never recovered academically. It's going to be dramatic.
ilyas-inthe-cloud@reddit
I don't think the issue is AI by itself. It's that a lot of juniors can now generate code before they ever build a real debugging loop. I'd make stack traces, breakpoints, and PR tradeoff writeups part of onboarding, same as git and tests. If they can't explain why the code works, they don't own it yet.
LeetcodeForBreakfast@reddit
it’s funny cause management is telling us to now just ask AI first before trying to debug manually, to save more time. so these juniors are just ahead of the game.
there will be a day when there is 4 senior SWEs trying to debug code and it’s just all of them asking copilot and getting 4 different answers, none of which are correct, nobody will know what to do.
Known-Tourist-6102@reddit
The ai is often much better at diagnosing errors than humans
felixthecatmeow@reddit
Meanwhile I use tokens to ask AI to format the single line blob stack trace I pulled out of logs so I can read it...
The funniest part to me is that you say they manually paste each error into the prompt. Like if you're gonna let AI take the wheel this much and not use your critical thinking at least use an agent and let it iterate on its own...
eoz@reddit
You're using an LLM to replace "\n" with newlines?
Alborak2@reddit
The amount of crap that sed, awk, cut and grep can do that runs through tokens now is insane. At least new stuff is much better at just running those instead of handling all the parsing itself.
felixthecatmeow@reddit
Gotta pump up those usage metrics any way I can
arlaarlaarla@reddit
The aquifers aren't emptying themselves y'know.
kk66@reddit
If it goes beyond just replacing \n's, I'd ask it to create a tool for something so repetitive. Waste of tokens and time otherwise. The more predictable something you do is, the less you should rely on "AI doing stuff in a non deterministic way" and use it to build a tool which gives you a more deterministic results every time.
coder155ml@reddit
It’s ok to generate AI code, but it isn’t ok to paste an error into the prompt? I don’t get it. It’s another tool to help with the debugging process, an sometimes it can be very useful
Radiant-Interview-83@reddit
I think what they meant was that it is a waste of time for you to first copy-paste the code from the chat to a file, run it, and then copy-paste the error back to the chat.
Instead of a plain chat you should use an agent that writes to code directly to a file, runs it on its own, reads the error, fixes the code directly, runs it again, reads the new error, debugs it, and fixes the code again on repeat until it works.
coder155ml@reddit
I still want read the explanation about why the fix works etc. I’m not pulling myself out of the loop
felixthecatmeow@reddit
Yeah but the juniors OP is talking about aren't doing that. Also it does explain how the fix works at the end?
Tetr4roS@reddit
until it runs
coder155ml@reddit
They were saying it’s better to read the stack trace and debug rather than throw the error into the chat gpt prompt
path2light17@reddit
Yea similar either bare bones test cases or cleaning up csvs, some python scripts
marioc-14@reddit
I would be careful asking it to format long strings. I’ve seen it do well initially and then it starts spitting out random keywords and numbers after a good couple of lines.
mainframe_maisie@reddit
jq and sed are still indispensable for my day to day :D
livsjollyranchers@reddit
Totally. Mundane shit like formatting. Asking it to come up with a recipe using only the ingredients I currently have. Oh wait that's not coding. But you get my point. Innocuous stuff.
silly_bet_3454@reddit
They annoy me but for the opposite reason you expect. They're not idiots who write broken code necessarily, but they are heavy on the AI hype train and everything I see them say on slack is just another proposal that's based on another agentic tool idea, and if you look closely the "tool" it's just a github repo with a bunch of of prompts, and they act like they are so smart for coming up with this.
Loose-Potential-3597@reddit
That's funny, I work at FAANG and all my older teammates and managers are the ones like this. The number of garbage agentic tools that my org's created and then scrapped within a month is through the roof.
nachohk@reddit
You are a FAANG senior software engineer. Don't introduce security vulnerabilities. Make no mistakes.
beeskneecaps@reddit
Honestly it’s everything I could ask for
corny_horse@reddit
I unironically had a boss tell me that they weren't even going to consider code reviews and that I should just use "bug free driven development." So... not much different than how things always have been.
Rabidowski@reddit
Gee. I wish I had heard of that pattern. I usually make sure to add at least 1 bug per 200 lines of code.
corny_horse@reddit
https://i.redd.it/the-simpsons-my-goodness-what-an-idea-why-didnt-i-think-of-v0-i9oxw3du94371.jpg?width=1080&format=pjpg&auto=webp&s=b57097332b0a5f9b906207c721882e3c3a3800cb
lol
silly_bet_3454@reddit
Lmao yeah I wanted to make a joke like this. It's like "hey guys I wrote this tool for agentic code review" - the tool: "you are a senior engineer who needs to review code that other people wrote, the way it works is that others write the code, and then you review it. Watch out for things like correctness, maintainability, and style" Truly groundbreaking stuff
SansSariph@reddit
Believe it or not this kind of things ends up unironically useful as a first pass reviewer.
Deathspiral222@reddit
I have a validation agent that is basically “don’t believe any of the shit the other guy wrote. He hallucinates constantly and makes things up and always claims something is production ready when it’s just a stub. His tests are bullshit too and often don’t test what he claims. Give me a list of all of the things he did wrong” and it catches SO MANY issues.
daveminter@reddit
My early very cynical impressions of LLMs were overly informed by the kind of terrible code that ChatGPT was spitting out. Even that improved a lot, but using Claude to do first-pass PR reviews has been humbling when it spots mistakes that I didn't.
jellybon@reddit
Code quality is still bit hit or miss, but using LLM for code review is very helpful, especially when you work in small team where there is no code-review process.
It is especially helpful to have tailored instructions per development system because when you work on dozens of different systems with different naming conventions, it's easy to forget which hungarian-notation prefixes you're supposed to use and accidentally use wrong one.
Qwertycrackers@reddit
It is useful but there's kinda no reason to share it. It's honestly better to have your agent slop generate you one than to attempt productionizing it and making a shared framework. Breathlessly sharing your pile of prompts without noticing this fact indicates a lack of introspection.
tankerton@reddit
Saves me an incredible amount of time in a similar seniority level to yours.
Ive also gotten into the habit of excessive acceptance criteria listing in stories and making my PR review flow look through it all as a check list. Far more effective.
candraa6@reddit
yes, these AI good for first pass checks, maybe it's good to run on CI check for PR or something, basically an adaptive code quality checks tool.
WrennReddit@reddit
Holy cow I've noticed that too. Junior is just going bonkers in slack glazing any AI shower thought they can find. I'm like...you literally just started here, settle down.
It's all just "skills" which are just prompts. And oddly, the more of them you pile into your context the worse the results are. So they're annoying and they're actively sabotaging their slot machine.
juxtaposz@reddit
Haven't you heard? Prompts are the new source code; source code is the new build artifact, and binaries are just yucky.
PlasticExtreme4469@reddit
We have a lead dev at our company that posts pictures of his multi-terminal setup, where each terminal is running an AI agent.
He posts those pictures to popular Slack channel every day, often with no description, just weightlifting, or flexing emoji.
My point is, It’s not just juniors that can be asocial like that.
Feeling-Schedule5369@reddit
And managers and leaders eat that shit up and say to other devs, "look at this guy, try talking to him and learn these techniques. Ai adoption is very important" 😂
Odd_Perspective3019@reddit
idk i’m so conflicted with the who think they’re so smart comment. Now everything thinks no one is smart cause AI did it for them. I think writing a prompt to solve problem is smart i think writing a good plan for AI is smart. It’s turning into a weird world where we have to prove our smartness and it’s very difficult too nowadays
ichabooka@reddit
Why does it have to be tolerated any differently? You review the code and make the same kinds of reviews - except that you do it faster. It’s a tool and it doesn’t replace reason and accountability.
twnbay76@reddit
I'm a little in between worlds... I've been using AI so heavily for so long, the instinct to use AI where previously helpful is growing increasingly stronger.
One thing I try and do is basically try and construct a response to a prompt in my head, and then weigh it against the response to the prompt. This way I'm effectively always either selecting optimality between two choices, or constructing a third more optimal choice which is one choice augmented with a second.
Sometimes I'll just hand code, either on my free time or sometimes at work when I have wiggle room with my free time, to keep my skills sharp.
Lastly, I try and write a lot of tests against the output after confirming the design is adequate more so than reading the code line by line and judging it for correctness, which is inherently error prone and unreliable. I'll use inline code completion but I would not generate test code.
These are some of the methods I've been using to ensure I don't lose control. I'll forget syntax but I will always know what's going on inside and out and have confidence in the code because I co authored the low level design and I hand wrote/character by character inspected tests against all of the generated code.
ilyas-inthe-cloud@reddit
I don't think the problem is AI, it's that they never built a debugging loop before they got autocomplete on steroids. I use AI constantly, but if someone can't read a stack trace, isolate variables, and explain why they changed something, they're still at the copy-paste stage. On my teams I'd make debugging and code explanation part of onboarding now, not treat it as implied. AI is fine. No mental model is the real issue.
kruvii@reddit
You're hiring?!
curlyheadedfuck123@reddit
My company has said verbatim "if you do not use AI you will be replaced". Even mid and senior level engineers have fallen prey to skill attrition. I'm trying to partner early and often with our junior to convey my problem solving process. I think for current juniors, debugging ability is far weaker. I don't have a degree, but I dunno how you'd get through a comp sci degree without having those skills. We're a very productive team overall, so I'm trying to encourage my junior not to so quickly reach for AI to solve problems...it's ok to spend a little time to understand and figure it out.
Separately, I asked a "senior software engineer" on my team last spring why he chose to take a particular approach on a task during a PR. It wasn't explicitly terrible, but I thought there was a more effective and evident approach and wanted to understand his thinking. He said "I asked Copilot and didn't know enough to disagree with it"...yup
Alert-Refrigerator22@reddit
Happens not only for juniors I think - I have been working for a bit over 6 years, and on "tasks" i get assigned for new jobs the process ends when i get asked how this works lol
ZabbyCapurin@reddit
Based on my team's recent interview process for junuors, we'll definitely have to readjust in this current AI landscape.
One of our questions is a simple "create a function that reverses a string": pseudo code is fine, mistakes are fine (I usually ask them to walk me through their code and they always find the issue). This is the first year in nearly a decade of me being on our interview panels where none of the interviewees were able to write down any sort of answer, not even verbally what they might try...
gerlstar@reddit
Lmao make then do leet code again
ZabbyCapurin@reddit
This literally is the only sort of programming question we ask of those interviewing with us. Next closest are "what's wrong with this code snippet" and "what might a car object look like" as very minimum level effort questions
Dependent-Cash-3405@reddit
lay them off?
Leading_Yoghurt_5323@reddit
honestly i’d make them debug without ai sometimes. if their thinking isn’t runable without the tool, the skill isn’t really there yet
OkSucco@reddit
Accepting as new normal, watch over and seed help, then when I'm comfortable they know their shit, just auto surface and inject work as needed in to their sessions.
Winter-Appearance-14@reddit
In the last 8 months I worked in a platform team and I haven't trained juniors as the team has been built with experts from all the areas with the idea of improving systems performance and reliability. But I now have visibility on everyone's code and effects and I found that there are 3 possible contributors despite the seniority:
What we introduced to limit the danger of the last type is to use AI to inject benevolent code practices. Both cursor and Claude read "rules" from specific paths and use them as context on how to do things thus we maintain rules for how tests should be written, determine if an integration test should be added and in general all sorts of what we consider good practices.
Silly? absolutely I would expect more quality from everyone but since the slop is non avoidable we can create gates to force the AI to iterate more. Code coverage gates, for example, works wonders with an AI as if you block a build the AI is inclined to write decent tests not just coverage while an annoyed human will just add dumb coverage.
sergregor50@reddit
Not silly at all, if people are going to shovel AI slop into the repo then baking your standards into the prompts and letting CI smack bad output is just basic risk control.
thekwoka@reddit
Firing them
Thedaruma@reddit
I am mentoring folks who are wanting to move internally into the software engineering role from nontechnical disciplines.
One person I am mentoring by going through the black-box vibe coded monolithic pile of garbage code they’d been vibing into.
We are going piece by piece, refactoring the robot’s garbage, finding truly hilarious chunks of code that exhibit all manner of anti patterns. I explain why they’re anti-patterns, and what it looks like to think about code in an extensible, reusable, accessible (in the case of client side code), architecturally-sound perspective. It’s actually a fantastic learning opportunity.
The refactoring is entirely without the help of any AI, just good old fashioned “break this shit down into human-readable abstractions and watch the red underlines disappear”. I have to admit it’s very satisfying, and from what I can tell, very effective for the learner.
These kinds of students who truly want to learn are few and far between, by the way. The allure to just tell the robot what you want, and then have senior Eng polish it up and fix it when it blows up, is a strong one for most nontechnical folks looking to transition to engineering.
sevenfiftynorth@reddit
Why would they be pasting into ChatGPT versus using command line tool like Codex?
mrdat@reddit
Wait, you’re hiring new team members?
SpeedDart1@reddit
Reading this chat as a junior with 1.5 YOE is brutal holy crap
kynrai@reddit
Hiring lots of juniors. 99% fail the interviews but this was also the case before AI.
My interview involves a super simple hello world but it does not compile. Think of missing semi colons or missing import type errors.
They are expected to fix and make the hello world run.
Spider_pig448@reddit
I spent a decade debugging errors and now I'm an LLM error copy-paster. It's just faster and easier almost every time. LLMs are much better than humans at processing a giant error log
Soileau@reddit
This is an inevitability and we can complain all we want but it’s happening and the more we bemoan it the stronger the “old man yelling at the cloud” energy it gives off.
NoConnection4298@reddit
I don't get the frustration beyond asking chatgpt for suggestions on the bug, and iteratively find out the problem (it is part of debugging). There was stackoverflow before chatgpt and many used to do similar thing. Anyways, harsh PR reviews and giving them ownership are the ways to handle AI slop in my team.
weeboards@reddit
I meet seniors like this all the time :)
mckenny37@reddit
Im in a similar position and am hopeful that by enforcing good tests and by having them focus on TDD that the code will start to be a bit less crazy.
Its something that I'm just starting to try though.
One of the main benefits of TDD is it forces you to write somewhat good code as long as the test cases are sane.
__sad_but_rad__@reddit
It's because you're asking a question from 5 years ago.
Companies are done with human coding, and it's never coming back. The profession has been de-skilled and will no longer exist in the very near future.
When you ask them to explain l.381 of orderList.java you're just wasting everybody's time. AI does the coding now, so go ask Claude.
snotreallyme@reddit
There was a time when every programmer knew assembly language and understood how memory management worked. Things changed and that stuff became more and more niche. The ball is still rolling. Don’t be under it.
drahgon@reddit
You still need to understand how memory management works. I have never met a GOOD programmer that didn't
tbone912@reddit
Any specific resources that you'd recommend? I wrote APIs and mostly do ETL. So memory management never came up with me.
pijuskri@reddit
You're still a shitty programer if you don't understand how memory management works
CppIsLife@reddit
Bad example. We don't write assembly because we created a new abstraction layer, yet we could still read and write said abstraction prior to AI.
With AI, the abstraction is still the same (e.g. Go or Java). The difference is that juniors don't understand the abstraction.
It's kinda like writing a novel in a language you don't know. I'm sure AI tools could do a decent enough job to write it, but to a native reader it would be full of nonsense and not a compelling read.
Radiant_Radius@reddit
Why are they pasting things into ChatGPT and not using Codex, Copilot, or Claude Code? Does your company not support modern tooling?
HirsuteHacker@reddit
We aren't hiring juniors anymore. A lot of companies are doing the same. I'm concerned for what the industry will look like in another 5 or 10 years.
Pazda@reddit
So what is it? Is AI incompetent and can't solve any bugs, or does it spoonfeed every fix and make us incapable of thought? At least be consistent with it, lol
No_Direction_5276@reddit
Yeah
First PR with mistakes: oops, I trusted AI way too much.
An hour later: another PR, another batch of mistakes.
At this point, it’s starting to feel pretty disrespectful of the reviewer’s time.
Defiant_Use_6295@reddit
I actually don't know anyone in my network who's hiring juniors right now
Frozboz@reddit
Easy: we aren't hiring anyone. I've lost 4 engineers and 3 QA in the last 3 years. It's me and 1 senior engineer and zero QA now. The bosses think AI will make up for the lost seat codes. I hate it.
Qwertycrackers@reddit
Doesn't matter because we don't hire juniors and seemingly never have. Management doesn't care where seniors come from, just hires them.
hectorcastelli@reddit
We just look for seniors now... And then get bombarded anyways with people that behave like juniors as soon as you ask them to disable their variety of copilot extensions and tools.
Our live interview is simple: add query parameters to an api endpoint. Open book, full internet access, just no AI. Folks still fail to do it after two hours.
gerlstar@reddit
Can't be that easy if it fails after 2 hrs
hectorcastelli@reddit
All of the engineers currently in the team done it in five to ten minutes (and we budget for 30 minutes for the public to account for nerves and all that) so we can discuss "complications" or add nuance afterwards.
Mr_Nice_@reddit
I've also seen developers regress. A dev with about 4 years xp, 2 on my team seems to have gone backwards in understanding. He frequently submits stuff he doesn't understand and when asked in review just says that AI added it. He's been relegated to working on peripheral systems that are non critical. He knows enough to get it working but he can't explain how he did it.
dinosaursrarr@reddit
Why do they expect to be paid for pushing buttons they don't understand. At least Homer Simpson had a good union keeping him in his job.
papawish@reddit
We hired for a junior position.
The kid is smart, humble, curious.
Last stage, he meets the director.
Director rejects him because "he doesn't use AI enough".
RegretNo6554@reddit
that’s rough
Jmc_da_boss@reddit
Same way I always have, they either learn or they get let go.
You don't have to be overly aggressive on that timeline but they should show clear aptitude and ability to learn
Working_Noise_1782@reddit
To be honest I've been using cursor for a few month now (graduated in 2011) and holy shit it's helping alot.
I use it to program stm32 arm micros. I gave it the 1000 page ref manual and it knows what to do at the register level if the HAl library is missing that functionality. Juniors gonna need to learn quicker than ever lol.
CaptainRedditor_OP@reddit
First time I heard of AI coding assistants being used in embedded. When you say you 'gave it the manual' is it like RAG and then mcp going forward?
gjionergqwebrlkbjg@reddit
Gemini models can read PDFs no problem.
WiseHalmon@reddit
I used it to make a esp32 iaq sensor. I take pdfs covert to markdown, provide model with small files of PDFs and a table of contents sorta deal. Psuedo RAG. But uh I also have a whole electronics bench
hexmaps@reddit
My network is just seniors on small teams, pretty common now to run a team of agents. They don't push unreviewed results though. As long as it's not making system design choices and provide a test boundary it's pretty much just supervising the AI team now.
Juniors that have a proclivity of digging deeper will have a path forward but realistically IT admins with scripting skills and AI devs will be on the same perceived skill levels.
Curious_Ad9930@reddit
Don’t worry about the downvotes, thanks for sharing your experience. I think agents are good with EXTREMELY explicit instructions. And I think thats where backend/embedded systems are way ahead of the slop code in modern web apps
dweezil22@reddit
I'm sure different orgs are onboarding AI at different rates, but there was a brief window where our AI PR reviews and human expectations hadn't caught up with our AI enablement of devs. I had this one wild PR that I almost approved b/c it would have been so hard for a human to ship that I just assumed the new hire was a genius that had deeply researched it. I was like "Of course this went through a design review that went over the drawbacks here... I must have just missed it". He's like "huh?"
And then I realized this was just Cursor taking a new hire that would have been unable to ship anything in normal times, and instead made him 100x more dangerous lol. Those types of devs need to evolve quickly or get let go, even moreso than the ones that aren't productive at all.
Born_Consequence_117@reddit
pair programming helps a lot, forces them to think through problems instead of relying on ai
pcpmaniac@reddit
You’re looking in them in the eye during a code review (I often get a blank look)? Jfc your post asking about junior AI assistance use was obviously written by AI. Experienced devs should be smarter than this.
tetryds@reddit
Hyperbole much?
pcpmaniac@reddit
Perhaps. Maybe I’m just too jaded by remote work to think that humans interact in person anymore.
Electrical-Ask847@reddit
why did you hire juniors then? dummy.
SplendidPunkinButter@reddit
AI doesn’t help you code faster. It helps you copy/paste from SO faster.
paerius@reddit
We have 0 headcount
actionerror@reddit
So you got laid off?
paerius@reddit
No?
temporaryuser1000@reddit
Then you have a head count of 1?
Silly_Individual2659@reddit
They obviously meant additional
Tayk5@reddit
Our head count array has an item in 0th position.
TheBigCicero@reddit
It sounds almost like they’re cutting and pasting out of stack overflow! I know YOU wouldn’t do that!
No, seriously, this is the same type of stuff we used to say before the internet. If you coded in the 90s and need to crack a hard problem, you turned to a “cookbook” of algorithms. And typed it all in by hand. Then when the internet and copypasta of code was enabled, many of us noted the drop in developer quality. But the trade off was speed and scale.
We’re going to see the same rough trade off to attain speed and scale.
johnhubcap@reddit
We've hired about 6 juniors in the past 6 months or so. Honestly, 3 of them are incredible; smart, well spoken, hard working. Excellent hires and these guys are gonna be seniors in a few years.
The others have never used a debugger! Ive had to coach all of them. Totally clueless on how to look for errors in general and one guy didn't really even understand how to use logs. One guy didn't know what absolute value or UAT was (we code accounting, it's important); adding that to the interviews lol. One guy we let go because on top of not being very technically savvy, he was not following procedure, and would tell clients things that were not true (and easily verifiable with a Google search). He also clearly didn't test a few times and released code for reviews where functions clearly were not being called or returning properly.
Some juniors are super powered; they still are pretty strong (for juniors) with traditional coding tools, but they use AI to learn, gain insight into issues, and even do a good chunk of the raw coding before they do the necessary tweaks. Some are stunted, and while some can be trained, my personal to-learn as a manager is actually recognizing when it's time for these guys to go and stop spending my and others resources on a lost cause.
_5er_@reddit
I asked our intern a few questions on PR to trigger some kind of thinking: "What do you think happens when X scenario?". His first reply was copy-paste from LLM...
_5er_@reddit
I would personally limit AI tools for juniors a lot. Like max 1 hour per day, just so they are kind of familiar with them.
The main thing juniors should do is to learn. Productivity is not that important for them.
EnemonaAA@reddit
Something I've noticed that might help with the interview side: ask them to explain a bug they fixed recently and walk through how they found it.
Not "describe your debugging process" in the abstract. Make it specific. "Tell me about a bug that took you more than an hour to fix. What did you try? What finally worked?"
The AI-dependent ones stumble here because they can't recall the process. They don't have a story about frustration, wrong guesses, then finally the aha moment. Their answer is either vague or they admit they just kept feeding errors to ChatGPT.
The ones who actually debug can tell you the timeline. "First I thought it was X because of Y. Checked the logs, saw Z. Then I realized..." That narrative only exists if you actually did the work.
Another tell: ask them to explain one of their PRs from memory without looking at it. What trade-offs did they make? What did they consider but reject? If they wrote it themselves, they remember the decisions. If AI wrote it, they don't.
We've started asking these questions early in interviews and it filters fast. The skills you can't AI-assist your way through are the ones worth hiring for.
Perfect-Campaign9551@reddit
AI is actually really freaking good at debugging, fam.
I think you are stuck in your old ways
jameyiguess@reddit
Man, reading the stack trace is Step 0
pr0cess1ng@reddit
No bro, isnta-paste into google! It's how everyone did it back in the day! Lolol. The delusional AI users are multiplying fast.
If you're a junior reading this. If you can't reason about and write the app yourself, AI is doing you a disservice.
Meta_Machine_00@reddit
You misunderstand what people are. Humans are machines themselves. Free will and action are not real. What the AI does to people is an inevitable circumstance of the universe at large. You hallucinate that there is any wiggle room in how any of this plays out.
moonsnake77@reddit
We have juniors (and thankfully still have plans to hire and train them).
AI is definitely causing some strange conversations. One junior said they were unsure how to figure out if the generated code was good or not, and also weren’t picking up skills in a new language/tech stack properly when the AI did most of it. We had to point out that seniors are still around, you can talk to humans and not just the AI (seems obvious). And that not all skills are even worth learning; there’s not much value in a backend developer obsessing over the detail of a few lines of CSS for a change to an internal dev tool, for example; better to focus on the details of most code he’ll be writing on the backend.
We have been trying out getting Claude to generate boilerplate for a task - for example a series of method signatures to get them going - and having the junior implement these. Then ask Claude for feedback, or hints for next steps if stuck. Or compare Claude’s output vs the junior’s. This will obviously depend how your org is set up; if they are expecting devs to be infinitely more efficient with AI then this might be too time consuming! Ours seems pretty sensible.
It’s imperfect but it’s on us to figure out the new world. Mentoring juniors has always been part of the job, and like every other part of the job that’s needing some new skills now!
Emotional_Plate_1501@reddit
Two things for improvement that can be posed in a way that you are looking to help them as they clearly prefer working with AI, direct feedback on reading stack trace first which would help them understand and then prompt better, clear feedback on debugging principles, processes and applying solutions, and for they don’t know the structure of producing code, ask them to write a documentation on how they have and whats the reasoning behind implementation, this will challenge them to think more, weighing pros/cons etc. Simple hard truth is, they are beginning their journey in AI driven world, can’t ask to them not use it, can’t expect them to produce senior results, what you can ask them to do is be more thoughtful on 1:1 feedbacks.
butterycrumble@reddit
This sounds like a problem with your hiring process to me. You've hired people and only tested what they're capable of creating. Of course they'll use AI to help create things but that's not the whole job, as you're saying. Was their no test over their desire to learn, their able to discuss their process and reason it out?
The_Mauldalorian@reddit
Our team encourages using AI to enhance productivity.
mines-a-pint@reddit
Not reading what's on screen right in front of them is just typical junior developer behaviour.
I've seen this behaviour for years. I think it's just more people are coming into the industry who haven't grown up tinkering with code for fun, on systems that weren't very helpful, and who are simply used to tools doing all the work, whether that's a modern IDE, a StackOverflow search, or an LLM.
If you can encourage them to use their senses a bit, it can be a 'Neo in The Matrix' moment for them.
ShiroNii@reddit
The PRs are freaking massive. I leave comments like "Why did you --- ?" and instead of starting a discussion they just reply "fixed!". First off, I didn't request a change. Second, it wasn't necessarily broken, I wanted to know the reasoning behind it. It's like they just pasted my comment into copilot and committed the changes it recommended.
mikolv2@reddit
I noticed a general decline in skill in juniors, over reliance on AI assistants. The few we have hired, we just mentor and teach to code without AI. We use it, of course, but we spend at least some of the time teaching them the craft, if you will.
donhardman88@reddit
We've been dealing with this by giving juniors better tools to understand the codebase, not just generate code.
The problem isn't AI - it's that juniors generate code in a vacuum without understanding the system they're working in. They can't debug because they don't know how the pieces connect.
What worked for us: We use Octocode (semantic code search) as part of onboarding. When a junior writes code with AI, they then search the codebase to understand: - How does this fit with existing patterns? - What else might break? - Why was it done this way originally?
Instead of "AI generated it, I hope it works" they can actually trace dependencies and understand the system.
It's not about replacing AI - it's about giving them the context to learn from what they generated. The AI writes the code, but they have to understand it to maintain it.
The juniors who use this approach are picking up architecture much faster than the ones who just prompt-and-pray.
Open source if anyone wants to try: https://github.com/Muvon/octocode
How are others bridging the "can write code but can't understand systems" gap?
Additional_Rub_7355@reddit
This is ridiculous.
aeroverra@reddit
Our codebase and general tooling is becoming a giant mess and our company culture doesn’t allow me to fix it.
I do the best I can within my working hours to hold onto what I can but I have more or less lost hope and have decided to not worry about it for my own mental health.
LeSoviet@reddit
Im jr this is fake news jr market its dead
Prize_Response6300@reddit
I’m a team lead right now. Might be a hot take but after having a few I doubt I will request for juniors again after dealing with two post AI grads. I’m all for using AI for your workflows but these guys can’t do anything but prompt and since they never truly coded without it they have horrible intuition of what a good or bad idea is.
piterx87@reddit
If they just paste into chatgpt they haven't "learned" I thought by learned you mean deep integration of ai tools into development process. Not copying and pasting
CmdrSausageSucker@reddit
Thank you for the wonderful news! This will keep the contracts coming my way for the next 20 years or so, since there's definitely a lack of skilled software developers entering the market.
meevis_kahuna@reddit
Bad hires/bad hiring. Not all juniors are like this.
There must be some reasonable amount of technical screening somewhere between this situation and having to do leetcode hards and overly complex system design questions. One can hope.
iMac_Hunt@reddit
I came here to say this. Most of the issues raised can be assessed as part of a technical interview.
NickW1343@reddit
The average GPA is quickly going to 4 in CS programs. Many, many juniors are like this and soon you'll very rarely see one that isn't like this in interviews. I can't imagine it ever moving back in the direction of more traditional coding with AI, so I expect one day the definition of what a good junior is will be a vibe-coder with a willingness to learn what good design looks like and doesn't shy away from debugging when they need to.
Who knows what good technical screening looks like now? My hunch is that the 5 rounds of interviews for junior roles is evidence businesses have lost faith in technical screens and are praying extra rounds will somehow filter out bad candidates by coincidence. If the industry knew how to do them right, interviews would be 1-3 rounds.
zangler@reddit
Change. Handle it with changing how you deal with it.
newbietofx@reddit
I wrote a terraform script to deploy eks in aws with persistent storage without using rds which is wrong but not incorrect. I copy and paste the error in Google after using kubectl documentation to generate the logs. Stackoverflow took one day to reply. I manage to resolve the issue over the weekend.
I'm using lamp stack. Now it's loveable and kiro or kilo and it's 30 mins. And it's dern stack as in documentdb.
I'm a full stack vibe coder. And I'm proud because I can do frontend with pagination and backend with node express or fast api and it's because Ai was extremely patience to repeat 10 x when I said I don't understand like 100x.
Some Reddit and stackoverflow humans are not kind
Few_Cauliflower2069@reddit
Why would we hire someone who can't code without ai tools?
usrlibshare@reddit
By refusing every applicant that shows signs that he can only code usinv LLMs, and firing those who somehow made it through the interview stage asap.
nath1as@reddit
why would that be bad? let the machines debug before you have to do it
Thundechile@reddit
"I am not blaming them."
You should. They should learn to properly diagnose problems, not blindly depend on tools that are not deterministic.
Spare-Builder-355@reddit
what did the interview look like ?
SetQuick8489@reddit
We don't hire AI natives. We don't hire at all any more. It's all ging down hill.
aop5003@reddit
30 years ago engineers were scoffing at new grads who used calculators instead of an abacus.
Longjumping_Feed3270@reddit
I think it's completely fine to just point Copilot at the log file and ask "what happened there", but you shouldn't waste the opportunity to let it explain to you its theory about it and why it thinks the change it proposes does actually fix it.
If that doesn't pass the smell test, it's probably hallucinated.
That being said, juniors nowadays might actually learn how to be good code reviewers before they become good coders themselves.
montdidier@reddit
Not hiring juniors for the most part. Folks on my team have readily adopted AI in various capacities but for the most part I have barely noticed any change in output. I have seem a few questionable design choices on a few APIs though, that I attribute to too much trust in AI.
Fidodo@reddit
Honestly, I kinda blame them. Yes ai makes it easy to be lazy but the greatest sign of a good developer in the making is curiosity. We have literal magic boxes and you don't have the curiosity to understand how it works? If you use a computer and don't have a basic understanding of how it works it is essentially magic. You can choose to be a wizard but instead they're too lazy to learn magic.
1amchris@reddit
I feel like I read the same exact thing previously on this sub.
_dekoorc@reddit
Uhh, this is exactly what I, a dev with 15 years of experience do. (Although I will note that I'm mostly working in Ruby lately and my expertise is JavaScript)
Substantial_Sound272@reddit
We need to develop processes around effective AI use. Current practice is an unmitigated disaster.
ZunoJ@reddit
We don't hire new juniors, We did a lot of interviews but it was exactly what you described. The future will show it
james__jam@reddit
None of our juniors are like that. They’d know they’d get fired if they cant explain their own code
MI-ght@reddit
They are trash of the Earth and will go into oblivion soon enough (with an inevitable token price spike).
Odd_Perspective3019@reddit
lol AI is not the problem but it’s just them not wanting to think and everything to do with lack of curiosity. Even now when i don’t know something and AI does something for me, i learn why it’s doing that and if that is the correct way to do it. I correct it with my experience. These juniors just take AI word and assume it’s right. Easy to mentor them tell them the hard truth and change your interview processes if you guys couldn’t even spot that!
decko_kaj_fura_crno@reddit
Can you explain what does even "learned to code with AI" means?
They didn't learned to code and i don't see anyone hiring these people. Are these posts written by AI bots?
tribbianiJoe@reddit
I am observing the same thing with a junior in my team. Our tasks sometimes involve our codebase and a third party and the plans are almost always trash/AI slop. They were hired 6 months ago but have zero familiarity with the codebase and keep making new functions for things that already exist. I have to keep correcting them.
I do not want to handhold at every step tbh but when the work is not upto the mark, we have to step in and that has been frustrating at the very least.
AlaskanDruid@reddit
They dont get hired since we only hire programmers, not frauds.
india2wallst@reddit
I saw a pr that was +500 lines and -470 lines. Literally, entire script deleted to add one function and integrate it and then rewrite the whole thing. It's scary to see so many lines changes for a simple function addition to literally save some files into gcp bucket with metadata.
It_is_my_username@reddit
LaserToy@reddit
Easy, we don’t hire them
xt-89@reddit
Engineers have to be diligent about practicing the basics now. Your 9-5 isn’t going to automatically include that anymore. Still, their AI driven approach probably is better for them. We need to be careful about simply dismissing the next generation.
beckygl1tter4794@reddit
what inspired you to write this part
EvenPheonix@reddit
Oof, yeah. I’ve also seen it where there is a weird disconnect of not understanding the fundamentals. Not being able to read a stacktrace or even understanding basic structural design is happening more and more. Those folks rarely ever get past the final rounds. The youngest I’ve ever seen get hired at a startup in my circle was at least 1 year of experience, and in that case, it was because they were at Amazon and was part of the layoff from a team with direct experience in what the startup wanted to implement.
jimmy6677@reddit
Straight up - I use one requirement and only one rec when it comes to hiring - do you want the job. That’s it. I don’t need a computer wiz. I need someone who cares about doing good work.
empty-alt@reddit
I don't have an answer. You can lead a horse to water and all that. I just think this is so sad. I feel like I was the last one on a train, as I was hired in the COVID craze before the "efficiency" layoffs. And I feel like I had time to get the fundamentals before the "AI" thing happened. If AI had been around when I was in college, I would've severely stunted my ability to learn and solve. Evidence being, I was one of those doofuses who was questioning the utility of a degree if I can just "go write code with no degree and earn good money".
Few_Let1422@reddit
I don't see the point for hiring junior engineer anymore.
They just rote repeating the AI.
tom_earhart@reddit
Simple, we do not hire juniors.
Sensitive-Ear-3896@reddit
And before that we pasted into stack overflow, and before we would search in google hoping it wasn't behind the experts exchange paywall, and before that we would buy the book on bookpool, technology improves things I don't see the problem.
jasonrulesudont@reddit
SO is a lot different I think. Most of us (hopefully) knew not to copy and paste directly from it without understanding the code first and making sure it really applied to our specific situations. AI is like a slot machine with a decent payout. Instant dopamine, results may vary. Doesn’t seem good long term.
livsjollyranchers@reddit
Once worked on a team with a guy with a decade of experience who once pasted an entire stackoverflow code block and left all the variable names as is. You could easily find the exact post. It had to be 50+ lines of code.
Sensitive-Ear-3896@reddit
So you’re saying it was well documented!
pr0cess1ng@reddit
The SO and google comparisons to AI insufferable and delusional. Anyone taking this stance has failed up and still bad at the craft, or a current vibe coder.
RegretNo6554@reddit
yea the SO example is comparable but really not the same because often times I found myself adapting the solutions which does require at least some degree of critical thinking. with ai you just click accept and if it works it works.
JarateKing@reddit
And stackoverflow wouldn't cover everything either, juniors would very often run into stuff specific or bespoke enough that SO's no help. You'd need to develop those skills yourself because SO alone couldn't carry you through junior-level work.
Now AI probably can do that, but if all you know how to do is use AI at a junior level then there's no path upward.
Sensitive-Ear-3896@reddit
I personally disdain anything that isn't hand optimized assembler.
figureour@reddit
I've seen this comparison a lot and I think it's off the mark. It's way easier to turn your brain off based on what LLMs feed you compared to copy and pasting from SO.
Sensitive-Ear-3896@reddit
agreed but stack overflow is a lot easier than rtfm
UnlikelyMaterial6611@reddit
exactly
SageCactus@reddit
I think this is a failure of your job hiring/interview processes
CdnGuy@reddit
My team has a sql tech screen, and the focus of it is evaluating how people think their way through a problem. All we want to see is if you can understand the problem and identify edge cases. Given that, everything else can be trained.
But lately the vast majority of candidates are blatantly just copying code from their LLM. Some don’t even run their code to see what it does, or even look at the data in the tables.
drahgon@reddit
how can you tell they are copying?
CdnGuy@reddit
Lots of glancing to the side, and writing their sql start to end like writing English. Nobody who works with sql does it that way, you kinda write “inside out”. Eg starting with a basic select from the main table, then adding joins and other complicated logic through iterations.
Kinda like when you’re doing a mathematical proof, you build your way up to it. You don’t just leap there in a single step.
MudMassive2861@reddit
Now biggest choke point is code review. Mostly there are no reply if we ask why? Mostly a new set of code get pushed.
Lalalyly@reddit
We stopped hiring juniors .
drahgon@reddit
I kind of agree in this market with codex/claude code I almost dont know why you would. A senior will pump out 5x+ code and it will understand what AI is producing. Even what juniors produce with AI is still the work of seniors since they review the vibe coded MRs and then the juniors paste the comments right back in without understanding a word.
OldPurple4@reddit
Do you like what ai produced? You have to like it. My best code has always been something I liked. My comments are almost always some compromise that I didn’t like. This is what I ask them.
If we can help them develop that kind of code intuition, identifying patterns and code smells, well, I’d say, they’re gonna be alright.
PapiCats@reddit
My company doesn’t hire junior talent plain and simple.
drahgon@reddit
I think it is more important know for junior interviews to require some debugging so you can see how they think and tackle issues. If they get stuck it is a good sign they are vibe-coders
dronz3r@reddit
If all those guys do is repeat pasting the error in chatbox, guess they are not really needed.
Just make the agent set up to iterate over the errors itself and run all the tests.
knowitallz@reddit
They don't know what they are doing. This is purely dangerous. It's infantile and going to cause tons of problems.
QI can be used to boost your productivity but if you can't read the language because someone else wrote it then you are missing tons of details that are Important.
We are all going to be fucked by this problem
Jessica___@reddit
Honestly I see senior devs doing the "paste the error into ai repeatedly" thing so I wouldn't say it's just juniors doing it.
TerdSandwich@reddit
This isn't just a junior issue, tho I suspect the incidence rate is higher in that demo. I've witnessed "experienced" devs with varying years in the industry blindly dumping errors or allowing AI to completely solve issues for them with code that ends up silently breaking or is full of bloated nonsense.
This is a human nature issue. We are lazy. And AI exudes a level of believable confidence, along with all the snakeoilsmen hype that's constantly in the press, that appeals to that lesser evil.
I forsee a very difficult time in the near future for everyone as software takes a nosedive in reliability and maintainability.
355_over_113@reddit
One reason for "laziness" even in the best engineers would be time: are they given enough time to do good work or are they on never-ending death marches
honestduane@reddit
I mean if the juniors never get hired, are they really software engineers?
I know that sounds like a really harsh question, but if somebody ends up going through school or whatever to learn, learns the skills, can do the work, and then can never get a job doing the work because of how bad the job market is - and let's also acknowledge that most graduates don't actually know how to do the job because there's still that year that most juniors need of on the job training before they can actually start to contribute to the profession or do work (yes exceptions exist, but I'm talking most jr's because that's what I see the most) - what was the value of that education and what was the value of those years for that person and for society in general? More importantly, how does that actually change how the world trends? Because an engineer that can get work doing engineering or somebody who just has those skills who can not get work using those skills means those skills will atrophy from not being used and society has never benefited from that time and investment in education, even if it was "free" self study. This is literally a national security issue, because at some point there is not going to be juniors to turn into mids or mids to turn into seniors. Who is going to maintain the software in our self defense / safety radar at that point?
Nezrann@reddit
It's an extremely annoying and embarassing problem when we have juniors in our sprint planning reading off what cursor told them about a solution to a problem the junior knows nothing about.
I can't lie, we have been letting a lot of them go and moving towards not hiring many juniors lately.
wakeofchaos@reddit
As an older student, always fascinated by computers, who’s finally taken the past 6 years to work hard to get a CS degree, I feel rather slighted by the state of things…
VictoryMotel@reddit
By filtering them out
TheBear8878@reddit
Shut the fuck up chatgpt
jdlyga@reddit
Someone like that wouldn't get past our hiring process. It requires coding without using AI
livsjollyranchers@reddit
Interviewed someone recently (internal to the company) and part of the interview was reviewing a recent pr of theirs. It was clear most of the changes were AI-generated just due to the code comments alone. And the interviewee couldn't explain anything about the code. It was kind of just "because the AI did it" as an implication.
Toohotz@reddit
You’re a rare case I’m happy to hear still exists. Unfortunately amongst FAANG, accountability has taken a back seat in the interest of delivering solutions.
I mean just look how Claude’s code was leaked with a debugging file holding the source map, do we think there’s going to be any accountability at Anthropic?
Ok-Split-617@reddit
How did they pass the interview. You might have seen their thought processes while answering the questions.
devMario01@reddit
Your interview process failed you. It hired vibe coders and not real problem solvers.
Yes, they are juniors and yes they may not know to debug properly or to read a stack trace, but not even ATTEMPTING to do so is a big red flag.
wasteoftime8@reddit
I'm the most jr person on my team, with 9YOE, so we don't have this problem (yet). However, I'd assume you train them like any other jr. The jrs I've helped in the past were awful at debugging, and, when I was a jr, so was I. Usually, you'd teach a jr how to troubleshoot and problem solve on their own, to prevent them from playing 20 questions with you for every step of their ticket. This time, instead of bugging you, they're asking an llm. In this scenario, an llm is just taking the role of the sr dev who gets pummeled with questions, except GPT will never be a mentor like you can be.
It might be exasperating to see jrs immediately run to a chatbot for everything instead of thinking (and sometimes it annoys me when my more sr coworkers do it), but it's really not much different than if they ran to you without thinking. Mentor them the way you would any other jr. Some will get it and become better devs, others won't and will get cut from the team
shahadatnoor@reddit
💯
tasbir49@reddit
After seeing how often gpt leads me down the wrong path, ability to use a debugger should be a bare minimum for a junior dev.
Megatherion666@reddit
I like how Swift has in the guidelines that all AI output must be processed by the dev and not to copy-paste raw AI output. It should be a default best practice now.
DWALLA44@reddit
We have one junior, but hes been with the company for a couple years as a QA engineer, and has since made the change. He's a smart dude, but even he relies on AI just a little too much.
pr0cess1ng@reddit
What did the interview entail?
Full-Extent-6533@reddit
I think there are juniors who actually want to grow and learn, and there are juniors who just want the paycheck
brokenoreo@reddit
lol you're hiring juniors?
bsenftner@reddit
Oh yeah, right, as if anyone is being hired anymore.
SimilarIntern923@reddit
Id do a 6 month period where every PR they have to do a 1-2 page write up where they explain their tradeoffs, and make them use pgw/whatever applicable debugger and provide screenshots
Void-kun@reddit
We have seen this same thing but they never get past the technical interview.
Once you quiz them on why they did what they did they struggle to answer and it's clear they're just vibe coding with little knowledge of important fundamentals.
It's not like we haven't been warning them, it's not like Anthropic didn't release a study which should've been a warning.
They aren't good enough though.
mrpurpss@reddit
My company just stopped hiring in general for years. That being said I do hear peers talking about the slop that gets merged in due to the sheer laziness of reading code and doing proper code reviews.
bestjaegerpilot@reddit
it's an employers market don't hire them brah
NotYourMom132@reddit
My company got rid all juniors 2 years ago and never rehired them
Fearless_Earth_9673@reddit
Yes, we faced this issue. But one of them screwed up some basic javascript as they don't understand what the code does just before a release. They are not interested to manually test as well. After that they were asked to explain each and every commit, show how to debug else you can go find another job. Another junior is smart to use and does a bit of home work so he fares well with AI.
We hire for the their brain power not raw copy paste also most of them have no interest to learn. Difficult to deal the most as they think they are doing favor by working even withing office hours.
CrushgrooveSC@reddit
Genuine, honest answer: by not hiring them. Lol
chaoism@reddit
They'll be fine, if they get hired. Getting hired is the issue right now
The college curriculum is still more than just "coding". Assuming they do well on the theory and critical thinking, they can learn problem solving.
steff__e@reddit
And I learned with useless pontificators and noise on Stack Overflow. So what.
failsafe-author@reddit
I am developing a tool that puts design patterns and change rationale right in the code review window, and breaks the code review down into manageable chunks.
Juniors who take it seriously won’t be starting at a bunch of code without tools to make sense of it. They’ll have reasoning behind each change and why it was made. (Seniors will have this to make reviewing easier- they can look at the “what” first, and then check that the implementation does it).
kiwibonga@reddit
Someone who doesn't know how to read a stack trace but somehow gets hired as a programmer is called an imposter.
ShineShineShine88@reddit
Asking AI to understand the Stacktrace is not bad as long as they understand the actual problem. Asking AI to copy paste their answer is bad. Then you can just replace them with an agentic coding system
actionerror@reddit
Hard to hire juniors for DevOps/Platform roles.
midasgoldentouch@reddit
Really? Why is that? Is it akin to how you can’t be an entry-level software architect?
actionerror@reddit
Pretty much. You just need experience with real systems and wide knowledge of many different parts of the system that as a junior it’s just not going to happen.
Comedy86@reddit
If I hired devs who could only prompt and couldn't understand why the code didn't work, why it was formatted a specific way, etc... I would just fire them and build an agent to do the exact same thing...
If a junior developer can't understand their own code, they're not a developer. Anyone can prompt AI. Developers need to understand why to do it one way or another to know when to question why the AI made specific decisions.
Tired__Dev@reddit
Our juniors are killen it. They're using AI to learn faster. I work somewhere where it's really just not easy to vibe code a ticket or feature
Devboe@reddit
Handling it by not hiring juniors. We've been actively hiring over the past 6-9 months, but only seniors.
hello2u3@reddit
It’s no editing skills and a lack of quality and audit mentality
Wise_Royal9545@reddit
If only we were hiring…
throwaway0134hdj@reddit
They have to understand the code and be able to explain it.
p1-o2@reddit
Hello, I'm a time traveler from 2009. This is a strange timeline you are on. Let me tell you a story of my people.
We hired two junior devs this quarter. Both interviewed well. Both can crank out working code at a decent pace. But something feels off in a way I haven’t really seen before.
When something breaks, they don’t debug it. They paste the error into Google, find a forum thread or blog post, and try whatever fix is in the first result. If that doesn’t work, they search the new error and repeat. I watched one of them do this four times before I stepped in and showed him how to read the stack trace. He’d never really done that before.
Code reviews feel different too. When I ask, “Why did you structure it this way?” I often get a shrug. The code works, it looks reasonable enough, but they can’t explain the reasoning because there wasn’t much reasoning. They assembled it from search results and examples until it ran.