Anthropic Event has scared me as a developer
Posted by Zestyclose-Trust4434@reddit | ExperiencedDevs | View on Reddit | 81 comments
It seems OpenAI, Google, and Anthropic are much closer to deploying production-grade AI agents (Bugs) than I expected.
One way I try to reassure myself is by believing that developers will evolve into tech leads, playing a key role in the human feedback loop — making that a fairly secure position. Still, the total number of developers may drop significantly.
I’m curious — what are some strong arguments against the idea that AI will replace most jobs?
For example:
- Without traditional dev hierarchies, the need for project management could shrink.
- Product managers might run A/B tests in production directly, reducing the need for UX and client research teams.
- Large companies could become far more efficient and pivot faster.
What’s your take on where software development and jobs are headed in this AI-driven future?
kregopaulgue@reddit
I don’t see right now how these new announcements are threatening devs. Let’s be real, AI is already at the point, where it writes correct and working code if the prompt and context are defined correctly.
Current conferences just gave me an expression of diminished returns. 1st: benchmarks always lie, 2nd: even if they don’t results differ in 5-10%, which is almost nothing.
I am yet to see real production-grade application of AI, and I am working at a big tech company. + agents are not new, and they still haven’t taken our job somehow. I think people overall overestimate AI usefulness
SporksInjected@reddit
Models are getting better but the tooling is where the really big difference is.
kregopaulgue@reddit
I don’t see breakthroughs currently. As I said agents are present for some time already and new tools improvements for them are what, Github integration? I just can’t name anything significant. It’s welcome, but it’s not threatening SWE jobs.
SporksInjected@reddit
Oh yeah you’re right that it’s not threatening devs. Sorry I was responding more to the part about diminishing returns.
I agree that models are not as impressive per iteration now but things like model context protocol and the newer agent frameworks are very impressive to me.
kregopaulgue@reddit
Ah, understood! I misinterpreted your reply a bit too. Yeah, I agree about the better tooling too
SporksInjected@reddit
This is actually the very first moment on Reddit where two people have agreed and not spiraled into a hate match hahaha.
We just made history
Zestyclose-Trust4434@reddit (OP)
yeah that’s the human feedback loop i mentioned. but the number of devs will reduce. don’t you think v
poipoipoi_2016@reddit
Mathematically, we're being replaced by Indians on visa, not AI.
jrdeveloper1@reddit
Mathematically this is incorrect lol
The people replacing you are existing people in America, as in people in tech using AI to become more efficient and getting more done with less.
It becomes a race to the bottom but the people at the top benefit the most because they can now do 2x, 5x and 10x combined with their expertise.
micseydel@reddit
Do you have any public examples to show of "people in tech using AI to become more efficient"? This recent post is the kind of thing I mean.
jrdeveloper1@reddit
We already have plenty of tools across the board.
Cursor, copilot, perplexity,…
This is only just dev tools.
This is just the start, once you have AI agents and they can talk to each other then you can even more efficient.
Imagine having a swarm of this.
micseydel@reddit
Personally, I have \~100 atomic agents deployed for day-to-day use in my own project so I don't have to imagine. I'm all for the swarm. Do you know of anyone measuring their efficiency? I want to see reproducible examples, not just vibes.
What did you think of the link in my prior comment?
SporksInjected@reddit
Isn’t code generated increasing and filled jobs stable? That could indicate that either existing devs are doing more with less or non-devs are making things.
micseydel@reddit
If there has been any increase, I'm extremely skeptical it's due to LLMs https://www.reddit.com/r/ExperiencedDevs/comments/1krttqo/my_new_hobby_watching_ai_slowly_drive_microsoft/
SporksInjected@reddit
My guess is it’s human in the loop AI coding not fully automated AI. Stack Overflow reported some insanely high number of open source devs using GitHub copilot.
micseydel@reddit
Sorry, I'm not sure what you're trying to say.
SporksInjected@reddit
Oh yeah no problem, I’m not super great with words lol. What I mean is the link you shared is a fully autonomous agent doing everything which is not super common.
What is super common now though is coding with GitHub copilot, using ChatGPT, stuff like that where the developer is making AI assisted decisions. It also makes people able to code that couldn’t normally.
micseydel@reddit
Yeah I want to see the data on all this, especially the limits. I would enjoy (and be patient with) chatbots more if I was confident it was worth it, if it varies by programming language, etc.
SporksInjected@reddit
I didn’t see a new survey but the 2024 survey showed that adoption rate was really high. I don’t know any other specific numbers though.
jrdeveloper1@reddit
Also to add, I never look at tech capabilities based on where they are at but rather where it’s going.
That’s a proper way to estimate because if I laughed at cloud in the 2000s then I’d be crying in 2010s and 2020s
jrdeveloper1@reddit
Cool. Where are you deploying the agents ?
I am still looking into them too.
You are not going to quantitative data at this stage where everyone, their dogs and their moms are shouting about “we need AI agents”.
It’s the hotness of the year.
I am glad Microsoft is spending time and money to experiment with agents and in public 🙂
We can learn from them but I personally think they are going backwards by retrofitting a new paradigm into an existing workflow rather than doing the hard work to rethink the whole thing.
I am willing to bet the best experience and efficient one will look very different than what we already have.
poipoipoi_2016@reddit
So far the industry is stable. And everyone doing layoffs has new hires.
We're just importing hundreds of thousands of (\~3/4 Indian) visa holders.
When the industry starts shrinking, that's AI.
jrdeveloper1@reddit
What happened when they introduced IT and cloud infrastructure? Did you forget many people were replaced and augmented ?
The existing market shrinked and died out - very small market have in house cloud infrastructure or hire people for that purpose. It makes no sense cost wise.
Any technology trend grows exponentially not linearly so I’ll see you in 5-10 years then we’ll chat again.
poipoipoi_2016@reddit
And 200,000 people work in cloud infra at the hyperscalers. Plus the new SRE position.
In a situation where the industry was dramatically expanding in part because cloud made our lives so much easier.
It wasn't 1 to 1, but also new grads didn't have double digit unemployment.
jrdeveloper1@reddit
Most cloud infra doesn’t have general intelligence that’s the difference.
Some people have built AI models on there before ChatGPT and they essentially have smart technology operations.
You are literally replacing a human who do digital work with general intelligence - these are not dumb infrastructures.
They do have high level of reasoning and problem solving and decision making abilities and it’s getting better over time.
Zestyclose-Trust4434@reddit (OP)
i am the h1b indian and i am worried lol
poipoipoi_2016@reddit
Relax, you can move to Metro Detroit and get a nepotism job at Ford.
As long as you're still on visa and don't mind paying your management chain a 20% kickback under the table.
/I have thoughts on... all of that.
//Fords are sort of shite these days and that is part of those thoughts.
ExperiencedDevs-ModTeam@reddit
Rule 9: No Low Effort Posts, Excessive Venting, or Bragging.
Using this subreddit to crowd source answers to something that isn't really contributing to the spirit of this subreddit is forbidden at moderator's discretion. This includes posts that are mostly focused around venting or bragging; both of these types of posts are difficult to moderate and don't contribute much to the subreddit.
Evinceo@reddit
Nice emdashes.
If you're using a bot to compose posts like this, yeah, I think your job may be in trouble.
Zestyclose-Trust4434@reddit (OP)
i don’t see a problem if i give GPT my thoughts and ask it to rephrase it
Traditional_Till3816@reddit
do you not see the irony of that?
Emotional_Act_461@reddit
The irony doesn’t invalidate his questions though. Care to respond to those instead of deflecting to a non sequitr?
Traditional_Till3816@reddit
Honestly if they are not going to put forth good faith arguments, why should I. To answer the questions tho, I think there will be less software egineers going forward but I think that was already the case. Companies over hired during covid and then learned more dev doesn't necessarily mean things will be built faster (https://en.wikipedia.org/wiki/Brooks%27s_law). Most LLM's can make standalone software projects, but they start falling apart when integrating with other environments and existing code bases. This will improve as time goes on tho, so you will see less of a need for junior developers.
Emotional_Act_461@reddit
Those are good faith questions though.
Zestyclose-Trust4434@reddit (OP)
it’s about embracing AI. not letting AI taking over your jobs. chill out. it isn’t a big deal
Jadien@reddit
LLM rewriting is a lossy process. Whatever thoughts you wanted to share, the rewritten version will contain less of them.
Zestyclose-Trust4434@reddit (OP)
you’re not using the right LLMs
Jadien@reddit
Literally any rephrasing by an entity that does not contain the internal state of your brain is lossy.
Zestyclose-Trust4434@reddit (OP)
i don’t want to get into an argument with a low IQ human
metaphorm@reddit
do better than this. this is a forum where professionals discuss their work with each other. act like a professional.
Jadien@reddit
You asked a question. I answered. You don't like my answer and now you're making it personal when it wasn't before.
You know what's low IQ? Not being able to write and needing a computer to do it for you.
thetdotbearr@reddit
lmaooooo roasted
YzermanChecksOut@reddit
"in this AI-driven future"
tetryds@reddit
Ooooof
dont_take_the_405@reddit
There will be a boom in startups (not necessarily AI startups) most of which will be small teams of 10-20 individuals doing the work of 100-200 per-AI individuals. Companies like Cursor (I think it’s <10 employees) are already there.
Larger companies will either keep trimming headcount or expand depending on whether AI automates more or unlocks more code (features) to be written.
jrdeveloper1@reddit
Yep - this is exactly where I think it’s going. Most devs and people are not willing to admit this.
Zestyclose-Trust4434@reddit (OP)
but what’s really the right side ?
jrdeveloper1@reddit
The side that grows exponentially over time. The AI.
cap87_@reddit
source: trust me bro
dont_take_the_405@reddit
OP asked for my take. I presented my opinion. I could be wrong but from my work experience this is how I see things are headed.
cap87_@reddit
Fair. Having used a lot of AI tools myself, the most value I could get out of them is by using them as a "nicer auto-complete". I can also see it being very valuable for quick and dirty PoCs.
I can't see it making anyone 10x more productive, but maybe I'm holding it wrong
baconator81@reddit
As a C++ dev.. None of the AI Agent has impressed me so far.. I guess C++ is so chaotic AI just can't figure out wtf is going on here :D.
ComputerPretty3565@reddit
C++ is not that chaotic - old codebases, usually written in C++, they are chaotic. I'm interested in how your C++ is so chaotic
baconator81@reddit
Well that’s the problem. The code was originally written in C and then moved to pre 11 c++ and now there is some lambda in there.
Also the newest c++ isn’t that much cleaner if you ask me. Once you get to class template like std::enable_if there are still a lot of wtf thanks to sfinae
davvblack@reddit
One question is... what is the latent appetite for features from eg. SaaS? The idea of ai-based layoffs is predicated on the idea that we have to get XYZ work done per year. But the way we arrived at that amount of work is by looking at our hiring budget and the landscape around us.
What if a new startup shows up that hires the same number of devs as traditional companies had a year ago, AND leverages 10x AI on top of it (that, for the sake of argument, fully materializes). Then we'd see that startup suddenly be able to compete with long-term software on the market. As this begins to happen, you may see the "overton window" of a reasonably productive software company shift significantly, and the expected output to be way higher than it was previously.
A different way to think about it is: if every feature became 1/10th the cost that it was before, would you pay more total $$ for your features or less? The doom and gloom expects that we will suddenly be paying less for features as an industry, but I don't buy that argument. I think we'll see an explosion in demand for more and more niche software that can be provided cheaply, with deeper and deeper features.
Zestyclose-Trust4434@reddit (OP)
yeah great point i have thought about this and gone into a rabbit hole where let’s say each feature takes minimal amount of time that ideally means developer would be pumping out more features but the features are somewhat limited on when the end user uses them and gives feedback.
so the final burden would come onto the devs who’ll be casted out because either the management wouldn’t want to try multiple features or the client wouldn’t be responsive and the pipeline of features would be limited
the impact would come a little late because of backlog of features that are present right now.
but if i have to decide today if i want to change my path to engineering manager/consultant or a senior engineer. what do you think would be the right pick ?
davvblack@reddit
sorry i have no idea. The other facet we haven't talked about is that all those other jobs are being "made more effient" (hence redundant? maybe?) with AI as well. It all comes down to how good AI is at each of these jobs, once the dust settles and things hit a true plateau.
t0rt0ff@reddit
There is no any strong argument that AI will not change significantly how engineers work.
At the same time, as with any progress, people who are not willing to adapt to new reality will become less and less needed by the industry. Don't be afraid, embrace the change and learn to use new tools effectively. How many people will stay out of job is out of your control, take care of what you can control.
tlagoth@reddit
Today I fought with Claude 3.7 and ChatGPT o4-mini-high to get them to do some pretty basic stuff: sort a list of strings with 150 entries in alphabetical order, then create a dictionary with list 1 as keys and set its values from a list of slightly different, but related strings.
I kept going out of curiosity, but for sure if I coded it myself it’d have been much, much faster.
If these production grade agents are anything like current models, the only things I fear for are the codebases, and the people having to review and fix what comes after.
I tried Cursor recently with the agent mode, in a big codebase. The “agent” would consistently and confidently make mistakes, causing a cascade of modifications throughout code. When an error happened, it kept trying increasingly wilder solutions that didn’t fix it, in a loop, until the conversation ended with an error due to the max 15 queries in one go limit. The solutions often included third party dependencies that were helpfully installed for you.
It feels like AI is going towards the path of making good answers take more and more requests, even for super simple things. Not sure if this is by design, or technical limitation.
With autonomous agents, the number of requests has the potential to explode - and possibly without adding value, or worse, causing issues, if they are anything close to what is currently available.
For those to really be a threat I would expect a very significant improvement to the current versions - I think just scaling current tech is not gonna be enough to get there, at least not with quality results
Zestyclose-Trust4434@reddit (OP)
have you tried codex or jules? my thought process is these new tools are being made production first even with claude x copilot
which basically means they trust them to get production ready code, which is the scary part
tlagoth@reddit
I think if they had models that good, they’d already be in production without agent mode. They wouldn’t pass the opportunity to earn twice: one with smart gpt and then smart agent.
Zestyclose-Trust4434@reddit (OP)
lol what are these allegations ? these are genuine questions as i am in a cusp to change my job and i have to decide whether to go on a consultant or a dev path
ofc i gave it GPT to check for grammatical errors. plus whatever you sent is not at all similar
kregopaulgue@reddit
Go consultant, free up some space in SWE, lol
CalmLake999@reddit
The designer tool is so bad, I tried it a few times.
Claude Code on the other hand I've been using a lot, burned $400 today in credits running multiple agents vibe coding a new product, completed today.
jrdeveloper1@reddit
How are you burning $400 on LLMs ?
You should look into local LLMs then even if it’s inferior just to test things out.
CalmLake999@reddit
Nah local LLM way too slow. I'm running Claude Code in multiple terminals working on different components. I could easily go to $1000+ in one day hehe.
metaphorm@reddit
$1000/day is enough money to actually hire a human. the value proposition starts deteriorating rapidly at that level of billing.
CalmLake999@reddit
I'm a 20 year dev with over 50 platforms behind me 😎 some with millions of users.
I can get WAAAAY more done running multiple agents and myself. Last 3 weeks I had 4 people wanting different apps, I managed to do most of the work already.
2 experienced/senior developers in my experience is just way slower.
jrdeveloper1@reddit
Check out Metas Lama, it’s actually not that bad. Theres even deep seek.
hitanthrope@reddit
The technology really is very good. I do sometimes wonder if the people who deny that are seeing the same tech I am.
We had an LLM write a unit test for us today. It did a very impressive job. Caught all the various edge cases that you would typically test, covered the branches and created a test for each one with a clear descriptive name of what it was going to verify. It *did* use an assertion library we don't use, so we told it what we do use and it fixed it.
Before anybody goes raging off on me, *no* we are not in the business of pushing this stuff up without going through every part of it, but it gets a lot correct. It's very good. I can't say I am not impressed. I'd have probably told you I wouldn't see this in my lifetime 20 years ago.
So with that in mind, where the fuck are we headed? I don't know. I don't want to sound overly romantic but I really feel my hard earned skill is, in large part, a way of thinking. I like to think that will remain useful as the tools and environment changes, and I think it will.
An interesting question that I have been pondering recently, is what happens if we discover that the next "leap" in LLM produced software is to have the machine produce very low level instructions. The notion of a machine producing C# (e.g) just for it to go back through the compiler again is perhaps a less than perfect way to do it. It's a bit like the idea that self-driving cars would be a simple matter if it was *only* self driving cars on the road...
I can't really approach any of this stuff with any kind of certainty either way.
Gave a guest lecture once, kid approached me after and said, "Wow, I bet you have seen a lot of changes in your time in the industry!". I was 32 at the time. I felt my heart begin to break. Then it fully snapped in half when I realised the fucking kid was probably right....
More change ahead.
U4-EA@reddit
It's not Artificial Intelligence, it's Machine Learning.
I personally have concluded that ML code is the best thing that can happen to experienced devs. In the coming years, there is going to be ML slop code everywhere that will eventually fail under real-world stress testing and will need to be completely rewritten by experienced developers. And today's juniors who have used ML code won't make themselves skilled doing it. We will end up with a bifurcation in skill with a global lack of the skill required to clean up the mess created by ML code.
metaphorm@reddit
I entered this prompt into chatGPT "write a post for reddit with the title Anthropic Event has scared me as a developer" and it output something that looks AWFULLY CLOSE to what you wrote
is that what you did? are these really your original thoughts or is something else going on?
Evinceo@reddit
We're in Emdash summer and OP was too lazy to strip them out.
HansDampfHaudegen@reddit
It happens the same way as a big company outsources work to small contractors. The technical people are boiled down to a skeleton crew that checks deliverables. Except that the contractor is an AI agent. I've seen it in a previous industry I worked in in the 80s. Think 10 people in a corporation of thousands. Getting a foot in the door with that skeleton crew is exceedingly tough. I got out of this altogether.
duddnddkslsep@reddit
I welcome it, sick and tired of unpredictable bugs after refactoring spaghetti code from ten years ago
cd_to_homedir@reddit
As opposed to bug free AI generated slop?
Winne_Pooh@reddit
I'm experiencing a bit of cognitive dissonance with this... On one hand, writing code is just a small part of what I do, and while these tools are mind-blowing, they still struggle in complex real world environmens... On the other hand, I've consistently underestimated how quickly and how far this technology would evolve, so I don't trust my own judgment.
Former_Dark_4793@reddit
thats all BS, none of that is gonna happen, AI is good but it cant replace the devs
poipoipoi_2016@reddit
> Without traditional dev hierarchies, the need for project management could shrink.
Already happening IMO. The continuous multi-decade move to make literally everyone full stack developers has been ongoing for decades and will be ongoing for decades more. But I've always been my own PM.
> Product managers might run A/B tests in production directly, reducing the need for UX and client research teams.
They already do this, that's WHY they have so many UX and client research teams.
> Large companies could become far more efficient and pivot faster.
Says someone who's never worked at Google. Believe me, the speed of code is not Google's problem.
fixermark@reddit
I think there's going to be a lot of people who have to look at themselves and ask whether the fun part of the work was making the computer do things / solve problems or mechanically manipulating little brain puzzles to make the computer do things / solve problems.
For the former, these technologies (if they work) will let them do those things faster.
For the latter, this is going to be a painful change akin to the camera turning painting-by-hand from a necessary skill for creating pictures of things to a neat thing someone can do because they can.
ceirbus@reddit
Still need a person to say it’s correct, the design is good, it’s but free, and regulatory compliance per industry. I think we are fine