Not getting dumber with company wide AI push
Posted by CocoaTrain@reddit | ExperiencedDevs | View on Reddit | 119 comments
Hey, so I work at one of the companies where our CEO is really in love with AI. We've got a company policy to push for AI usage everywhere, in all departments. We're getting all sorts of tools. We also have dedicated people who, alongside they usual work, need to work on finding new tools, use cases, and educate others on using AI more
While I can appreciate the benefit of e.g. having someone to talk to about ideas, I sometimes get afraid that I will use AI too much and kinda forget how to code. You know how that is. If you use a tool, sooner or later you become dependent on it. And the AI in regards to code can actually sometimes do the thinking for you.
Do you have similar thoughts? That you'll use AI so much that you'll become dumber and just start forgetting your skills for code developments debugging, etc?
MachineOfScreams@reddit
I find that most people in love with AI tend not to do much work to begin with. That being said, if you use it mostly as a syntax converter between languages with lots of training data it’s not the worse thing on the planet. If you aren’t outsourcing your critical decision skills to it…well, then maybe don’t?
codescapes@reddit
The problem I've found isn't with responsible use of AI, it's the people just shitting out dysfunctional code / analysis and then wasting everyone else's time. I've been in meetings where someone thrust their phone in my face with an AI response to some question and it wasn't even relevant. Like fuck off if you've not even read it and parsed it before speaking to me, it's insane.
There's a shockingly bad engineer on my team and they are now 10x worse because they churn out garbage so quickly and then pester for PRs that are not even slightly ready. It's far worse than what a novice would produce. Recently they were trying to pull in dependencies for a completely unrelated UI framework because their LLM went down a rabbit hole.
People who use it well can really leverage it to productive ends. But if it makes you 3x more productive it makes people 10x faster at making spaghetti code. I swear there needs to be a push to performance manage people for inappropriate / low quality work that is derived from AI.
"AI did it, I just copied" is a professionally disgraceful thing for someone to say when questioned on their code, it's total negligence. Cannot comprehend how some people think it's ok.
MiniGiantSpaceHams@reddit
The people you're talking are the ones who are about to start losing their job to AI. We used to need them, because even junk code might be better than no code in some cases. This is how the whole offshoring industry survives.
But now the competent devs can produce more (good) code, the junk devs and their junk code aren't needed. There's a lot of understandable fear around AI, but I think we can probably all agree in hoping this part happens quickly.
fireblyxx@reddit
Honestly it’s a mixed bag even with skilled and experienced developers. Some people just end up going all in on the AI and just blow up scope because they think the AI will refactor things better or improve reliability.
The war I’m expecting to have is the memoization of everything in React because that’s what the AI will do if left to its own devices. Except now we’re throwing shit into memory that rarely, if ever, gets re rendered, or might be irrelevant because it being rendered exclusively in a server side context. But now you’re fighting engineers who should know better, but have gotten taken over by “prompt engineering.”
nitelight7@reddit
react compiler is coming
wwww4all@reddit
React is horrible for AI generation, because so much context and nuance are needed for building feature components.
But, since there are so much React code examples, from all the hello world Pokémon half finished project attempts, it probably has the largest training sets.
So, you’ll get generated garbage React code even faster and fed back into AI training sets.
ALAS_POOR_YORICK_LOL@reddit
That's be amazing, but I'm pessimistic. The people that paid for junk code will likely continue being idiots
Mateusz_Zak@reddit
I feel you.
A good rule od thumb is to limit Human LLM interaction at PR level. Constrain your codebases. Use automatic enforcement for linters, tests and quality metrics. Let the procedurals deal with the sloppiness. Talk with your architects and ctos about that. I personally see the renessaince of TDD as a net gain
appoloman@reddit
Yep, this is it. Managers think every developer they hire is a net contributor to at least some degree, so making everyone faster must increase value. The managerial class need to reckon with the fact many of their employees are net-negatives, and making them more effective will be making things worse.
You can say that these people will be fired all you like, but you need to also answer why they haven't been fired up until this point.
bizcs@reddit
You said more effective, but you meant more efficient. More efficient at producing garbage is not effective, and is probably less effective given the downstream cost of their "efficiency". Improving their effectiveness would be gates on their productivity and ensuring their contributions actually add value.
bizcs@reddit
The sort of laziness you describe is one of my great fears in all this. It's the worst version of "move fast and break things" when carried to its logical conclusion. Democratizing laziness is a bug, but a feature. Implied is a tax on the non-lazy people to protect against the lazy, and I don't like it.
drumDev29@reddit
Honestly if I was a manager and I saw one of my engineers say that it'd be an instant firing
codescapes@reddit
I can always tell when it's AI slop from this particular engineer because our LLM loves to put in unnecessary comments like "this sets the counter" or whatever that they leave in. Stuff nobody would write and is bad practice.
Tbh I might bring it up with our team lead, it's super disruptive when you can't trust a colleague and have to overly scrutinise their work on the assumption it'll break things.
RandomUsernameNotBot@reddit
What do you do when it’s the tech lead churning out garbage too? Those little comments you mentioned are absolutely littered throughout the code base. I’ve been going through and adding tests to a few of them and found over half of them are not working exactly as intended. I’m tired…
uniquelyavailable@reddit
Focus on quality, not quantity. Ai can be a useful tool when you work alongside it and use it to throw ideas back and forth while you write code. Maybe the Ai can help type out a few small sections but usually you should be reviewing that code and making changes to it everytime because (hopefully) you know more about the context of your codebase than the LLM does.
squeeemeister@reddit
Early early on in my career I started interviewing people. I interviewed one guy that couldn’t remember basic html syntax on a whiteboard because he was so used to Dreamweaver’s auto complete. Because of this interview I’ve mostly avoided auto complete features in all IDEs.
In my view LLM auto completes will mostly make a generation of programmers that can’t write a line of code without them. And after twenty years of big tech bait and switches, this is all by design.
baconator81@reddit
I never felt like that. Because I end up have to read through so much code anyway. And that’s what usually end up happening
Mystical_Whoosing@reddit
Did using code completion or syntax highlighting make you forget how to to code? Don't worry, keep learning the new tools.
cuntsalt@reddit
Part of why I won't incorporate AI into workflows is that I don't want my brain to degrade.
Studies:
Opinion pieces:
Other:
xDannyS_@reddit
I limit my use as much as possible as well for the same reasons. It's not even just AI, but a lot of tools in general. Relying too much on autocorrect made my spelling horrible. Switching from doing math with pen and paper to digital made my math abilities degrade significantly.
So, yea I'm trying to be as aware of the consequences as possible. A good example would be using it to get an understanding of a foreign/new code base by having the AI explain it to you. I see this recommended here often. I do worry if this is going to affect my ability to read code without it. At the same time, I try to ask myself whether it's worth the trade. In this case I think it is, as long as I don't abuse it too much.
A team of 6 people at work started using AI to write notes for various things. It didn't take long for everyone to notice how quickly their quality of work degraded and how much slower it was to work with them. Everytime they needed information on something they had to look it up. When you write notes by yourself, the information sticks. When you let AI do it, you dont get that benefit. At the end of the quarter they were all so far behind that every one of them got fired.
cuntsalt@reddit
Yes. There is very long-standing evidence that writing aids your memory, particularly hand-writing versus typing. It is why I still take handwritten notes.
It is a similar principle to me as relying on tools that are baked to a given system (e.g., Linux) versus relying on tools that have to be installed and configured. Which is not to say it's a dogmatic "never, ever do that" but that each tool needs evaluated carefully for benefit versus degradation of base skill. If you ever can't access the tool you are used to for some reason, you want your skills to be available.
I've not found AI output correctness and usefulness to be worth the risk, personally: believe me, I'd be into it if so.
bizcs@reddit
How often do you try using AI for things? I've found the sweet spot to be doing something where I know what the outcome should be, it's mostly obvious how to get there (I can verify them easily), but I don't want to spend the time on getting there.
One recent example of this for me was writing a script to parse some data out of an XML document. I sort of remember how to achieve this, but it's not something I do very often. I could've figured it out in a few minutes, but I was able to pass it off to an AI system and solve it in less than a minute.
Another was trying to do a pivot operation in SQL Server: I'd encountered, and used, that syntax many times in the past, knew it existed, and needed it for something I was trying to do, but have not spent enough time writing that sort of query to remember how to do it off the rip (I always have to open the docs). I was able to pass the labels for a pivot operation and tell an AI system what the eventual output should be, and it just nailed it. I'd call that improved recall + typing efficiency.
cuntsalt@reddit
Specifics:
Trying to make it give me a HTML webpage made out of an image, with both Gemini and GPT. Both of them first embedded the image within the page, then gave me a grayscale HTML page that couldn't follow the design, then ultimately both ended up like this with GPT and like this with Gemini.
Trying to make it answer a poker question, GPT and Gemini. Objectively correct/incorrect answers. Fun note, for the last round, Gemini thought for almost two minutes and offered me two responses at the end of that transcript. The one I didn't choose was continuing down the path of providing incorrect hands where a straight was still possible.
I will readily admit a lot of my resistance is ideological and in defiance of the hype surrounding AI. When the dust settles and it becomes less of a marketing cudgel to bash us over the head with "AGI soon bro" and "replace all the devs bro" and more correctly viewed as the very fancy but limited autocomplete it is, I will be less resistant and use it more for the small, well-scoped, yes/no correctness answers that it is good for (I tend to be a late adopter in general).
Assuming it remains financially accessible, that is -- I'm also not convinced the current subsidized lose-money pricing models will remain (i.e., when the companies start charging what a subscription should actually be worth to make money, it might become astronomically expensive and thus die off). That part really scares the crap out of me with the cognitive offloading thing: if we offload thinking into these things, forget/degrade our thinking, and they suddenly get a lot more expensive, what then?
AccomplishedLeave506@reddit
If you want to build muscle you have to lift weights. You can't employ someone else to lift the weights while you watch. The next decade is going to be dire as we see people enter the workforce who never learn to do the job.
the_pwnererXx@reddit
Sorry, this paper is bullshit
Yes, if you use ai for a task you won't think as much during that task. No shit, if I ask an intern to do something for me, I'm not the one doing it.
ill_never_GET_REAL@reddit
Alright bud
Ciff_@reddit
You should not apply less critical thinking when using ai
cuntsalt@reddit
Do you think there are no cumulative, long-term effects from offloading critical thinking tasks?
the_pwnererXx@reddit
If that was true, you would expect to find evidence that critically thinking will increase your intelligence over time. Unfortunately, that's not really the case
Depending on the person, I'd expect you would get lazier. But on the other hand, some people will just use the extra time/energy they save to do other things
bizcs@reddit
Code isn't the end goal of thought, it's a product of it. The end goal in software (in commercial settings) is identification of a solution to a problem that is economically viable and useful. I'm not willing to wait a year for the result of a sophisticated analytical process if the tint horizon it applies to is the next 12 weeks, for example, but I would be willing to wait a week if I thought I could make use of it over the course of the next 11 to impact my operation.
I'm not a fan of AI everywhere as a policy. I don't think the capability is there yet, and folks are buying into a sales pitch that has not yet been realized. That said, there's sparks of Claude and other tools being good enough that I could delegate significant chunks of thought to the tools for a feature and get the shit done a lot faster. Practically speaking, it could mean that I get to think about the product more and implementation less. The effect on the workforce is dynamic and I have no idea how to predict it (there are many possible futures). The problem is that increasing productivity at shipping features isn't going to be the fundamental constraint... It's going to be identifying which features to build. But being able to deliver quickly so I can go identify and ship the next feature is still very valuable if it means I can increase features per year, or decrease bugs incurred per feature.
Keep in mind, a feature may be infrastructure stability: if you create a marginal improvement in reliability that decreases support cost or eliminates a failure mode, and that reliability improvement allows incremental profit (profit, not revenue!), it's a good idea.
As a professional, working for a company, it's a good idea to seek to maximize shareholder value. There's a ton of stuff in software that I bucket as "maintenance" that is valuable only because it ensures that software assets remain assets and don't become liabilities. This includes things like integrating new versions of packages in the stack to address vulnerabilities, or patching the software because it didn't address some corner case in initial development (that could have been easily predicted), or any of the other ways that we have to waste time on nonsense where I'd prefer to allocate time more usefully but can't because I need a smart person (myself or one of my colleagues) to go work on a menial task that was generated by a choice we made days, weeks, or years ago (including adoption of a package or framework, or provisioning of a service we now own). If I can leverage any kind of automation, AI or otherwise, I want to do that.
Is there risk of skill atrophy? Yes, 100%. It's still important to understand things deeply for yourself, though the set of things you need to understand deeply should decrease and become more coupled to theory and less coupled to implementation over time. The idea of information retrieval (aka "recall") will be enduring, but how it's achieved may not.
Competitive-Vast2510@reddit
Some of my colleagues have started to experience what you described.
One of them says "I've become a mouse oriented engineer, even if AI spits out wrong answer, I just try again with a different prompt and try to copy paste my way to victory. I've become too lazy to just write the damn thing."
Another one: "At first, I really liked having Claude guide me to decide how to proceed, but later on I've realized that I have become too dependent on AI to the point where I feel like I can't think much without it".
The CEO of the company I currently work for tries to push for AI as well. The issue is, I do not work in this field to satisfy companies' needs.
I just refuse to listen any engineering advice from business oriented people and waste time/energy with concepts like "vibe coding".
Have done so since 2023, and have been doing just fine.
ALAS_POOR_YORICK_LOL@reddit
This is wild to me.
If anything I've felt the opposite. Sometimes interacting with the LLM gives me a boost of energy on a slow day.
PoopsCodeAllTheTime@reddit
Well, that's because LLM is entertaining, not because it's great at solving...
https://youtu.be/QWngvJnoPJA
ALAS_POOR_YORICK_LOL@reddit
If they're entertaining, then I'm using it wrong.
PoopsCodeAllTheTime@reddit
Okkk
PoopsCodeAllTheTime@reddit
Well, that's because LLM is entertaining, not because it's great at solving...
https://youtu.be/QWngvJnoPJA
tetryds@reddit
AI is an enthusiastic junior developer who wants to look good and is not afraid to read some documentation. For my use case it is often useless, as I will review and refactor the thing anyway I mighy as well just write it.
Also, always know 100% what each piece of the code you write does. Read the docs and try out different approaches.
akdulj@reddit
Its funny because I also double or triple check the ai output. And then I have a diff tool open to compare exactly what it changed versus the previous output it created
MinimumArmadillo2394@reddit
AI does really well the things you do not want to handle yourself. IE: unify these data models between frontend and backend or just asking it to make a GET endpoint that will return a single piece of data that you define.
It does poorly at anything more complex, but I guess thats not what its meant to do
elnelsonperez@reddit
You guys must me using the absolute bottom of the barrel AI or have non common use cases. Claude code is a core part of my workflow today and as experienced developers, one must see it as just another tool that will obviously not work 100% of the time like you want it to, but thats why its a tool that you learn to use to be more productive.
PoopsCodeAllTheTime@reddit
If it doesn't work 100% of the time then it doesn't make me more productive 😂
ALAS_POOR_YORICK_LOL@reddit
Yeah it's definitely more useful than just boiler plating a simple get endpoint.
It's just another tool to learn and embrace.
11thDimensi0n@reddit
What baffles me is that in a sub called experienced devs there are countless so called experienced devs hammering AI for inane reasons. You wouldn’t hand over your entire codebase to a fresh out of uni grad / junior dev alongside a list of requirements full on feature work and tell them “do this for me from scratch and start to finish” but somehow that’s not only acceptable but actually expected of tools that have barely seen the crack of dawn.
And when it invariably doesn’t do everything it’s because it is absolute crap and completely not worth the time of die.
A dev is many things but any half decent one, even more so, so called experienced ones should be adaptable.
I see / countless of these people full on reject “AI”, and I’ll stand by the fact these are the same people that if we rolled back the years to 2015 would be the ones refusing to use stack overflow because real devs can do everything by themselves, or were it 2005 no way they’d be using Google or anything other than programming binders written in the 90s by a dozen of authors.
It’s truly baffling that people will die on hills like this purely out of fear of hypothetically being perceived as less than their peers.
Use the tools that you have at your disposal to your advantage people. If they’re good at autocompleting don’t use them to generate high quality code, same way you wouldn’t use word tables to do excel work, if they’re knowledge cut off is 2023, don’t expect it to be aware of the latest frontend js flavour of the month library, so on and so forth.
Jiuholar@reddit
100% this. I was pretty anti AI up until quite recently - my work got us an ai ide plugin and I just committed to integrating it into my workflow.
It's absolutely not perfect, but damn if it doesn't save me shitloads of time. Writing unit tests completely by hand is a thing of the past for me. Ask AI to generate unit tests > review and tweak > add missing edge cases > done.
It's absolutely amazing for putting together quick shell scripts, helm charts, db queries, and reviewing code.
Writing feature code? Not quite there yet for me - particularly in legacy, pattern + abstraction heavy code bases. But it's incredibly good for reviewing + improving small snippets. I often write quick and dirty code that meets the functional criteria, and then get AI to clean it up for me.
People that don't integrate it into their workflow will 100% get left behind. I can move so much faster with it.
TastyToad@reddit
100%. The best thing about it is removing friction - I know (or used to know) how to do all these things by hand but I don't do them daily. So I ask AI, review, tweak a bit and go back to my main stuff in no time.
Not so much. Maybe because we've always been heavy on automated code quality checks and have relatively few juniors, but I don't find it adding much value here. YMMV.
appoloman@reddit
No, I wouldn't, but my leadership absolutely would if I didn't stop them. Now they can bypass me and do it themselves.
Organic_Ice6436@reddit
I bet these same people would have been against refactoring tooling when it came out.
tetryds@reddit
Nobody was ever against stackoverflow or code snippets or anything like that, but senior devs have always warned and felt the pain of copypasting code. AI just automates this process.
Incompl@reddit
Adapt or die
MathematicianSome289@reddit
Compound Gen AI systems are far past early career devs.
rajohns08@reddit
Out of curiosity, what agent and model do you use?
Humble-Persimmon2471@reddit
This, so much... I hate it when one of my colleagues just says oh, gpt wrote that seemingly proud but having no clue why...
SynthRogue@reddit
This^. Especially the part where you need to know exactly what every piece of code you use does.
TheCauthon@reddit
Instead of technical debt - AI users are creating knowledge debt.
Taking the efficiency now while pushing understanding down the road.
This doesn’t apply if you are using AI to learn.
wwww4all@reddit
It’s turtles all the way down.
LLoyderino@reddit
quite sure that knowledge debt leads to technical debt as well
MasSunarto@reddit
Brother, I think I am the only one in my department who doesn't actively incorporate LLMs into my work flow. My bossman and his bossman know it too and they're fine with it. The reasoning is that my tickets are easy enough for me (to the point I ask my bossman to increase my workload) and I actually quite enjoy throwing feces on the wall and see what sticks in the end, brother. As for the acceptance of LLMs, I think engineering managers and above are using it quite frequently as I think their time is more valuable than mine. Hahaha.
1he_introvert_glass@reddit
it already happened with me so imma doing personal projects to upskill and that is out of office
gowithflow192@reddit
Everyone can multiply their output or reduce delivery time with AI that's why leadership loves it. Even if it's only 5% difference.
Many in this sub are in denial about that. They just want to write off AI altogether. You'll get left behind.
Helen_K_Chambless@reddit
Yeah, this fear is totally legit and you're not overthinking it. I work at a firm that helps companies implement AI strategies, and we see this exact concern from developers constantly.
The "use it or lose it" thing is real with coding skills. I've watched our clients go from writing clean, thoughtful code to just copy-pasting AI outputs without understanding what the hell they're actually doing.
But here's what we've seen work for developers who want to stay sharp:
The real danger isn't becoming "dumber." It's becoming lazy. AI can make you incredibly productive if you use it right, but it can also turn you into someone who just glues together code they don't understand.
Your CEO's AI push probably isn't going away, so figure out how to use these tools without letting them use you. The developers who survive this transition will be the ones who stay curious and keep their core skills sharp.
g1ldedsteel@reddit
Probably in the minority here but I feel like my ability to learn has increased with AI. Not really using it to write code for me, but definitely using it as an advanced rubber ducky for working through higher level design things.
isurujn@reddit
That would be the ideal way to use AI. But most people just rely on AI to write code do completely using tools like Windsurf and Cursor. That's the problem.
syntaxfire@reddit
Definitely not disagreeing, but I think people like me and whoever else said they feel like it has made them smarter were also probably the people who never stopped asking questions in college and we now feel like we have unlimited free questions to the rubber ducky without needing to harass our coworkers :P
syntaxfire@reddit
Same, I have learned 2 new programming languages and am working on learning a new spoken language over the span of a year because of AI. I never ask it to solve problems for me, I treat it like the advanced rubber ducky for design paradigms and I also feed it GitHub issues when I'm evaluating tooling and ask it to produce "pros and cons" charts and add any limitations so I don't have to spend hours digging through issues before picking an open source technology. For language learning I ask it to prepare verb tables, study guides, and conversation exercises. For programming languages I solve leetcode problems and then ask it to critique them and compare my solutions against languages I already know. Saying "AI will make me dumber" is like saying "if I use the calculator instead of the slide rule I'll forget how to do math". I mean I definitely learned how to use both in college but when I need to solve a differential equation I definitely break out my TI-89 and not my slide rule, just saying...
ALAS_POOR_YORICK_LOL@reddit
Yeah it's pretty decent as a rubber ducky
ReadyReputation239@reddit
I haven't googled searched any code issues in a while. AI has been pretty helpful
NatoBoram@reddit
Well… AI does make you dumber, that's a fact, but not everywhere and not equally.
AI has the unique capability to position itself in areas of work where we should be using our brain instead. If we offload the critical "thinking" steps to AI, we become less good at it.
There's also the natural human tendency to forget skills when they're conveniently handled by something else. To pick a different domain, imagine scripting at a videogame. Say you're playing League of Legends and you have a script that can tell you when the sum of your spells is going to one-shot the enemy. Calculating your damage vs your opponent's health vs their armour vs your armor penetration is a real skill in the game that the highest skilled players are exceptionally good at. But if you have a cheat to calculate that for you, you become bad at it.
It's the same principle when applied to finding solutions to code problems. If you struggle with satisfying the TypeScript compiler, AI is often going to cheat it using
as
or do some other unmaintainable bullshit, and if you accept that shit code, you become a shit programmer.When using GitHub Copilot, the "copilot pause" becomes real because you learn what it can conveniently do for you and it can steal some amount of skills for you. To use it effectively while minimizing brain drain, I think you have to make yourself a plan before it generates code and you only accept suggestions that fit your plan.
CarelessPackage1982@reddit
People are learning that devs are replaceable. Businesses exist to make money not to pay dev salaries. I love coding personally, but if as a business you can turn out Ai generated code and make money ....that's exactly what they will do. And if it works, you won't have a job.
All that being said, if you're a senior engineer you won't have much to worry about. Those skills will never leave you. You've built enough pathways through your brain. A few repetitions and those pathways will return. Junior devs however, that's a different matter altogether.....
appoloman@reddit
A lot of people seem to be arguing about whether AI is a capable tool or not, but I don't think this is the issue, this is about trust between humans. Many are afraid of the net-negative contributors in their organization being given force multipliers. I am, especially as an AI tool will dramatically increase the apparent output of a net negative contributor, but will only be a minor to moderate aid to a net-positive contributor.
We all know organisations cannot evaluate engineers almost at all, so the idea that AI is a tool that's great if used well is moot. We have the same argument about agile all the time, it doesn't matter if in-theory it's great if in-practice it produces undesirable effects at scale.
appoloman@reddit
I also have this fear, I'm trying to not use AI at all outside of work but it becomes this disgusting habit.
mildgaybro@reddit
I think the point is to forget how to code, why think when you don’t have to. That is the trend throughout history from industrialization, mass production, automation.
Strus@reddit
Using AI is skill like all other skills, and if you won't use it you will be left behind by skilled people that do.
That being said, using AI for coding ex. with Cursor is similar to painting by numbers - except that you are the one that is drawing lines and numbers, and AI is a dumb tool that is filling the fields with colors.
You are still "the architect" of your code. You need to know what you want to do.
Apart from that, AI is great at tedious tasks like formatting things, changing formats, generating test data, implementing stubs, fixing linter issues, generating boilerplate etc.
I was too very skeptical before I've spend a week intentionally "vibe coding" (I hate that term) everything and I've learn what Cursor/LLMs are good at and how to work with them to achieve what I want.
I also recommend reading this: https://fly.io/blog/youre-all-nuts/
NoleMercy05@reddit
Assembly developers said the same when KnR C came out.
seven_seacat@reddit
I've gone from being 100% anti-LLM to 'actually maybe its not so bad when you get used to it' to actually trying to use it in my work and having to rewrite the vast majority of code it gives me. Or stop it because its gone down some rabbit hole of random issues because it writes 500 lines of code and declares its production ready - when it doesn't even compile, despite you having in your rules file to ensure that code always compiles without warnings and tests always pass before finishing a task.
Yeah I'm not a fan.
jimjkelly@reddit
I think it’s wild how much companies are “investing” in this given how the benefits are theoretical, and things like the latest DORA report shows there actually be a net negative impact on product delivery metrics. Like, it’d be one thing to say “hey we spent a little money per engineer, let’s see what happens” but places allocating serious time of their staff to try and figure out how to get benefit…
menckenjr@reddit
Word. There are companies who have established whole teams to try and burrow this stuff into everybody's workflow (whether or not we need it) and it's just as obnoxious as you'd expect it to be.
Pleasant-Direction-4@reddit
I personally don’t offload a logical problem I haven’t solved before. Plus if I get help from it, I make sure I understand the why’s, so I can recreate the how’s easily
silentjet@reddit
Have you ever tried to code without access to the Internet? Without googling? Without IntelliJ or any other code completion tools?
About 15-20 years ago that's how technical part of the recrutement was happening. I remember I've got a FreeBSD PC, gcc+make on it, vi as text editor, and I was asked to write a custom memory allocator with some extra requirements. Fortunatelly libc is very reasonable in names, argument types and their order in API is predictable, so that was an easy task.
Doing active tech part of hiring these days I think maybe up to 5% candidates would be able to write such a code with no internet access... Especially on more advanced prog languages.
silentjet@reddit
I wanted to say, that what previously was known as StackOverflow coding, now would be LLM coding. And software still would be developed by actively practicing software engineers...
Equivalent_Case_7049@reddit
47M here.
A good example is map apps (Google Maps etc).
Been using it for about 15 years in the city where I live, and my direction/navigation skills have definitely eroded.
Now that I have offloaded this task to the mapping app, and have some brain space free - am I actually doing something productive with the extra “space”. Like mentally preparing for the meeting that I am driving to or am I just going to listen to a podcast or music and just drive (nothing work bg in that) - but this boils down to the individual.
I am a software engineer by profession. And earlier when I faced an issue I had to wade thru google search which would direct me to forums where people used to discuss solutions. I ended up picking up some pearls of wisdom and gaining a wider understanding of why the problem was occurring and not just the solution. Nowadays I use ChatGPT a lot and it just points me directly to the solution (blazingly fast I must admit) and I miss out on the “wading thru forums and reading up” bit. Yes it’s definitely faster now - but alarmingly I am finding that I am turning to ChatGPT for every hiccup i face in my day to day work.
So yeah, to answer your question - definite degradation. The wider consequences of this - it’s too early to tell.
Correct-Anything-959@reddit
You aren't going to forget how to code anymore than a calculator will make you forget math.
isurujn@reddit
This is such a bad comparison. Are there calculators that you can feed an entire math problem and it spits out the answer?
You use the calculator as an assistance. You still have the knowledge how to solve the problem. Even if you didn't have a calculator, you can solve the problem, it just takes a little more time. That's not how these AI tools are being used.
howtocodethat@reddit
I learned development without so, and now that it exists it’s a godsend for those code snippets I found once on stack overflow in a comment and then never again could find.
Stack overflow didn’t make us dumber, and neither will ai. You need to understand how all the pieces of a program fit together and what the output of the ai is, and be able to defend it to your coworkers. If the ai is using even one piece of code you don’t understand, ask it what it’s doing. Then you’ll turn it into a teachable moment. At least now we can do that, whereas you couldn’t exactly ask the stack overflow user from 2 years ago why their solution works
isurujn@reddit
Maybe AI isn't making any half decent programmers dumb. But I know first hand that it's not helping anyone actually learn.
This is an anecdote but we just got an intern a while back who's totally relying on AI to code shit up and it's making everyone else's life hell. More often than not, none of it works. He once literally changed a table structure of a web app without understanding why just because the AI prompted him to do so hours before a demo and broke the entire thing. There's zero learning going on.
And it's not comparable to using StackOverflow. I'm a self-taught developer and I owe my career to that site. Sure, there were times people would just hand me the complete solution. But a lot of the time, I had to take the solution given and modify it on my own to best suit my codebase. Besides people generally resorted to SO when they had exhausted all other options. With AI, you make zero effort.
There are ways to use AI to actually learn. First writing the code yourself and asking for suggestions to improvement is one way. But the vast majority of people just plug their entire codebase to an AI using these AI-powered IDEs and call it a day.
Fair_Atmosphere_5185@reddit
When people offload learning to AI - it absolutely will make them dumber. It's just going to happen much younger
howtocodethat@reddit
Again, did you get dumber by reading stack overflow?
It has to do with how it’s used at the end of the day. If you copy paste and don’t make an attempt to understand the code, sure. If you spend the time to understand the response or ask the ai how to do something, you can learn SO MUCH
sampsonxd@reddit
I think the difference is with stack overflow, it provides a solution but you still need to work it in. By the time you’ve changed it to meet your needs, renamed variables, throw some comments in you understand how it works.
Using AI you can just hit gimme code and it’ll work out of the box. I mean I hit build and it works right… 3 months later things break though and no one knows what’s going on with half the project.
howtocodethat@reddit
Plenty of people copy stack overflow answers and change nothing. Sometimes you have to change it to work in your project, but not always.
I think the real difference here is that it’s easier for the lazy person to get farther, and people mistake that for people getting dumber. It’s more like the “dumber” people are getting further in their career before they are caught for not really doing their job.
RedTheRobot@reddit
I think the problem you are running into is you are trying to convince people that absolutely have no interest in the benefits of AI. It is like trying to convince my grandma to use a computer. She sees no need to use one even though there have been multiple times I have had to do something for her.
There will be two types of devs. Ones that learn to use ai in their workflow and ones that don’t. Then in 2-5 years they will wish they did. When the internet became big so many devs said if you have to google you aren’t learning, google is just telling you what to do. Now googling is the norm and the same will happen with ai. These devs are the same ones that made SO so toxic.
howtocodethat@reddit
Very true.
Honestly I don’t care if someone else uses the tool. I’m not out here telling other people to use it, I’m just annoyed at the insistence that it’s making you dumber. I don’t care what studies say about it, the result of if you get smarter or dumber using a tool is all in how you use it.
When I was in school the teachers always said “don’t believe everything you read on the internet” and that still holds true. If you’re the kind of person who would have taken the first result on google or the ai summary at face value, you’re not going to learn or have good answers. But if you use it as a tool it can be invaluable.
Fair_Atmosphere_5185@reddit
It's like using a calculator. You don't hand it to a 2nd grader for them use with arithmetic. You expect them to learn how to do it, and then apply it forward. Give them the calculator too early and they just learn to punch in buttons.
Using AI is similar if you are relatively new with a set of technology. I've been coding for 20 years. I probably can use AI and not be harmed by it.
People using AI too early in their careers harms their learning and development process. By making everything and happening at the snap of a finger - they never push through the mental context load and actually master the underlying material. It's not supposed to be pleasant or easy.
Correct-Anything-959@reddit
My parents actually gave me a calculator right away and I had no trouble learning math.
The more I played with them, the more curious I became about the other functions and because the calculator was cool looking it felt like a toy and I played with math.
howtocodethat@reddit
I think that’s fair, though when you are learning I think it doesn’t hurt to use a calculator to check your answers at the very least
oromis95@reddit
I've seen it first hand, when you delegate your homework to AI without reading it, yes. People do get dumber. He's saying in a learning environment.
Fair_Atmosphere_5185@reddit
Most devs under 10-15 years of experience are in a learning environment. Anyone using a new language, stack, service, etc is in a learning environment.
On a long enough timeline - offloading things to AI will make you dumber.
oromis95@reddit
I don't think asking chatGPT to write a JSON containing 75 home addresses, first and last names will make you any dumber. If anything you get dumber by doing it by hand when your time could have been better spent. Moderation is generally a good rule for everything.
AnimaLepton@reddit
It's definitely something to think about. You need to push yourself to do new things, get feedback, and do the actual rote work to learn and retain your skills. And while you'll likely keep some core skills, your broader skillset can absolutely atrophy within just a few months of disuse, let alone years.
On the flipside, AI is just a flashier aspect of the tradeoffs we already make with regards to knowing when something is "good enough" and moving on. You don't need to drink to Kool-Aid to find at least some shortcuts that save time. Those let me get my work done faster and do other things. Some of that is work related - there's a lot of work that comes down to communication and consistency, not (just) the coding. But some of that is giving me extra time to take a few walks, get some small workouts in during the day, relax and eat lunch, or just get off work early.
I don't know about longevity in this industry firsthand, but I saw my parents struggle to find work even when they were in their 40s, and I don't want to have to work into my 40s and 50s and stay on the treadmill of constantly learning new tech. There are so many things I touch, even things I'm interested in, that I don't need to or have time to truly develop expertise with. My plan is to be able to retire by the time I'm 35 or 40, although of course plans change. As long as I can stay employed and employable throughout that timeframe, I'll be happy. I'll probably still work on projects that require some related or lateral skills. But I have a ton of other skills and hobbies I want to continue developing that I don't necessarily want to have to monetize. In the meantime, if AI makes my life easier in the moment, that's good enough for me.
al2o3cr@reddit
This reminded me of an article I saw recently that ended with the suggestion "try reading 'cocaine' in place of 'AI'" and it makes this post HILARIOUS 😂
beaverusiv@reddit
I recently took an online coding interview where the instructions were "setup your project however you want with whatever framework you like" so I set up a React project with MUI which is what I've built all my projects in for the last 3 years.
Interview starts, they give the prompt - which I immediately question because MUI basically has a UI component that does what they want me to implement and they say "yeah, you can't use that, also you can't use any React hooks which handle XYZ either". So now I have to remember how to do things without React and MUI and it was a lot harder than I thought it would be.
You really do get comfortable/used to the tools/frameworks you use, and it's not just AI
splicer13@reddit
Boss makes a dollar, I make a dime, that's why I [use the bathroom] on company time.
You've got an opportunity to, in fact are forced to learn about it. So do that. Nobody really knows for sure how this is going to affect them but business types are very optimistic it's gonna make you a cog. I think at least the top 10% will adjust and be fine. Who knows? So find out.
The faster you learn that you should become an electrician or whatever, sooner you can start. And for union electricians hiring is based on seniority. Also you want to get the apprentice work in when your joints still work.
I don't know if AI is going to turn this industry into a burning pile of shit (kinda leaning that way) but you don't get to opt out unless you're a snowflake 10x indie dev.
Comprehensive-Pea812@reddit
Some say you are getting dumber with google.
I did have kneejerk reflex to ask google everytime I need to do something. now it is AI.
good thing is you can challenge AI and practice your critical thinking. if you dont bother to check and just swallow whatever AI gave you then it is true you become dumber.
SiG_-@reddit
Don’t raw dog copy and paste things without understanding how and why the code works?
nonades@reddit
Jokes on you, my colleagues struggle with that before AI
RenTheDev@reddit
Reminds me of an interview process of a household-name company. They told me 1 of 5 rounds would be assessing my ability to use an AI powered IDE like Cursor. I couldn’t say “no” fast enough.
uuqstrings@reddit
I use AI as a proxy to advocate for test-driven development. Write your tests, have your AI write just enough code to pass the tests. Gives a human control over the outcome, encourages black-box systems thinking.
MeTrollingYouHating@reddit
I mainly use AI for things I do infrequently enough that even if I did learn it, I'd forget it before the next time I needed it.
I'm never going to be a Powershell developer so twice a year, when I need to script something on Windows, I just blindly let Copilot write the whole thing.
I don't care that I'm not learning anything. I don't care if I'm writing modern, idiomatic Powershell. I do care that it took 5 minutes to write something that would normally take 2 hours.
Maiden_666@reddit
Wow do we work for the same company lol literally same thing is happening
powdertaker@reddit
Fun Fact: AI is also non-deterministic. Meaning you could (and probably will) get a different answer given the same prompt and input files in the future.
ListenLady58@reddit
At the end of the day, there will be devs who jump on the AI train and they will flourish because it will help them get things done faster. They know what they are looking for and therefore they will be able to make corrections to what AI gives as an output. They will, in fact, be faster and deliver more, therefore will be in higher demand for their work. People who resist using AI need to get faster than those who do or else don’t expect to stick around.
The fun part will be watching those who think they can just join tech and be a prompt engineer without any background in coding or engineering at all. That will be a massive disaster and it will be highly noticeable.
Double_A_92@reddit
AI barely helps me to actually solve problems. It's more like a very smart template engine.
I.e. recently I had to create a new mouse click tool for our CAD software, and I told the AI to create a new tool that does this and that based on an existing one.
The logic code it wrote was absolute nonsense, but it created the whole skeleton of the class with sensibly named methods and comments in our companies style.
Beginning_Occasion@reddit
We recently had a discussion about how we think of creative solutions to problems. One person said he always asks ChatGPT for ideas. If you offload your thinking to ChatGPT, you will get dumber.
Another insidious effect I've noticed is that devs are seemingly less able to have coherent discussions about anything technical. Also, devs don't want to pair with other devs, sharing knowledge, as this may just involve one personel watch the other prompt.
As an aside, it's interesting how all of the nirvana-like experiences I read about using AI seem to be in a solo dev setting, and how too much AI Ina team setting leads to no one understanding the code and a lot of frustration.
I'm not saying AI tools are a net negative, but rather all tools have the things their good for and their dangers: some tool might cut off your hand if you're not careful while another could give you RSI without proper form. I wish we could have these discussions concerning ai.
SoggyGrayDuck@reddit
I absolutely despise this shit. Why use a technology just because it exists? It was the same shit with data analytics for the past 10 years where everything was green lit but had absolutely no details or clarification of what the company wanted to do. The executives go to some conferences and get a 30 min rundown of what some startup did and they want the same results but don't know how to get there so they pass it onto the director, who passes it to the manager, who passes it to the engineers who have no idea what the business needs because the business doesn't even know. Then in 10 years they look back on what they spent and go "oh shit" realizing nothing actually panned out other than the standardized reports that could have been generated directly from the ERP system. I fully expect AI to follow this same path.
I just got done with a failed project that tried to create delivery teams that would operate like a startup. Unfortunately they didn't think about the reason that works and people put in so much extra time to work through major problems, receiving a percentage of the profits or company! That's why those teams work, it doesn't work when everyone is straight salary and simply want clear job requirements and responsibilities.
gimmeslack12@reddit
I’d be excited to use all these tools to help definitely answer that it makes things move slower. The tech debt from autocomplete AI entries is gonna mount up fast.
fireblyxx@reddit
We recently had a big effort that required building brand new components and scaffolding things so that they could more used more broadly later on. The LLM Agent was invaluable at it, but I found myself finding it most effective when I already strictly defined the requirements of the work, and scoping the prompts in ways that I knew wouldn’t blow things up if it got something wrong. So I still coded it, effectively, but in the way that a bunch of developers in a scoping meeting would do prior to actually writing tickets.
However, now we’re off of the big effort project doing typical sprint work, and I find myself having less use for the agent.
andymaclean19@reddit
I think when AI starts to get it right the experience will be a bit like when you become a senior dev/architect/ whatever and start ‘programming with people’ - speccing things out and supervising instead of writing everything yourself. I didn’t find that particular transition meant I spent less time coding and I don’t think AI will really stop that either. Sometimes you will still code but not all the tedious bits.
a_lovelylight@reddit
I mainly use it for rubber-ducking and grunt work.
For example, I had a list of 100 hard-coded static fields I needed converted to an enum class. Java, so nothing complicated. I popped those fields into ChatGPT and got my enum class.
Then I ping-ponged some ideas with ChatGPT on how to best refactor all the classes that needed to use that enum because my manager was fucking insane and considered any form of refactoring or even reformatting to be a functionality change. Sucks to be fired but goddamn, that place was worse than the place that laid me off. 🤦♀️
I think AI encourages us to get lazy with critical thinking skills, so I don't rely on it for anything that isn't just to speed up my work, or for anything that really matters (ex: I don't care if ChatGPT is hallucinating facts when I ask about how we might stream updates to androids stationed on Mars).
I also get a little cute sometimes and turn off autocomplete in IntelliJ, but only for personal projects.
Constant-Listen834@reddit
Not really, I just use AI where it works and then I don’t use it where it doesn’t
ImaginaryButton2308@reddit
Syntax, concept explanations, code snippets is where AI excels. It just can't do anything widescale really, too far from it.
BiscuitOnFire@reddit
I recently started to use Cursor at work and it really helps with boring stuff like auto completion it does write code for me tho. Sometimes if I get distracted by what he's trying to propose to me I simply turn off auto completion.
I get your point but I feel like it's just another tool like I use google or some lib's documentation nothing more