The trend of developers on LinkedIn declaring themselves useless post-AI is hilarious.
Posted by VindoViper@reddit | ExperiencedDevs | View on Reddit | 326 comments
I keep seeing popular posts from people with impressive titles claiming 'AI can do anything now, engineers are obsolete'. And then I look at the miserable suggestions from copilot or chatgpt and can't help but laugh.
Surely given some ok-ish looking code, which doesn't work, and then deciding your career is over shows you never understood what you were doing. I mean sure, if your understanding of the job is writing random snippets of code for a tiny scope without understanding what it does, what it's for or how it interacts with the overall project then ok maybe you are obsolete, but what in the hell were you ever contributing to begin with?
These declarations are the most stunning self-own, it's not impostor syndrome if you're really 3 kids in a trenchcoat.
Jackdaw34@reddit
Perhaps they are an avid contributor at /r/singularity.
Comprehensive-Pin667@reddit
God I hate this subreddit. I started following it to stay on top of what's going on with AI but it's not really good for that. All they ever do is wish for everyone to lose their jobs so that they can get UBI.
Jackdaw34@reddit
Exactly the same with me too. I joined it to have some specialized AI takes in my feed other than the general /r/technology posts and damn is that sub on deep end. They take everything that comes out of SamA or OpenAI as gospel with zero room for skepticism.
Yet to find a sub with good, educated takes on whatever's going on.
Firearms_N_Freedom@reddit
Also the vast majority of that sub doesn't understand how LLMs work. Many of them genuinely think it's close being AGI/sentient
Jackdaw34@reddit
Close to? They are already declaring an unreleased model AGI because it’s scoring high on Arc AGI.
hachface@reddit
There is no accepted definition of general AI so people can just say whatever.
Ok-Yogurt2360@reddit
Changing the goal post ! Changing the goal post ! /s
Noblesseux@reddit
Say it again for the people in the back. There is straight up a guy in another thread that seemingly doesn't understand the concept that there is not a standardized test that can evaluate general intelligence, partially because in a lot of ways we don't really understand it.
A lot of the evaluations people are using are basically "we found something that the existing LLMs aren't that good at", and then when someone creates one that scores well on that largely arbitrary test, people unironically think it means the thing is an AGI.
Noblesseux@reddit
The vast majority of the entire internet doesn't understand how LLMs/SLMs/etc. work. There was a guy who got salty at me the other day because I pointed out in an article about PUBG adding in an AI powered companion that the SLM they're using is mainly just kind of a user interface on top of the NPC logic and is thus going to be much dumber than they're thinking.
The guy genuinely thought the SLM was controlling the character and thus it would be near-human in proficiency, so I made the joke that the L in SLM stands for Language not Let's Play, and then he got mad and blocked me.
JonnyRocks@reddit
r/openai might be good for you. despite the name, it seems to be a very general ai subreddit. they arent super openai or sam either
Ok_Parsley9031@reddit
Totally. Every update from Sam Altman is considered admittance of AGI.
Slight-Ad-9029@reddit
I am there often honestly and it’s mostly just NEETS
Ok_Parsley9031@reddit
I was reading over there today and got the same vibe. Everyone is so excited but they have a very naive and optimistic outlook where the reality is probably much, much worse.
UBI? It’s far more likely that there will be mass job loss and economic collapse.
drumDev29@reddit
Owner class would much rather starve everyone off than pay UBI. They are delusional.
Noblesseux@reddit
Yeah this is always a funny thing to me. The richest country in the world right now can't even be bothered to ensure that people who are working full time are able to afford homes because we refuse to even consider housing to be more important as shelter than as an investment vehicle.
What moon rocks do you have to be snorting for you to think that country (also the country that thinks giving kids free breakfast is unacceptable because it makes them "lazy") is going to suddenly vote in a UBI? That's never happening.
iwsw38xs@reddit
I think that's where the phrase "eat the rich" comes from. It's a conundrum; they better have bunkers.
Sufficient_Nutrients@reddit
Given the COVID checks, I think if we hit 25% unemployment there would be a similar response. Especially if it were the lawyers, developers, and doctors getting laid off.
steveoc64@reddit
Just had a read - fascinating stuff !
These people have no memory
I find the whole belief in AGI thing to be one giant exersize in extrapolation. It’s mostly based on the misconception that AI has gone from zero to chatGPT in the space of a year or 2, and therefore is on some massive upward curve, and we are almost there now.
ELIZA for example came out in 1964, and LLMs now are more or less the same level of intelligence… just with bigger data sets behind them.
So it’s taken 60 years to take ELIZA, and improve it to the point where it’s data set is a 100% snapshot of everything recorded on the internet, and yet the ability to reason and adapt context has made minimal progress over the same 60 years
Another example is google. When google search came out, it was a stunning improvement over other search engines. It was uncanny accurate, and appeared intelligent. Years later, the quality of the results has dramatically declined for various reasons
By extrapolation, every year going forward for the next million years, we are going to be “almost there” with achieving AGI
Alainx277@reddit
Claiming ELIZA is remotely like modern AI shows you have no idea where the deep learning field is currently or what ELIZA was.
The Google search analogy is also completely unrelated. It got worse because website developers started gaming the algorithm to be the first result (SEO). The technology itself didn't get any worse.
steveoc64@reddit
I think you missed the point of the comment
Modern LLMs have exactly the same impact as Eliza did 60 years ago
Or 4GLs did 40 years ago
Or google search did 30 years ago
Quantum computing
Blockchain
A clever application of data + processing power gives an initial impression of vast progress towards machine intelligence and a bright new future for civilisation
Followed by predictions that the machine would soon take over the role of people, based on extrapolation
Of course you are 100% right that the mechanisms are completely different in all cases, but the perception of what it all means is identical
All of these great leaps of progress climb upwards, plateau, then follow a long downward descent into total enshitification
It’s more than likely that in 10 years time, AI will be remembered as the thing that gave us synthetic OF models, and artificial friends on Faceworld, rather than the thing that made mathematicians and programmers obsolete
iwsw38xs@reddit
Can I pin this comment on my mirror? I shall read it with delight every day.
WolfNo680@reddit
Well if the data that the technology uses gets worse, by extension with AI, the results it's going to give us are...also worse? I feel like we're back at where we started. AI needs human input to start with, if that human input is garbage, it's not going to just magically "know" that it's garbage and suddenly give us the right answer, is it?
Alainx277@reddit
The newest models are trained on filtered and synthetic data, exactly because this gives better returns compared to raw internet data. The results from o3 indicate that smarter models get better at creating datasets, so it actually improves over time.
It's also why AIs are best at things like math or coding where data can be easily generated and verified. Not to say that other domains can't produce synthetic data, it's just harder.
steveoc64@reddit
Depends what you define as coding.
It’s not bad at generating react frontends, given a decent description of the end result. ie - translating information from one format (design spec) into another (structured code)
Translating a data problem statement into valid SQL, or a JSON schema is also pretty exceptional
It’s worse than useless in plenty of other domains that come under the same blanket umbrella term of “coding” though
If it’s not a straight conversion of supplied information, or anything that requires the ability to ask questions to adjust and refine context .. it’s not much help at all
Ashken@reddit
And then the occasional FDVR circlejerk
markoNako@reddit
According to the sub, AGI is coming this year...
i_wayyy_over_think@reddit
Comes down to definitions though.
deadwisdom@reddit
Correct, and by a perverse set of circumstances the only definition that matters is Sam Altman's contract with Microsoft, which we cannot know. This is because, supposedly, Microsoft loses all control over OpenAI once they create "AGI". So I'm sure the OpenAI definition will be as loose as possible, and Microsoft's definition will be as tight as possible, and a marketing war will ensue that we will all get caught up in.
hachface@reddit
How did an industry defined by smart people become so fucking stupid
Calm-Success-5942@reddit
That sub is full of bots hyping over AI. Altman sneezes and that sub goes wild.
VisiblePlatform6704@reddit
I remember loooong ago there was a subreddit of literal bits talking to other bots.
Is there any such a think nowadays?
regjoe13@reddit
Honestly, I expect the effect on programming by AI use to be similar to effect introduction of CNC had on machinists.
flakeeight@reddit
If someone is too active when it comes to posting on linkedin i don't really trust this person professionally.
anyway, AI is the new cool thing for some people, let's see what comes next.
Careful_Ad_9077@reddit
I was just reading a recruiter's post that said (amongst other things) that they consider " too much posting in LinkedIn " a red flag.
RandyHoward@reddit
I've noticed this trend from a few of my former coworkers who start posting a ton on LinkedIn as they've moved into management roles. People who have never posted much at all are now making a post at least weekly, often more frequently than that. Go manage your team instead of managing your LinkedIn post schedule.
MinimumArmadillo2394@reddit
That's the advice now.
Post atleast weekly on linkedin, because otherwise your application/profile is considered "inactive" to recruiters. The best way to get noticed on the platform is to actually post, which is often times once a week.
If you have premium, you can usually jot down some nonsense and the AI will make it look good, even if the content is pure slop
UnkleRinkus@reddit
My inbox wishes to disagree.
crazylilrikki@reddit
I’ve never created a post on LinkedIn and regularly receive messages from recruiters.
MinimumArmadillo2394@reddit
Yeah, youve been in the market for over a decade lol.
Hardly anything about the current market applies to you
DigmonsDrill@reddit
A weekly post is too often? How much can I post on reddit?
belkarbitterleaf@reddit
Twice per account.
DigmonsDrill@reddit
I'd better make my second comment count, then.
belkarbitterleaf@reddit
🎉 congratulations on using all your comments in a single thread. Any thoughts on what your new account name will be?
msamprz@reddit
They can't reply to this comment anymore :/
Freedom9er@reddit
They're angling to move to senior management elsewhere.
Thug_Nachos@reddit
Absolutely. That's why I do it.
My audience isnt my peers, it's people who don't know anything about my field who need to feel good that they are hiring someone "aligned with blah blah blah".
flakeeight@reddit
kinda agree.
from my experience when someone posts too much on linkedin it's never because they exclusively wanna share knowledge, they want attention somehow and then when you work with some of them they act like freaking little rockstars.
linkedin is the onlyfans for office people, i guess haha
RandyHoward@reddit
Yep, there's two ways people use LinkedIn... 1) To search for jobs, and 2) To stroke their ego
Eire_Banshee@reddit
I use it as a living resume and to spy on coworkers past job experience when I'm mad at them.
dieselruns@reddit
It's not even that good for searching for jobs. After all, why would LinkedIn want you to be successful at finding a job? Then you'd be done using their platform - unless you found a job as a manager who needs to validate in an echo chamber. LinkedIn is the new Facebook.
AchillesDev@reddit
When I was working for other people LinkedIn was the primary way I got leads. It's not for people who just apply blindly to jobs, though.
RandyHoward@reddit
I agree, though I use it in job searches I don't think I've ever actually landed an interview through LinkedIn
teslas_love_pigeon@reddit
As a counter example, every job I've gotten for the last 10 years has been through linkedin. It's been like 60/40 for specific recruiters reaching out to me versus myself applying to jobs on the site.
rdditfilter@reddit
Thats so weird cause I mostly get messages for some basic contract job for some local in-office company when my profile obviously states that I work from home for a bigish tech company.
Like they’re still offering me jobs that I wouldn’t have responded to even fresh out of college. Its so weird they spend money on that.
teslas_love_pigeon@reddit
Going to sound harsh but I'm guessing that it's mostly do to your work experience and companies you work at. Everyone I know that worked at name brand companies, not taking about Meta or Netflix here, have had no issues with getting messages about other F100 companies.
When you're a recruiter you can target very specific people with certain types of experience. If you don't fall in those filter's range, you get left behind.
It's extremely unfair.
rdditfilter@reddit
That may have been true before but I work at a pretty decent sized analytics company now and I'm still getting just the bad jobs in the cold messages.
Good thing I'm happy where I'm at, I guess.
kayakyakr@reddit
I landed my current off LinkedIn. Previous was a referral, and before that was off indeed.
AuroraFireflash@reddit
It's what you make of it. I use it as a smart address book of all the people I've personally worked with or known through the years. These are also the first people that I'd hit up if I were laid off and was looking for work.
Also works as a simple resume that I try to keep updated every 6-12 months.
The social media bits? Meh... I'm too busy for that.
_dactor_@reddit
Once people started sharing political opinions on there it was all over
Sexy_Underpants@reddit
LinkedIn makes most money from companies and recruiters paying to find employees. They want them to be successful to keep paying per user subscription fees.
Anecdotally I have found several jobs on LinkedIn as a developer.
pheonixblade9@reddit
fun fact, as a recruiter, you mostly pay when prospects don't message you back. it's $10 for unresponded messages. so I don't bother responding to the low effort BS. they can pay the "didn't read my LinkedIn resume" fee, lol
pheonixblade9@reddit
I get the vast majority of my job opportunities from LinkedIn. beating the recruiters off with a stick, sometimes, especially AWS recruiters. I have over a decade of experience, mostly at big tech, though, so YMMV
RoyDadgumWilliams@reddit
The finding jobs part for me is more about checking where friends, acquaintances, former coworkers, etc are working so you can get the inside scoop on the company and:or a referral from them.
HL-21@reddit
I use it as a recruiter farm. Works pretty good and have gotten a few recruiters out of it over the years that lead to pretty good roles. I don’t post or engage in the other nonsense though, plus I don’t understand people putting controversial political opinions under their “help get me a job” profile.
supyonamesjosh@reddit
This is a good adage for most social media but I don’t think it applies to LinkedIn because of how much money they make from companies listing and promoting their jobs.
If nobody was successful companies would stop paying them to promote their openings.
DigitalArbitrage@reddit
3) To try and sell something
warmbowski@reddit
This. Most people posting about the demise of engineers in favor of "AI" stand to gain something. Usually VC funding.
juggbot@reddit
Hey you can also use it to troll the ego strokers which is really fun
Pristine-Campaign608@reddit
Job searching on LinkedIn has enshittified.
RandyHoward@reddit
Yes very much. When I am searching for jobs I use sites like LinkedIn and Indeed just to assemble a list of companies that are hiring. I typically apply for jobs directly through the company's own website if possible.
Tuxedotux83@reddit
Unless the person posting is a social media or marketing manager and most of their post are „role oriented“, indeed a red flag.. it shows that an IC is more focused on appearing as someone and less about practicing their actual job well
PrivacyOSx@reddit
I disagree. I used to post educational content on LinkedIn a lot when I wanted to get a job, and it dramatically increased my visibility & got me a lot of opportunities. I do agree that some people's content is trash & just looking for attention, but there are others that provide true value with bite-sized lessons that show to others you're someone that is knowledgable.
Grounds4TheSubstain@reddit
Visibility is helpful to your career, but some people are borderline obsessed with LinkedIn. It attracts the worst preening narcissists who want to show everybody how virtuous and wise they are. The platform would really benefit from the ability to downvote posts. Fake ass story about how you gave the shirt off your back to a downtrodden person but that they still need to pull themselves up by the bootstraps? -50 for you, maybe you'll think twice before posting that shit next time.
PrivacyOSx@reddit
Agreed. Those type of posts are incredibly annoying, ands not the ones I posted. I mainly did bite size lessons like how ByteByteGo does.
thedeuceisloose@reddit
Because it shows you prefer social media to actually doing the job. One of those “your reputation precedes you” sort of things
staminaplusone@reddit
If i hire you i want you working instead of posting on linkedin or reddit or... wait a minute!
pheonixblade9@reddit
I used to be super active on StackOverflow and it was generally seen as a double edged sword by potential employers, lol.
Mornar@reddit
Best I can do is half of that.
staminaplusone@reddit
Which half. The working or the social media 😅 (or did you mean no LinkedIn and 100% reddit)
Mornar@reddit
I can definitely be working instead of posting on LinkedIn.
touristtam@reddit
I was going to ask you if you are doing that from your terminal, only to remember that Google search is still a thing ... anyway there is at least one TUI reddit client, which is impressive and completely useless.
Mornar@reddit
And I'm sure there's people claiming this is the way to interact with reddit.
Which tbh now that the official app is being forced and the web page is getting facebook'd hard I'm actually starting to see the appeal of, frankly.
RandyHoward@reddit
If they ever kill old.reddit.com that will probably be the end of my days on reddit
Mornar@reddit
That's what I'm saying too, but I thought that when redditIsFun was getting the axe, so I'm not sure if I trust myself on that.
Bren-dev@reddit
Do you think there’s a middle ground? As a developer who never posts anything, I feel like I’m doing myself a massive disservice
AchillesDev@reddit
Yes, if you're not completely shortsighted it's a good way to build a network and show what you know to other professionals and recruiters. When it comes time to find a new job, or if you go independent (something people here apparently can't even conceive of), that network becomes your lifeblood.
If you're okay with having a weak network and staying where you are (and then complaining about the "weak market") follow the bad advice in this thread.
carlemur@reddit
It does seem like a lot of faang-ey types who can snap their fingers and get a job sneer at the idea of self promotion, not understanding that having a brand and being known for something is the way the rest of us maintain a pipeline of jobs.
AuroraFireflash@reddit
The only things I posted on LinkedIn is stuff that you'd find on my resume / CV. There's no need to go further.
It's also a nice way to keep an updated address book of all the people I've worked with over the years.
OtaK_@reddit
Next we'll probably figure out that the "strides" made by LLMs in producing code will go down significantly as the "next-gen LLMs" get trained on the horrid & broken code previous gens produced, poisoning the output and at least negating any advancements in accuracy.
I WONDER what will happen to all those people basically handing the steering wheel to LLMs for the past few years (no).
Sensitive-Ear-3896@reddit
We will be going back to doing it the old fashioned way, google and stack overflow!
OtaK_@reddit
Assuming those people didn't lose it in the meantime.
One of my friends (React front-end dev - 4 YoE - intermediate level) was using Copilot/Claude profusely and complained that they were feeling like they were losing touch with the logic of algorithm thinking.
Told them to try NOT using it for 6 weeks, write everything by hand etc and make conclusions.
First 4 weeks were an absolute miserable abyss of incompetence. Then it came back. They haven't touched LLMs for work ever since.
-_1_2_3_-@reddit
ok buddy
OtaK_@reddit
Anything more interesting to say, buddy?
-_1_2_3_-@reddit
Just that the march of progress will render those who don’t embrace new tools unproductive in comparison to those who do, the same story that has played out countless times across humanity.
Those who can’t see what’s on the horizon now will become increasingly entrenched curmudgeons as their well meaning skepticism slowly turns into a personal liability.
OtaK_@reddit
Exactly *which* progress are you talking about?!
A LLM that is able to wrongly regurgitate mangled code ingested from a training corpus with extremely inconsistent quality (because human-produced code is like that) at extremely high speeds? What's the point of gaining time to produce...nothing of value?
I was there when TabNine started out before the words "AI" or "LLM" were ever uttered. I tried it, used it. It was just a shitty crutch that was less correct than I am at my job. I tried Copilot, Claude and all the others too. None are better than any fresh out of school junior dev with 0 experience.
Now, I'm gonna go out on a limb and agree with you: devs who do useless jobs like creating the 90th version of "I have a project, it's going to be Facebook but *better*", then sure, they'll have to find something else to do. And it's a good thing. Same goes for anything involving reinventing the wheel for the Nth time. But actual engineering? Highly doubt we'll see anything of use in the next 10 years. I'd be happy if I'd be proven wrong, but all signs so far point to not happening.
-_1_2_3_-@reddit
Your inability to generate good code with LLMs says more about you than the model. Garbage in garbage out.
RemindMe! 2 years
OtaK_@reddit
It's very funny because I've seen this argument too many times. I can prompt engineer without issues. I know most of the common techniques (one/few-shot/CoT prompting and many others) and had my bit of fun with adversarial prompting techniques. Keep being delusional.
You cannot say in good faith that any LLM can help you engineer something that does not exist. It's impossible and completely against the underlying principles of pre-AGI LLMs. With no corpus to train on it's impossible to get any non-hallucinated answer. Once we have AGI (in 10? 20? 50 years?) then okay, maybe yes.
But as I said if it's for reinventing the wheel for the Nth time it works yes. But I don't care about those devs. Working on such topics is a risky line to tread on as it can snap under your feet anytime. (Remember what happened to an ancient job called "Webmaster", whose task was to maintain static websites by manually writing content in HTML and styling it with CSS?)
Sunstorm84@reddit
I’m just waiting for the AI bubble to pop.. it doesn’t seem like it’s that far away from happening.
RemindMeBot@reddit
I will be messaging you in 2 years on 2027-01-08 15:03:06 UTC to remind you of this link
CLICK THIS LINK to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) ^(delete this message to hide from others.)
stevefuzz@reddit
It's good at code completion (limited), helping name variables, and maybe writing some documentation. Anyone arguing with you is about to try to sell you at ChatGPT wrapper. To use a 90s term, ignore this poser.
pheonixblade9@reddit
lol, I don't have much interest in AI and recruiters are beating at my door, primarily for roles to unfuck codebases that have been tech debted to hell by years of moving fast with LLMs and contractors.
-_1_2_3_-@reddit
TIL tech debt was invented by LLMs
pheonixblade9@reddit
it certainly doesn't help!
pheonixblade9@reddit
this has been my criticism - LLMs can be faster in the short term for some tasks, but it erodes critical thinking skills, and makes it less likely that SWEs will be able to solve actual challenging problems (debugging, performance/optimization issues, etc.) that the AI can't figure out for you.
AchillesDev@reddit
Rawdogging a chat interface will do that. Using it as a good autocomplete won't.
antiquechrono@reddit
This also happened to a friend of mine to the point he basically can’t code anymore and is really struggling with coming back out of it. I really wonder what the societal impact of ai brain rot is going to be.
OtaK_@reddit
I think it's going to be something akin to the flavor of financial doom of all the cryptobros who bet their whole savings into some obvious shitcoin/rugpull. You get set back so far that you basically have to start from scratch again because the world/industry isn't waiting for you to keep up.
SnooPears2424@reddit
Speaking of this. There’s this guy that keeps popping up on my feed, I forgot his name. But it’s some staff engineer from Instagram. The header reeks of influencer wanna be. He seems to have a new post every single day about genetic things to be an effective developer. I looked at this profile and I saw he had like a 1 year tenure at the previous companies and insta is the only he had the longest tenure at.
I can’t recall the name but it’s a guy with dark hair and carry a sweater over his shoulder. Anyone actually know if he’s legit? Really strikes me as disingenuous.
flakeeight@reddit
Oh damn, I’ll look it up tomorrow cause I’m curious. I’m here for the gossip hahaha
SnooPears2424@reddit
Ryan Peterman
BosonCollider@reddit
Prolific Linkedin posters are like Wheatley from portal 2. They don't just say stupid things, they often say things that take an extreme amount of effort to achieve that level of stupid. Though what non-physicists confidently say there about physics is probably one step worse than about programming
JaneGoodallVS@reddit
So far, for general software development, it's more useful than blockchain, less useful than the cloud.
lost60kIn2021@reddit
Most of them at some time in the past were posting about web 3.0, then NFTs, blockchain...
flakeeight@reddit
Exactly 🥲
casey-primozic@reddit
AI is amazing for generating Go structs to receive API responses. Saves me a ton of time having to type all that Go boilerplate crap.
AchillesDev@reddit
Eh it depends on what you're doing. Yeah if you're an employee somewhere maybe that makes sense, but there are a lot if you're a founder (especially in B2B product orgs) or a consultant/contractor/freelancer, that's where you go for your marketing and lead generation, and it works really well for that. That's where I get my clients that are outside my own network.
Irish_and_idiotic@reddit
Wow… you just changed my entire perspective of LinkedIn and you’re so right.
AvidStressEnjoyer@reddit
This 👆
People who post on LinkedIn are at least one of 3 things - psychopath, looking for attention, or deeply mentally deficient. Only exception is if you’re in the market for a new role.
squishyhobo@reddit
Forget posting on LinkedIn. Even keeping it updated is kinda a red flag.
ikeif@reddit
I was trying to find an article where LinkedIn talked about the high percentage of posts that are AI, and there's a "short study" that reads like it was passed through AI.
This isn't the article (I want to say LinkedIn published the number, probably because of the first article, to show "a lot of people are doing it and seeing results."
…but I think it's also pattern recognition. People are becoming more aware of faux-engagement and rage bait. The constant immediate replies to any comment with "what would you do differently?/what great insight - what else do you think would cause/drive/etc?"
Social Media sites are going to use AI to drive engagement so they can start to cut out any "influencer" making cash from them when they could be funneling that cash back to themselves.
pheonixblade9@reddit
I love all the hot takes posted from people as if they're unassailable truths and you go look at their profile and it's just a decade of being a "CTO" at various crypto companies 🤣
thelochteedge@reddit
I used to hate on chronic LinkedIn users... then they came out with games and now Queens forces me to open the app daily. Fun game.
sonobanana33@reddit
chatgpt is excellent to generate posts for linkedin! I love it! (I use it to generate parody of linkedin posts)
flakeeight@reddit
hahaha true, those are fun
Swimming_Search6971@reddit
Correct, Linkedin is to work what facebook is to life. Except messages from recruiters there is nothing nuch worth the read.
olssoneerz@reddit
This. From my limited experience, the more time a colleague spends posting on LinkedIn, the less effective they seem to be at their job.
iceyone444@reddit
Me either - the biggest self promoters are on there.
MathematicianFit891@reddit
If you really want to laugh: in the 90s, they thought business people were about to take over most software development work through the use of visual object-oriented design tools.
DogOfTheBone@reddit
Something you eventually learn after working in software long enough is that a lot of devs who are high-level/very experienced on paper have never actually done work beyond the goofy little scripting or basic system design level.
Promotions and titles don't always come from merit, and if you're a small cog in a large machine you can spend years and get a fancy senior/staff job by virtue of attrition.
I suspect some of the people who freak out about AI on social media are this type.
Slight-Ad-9029@reddit
I have learned that a lot of people that come in with authority talking about “25 years of experience here” often give some of the worst advice you have ever heard and clearly have never done any real significant work before
CpnStumpy@reddit
The amount of engineers who are desperately averse to banging out code these days is persistently weird to me. Buy vs Build is a good and important discussion and decision, accounting for cost of ownership and maintenance (for both choices). I'm not seeing that though, I see more and more engineers desperately trying to figure out how not to write any code at all, or speaking of it as a Herculean endeavor. I'm agog, coding is fun, at some point the industry seems to have decided that's untrue.
pheonixblade9@reddit
I've always found it so odd that engineers are excited that their jobs will get easier because they have a tool to write code for them. The actual writing of the code is one of the easiest parts of the job, in my experience.
theDarkAngle@reddit
True and I also believe it's lower cognitive effort than things like tracking down bugs or trying to map out vague/incomplete requirements to a general code structure (even great product analysts leave plenty of ambiguity, it's just the nature of a highly lossy language (English/human) vs a highly specific one (code)).
This is kind of why I think even productivity gains from AI will be somewhat marginal for the foreseeable future. When you think about how we work, we have a limited amount of cognitive energy and for most of us it doesnt last 8 hours on the more taxing things like I mentioned. Maybe it lasts 3 or 6 hours, and then we spend the rest of the day on easier coding tasks or even lower effort things like unnecessary meetings or reading emails.
So AI mostly will just cut down on that time we have to spend doing easier things, but it doesn't really change the harder part that would actually lead to productivity gains.
If anything, AI should simply lead to a shorter workday, but you know we don't have the culture to support that. We'll just do more meetings or read reddit more, most likely.
ogghead@reddit
Some portion of devs are purely in it for the money — if they’re smart, they can thrive in certain environments (FAANG), but their lack of interest in the work means they eventually devolve towards this mindset. Those of us who do have passion for coding and learning new technologies will have a longer, more fulfilling career, but because tech jobs have become so lucrative, you’ll see folks in the field who straight up hate coding and technical learning. 20-30 years ago, they might have instead become stock brokers or gone into another highly paid field for the time.
SmartassRemarks@reddit
This is so real.
pheonixblade9@reddit
agreed - it's not necessarily obvious from my resume, but I've gotten pretty deep on some technical stuff that I think most people would not be capable of. I found what is essentially a compiler error in the Spanner query optimizer when I was at Google, and I have found a couple of pretty significant performance bugs, as well. I doubt AI is going to be capable of that sort of work any time soon.
tittywagon@reddit
All bug fixes or little enhancements and when it comes actually doing something bigger it's a big hack job unfortunately.
pneapplefruitdude@reddit
Best to just tune out and focus on building relevant skills.
VulnerableTrustLove@reddit
Including soft ones!
Sunstorm84@reddit
They’re more cuddly <3
exploradorobservador@reddit
Honestly it reminds me of college when I'd take a chemistry class and all the review sites were spammed with how terrible an actually well run class was because the population is 5% chicken littles
GuessNope@reddit
95% chicken littles. It is truly staggering how cowardly people are.
Acceptable-Milk-314@reddit
Lmao I love this.
Lyelinn@reddit
my job was recently severely impacted by AI and chatgpt o1 in general... but not in the way you think. Our designer started pushing his "fixes" and "changes" to our branches and now I spend 20% of my day fixing the gpt-puke that breaks 90% of the time lol
kronik85@reddit
Are there no tests in the CI/CD pipeline to catch his breaking code and reject it?
Lyelinn@reddit
its a fairly big project with like 5% coverage. I usually cover logic and side tools, but not the main ui app because its too big, old and we never have time (startup reality)
kronik85@reddit
gotcha. my company is in a similar situation, so I feel your pain. code coverage isn't extensive enough to catch most new code. new code goes into areas where things are so coupled you can't test anything without loading the whole program, which expects physical devices and craps if they're not found.
good luck.
v3tr0x@reddit
lol why is a designer pushing code to your repos? I imagine you work in a startup or an equivalent of that?
Lyelinn@reddit
yeah we're a very small niche startup, I guess he have good intentions but when we discussed not doing that things got heated so I just kinda roll with it and laugh from time to time when I fix stuff lol
Wide-Pop6050@reddit
Why do you approve and merge it though? He can put up PRs, doesn't mean they have to be merged
Lyelinn@reddit
its much faster if I fix it later than make him torture chatgpt to try to fix something it does not understands, plus much less hostile talks etc etc
Wide-Pop6050@reddit
Yeah I get that. It's just that problems don't get solved until you make it the person who caused it's problem. Right now its only a problem for you.
belkh@reddit
do you not have tests? I would simply have them fix their own code until the tests pass, they'll either get better or give up
Lyelinn@reddit
its a startup so we "dont have time for this, we have to move fast" plus trying to explain to a non programmer how to fix the issue is usually a lot slower than just fixing it yourself, besides I don't even care anymore. Job is job, code is code and bugs are bugs, I'm paid same amount of money regardless
GuybrushThreepwo0d@reddit
Tests help you move fast. Not having tests equating to moving faster is just a logical fallacy. I say this as someone else in an early phase start up
belkh@reddit
Eh, it's meant to shield you from having to fix their code, let them merge 20% of the unbroken code, and they deal with the 80%, don't help them there, unless your manager specifically tasks you to.
In the end you're responsible for your tasks, and you don't want your perceived value from management impacted by invisible tasks you spend your time doing to fix the designer's work.
Chances are if management knew how much time you waste with this they might just stop the designer from contributing all together
Lyelinn@reddit
> don't help them there, unless your manager specifically tasks you to.
well, you're talking from "normal company" point of view, but I'm in very small startup. My manager is our CEO and he said that since designer have "good intentions" it is my responsibility to help him, so things are different in the wild startup world LOL
When I was working in big tech, thing like that was unimaginable, but so were countless unfinished everchanging design pages that were updated together with task lifecycle (and usually completed some time after feature is actually deployed)
Smart_Whereas_9296@reddit
Having worked both startup and larger companies, you really should to address this. You're being made responsible for other people's mistakes and your experience is being ignored, they will likely blame you and fire you for the first major tech issue that loses the company money, if you are responsible or not.
I work with a codebase that's decades old these days and we still have major issues due to poor implementation right at the beginning to "just get it done". It's now too costly to reimplement properly and every change costs about 5X more dev time than it would with a modern implementation.
Even just create a policy that nothing can be merged without tests passing and have a single test that automatically navigates around the system.
Lyelinn@reddit
I’ll just address your comment in general by saying I’m already planning to run instead of trying to change something. I spent 6 months trying to resolve basic issues and I believe amount of stress is not worth is so I don’t care anymore.
Smart_Whereas_9296@reddit
Probably for the best. Any kind of significant change to company culture is really difficult without senior people buying into it from the start, and it sounds like yours just want to stick their head in the sand about things.
otakudayo@reddit
Pull requests / code reviews?
I am kindof a cowboy, and I can roll with an experienced dev pushing code without review, but even I wouldn't let a designer just run wild in the codebase, especially if it's a non-trivial project and all their code is generated by ChatGPT
Lyelinn@reddit
we have both, but I admit I'm kinda numb already and sometimes just merge because its faster to do that and then fix instead of trying to explain to non-programmer how to solve the issue lol
kronik85@reddit
What kind of designer are they?
subma-fuckin-rine@reddit
Pushing code or PR is one thing, but allowing them to be merged is another
pedatn@reddit
Sounds like you didn’t have branch protection in place, that’s on you tbh.
Lyelinn@reddit
Never worked in a startup huh?
pedatn@reddit
I have and I currently do. Why, is it common not to do the literal 2 minute effort of setting it up in projects you were in charge of?
Lyelinn@reddit
Perhaps you never worked for actual 5 people team startup where « moving fast » is above everything else and only person in actual charge is ceo
pedatn@reddit
I consult for one of those one day a week. I still set up branch protection because it is very easy and quick to do, and it protects me from scenarios like the one you are describing yourself. Not sure why you are digging your heels in instead of learning from your very own recent experience.
Lyelinn@reddit
because you dont seem to understand the situation and implying that I work in reasonable env instead of something where only one person have ownership over such things. If we're 5, obviously designer isn't just doing it on random but because CEO is happy about the idea itself lol
darkkite@reddit
a simple github change can add branch protection rules preventing pushes without PR and approvals. now might be time
bonesingyre@reddit
I watched a YTer try to use Devin AI to do a simple css change where they asked it to have text expand to fit the dimensions of the cell in a table they had. It could not do it after 3-4 tries and an hour of prompt refinement.
Intelnational@reddit
True. But. Who would imagine that it would get this far 5 - 10 years ago. And it will only get better and with an increasing pace. Who knows where it will get in the next 5 - 10 years.
Those that are mediocre or weak will get substituted. Those that are smart and strong will get even better with such a tool in future. They will be able to do way more than now without it.
deathhead_68@reddit
Honestly any developer who says they can be 'replaced' by AI in 2025 is a straight up shit developer.
___Not_The_NSA___@reddit
Sometimes Imposter Syndrome isn't actually a syndrome
deathhead_68@reddit
Sometimes I think there might be as many imposters as there are those with the syndrome
read_eng_lift@reddit
The "confidently wrong virtual dumbass" is the best description I've seen for AI producing code.
pedatn@reddit
This was true 6 months ago but not anymore really. When given enough context it is great at autocompleting great slabs of code. It’s kind of a smart snippet library now that can automatically use and name variables.
iwsw38xs@reddit
Yeah, and the other 50% of the time I delete most of what it writes. I agree that the good parts are good, but they're offset by the bad parts. Oh, that and you can never really tell whether it's bullshitting or not: I spend more time going down dead-end rabbit holes than learning anything.
pedatn@reddit
I just don't press tab when I don't like the suggestion, it's no extra work compared to not having an AI assistant. Only time I ever let it generate entire files is for unit tests, which I hand check anyway, just as I double check my own work in unit tests.
marx-was-right-@reddit
Not at all lol
pedatn@reddit
Strong counterpoint I can see you speak from personal experience.
foodeater184@reddit
It's better than that but still very limited. It works for problems you can fit into the context, which are typically tiny. LLMs also don't have a good understanding of most APIs/SDKs, or are at least outdated. Tools that index code, read documentation, and keep environment context in mind could be useful but I haven't seen any that work well yet (haven't tried many, still don't trust the base LLMs for generating code without heavy revision). I use them for getting started on projects, rubber ducking, and simple scripts.
karaposu@reddit
You are gonna replaced.
deathhead_68@reddit
In 2025? Lol no.
By 2040, maybe, along with everyone else's job.
hippydipster@reddit
Only 15 years till a vastly different world? Damn, used to be hundreds, or unimaginable.
Still I would guess you and I have less than 10.
deathhead_68@reddit
Will be interesting to see what happens! AI is in equal parts genuinely amazing and also incredibly overhyped, so the jury is still out for me.
karaposu@reddit
2025.
Nax5@reddit
Your meme coin is gunna get replaced
karaposu@reddit
yeah after you guys
deathhead_68@reddit
Lmao this is actually making me laugh.
Peak dunning-kruger effect
karaposu@reddit
Lets just wait and see. I am pretty sure these dudes will lose their mind when o3 drops
deathhead_68@reddit
!RemindMe 365 days
Nax5@reddit
Along with every other worker lmao. If you have a point, make it.
Alainx277@reddit
What about 2026? 2030?
Xenasis@reddit
Nobody can see the future but any developer who thinks they'll be replaced by AI in 2026 or 2030 is also a shit developer and misunderstands what the role of a software developer is.
Alainx277@reddit
Or perhaps you significantly underestimate the advances in deep learning each year?
deathhead_68@reddit
I don't understand how people think this and are in this sub. AI is amazing but good lord its nowhere near doing the job of development teams
EducationalWill5465@reddit
So like.. fresh grad developers?
If software dev will only be for the top 1% then that's not cool.
deathhead_68@reddit
AI is a tool, in many ways its better than fresh grads and in many ways its worse. I think it might replace really bad offshore or boot camp developers though tbh, because they really don't do much.
tl_west@reddit
I see a lot more developers concerned that their boss’ boss’ boss is going to fire all the developers because an intern can just use AI to replace them, sort of like outsourcing panic 30 years ago.
And yes, I did see a lot of projects grind to a halt due to outsourcing. Funny part was that management was mostly okay with that. Apparently 0 productivity for 1/6 the cost was worth it. :-)
Later on, the outsourcing techniques improved and productivity rose, but the lesson was clear. Mediocre software was acceptable if it cost 1/3 the price. Customers chose cheap over quality, and the customer is always right.
We’ll see if we see history repeat itself.
WolfNo680@reddit
Did the customer choose it? Or did the shareholders choose it by virtue of "line must go up and the to the right"? I feel like MOST customers would rather the thing they pay for work and be easy to use and understand, rather than...most of whatever we're currently getting on the internet.
AnimaLepton@reddit
Many customers/companies would also often like to pay as little as possible even if they get an objectively shittier product. At a company it's not "your" money, but depending on the state of the economy and priorities of the company, the decision-maker at the customer company often still has a directive to do the same kind of cost cutting, even if it means getting rid the stuff that works.
tl_west@reddit
Good point. Let’s just say they eventually bought most of the company’s competitors, so they were more successful than them.
pheonixblade9@reddit
in my experience, low cost outsourcing was negative productivity, not zero. it's like their entire job is writing tech debt.
TheFaithfulStone@reddit
What's the Cory Doctorow quote? "AI can't do your job, but unfortunately it can convince your boss that it can."
pheonixblade9@reddit
my hot take is that all the jobs that can be replaced by today's LLMs were replaced by squarespace and wix 5 years ago.
deathhead_68@reddit
For straight up crud webdev, probably.
Signal_Lamp@reddit
You have to remember that a lot of the AI doomer posts are being made primarily by a few key groups
In every single one of these groups you are seeing generally speaking a misunderstanding of the current capabilities of the AI tools that are being spoken about, as well as the trends that those current tools will be able to do in the long term when speaking specifically about the industry.
There is also an extreme lack of content generally speaking of AI neutral advocates that simply see these as tools with a realistic look to explore the current limitations of these tools for what they can do for us right now, the areas in which these tools can be learned without a mystical sense that you need to have a master's degree in AI/Machine Learning or some other topic that feels like it's shrouded in mystery by the average consumer of AI products. If you look online on the stance of AI, it's either extremely negative or sickly positive regardless of what context it's bring brought up in.
Social media is generally more fueled by controversial posts than it is by neutral perspectives, so generally speaking you're going to see more people give extreme takes on AI as a whole because that is what fuels the most engagement, which is ultimately the goal of any algorithm on a social media platform. That doesn't mean that these posts are popular, or there necessarily correct in the statement they're saying, it just means the content in itself is what has been determine to be able to get you to engage the most based on your behavior on that platform. If you engage with neutral posts on the topic, then the algorithm will feed you more nuanced positions on the topic, while possibly feeding you on occasion statements that fall outside of that norm in order to see if you may engage more with the topic being presented in a different lens.
pheonixblade9@reddit
you missed a group - engineering "leaders" who are salivating at the prospect of laying off entire departments in favor of low paid "prompt engineers"
Decent_Perception676@reddit
Huh? I’ve never met an engineering lead or director who wanted less headcount. Reducing your team’s expenses or headcount just means you get less budget next year. Generally the mid level leadership is fighting to increase head count on their teams.
Noblesseux@reddit
Yeah the middle manager class is weirdly obsessed with AI, despite arguably being the easiest to replace with AI
lWinkk@reddit
I was just having this same conversation yesterday. I assume these people are just rage baiting.
Illustrious_Wall_449@reddit
I don't even understand it.
I am trying to do as much with AI as I can, but for anything beyond small-scale or well-established issues/questions, AI is often wrong or misses important details.
And there's this issue with it where even if it writes the code, you still have to understand the code well enough to debug it. You can't just fire an LLM at the problem and watch it melt into nothing. If I could get away with that, I would.
LLM's are a useful duck that can occasionally save you some time, but at this point they are not more than that. Maybe that day is coming, but for now we're definitely not there.
iwsw38xs@reddit
o3 compute is 186x that of o1: I think that day is further away than people care to admit (shh, there's $1tn on the line)
kenflingnor@reddit
If you dig in I’m sure that you’ll probably find that most of these people are more or less influencers that are involved with some AI tool that they’ll eventually be directly shilling
Noblesseux@reddit
That or they're like management/tech bro people who went to a conference and got excited about AI so they think it'll replace everything because they don't really understand the intricacies of other people's jobs. It's the same thing with art stuff too, most of the people who are obsessed with artists being "obsolete" have no idea what most artists and designers actually do.
A big part of my job as a Senior Engineer is taking a bunch of vague requirements from people who don't really actually know what they want and turning it into a concrete idea that can actually be practically made. Coding isn't the entirety of the job, a lot of it is having someone come to you with a genuinely stupid or half-baked idea and having to workshop it into something that makes sense.
tittywagon@reddit
I saw a guy yesterday with mid-senior experience (in years) saying he can't find a job after 6-8 months and AI can basically do his old job so what's the point.
teslas_love_pigeon@reddit
What's the point of doing anything when the heat death of the universe will happen in trillions of trillions of trillions of years from now?
hippydipster@reddit
Still waiting for an answer here...
turturtles@reddit
Sounds like he just sucks at his craft and his resume or interviews prove that.
PragmaticBoredom@reddit
In my experience, these people are often inexperienced (in skill, not necessarily YOE) developers who haven’t progressed far enough to separate themselves from LLM level output yet.
So many people, especially among the LinkedIn thoughtfluencer crowd, have operated for years in environments with low expectations and low demands. Often without realizing it. I think the jobs where you can get away with copying from StackOverflow and poking at code until it kind of works are becoming more rare and these people are waking up to that reality, although AI is just the bogeyman.
iamsooldithurts@reddit
The word you’re looking for is “talentless”, and the industry has been inundated by them since the 90s.
Snakeyb@reddit
This is my opinion too. I've said it a few times to people that it reminds me of my time as a graphic designer/artworker. When I went into uni, it was seen as a (relatively) stable/reliable job. By the time I left an event horizon had been crossed with the tooling available (mostly Adobe's doing) which meant all of a sudden 1 good designer or artworker could absolutely motor through the undifferentiated heavy lifting of the job - rather than relying on a flock of interns/juniors.
The jobs were still there but not for being "just" a pixel pusher who moved things around in InDesign/Photoshop and sent it to a printer/webpage.
PragmaticBoredom@reddit
A few jobs back they had a “Chief Design Officer” who wanted to operate this way. He had convinced the CEO to let him hire almost one designer for every two engineers, arguing that we didn’t want engineers bottlenecked waiting for designs.
It was unreal. Toward the end there were some tough conversations asking what all of these designers were really doing, with very little to show for it.
pheonixblade9@reddit
they've always got 3 or 4 roles in their job history where they were "CTO" of some random ass crypto company.
kenflingnor@reddit
lol yeah. I saw some guy on here a few days ago who said he “had some experience as a CTO” while also mentioning he was 26 in the same comment which gave me a chuckle
MinimumArmadillo2394@reddit
My favorite is people who say they have experience as a CEO, when all they did was start a company that got no revenue
obregol@reddit
Everybody is trying to use the buzzwords to attract engagement.
I got fed up with people using the word "cook", like "we are so cooked".
As other comments say, I usually find this type of posting as a natural filter.
jacobjp52285@reddit
I think we overestimate 2 years from now and under estimate 10…
Most coding tasks can be completed by AI now… but the limitation of AI is new ideas or determining what brings value. That along with the ability to leverage AI well will be what sets engineers apart in the future
Rivao@reddit
AI is just Google. It hasn't progressed, I would say it has regressed. Giving bloated text unlike at the beginning where it was very concise. To get what I need, I have to spend more and more time writing prompts. And it's so oriented at pleasing, it often makes things up and repeats itself without honestly saying it cannot help. It's still a very useful tool, but it's not replacing anyone as it lacks the "I" in AI. Anytime now I see someone saying AI can do my job, I just know the person has no understanding of what he's talking about. "AI" is overrated. It was impressive at the start, but I haven't really seen any big leaps forward if we are talking about all the chat assistants
Tuxedotux83@reddit
Those are karma farmers,
looks identical to an incompetent company executive that signs up for one of those „AI newsletters“ and just forward each email newsletter to their employees as if they them self even had a glimpse at the text text (after reading the article and realizing it’s a pile of non sense you realize they just forward those without actually reading them)
CoderMcCoderFace@reddit
LinkedIn is the most toxic of all social media, and I will die on that hill.
SizzlerWA@reddit
Worse than X?
CoderMcCoderFace@reddit
The people on X know what they are. I don’t think the drones on LI have an ounce of self awareness.
SizzlerWA@reddit
Self awareness is good. How do you experience them as drones?
hermajestyqoe@reddit
Yeah, I mean the big problem with people saying this is that, at the end of the day, the technical implementation still needs to happen, and these LLMs are not capable of actual implementation, they need a person to review and complete it. And you need a technical person for that as even with the most simplified instructions nontechnical people get confused or overwhelmed with just about any computer related tasks.
Now, it will certainly increase productivity of individual developers and lead to downward pressure on the overall number of jobs, but it isn't outright replacing positions.
clueless_IT_guy_1024@reddit
AI is never going to be able to unwrap the mess of business rules you have to reason about especially if its inefficient to begin with. Most of my day is doing more business level work and figuring out what needs to be written or debugging some legacy software
particlecore@reddit
This is because they all fighting for the same few roles at FAANG companies.
Wtygrrr@reddit
Shows how bad they are.
CountZero2022@reddit
If at this time you are not able to get high quality, useful code from tools like GPT, Claude, or even Qwen Coder, then you are in trouble. You are walking dead and don’t know it.
10x engineers are now 100x.
1x engineers are now 0.
hippydipster@reddit
It seems crazy to say anything static about AI as one finds it today. Half of what one might say will quite likely be wrong in a year.
Militop@reddit
I have already declared myself useless post-AI, and I'm not posting on LinkedIn. However, I posted numerous best answers on Stackoverflow and created some open-source libraries with thousands or hundreds of thousands of users sometimes.
I am not alone in these thoughts. You don't need to post on LinkedIn. I know now that the less I share, the less training there is, so there will be no more open-source contributions for me.
G_M81@reddit
I'm a developer of 20+ years, have worked in defence, banking and last decade as a consultant with startups. I have fully embraced AI and LLMs, I've seen it produce code in two hours that would have taken me two weeks. Even though as a consultant I was typically brought in to solve the challenging problems, it doesn't mask the fact that a lot of the code developers including myself write, isn't intellectually challenging but more tedious than anything else. Just a few months ago I fed an LLM the 40 page PDF register map for an embedded camera chip and had it write the data structures and functions for the device. It just churned it out. Previously there would have been no quick way for me to have done that. At the very least LLMs will drive up expectations in terms of developer productivity and drive down resource allocation (jobs) and subsequently pay.
There are some Devs with their head in the sand but even those are starting to come around to the disruption about to hit our industry.
otakudayo@reddit
This is the expectation a lot of people have of the LLMs when it comes to producing code. But the reality is that the code is often incomplete, overengineered, or it doesn't even solve the problem. And it usually doesn't take into account the overall system or requirements, even if you feed it the whole codebase (Usually not possible because of context windows, but even if your codebase is small enough to fit, the LLM will basically ignore a bunch of the information/code)
Yeah, it's a great tool. I'm probably more than 10x productive than before. But part of that is being able to evaluate the LLM's output critically, which means you need to understand what the code does.
Writing a good prompt is a separate skill. You simply can't do the equivalent of "Hey chatGPT, make my app" unless it's something extremely trivial.
Synyster328@reddit
As they've said, the person taking your job will be a developer using AI.
G_M81@reddit
In the early party of my career working on mission computer systems, the requirements were very formal and explicit. "The system shall return an error code 567 when the line voltage of the backplane drops below 120V" Having spent time with that, I find LLM prompting pretty natural in that regard. We were forced to ensure every single line of code was traceable to a requirement.
"Build me a CRM app" is pretty much a Garbage in garbage out prompt. Though even that is getting mitigated slightly with the "thinking" models o1, o3 etc.
lunacraz@reddit
the difference is... you have 20 years of experience. you can look at what it spits out and tell whats good, whats not, and adjust it accordingly
the issue is when someone without that experience does the same thing... that's where it falls apart
flck@reddit
Yeah, exactly. There is no way in hell GPT could replace my job today.. there's a huge amount of domain and cross-systems knowledge involved with what I do, but I absolutely use it for mindless tasks, Google replacement, or for exactly things like this, "Give me a node script to recursively process a directory full of CSV files, pull out fields X,Y,Z, recombine them in some way, output the results in this format, etc".
I always check what it's doing, and I could write it myself, but those requests do legitimately bring ~45 minutes down to 5 in a number of cases.
creaturefeature16@reddit
My hot take: LLMs are power tools meant for power users. Sort of like if you get into construction and want to jump into heavy machinery and advanced power tools...uh, no. You need to first learn the fundamentals of construction before you can leverage those tools, otherwise you're going to get into a heap of trouble. You can't start with the high powered nail gun if you don't know where to actually place the nails. 😅
CVisionIsMyJam@reddit
from
to
to
in conclusion, you fed in a PDF register map and it got something as basic as byte endianness wrong. who knows what other bugs were present. i hope you had good test coverage. this feels like an irresponsible use of the tool.
honestly i do agree with you that developers which cram +20 pages of a PDF into an LLM and then submit that work after a few tweaks will struggle to find work in the near future.
G_M81@reddit
I see CV in your name. It's like the desire for full self driving, if my commute is two hours on the motorway and ten mins on the side streets, if a car can drive on the motorway effectively despite the limitations on small streets it should be acknowledged as a huge advantage/breakthrough. One then assumes at some point in the future assuming any level of progress. That other ten minutes of limitation will gradually trend towards zero.
G_M81@reddit
I think you are missing the point though is that, prior to an LLM there was no quick way of generating that code without wasting days on the most banal of churn activities. It's not that the LLMs are big bang one shot code machines. To expect not to say extract the pertinent information from PDF prior to incorporation or to not have say something like an endian oversight is naïve.
But any developers who fail to see the impact they are going to have are wilfully blind IMO. As Gretzy says, “Skate to where the puck is going, not where it has been.” LLMs don't absolve people of all work or cognitive effort. But if I need to write ARM64 neon assembler to speed up vector functions, I'm gonna turn to an LLM in some capacity and feed in some form of RAG data. Same goes for any complex regex, that I have to write and can go decades between needing to. I've lost days of my life to regex. That becomes way less bothersome with LLMs.
steveoc64@reddit
That PDF parsing example is indeed impressive- really good use case for an LLM
That would be a huge amount of grunt work to do it manually
Conceptually that is a translation job - converting the info in the pdf from one form into another form, and you are right in saying that is 90% of what we do most times
It’s just that elusive other 10% that requires creating something novel and useful where we struggle.. and I don’t see LLMs making any progress in that area
Will be great when the hype settles down a bit, and we can focus on using AI for the grunt work, and spend more time being truly creative
I suspect it’s likely to go backwards a bit first, as people are going to mistake AI output as a substitute for real thinking, and auto-generate a pile of mess that needs time to clean up
I wish I could have more faith in human nature, but I simply don’t
pheonixblade9@reddit
agreed - my concern is that the skills to do the actual difficult work will atrophy if we aren't doing the foundational work underneath it.
ventilazer@reddit
do you use gippity or some other model for that?
G_M81@reddit
I have both Claude and Gippity, I'm pretty sure I used Claude initially then Gippity to fix the byte endian after the code had been generated.
I'll often use chat gpt with a a strict IDL interface as a contract and get it to devlop python with type hints and run it in it's environment(so it can self fix oversights ) which allows me to develop solutions that are much larger than context length. Once the solution is working I'll port it to java,C++, Go or whatever I need to. If you stalk my profile you'll see a post about it.
ventilazer@reddit
these are the only two I use. I've never done any PDF feedings though. I believe if I paste too much code into Claude it automatically turns it into attachments or something like that. Going to experiment with it. I'll look into your profile, thanks.
G_M81@reddit
Yeah the pdf goes in as an attachment, but that is fine. I'll often prep the PDF so it's just the key data pages and not introductions and warranty disclaimers etc
G_M81@reddit
That's the one. It details how to use python Gippity with and IDL to keep it honest
Ashken@reddit
I just don’t agree with your conclusion because I believe businesses will see this as means to hire more workers for higher output, not cutting workers to maintain output.
G_M81@reddit
Obviously I hope you are correct. But my understanding is that assuming any level of improvement in the coding abilities of large language models it will be akin to replacing horse drawn ploughs with tractors.
ventilazer@reddit
I have 8 years plough experience, am I cooked?
G_M81@reddit
Artisanal organic software developers. Non GMO software. We just need to spin that.
ventilazer@reddit
The amish would love that
DeadPlutonium@reddit
Shh, let the self-selection process happen. If you’re worried, you probably should be worried.
TFenrir@reddit
You should look ahead and do research, figure out why some of the smartest people in the world are given pause by the latest model advances.
Looking at a model that you used last year and thinking "this is never going to take my job" is like looking at... Well basically any software and suggesting it will never get better.
I implore as many devs as possible to do real research on this topic. Look at the benchmarks being created specifically to test against harder and harder software dev challenges. Look at the trajectory of model improvement. It's staring you right in the face.
casey-primozic@reddit
Those are the same doom and gloom developers you find on cscareerquestions.
haasilein@reddit
More and more code will be generated using AI. Without a lot of skills, you will be able to pretty quickly put together some AI generated code. Therefore, producing new code will become cheaper, leading to more code being created overall. We will see an increase of large codebases, AND...
Complexity and maintainability in large codebases are different beasts, that require seniority and human intervention. Therefore, we should expect software engineers shift more towards platform engineers keeping the system alive and maintainable for the AI.
So, I think software engineering will not go away, it will shift the focus area a bit, but AI for sure will have an impact to some extent
farox@reddit
Are you looking for a job right now? If so, how long?
eggZeppelin@reddit
I remember in the late 2000s when Machine Learning models were catching steam
People would use them on datasets to get insights
That you could also get with an SQL query
But then great use-cases like natural language image search arose
We're at a similar place where LLMs are doing cool things but not much better then code generation templating tools or a Google Search
I think there's gonna be a lot of grunt work that AI agents will do 1 million x better then humans
Like say you have 180 microservice repos that have a queue of dependabot PRs open
AI agents can fly through and test and apply all the critical updates
But if you ask a LLM "Build me this new feature, enabling this segment of users to perform this task"
It doesn't have the context of your infrastructure, product strategy or a way to iterate through product/UX/Scaling challenges the way real software is built
AchillesDev@reddit
There is some garbage 'advice' in this thread about using LinkedIn. If you're able to use it properly and not worry what nameless dorks here think, it's great for building and maintaining your professional network, getting leads (for jobs, customers if you're a founder esp. in B2B, or clients if you're independent). Of course, people who spend so much time posting here don't have much going on, so if that's what you want, follow their advice.
Anyone claiming that posting on LinkedIn or keeping it updated is a red flag is hoisting their own red flag that they're either super inexperienced or they (rightly) have no say in hiring.
For me, before I went independent, LI was the primary way I found jobs, kept in touch with old colleagues, and helped friends and old colleagues who lost jobs or whatever find their next spot. It's also been a great way to advertise my books, articles, and services.
soft_white_yosemite@reddit
I suspect not appearing to buy in to AI is almost worse than being a bad dev, on LinkedIn.
God I hate this AI buzz right now. We can never have a good jump in technology without it turning into a circus.
I now miss the days when the crypto hype train was the only thing to roll my eyes over. It was annoying, but I could just ignore it.
nath1as@reddit
if you think your job won't be affected by AI in the next 5 years you are delusional
skidmark_zuckerberg@reddit
It will be, and the people who cannot use them will be the ones down and out. There is no question these will change the landscape of the industry, but not in the way you think. These will become commonplaces tools developers will be expected to know how to harness and use to maximum effect.
B_L_A_C_K_M_A_L_E@reddit
I totally understand both extremes -- "AI will never be anything close to me!" and "AI will evolve into AGI and replace knowledge work!"
What I don't really understand is the middle position that you seem to have, which is something like "AI will remain a hammer I use to do my work." This sentiment in particular:
Why? If we think an AI system is intelligent enough to do all the implementing for you, why wouldn't you think it's smart to interpret what a human (client, project manager, whatever) is saying? I mean, if anything LLMs right now are better at understanding arbitrary English requests, compared to writing perfectly functioning code.
If you're willing to be a bit abstract, you're like the present day "AI" for your boss. Your boss doesn't know much about programming, but he's got you to figure out the implementation. The AI maximalist would say that we will eventually cut out the middle man.
CloutVonnoghut@reddit
On a platform with fake recruiters and fake job listings, there’s fake programmers too. The bots are trying to justify anti enable the rigid hiring process. The real programmers are just trying to pay engagement from recruiters, hiring managers, and jaded developers (oxymoron)
Main-Eagle-26@reddit
It’s tech sector hype from people who are trying to be social media influencers while cosplaying as software devs.
Loose-Potential-3597@reddit
They’re farming impressions like anyone else that posts on LinkedIn. Either it’s because they have no life or they’re selling a product
CryptosGoBrrr@reddit
Sure, AI and the overall quality of AI-driven tools are getting better. It's gotten to the point where I can point a machine to a source file and ask for a 100% code coverage unit test for said file. Great time savers and good enough for the dump grunt work. But using AI to create entire (web) applications that are maintainable, have good/clean architecture, are scalable, etc? Nah.
We survived RAD frameworks.
We survived low-code frameworks.
We survived no-code frameworks.
We'll survive the AI fad.
augburto@reddit
Honest question -- how many of you all take your LinkedIn seriously as a social platform? I only really use it when I am interviewing or recruiting for a team but I am seeing lots of my peers use it very actively even just to share news.
fknbtch@reddit
i personally know people in this industry using AI to write those posts and the posts are just filler to make them look active for more engagement so i roll my eyes every time.
Spare-Builder-355@reddit
For the sake of this post, just tried this prompt on free version of chatgpt:
The resulting script indeed creates a project structure with required files. Also it creates src/index.css with correct tailwind stuff and 30 lines later it overrides src/index.css with some pure css.
I'm not sure how anyone can trust anything produced by this tool.
GuessNope@reddit
Try some of the AI tools integrated into vscode.
They are getting better.
This is the best thing that's happened to coding since Intellisense.
lostmarinero@reddit
Also the agentic hype is kind of weird to see
"WHAT WILL WE DO WHEN THERE ARE NO ENTRY LEVEL JOBS??!?!? WE ARE GOING TO KILL AN ENTIRE GENERATION OF WORKERS"
And im like, lets hold on. First, Agentic AI is far from trustworthy - AI is great augmenter to workers right now, and it may change really quickly, but from what I've seen with AI, we are a ways off with agentic capabilities
Secondly - humans adapt - so don't try to call something in advance so you can feel smart.
In 1930, economist John Maynard Keynes predicted that people would work 15 hours per week by 2030 - But we adapt.
https://www.npr.org/2015/08/13/432122637/keynes-predicted-we-would-be-working-15-hour-weeks-why-was-he-so-wrong
Anyways, people calling things right now are in my opinion people who want to swing in the dark and hope they hit something so they can feel smart later
exploradorobservador@reddit
https://crawshaw.io/blog/programming-with-llms
I found this to be a good summary of how I find LLMs fitting into my daily routine.
Icy-Injury5857@reddit
I think the more accurate statement is 'AI can do basic coding tasks, H1B contractors are obsolete'.
GronklyTheSnerd@reddit
As far as I can tell, they’re far closer to replacing managers.
PotentialCopy56@reddit
AI won't replace your job but it's funny when a dev says AI is useless. I'm sure devs said the same thing about IDEs when they first came out.
pedatn@reddit
Yeah that’s just engagement bait. AI assistants have improved a lot in the past year, and in the last two months to the point that you could probably lose a junior team member and not lose productivity. They’ll probably get better still, to the point where you can replace multiple or all junior devs on a team.
MangoTamer@reddit
I've been blown away by some of the autocomplete suggestions from visual studio lately because there's just no way they could have suggested that unless they understood the rest of my code. Beautiful. I love it. But I'm also not about to go out and make another one of those AI is going to replace all of us posts because, first of all, obviously. And second, what a cliche. There's a million of those posts already. We don't need one more.
ghoststrat@reddit
They have other motives.
BorderKeeper@reddit
Today I had to do a simulation on if users of my app can get rate-limited by Github API which we use to store our installer and handle versioning.
We have normal polling, polling when got rate limited, random interval at start to space users out, and bunch more caveats.
The python script works even with graphql first try with one minor mistake I fixed. Saved me a day of work so kudos. When I use it usually though on more niche problems my app is facing I dont even bother asking.
sunny_tomato_farm@reddit
They’re just looking for social media clicks.
sozer-keyse@reddit
Classic social media victim mentality, nothing new.
Times change and technology marches on. Learn how to adapt, or better yet, get as far ahead of the curve as possible.
reddit_again_ugh_no@reddit
I've been using Copilot and ChatGPT consistently in my job, and they are a great help, but they are not a replacement for a human developer.
bloudraak@reddit
It’s not that they replace human developers with AI.
It’s that, AI makes certain folks way more productive, thus reducing the need to hire more folks to deliver the same value. The number of available positions decline, and as such some folks will be denied career progression due to a smaller pool of available jobs.
Ms-Architect@reddit
I agree with your post. In every stage of my career I made sure that the work I was doing could not be replaced by automation. At my very first job when i needed to refactoring code from C to C++, I wrote a script to do it. I don't see why AI is any different, it's a great tool that we can take advantage of to free up our time for more interesting work. It's not going to replaced us.
loumf@reddit
To paraphrase Henry Ford: “whether you think you’ll be replaced by AI or not, you’re right”
pheonixblade9@reddit
I have over a decade of experience (mostly at big tech) and looking for a job. vast majority of roles are basically "we moved too fast and used a bunch of contractors/AI/juniors without mentorship to write our codebase and things are falling over and we need some Real Engineering (TM) muscle to come in and lead things in the right direction and pay back years of technical debt"
to be honest, unless AI has another massive generational leap in the next 5 years, I only see my career prospects improving. really sucks for the current generation of juniors, though. combination of nearsourcing/outsourcing/LLMs/H1Bs are gonna destroy the next generation of talent. Companies should be investing now.
Emergency-Noise4318@reddit
The 200 a month version of chatgpt is insane. Don’t sleep on it. It can do everything.
bloudraak@reddit
I really hate the animosity towards progress in this industry. It's always this vs that; talking down at folks. Maybe that's why they call our industry immature.
My view is that if most engineers around me had a sense of discovery and asked AI about the challenges they faced every day, my position wouldn't be needed. Much of the information is public knowledge, they just have to take initiative to ask questions; explore the answers, connect the dots and dive deeper. But alas, many don't even start. But they sure have opinions...
gowithflow192@reddit
I can't help but laugh at devs who think AI will never threaten their jobs. Hubris.
chmod-77@reddit
Imagine accounts in the 80s doing this when spreadsheets were invented.
Jdonavan@reddit
Every single time I see an "experienced" developer post this I also can't help but laugh. It tells me they're not actually experienced and are just shoving their heads in the sand. You chuckle heads look at consumer facing products and think that's all there is.
If you actually took the time to REALLY learn how the tool works and what it's capable of you know that in the hands of an ACTUAL developer that's taken the time and effort to learn how the tool works is easily 2-3x as productive. You'd know that LLMs CAN write good code when given good instructions. But no, you'll keep you head in the sand and get left behind.
F1B3R0PT1C@reddit
My product owner regularly sabotages our work by running his thoughts through chatGPT and slapping the results into design documents and story descriptions. So much word vomit and inconsistencies, and when we do get our PO’s own thoughts they are usually just a fragment of a sentence rather than a complete thought… If they’re gonna replace engineers with this thing then they have a looooot of work to do still.
SituationSoap@reddit
This entire industry used to be filled with people who would proudly brag about how the vast majority of their job was copying and pasting things from Stack Overflow until it did what they wanted.
A huge percentage of developers have never built up understanding of how this stuff works. Ever.
ItsOkILoveYouMYbb@reddit
GPT and other coding-able LLMs (that I've been able to try in depth, which is not all of them) is as useful as you are knowledgeable about what you're asking it for help with. If an incompetent cheap offshore developer is asking it everything, all it's going to do is make their incompetence more productive. I would argue that creates excessive costs over just a couple of years. They build fragile products and struggle to maintain existing ones, just faster.
Where it's useful is saving time for and enabling competent devs/engineers, as it can cut down on time spent researching and reading docs (especially if you can have it summarize). But to do that, it takes a lot of time to give enough details and specifics such that it gives you an actually useful analysis. And I've noticed the more specific and detailed your prompt is (the better your English is), the more helpful and accurate responses it gives you.
But you still need the knowledge and expertise to spot when it's giving you now very subtle hallucinations, and you need to be able to know when you haven't given it enough context. And in the case of cheap labor using it (think from a clueless MBA mindset seeing IT as a cost center rather than as infrastructure), you don't know what you don't know.
What this means is it's a productivity tool only as good as the user, same as many others.
What I fear it cuts into is what was already being cut into, which is Junior and entry level roles. What was destroying that to begin with was all companies being forced to hire at market rate, but not being forced to retain at market rate, so many orgs see Jr level roles as not worth the cost. I feel these LLMs do help cover that gap a bit, so it becomes even harder to break into this career.. If you're obsessed with being fully transparent and honest, anyway. If you're not, then LLMs can be a powerful tool for a competent and smart person who simply lacks experience.
Jaryd7@reddit
I'm just thinking how those AI tools will inevitable get worse over time.
If everybody is using AI to generate code, there will only be such code to learn from, so the AIs are learning from each other, which reeinforces bad code in their datasets.
You generate bad code using them then publish it and the next AI learns from that code and generates even worse code. A vicious cycle.
I personally think these AIs have propably reached a plateau in their coding abilities and it's only downhill from there.
Developers will never be useless.
The-Ball-23@reddit
A lot of these people on my feed are “developer advocates” who don’t write real code. So yeah, it’s actually funny when I read them on LinkedIn
AfraidOfArguing@reddit
LinkedIn runs disinformation campaigns for our megacorps. They bubbled up so much goddamn "Return to office" nonsense that my blood pressure would rise if I got a LinkedIn notification.
Pristine-Campaign608@reddit
They just bought into the AI ponzi scheme.
copilot is a scam
Necessary_Reality_50@reddit
ChatGPT is a search engine. It's the new Google. I can search for how to do something, and it gives an example, just like stackoverflow does, but better. That's all it is folks.
If you aren't using it as part of your daily workflow, I dunno what to tell you. Other devs will be working faster than you.
skidmark_zuckerberg@reddit
Right? I’ve been saying this for some time now. It’s a tool, and it’s better to be comfortable with these LLM’s than to be a Scrooge and detest them. It’s much more efficient to look up information with GPT or whatever, than it is to comb through multiple web searches. What took an afternoon of Googling and reading now takes 20 minutes. It’s much more efficient. No one is less of a developer because they get their answers from StackOverflow, same with LLM’s.
Eventually these things will be commonplace and the developers who spent years talking down about them or the people making use of them, will be down and out. The con however is that Junior devs do not learn as we did without them. There’s a lot of AI slop out there that less experienced people cannot tell is good or bad. This is where the bad taste comes from with more experienced people I think. Any experienced dev worth a damn can take a problem, use AI, and tell you right away what is good and bad. Juniors typically lack this ability.
Sufficient_Nutrients@reddit
ChatGPT, yes. But it remains to be seen what LLM-powered agents can do.
I'm bearish on language models, but have not declared victory yet.
AILearningMachine@reddit
The AI you see now has nothing to do with the AI that will be used three months from now.
We need to think about what our society is going to look like.
Sensitive-Ear-3896@reddit
Is it possible that this is stealth marketing?
blizzacane85@reddit
Al is best at selling shoes or scoring 4 touchdowns in a single game for Polk High during the 1966 city championship
paradite@reddit
There is a chance that you are talking about me...
I am software engineer with 7 yoe at big tech, startups, and building my own startup.
There are a lot of reasons why AI is going to replace software engineers and here are my thoughts scattered across several LinkedIn posts:
TsangChiGollum@reddit
You couldn't have tortured this information out of me
Ashken@reddit
It takes Devin 15 minutes to not push to main, I think we’re fine.
LongjumpingCollar505@reddit
After making all the repos you trusted it with public, and after running an open s3 bucket on their demo site. They aren't the most security conscious company out there.....
AloneMathematician28@reddit
Unfortunately they don’t walk the talk. I’d wish for them to actually follow through and replace their devs with language models. Glhf
Sheldor5@reddit
LinkedIn is a social media platform just like Facebook
I ignore it just like all other social media platforms
greensodacan@reddit
They're trying to sell companies, not software.
Ideally, you attract an entrepreneur who's willing to pay a salary long enough to get something copyrightable on paper, at worst a working prototype. Then they sell the company and all of tis assets to someone else as quickly as possible.
It's not about actual software, by the time you start coding, you're worrying about crap like product/market fit and that out gets expensive real quick.
Regardless of if the company sells, the engineer walks away with a C level position on their resume and whatever salary they were paid for whatever amount of time they worked. Maybe stock options if you want a chuckle.
The entrepreneur (knowing full well the whole thing was a gamble) gets a line on their resume too, a copyright they can sue other companies over (aiming for settlements really), and maybe a trademarl; bonus points if it includes "AI", "Blockchain", or the letter "X".
If everything goes well, everyone gets rich.
Effectively the greater fool theory at work.
just_looking_aroun@reddit
On the bright side they’ll scare off people and we’ll maintain high salaries