What's the actual long-term future of the field? Seeing through the noise.
Posted by No-External3221@reddit | ExperiencedDevs | View on Reddit | 159 comments
It seems like every year there is a new view of the field and where it is heading. Pre-2022, is the the field to be in with a long future and excellent opportunities. Since then, it has been framed as a hellscape with high competition, lack of jobs, offshoring + AI, etc.
I'm interested on where the field will be not in a year, but 10, 20, 30 years from now as a long-term commitment. In other words, is it a strong field going through some momentary troubles, or is it BlockBuster in 2013?
Personally, I see a few longer-term trends at play:
-
The ownership/ management class are dead-set on making labor as cheap as possible, be it through offshoring, automation (which includes AI), etc.
-
Dev work has basically unlimited demand, as there will always be a desire for new/ better software. Increasing the amount of work that a single dev can do will eventually open up more work to be done.
-
Nationalism is increasing worldwide, meaning that countries' governments will want to keep jobs within their countries. However, the internet makes it very easy to offshore despite that. I'd expect it to continue.
-
The skillset of being a good dev is still rare and difficult to obtain. At the higher levels, it is similar to that of being in management/ an entrepreneur (taking ambiguous goals, converting them into a product, leading teams, etc). I expect it to remain a valuable skill, but perhaps see the requirements increase.
Overall, I expect to see more of the lower rungs of the ladder get chopped off, while those at the top will be extremely valuable (and well compensated/ competed over for it). I expect to see this as a long-term trend moving forward, unless we have another industrial revolution that overshadows the value of computers.
What are your thoughts?
Exotic_eminence@reddit
I don’t want to work for assholes anymore even if they are hiring again
EmotionalQuestions@reddit
I'm lucky to be in a position where I can be very choosy and I totally agree. I put up with so much crap and I'm over it.
weird_after_taste@reddit
Can u elaborate?
EmotionalQuestions@reddit
On which part?
weird_after_taste@reddit
Just wanna hear what you went through. Idk why I’m downvoted lol
EmotionalQuestions@reddit
My mistake was spending a huge chunk of my 25 year career at the same big tech Co bc it was easy, comfortable, paid well, and was 5 minutes from home. I got lazy about looking elsewhere and should have gotten out sooner.
Bad managers, constant reorgs, slow promotion velocity, lots of layoffs post-pandemic, pressure and unreasonable deadlines, felt like a robot, not a person.
My current role pays significantly less but it's the nicest environment I've ever worked in. That's worth more to me now than TC that I'm fairly close to retirement.
Intelligent_Part101@reddit
Your years at that highly paid company enabled you to take a lower paying job now. Wasn't /all/ a waste.
EmotionalQuestions@reddit
Agree, for sure. I just should have bounced around to other big companies more while I was younger and still lived in a great city for tech opportunities. Current city sucks for that.
weird_after_taste@reddit
Currently in a similar position. I’m 5 years into my career and got senior but career trajectory significantly slows after the title “senior” because of politics. Thinking of leaving
Intelligent_Part101@reddit
Got news for you. At most places, SW devs advance quickly but hit their ceiling quickly too. Jobs past pure individual contributor require political smarts because it's not just about technology at that point but also about people and their vague goals.
EnthusiasmWeak5531@reddit
I finally gave up trying to remain an individual contributor and took a team lead position at a fairly large bank in May. Probably sold my soul a bit but the market made me want to give up and get out.
EmotionalQuestions@reddit
It's a good time to move.
ConstantExisting424@reddit
same here, hopefully we get to a point where AI is doing the hiring and I can work for them!
Exotic_eminence@reddit
There will come a point when the very few people who control the rest of the world and the clankers no longer have control over either - hopefully the clankers don’t rule us all one day
Spirited-Camel9378@reddit
We’re at the point where you make the big bucks if you set aside all ethics. Don’t work for assholes, don’t work for sociopaths, don’t work for companies that use people as a product, don’t work for companies building death machines, don’t work for companies funded by psychos with blood bots, don’t work for companies that promote fascism.
Tough line to walk!
EnderMB@reddit
I feel that this might be a driving mindset for the next decade.
Big tech showed itself as fundamentally not caring about tech in the way that people evangelized it to care. As such, the best candidates won't join, and these companies will see a slow decline, driven through their own erasure of their best talent
JaguarOrdinary1570@reddit
This has definitely turned me off of the "prestige" tech companies of the past two decades. None of them seem to have any kind of plan anymore, nor any ability to commit to the few they make.
The MBAs there who've fully squeezed all of the technical people out of any design and decision making will assure you that it's being smart and staying agile/competitive in a very turbulent market, but they're all just pushing their people for quick "results" to put on their resumes and gtfo before their own management does the same to them.
It's just a massive perpetual fake work generating machine. Sure I could be tempted to join for a year for a juicy Meta AI-tier salary, but I like actually building real things that work, so I'd be out as soon as my contract allowed it.
Exotic_eminence@reddit
Boom Shankar
Har Har Mahadev
intertubeluber@reddit
My experience is that for a bunch of reasons you generally deal with more assholes as a consultant. I bring this up because of your flair. Have you considered going perm somewhere?
Exotic_eminence@reddit
I’m a permanent consultant lol and you are so right - the thing is I work with them not for them.
My bosses are fucking awesome 😎 and I am lucky to have them
matthra@reddit
I don't think any of us can see past the current LLM shaped elephant in the room. Having lived through the dot com bubble I see a lot of similarities. If the similarities end up being more than surface deep, we will see entirely novel specializations emerge.
We just don't have enough context on how things will be implemented to say what the demands of the field will be in the future.
damnburglar@reddit
We won’t know anything until we find out how much to will cost to use these LLMs once their owners decide it’s time to recoup cost and make a profit
Saint_Nitouche@reddit
The models are not particularly expensive to run inference on. You can get semi-useful models on consumer hardware, Altman has said they would be wildly profitable on inference alone. If bosses wanted they could absolutely buy some GPUs and host an office instance of Deepseek R1.
Training is the real bitch and why the numbers are so bonkers. Either we keep training bigger and bigger models and the economy becomes a black hole, or things plateau. Wildcard is that unprecedented optimisations are found, but that is not the kind of thing to reasonably make predictions about.
weIIokay38@reddit
Things measurably plateaued a while ago. We unfortunately no longer have a way to reliably test models because their benchmarking datasets somehow become included in the training datasets, so we’ve had to mainly go based off of “vibes”. But even the benchmarks we do have show a leveling off of performance over the past year and a half of model updates. GTP-5 was the final nail in the coffin, showing pretty clearly and obviously that we are at the peak of the current paradigm.
Will they discover something else? Maybe. But billions upon billions of dollars have been spent training the current paradigm. That has to be made back somehow. And there is not a “killer app” other than unprofitable models like ChatGPT that lose money for every extra user they have. OpenAI is on track to spend $100 billion this year. They cannot and will not earn that money back through consumer spending, it just is not $20 valuable to a majority of people.
shared_ptr@reddit
TL;dr: models are still massively improving from what we can measure
I know you’ve discounted benchmarks so I won’t argue about that. What I will say is I’ve spent the last 1.5 years building AI applications and every foundational model upgrade has substantially improved the performance of our application as measured by our internal suite of evals (a really large dataset).
The model providers don’t have access to our data for benchmarking so the models haven’t been overfitted to our data.
From what we can measure the improvements to frontier models are still substantial with each upgrade, it’s just going from 80% to 90% of a task feels less substantial than what LLMs did going from 0% to 50%. In practice though, the final few percents are many times harder, and even step toward 0% error rate these models can make opens more opportunities to use them.
Ok-Importance4644@reddit
Your comment is contradictory, how can models be "massively improving" when you mention that we've gone from improvements like 1% -> 50% of a task (a 4900% improvement) to 80% -> 90% (a 12.5% improvement...) in single generations lol
shared_ptr@reddit
It’s massively improving because this is about making products viable that were not before.
We’re building an automated incident debugger (AI SRE if that is a term that you’ve heard). An investigation system that is wrong 50% of the time is honestly quite useless: at 80% it starts being pretty useful, at 90% extremely, at 95% you start considering automating remediation because it’s as good or better than humans.
And while we see a bunch of our evals that were previously failing start passing, we also see the quality of the previous responses improve. And the consistency too.
In most systems a frontier model release that reduces the error rate by 50% is huge, even if that means the pass rate on evals goes from 80% to 90%. The long tail on hard problems is often where the value is, and every release is making material gains on those.
shared_ptr@reddit
I think the best example of this is on AI code agent ability to complete tasks, where tasks are measured in terms of how long they would take a human engineer to complete, and a model 'wins' if it can complete that task >50% of the time.
It's a great proxy for how much that error rate is essential for complex AI tasks, as getting the frontier models to drop the error rate makes huge differences when you're wiring up hundreds of sequential model calls in a code agent and errors compound.
The latest study from March 2025 (when the best in class model was Sonnet 3.7, since replaced by Sonnet 4 and Opus 4.1) shows that the complexity of the task that the models can achieve ~2x's every 201 days. That's insane, I don't know how anyone can claim that's not substantial improvements.
https://arxiv.org/html/2503.14499v1
jbroski215@reddit
Ive also worked on frontier models in a couple of contexts, and some of the results have been very impressive. I think there's a future scaling problem here though - as AI is able to handle more complex tasks, it will push applications themselves to become even more complex; if you are able to use AI to quickly build an app, everyone else will be able to as well. The only way to create a unique value proposition will be to create something beyond the capability of AI. To keep up, AI will need to be trained on more data to increase complexity and handle the newer, more performant architectures and ideas that people think up. But the cost and energy usage of continuously retraining models is already incredibly high, so unless renewable energy becomes a substantially higher amount of the power grid, we hit a plateau in complexity, or someone figures out a far more efficient method of training new models, AI will start to lag human advancement. That or distilling models and doing enhanced/reinforcement learning of some kind will become stopgaps between model versions, which will be released much less frequently.
shared_ptr@reddit
I’d agree that future improvements aren’t guaranteed, and that the context may change such that the improvements may not be as tractable.
I was mostly objecting to the idea that progress has slowed/stopped when I’m watching the numbers go up on a daily. Don’t know where people quote this from!
damnburglar@reddit
This is not my domain of expertise so maybe my experience is not the best example. Take my input with a grain of salt.
Whether you’re running training or inference you’re still paying for the same instance time if you’re on something like EC2. Presumably if you own all of the hardware you get variable costs due to power load fluctuations but that I can’t speak on. Even if you aren’t paying for training and that saves a ton, the providers are going to want to recoup every cent plus profit, they won’t just give the models away for free. Or maybe they’ll all take a page from DeepSeek and distribute distilled versions.
I had a 24GB instance running Deepseek R1, and if we were to run it for the full month I think the costs come in upwards of a grand, and the output was just not something I would rely on.
Now, a grand a month is not an issue for most companies I’m sure, but even if the output was good you’re not going to run an office off a single instance. You certainly won’t power a busy service off of one either.
I don’t know what the future holds but my monkey brain can’t fathom how this is in any way sustainable.
quentech@reddit
Isn't training mainly done on GPU or similarly built AI cards, not on the general purpose CPU?
I don't think EC2 would be of any use for training.
damnburglar@reddit
I was using G5 I think, which does have GPU.
BatForge_Alex@reddit
There are EC2 instances specifically for this, I think it's the G instance types
Tired__Dev@reddit
My ASUS M16 with a laptop RTX 4090 was struggling deepseek with any model that was barely useable. With 3 A100s I was getting Deepseek 70 billion parameters to run reasonably and the model wasn’t as good as GPT 3 imo.
LLMs on consumer hardware isn’t viable for even us who are rather well paid.
79215185-1feb-44c6@reddit
I run Qwen3 Coder 30B locally off of CPU (a 7950X3D). I get nearly 30 tokens a second.
This is more than enough for what I use AI for but accuracy is a bit lacking.
On the other hand I also have a Gitlab Duo Premium seat from work. It ends up being like $50/mo. The service could be better but I pay nothing for that.
I get around 9 Tokens/s off of Qwen3 Coder 30B on a "ancient" dual socket 8160 system with 512GB of RAM. I am trying to see if I can get a larger MOE model going on this so my coworkers can leverage it as well.
If you are not building an entire application with these models then low token rates are fine. I'm still trying to wrap my head around some of these agentic workflows, MCPs, and other tooling because I'm struggling to find ways to integrate them into my workflow.
belkh@reddit
Something to keep in mind is that a lot of people using LLMs are going the agentic route where you stuff files and context documents in the conversation, 30t/s turns to sub 1t/s once you approach larger contexts, especially on CPU inference where prefill is going to take up minutes as well
freedom2adventure@reddit
Check out pydantic-ai, the tools make more sense then building out mcp servers etc. I use it for a few of my own agents.
damnburglar@reddit
Thank you for the great insight and perspective.
intertubeluber@reddit
This is a good take. Nobody knows.
As an aside, I use this technique in hiring. Ask candidates questions that don’t/cant know to see how they respond. If they at least hint that they don’t know, that’s a good sign. Answering with confidence is a good proxy for a few toxic traits - high ego and/or an imbalanced confidence to capability ratio.
greensodacan@reddit
Remember, the original Lost In Space took place in 1997. The Jetsons took place in 2062.
Blankaccount111@reddit
Fixing lots of undocumented broken AI slop.
Dziadzios@reddit
I disagree. Every demand is limited. This is the biggest problem with capitalism that is frequently undernoticed - it can't handle overly fulfilled demand well because it then reduces demand for labor. And then people get shitty job market.
jbroski215@reddit
Become an expert in how LLMs and SLMs work under the hood, figure out the best use cases for each, and use them to help you work faster. Another comment mentioned that SWEs will become more specialized, similar to what happened in the wake of the dot com bubble. Realistically, SWEs will be needed to design applications, optimize them, and ensure the proper security is in place, both from external API calls as well as in the form of guardrails for AI. The need to develop code will not disappear, but you will do far more value-added application-specific code and code reviews of AI generated code. This is great IMO, as the amount of time I've spent on boilerplate or iterations on existing code over the course of my career, even with emmet/autocomplete to cut down on development time, is nutty.
SWEs like to talk a lot about how AI generates sloppy code. They're right. But many applications don't benefit much from algorithmic perfection, and those that do often don't need it to start. Learn to tell the difference.
In a way, this will be the best time for developers ever. You'll need to become more business-minded, though, as the amount that companies want to spend on the best and brightest DSA-obsessed leetcoders is falling quickly. Many AI implementations right now are just smokescreens for offshoring - the AI projects will lose funding or become more limited after as much SWE work has been moved to India, Philippines, etc. as possible. If you want to keep making the faang bucks, you'll need to offer more than someone based in low-cost areas.
Source: just hit 15 years in SWE, in particular AI/ML, and have built multiple LLM-integrated apps that actually work. Have also worked as a hiring manager as well as independent consultant in the space working with c-suite level employees.
t1mmen@reddit
I’ve written code for close to 30 years. My working hypothesis?
That part of work is over any minute, though most don’t see it, or if they do, don’t want to believe.
Dead man walking, though I don’t know how long till the body hits the floor (humans be stubborn, stupid, and very intent of resisting uncomfortable change).
The old, new programming language is your native language. Punching keys is no longer needed, and in fact, slowing you down a lot.
Code is an implementation detail you don’t need to care (much) for. The idea, planning, spec & boundaries around it is nearly all that matters.
Cost of execution on (digital) products will plummet. Personalized, as-needed software will quickly become the norm.
Vast majority of us won’t have a job (per the classical definition). Not just us developers, I mean everyone large working digitally.
Our whole world is about to flip on its head, in a very volatile way, and our level of ready for that is… practically non-existent.
IF we make it through, the digital space will lose its importance. Literally everything you can think, you can watch, play, listen to. Personalized, exactly how you want it, when you want it.
But it won’t be real, so we’ll either chase that rabbit all the way into deep darkness of the rabbit hole, or we’ll wake up and hopefully return to the real world, focused on what actually matters.
A question we barely had time to ask ourselves in the non-stop chaos of today.
What is the meaning of life?
By all means, scramble together what you can before shit hits the fan, but covered in shit, we will be. I just hope something good grows out of the manure.
t1mmen@reddit
PS: If you read this and it resonated but you don’t know what to do, the path most viable (imo) is to focus on communication, critical thinking, problem identification & solutions. Macro and micro perspectives.
This seems most valuable, no matter if you’re targeting man or machine.
Can’t wait to see how this comment ages in a 1-3-5 years :)
Anime_Lover_1991@reddit
I always see the variation of what you have posted that lower rung/junior dev will get replaced and only once who are on top will survive and will be well compensated. But i will never be able to understand that those seniors will have to eventually retire and someone has to replace them who it will be if not for new juniors coming through.
The "good dev" definition is too vague. Who is good dev? Is junior with analytical skill a good dev or with good communication skills. Someone can be good at one but not at all. Who will in future will be going through all this grind and learn through it. At one point of time we all were shit dev hell i still won't call myself a "good dev" I am at best average but I get the job done all because i have scrapped through years or practice and continuous improvements.
Understanding-Fair@reddit
The thing I've noticed while using these models to generate useful and maintainable source code is that it still requires a ton of specialized knowledge to craft the prompt and just to know the right questions to ask. People with dedicated software engineering skills will always have a leg up on people with none when it comes to creating software, AI assist or not, because they know the right avenues to take, and those that you should avoid.
ICantBelieveItsNotEC@reddit
I think that the drawbridge is going to be pulled up.
The fundamental problem at the moment is that there's an oversupply of code monkeys and an undersupply of serious software engineers. It's a bad state of affairs for everyone involved. There are A LOT of bad developers out there, and we don't really have a reliable way to screen them out. It's kind of crazy that we expect society to just be okay with constant service outages and data breaches - can you imagine if structural engineering firms had one or two catastrophic failures every year?
The obvious solution is to make "software engineer" a protected title, just like chartered engineers, accountants, doctors, etc. I think this is almost certainly going to happen in the next decade or so. It'll be good for experienced devs who have the credentials to get accreditation, but terrible for people who want to use tech as an avenue for social mobility.
86448855@reddit
Business doesn't care what's under the hood, you can implement the most shittiest solution as long as it works and can be delivered fast.
floral_disruptor@reddit
in the big picture, business isn't the only stakeholder, there's a duty to protect. that's what laws (try to) do
unordinarilyboring@reddit
Being able to deliver fast is pretty directly correlated to how many pieces of duct tape are covering the missing spots in your Jenga tower.
LongIslandLAG@reddit
I don't see the corporations that own our politicians going for that. How do you outsource that to the lowest bidder? Also, they don't want their devs turning around and saying things like "I'm not risking my license for you".
death_in_the_ocean@reddit
Also bad for young developers that are good at the engineering bit and not just coding
Bakoro@reddit
We just need a tiered system where critical systems that could kill people, and important financial infrastructure need a licensed software engineer, and all the other stuff can be done by unlicensed software developers.
We don't need licensed engineers making video games, or Photoshop, or any of a thousand other entertainment and productivity tools.
Someone writing software for a medical x-ray machine or an airplane should be a licensed professional. That would mean personal responsibility for the code they write, but would also have to mean them having legal power to tell the company "no, we aren't doing that", while having very significant job security by law.
Sufficient_Side6320@reddit
Aren't we already doing that with college degree ? Bank, Quant... isn't hiring anyone without a degree.
Bakoro@reddit
Corporations arbitrarily deciding to increase or decrease their hiring standards is in no way a substitute for formal licensing, legal protections on titles, and legal regulations on how a company is allowed to fire or lay off licensed workers.
local-person-nc@reddit
Such an elitist take. 99% of software doesn't kill people if it breaks like a bridge. 🤡 Software that can kill people are already taken very seriously. Software engineers have been jailed for their shitty code in these scenarios. We don't fucking need professional licensing software for every CRUD frontend for backend software out there.
conchobor@reddit
I mean I hear you, but not sure this is analogy really works. People don’t expect bridges to change, but in software they do.
L_enferCestLesAutres@reddit
Exactly, bridges would not provide the same reliability guarantees if you were switching the foundations every other month, without stopping traffic of course.
Napolean_BonerFarte@reddit
Fundamentally you can’t really fix a bridge once it’s built if it’s wrong, so the design and review is very methodical. Also we have 1,000s of years of history building bridges and other stuff such that we’ve narrowed in on what works.
It’s very easy to fix software once it’s deployed, and there’s usually zero risk if it doesn’t work perfect at first. So we all accept that solving 90% of the problem very fast, releasing, and getting the final 10% later is better than solving 100% very slowly and only releasing at that point.
hilberteffect@reddit
Top-heavy organizational structures lose. Every time. There's no realistic business where having more overpaid suits with inflated titles than boots on the ground produces more shareholder value. It just doesn't exist. Investors/shareholders won't tolerate such a structure for long. If you want to capitalize on the benefits of free market capitalism, then you also have to accept the limitations.
The sober realities of building and maintaining complex software systems aligned with customer needs and market demands will outlast any hype cycle. It doesn't matter how hard VCs' dicks get thinking about the ultra-exploitation of a minimal labor pool augmented with AI. "AI-first" executives who bet the farm on this strategy are already losing and have been unmasked as the clowns they are. AI just isn't that good, folks. It's a useful tool for sure, but this generation of models isn't going to achieve AGI and certainly won't achieve ASI. When all's said and done, the result will be that companies can reduce their headcount costs by a maximum of 10% - and that's assuming they execute a thoughtful and well-informed AI strategy. Which most won't. Mark my words.
I'll keep beating this drum: when we do achieve AGI, this conversation won't matter at all, because literally every position will be replaceable. Including those at VC firms. So, what? A handful of VCs will have AI manage their money and investments? Who's going to spend the money on the goods and services the companies they're funding produce?
That's actually the wrong question. The right question is: why would AGI-level models take orders from humans to begin with?
It doesn't matter how much smoke, mirrors, and bluster these sheisters try to shove down our throats. The sober realities of macroeconomics don't give a fuck. It doesn't matter how much "value" an ultra-elite investor/owner class and semi-elite exceptional worker class produces. When fewer people have expendable income, demand for goods and services goes down, and government expenditures go up. GDP growth stalls. Consumer debt balloons. Interest rates spike. Companies shutter or tighten the belts, meaning startups lose customers, meaning VC investment outcomes are throttled. They can choose to blissfully ignore these realities for a bit, but the correction is inevitable.
So while I have no doubt that myopic execs/investors/governments will try to move toward the world OP describes, they're going to get slapped down hard and fast. They already are. Some of them just haven't woken up from their stupor yet.
babuloseo@reddit
we can easily design ai resistant languages or protocols and prevent them from being scrapped or used by leading ai providers or more.
godofavarice_@reddit
AI slop means more engineers to maintain the slop.
csanon212@reddit
Actually, believe it or not, more slop fixing the slop, run by 1 CEO in the Bay Area, and 5 Indians.
SmartassRemarks@reddit
Fixing the slop, or just rolling around in it?
csanon212@reddit
Full pig mode.
wakkawakkaaaa@reddit
They gotta ensure their own employability somehow
godofavarice_@reddit
Sure, I can see that.
possiblywithdynamite@reddit
is this sub satire? I cannot tell anymore
timmyturnahp21@reddit
Are you saying you agree with him or no?
DependentOnIt@reddit
It's basically career questions v2. So yes
79215185-1feb-44c6@reddit
At least we don't have high school students from third world countries larping as software engineers yet.
Foreign_Clue9403@reddit
Seeing through noise is a tall order, I think. Figuring out the invariants for me means understanding what parts are context and what parts aren’t, and that for me involves hitting a lot of books - history, sociology, economics, philosophy, etc. As much as I don’t know, the modern labor pool is full of people who didn’t look into these things beyond an executive summary, because they had to make space for things directly relevant to job training. Tech is especially rife with this.
Why does this matter? When someone with capital access starts an initiative or company, it is not simple to evaluate its merits and its costs. Even in the narrow narrow space of corporate finance there’s a lot of hand-waving required because the suggested 5 year valuation methods will not work when things like the federal interest rates and average labor costs change significantly in shorter timelines. “Fund now, build later? Build now, fund later?” Etc.
Combine that with “everyone can think big but not everyone did the reading”, and you see lots of money flowing in directions that lack sense, like say, biohacking. Investment (which includes investing time as a worker) is placing a bet on what people ultimately value, and that’s a tough question. For instance, you probably do well investing in a violence-related field, as the world becomes more multipolar and resource distribution encourages more physical conflict.
Now, do you want to do that?
Do you want to participate in that kind of world? If you’re a knowledge worker, and if you understand that your value comes from your ability to problem solve, it is worth the exercise to reconfigure or reaffirm your core values and interests and develop a “full stack” around those. Let’s say you have a background in data management. Continue to develop some technical skill there, but also go vertical- higher and higher into systems management and inter-organizational data- lower and lower into unstructured data, encoding, physical storage media, and non-electrical forms of communication.
In other words, a good foundation means you’ll be able to keep up whether the next leap is highly advanced or highly regressive.
Yourdataisunclean@reddit
The effective engineering guy made a good point on a podcast that github, which makes a coding copilot is increasing their hiring of juniors engineers and so they likely understand very well that their product can't truly replace junior engineers. I too think there is a good case to be made that once economic conditions improve, and generative hype dies down, we may see hiring increase and demand for more experienced tech workers become competitive because there is a relatively constrained supply from this period.
Isollife@reddit
Management at my company - a tier 2 big tech firm started bemoaning the ratio of Senior+ vs Mid- (weighted heavily towards the former) just last week after barely hiring for the past 3 years. Takes around that time for most mids to progress, so wondering if we're reaching a turning point.
Stargazer__2893@reddit
I don't know of any science fiction where programming is no longer a thing. It may take the form of troubleshooting with your AI, but knowledge of how to get a computer to behave as you wish, and that being a challenging thing to do, is always going to be with us.
I have no worries about job security in this field.
thefragfest@reddit
Your second point is key. As long as this holds true, this will basically always be a good career. Especially because many people just simply aren’t cut out for it. So if you’re good at this work, you’ll be employable forever basically.
xxDailyGrindxx@reddit
I predict that nationalism won't help US workers with respect to outsourcing since gov is dominated by corporate greed.
SmartassRemarks@reddit
Offshoring is a problem of grave to concern, but only to a small minority of the electorate. If offshoring is banned or severely restricted, it will only be that someone won a revolutionary election after a national movement formed out of a large scale shock that fomented a nationwide political movement, and the victors took their newfound power to take various liberties solving many problems people have long cared about but had no power to solve.
FetaMight@reddit
You're dreaming if you think your government or the companies hiring care whether devs are sourced locally. Nationalism is a tool to control the people, not any kind of doctrine the ruling class actually lives by.
SmartassRemarks@reddit
The ruling class live by the Nuremberg defense: If I didn’t mass fire all these innocent people and offshore their jobs 10.5 hours away to a frenemy country where developers are no better than college interns and new grads and are working multiple remote jobs while putting in the bare minimum effort and possibly selling corporate secrets on the side, then the shareholders would fire me and hire someone else to do it. I’m just following orders.
oceanfloororchard@reddit
I think 1 is very true.
I think 2 has some truth to it in that empowering the dev more will allow more complex and customized software and make it more affordable to build.
For 3, I wonder if the increasing global accessibility of education and digital resources will drive more and more smart people from lower income countries into software.
For 4, I do think there are challenging parts of swe, but I’m not sure if we’re really that special. Are good, solid devs really super rare? This has been the field for all the smart people to go into for a little while now
SmartassRemarks@reddit
I think most of what makes the broad group of developers special is their interest and willingness to sit at a screen all day and slave away solving abstract problems for someone else, as part of a business focused on maximizing shareholder value via a product that has no altruistic intent or even side effects. It’s a meaningless career for people motivated purely by the passion for solving complex puzzles, or by money. That is rare. Most people just want to be part of something and feel important and appreciated, and what gets them by day to day are interpersonal moments of gratitude.
HoratioWobble@reddit
Before COVID I was expecting exactly this market, but in about 5-10 years from now (10-15 from that point).
I've had a theory that the tech sector grew so aggressively between 2008-2022 because we'd just come out of the financial crash, investors were buying in low combined with the rise of smart phones.
Think of all the industries, technologies and sectors that came out of Smart phones. New ways of consuming media, shopping, gaming, socialising - an absolute shit ton of new industries were born.
But that was always going to wane.
It also drove a lot more people to enter the industry for the first time, there were a growing number of juniors even before 2020 and companies have historically been reluctant to hire them.
So I assumed at some point the market would be saturated with engineers, salaries would stop going up and we'd start seeing a contraction.
COVID accelerated a lot of that, 100'000's of people globally decided to retrain as engineers, boot camps absolutely soared, investment dried up, and because we were all home - we accelerated the innovation that could be milked from the proliferation of smart phones.
So now we're seeing what should have happened in 2030-2035.
I'm still not convinced LLM's have really had any impact on the job market globally infact every stat i've seen, has shown tech jobs increasing since ChatGPT was released.
But, also I think this is the end of the "good times", unless we see a dramatic new technology that creates a paradigm shift in shopping, entertainment, media, socialising etc like we did with Smart phones I doubt the tech sector is going to explode again any time soon.
I suspect in 10 years there will be far too many engineers and it'll have normalized to a average role with average salaries and only specialists seeing above that.
20-30 years, I think it'll be an entry level role, similar to how retail / fast food is considered now.
The golden age is over.
SmartassRemarks@reddit
Valuable comment/discussion topic.
Sure, the smart phone ushered in a decade-plus boom that led to soaring demand for software developers and therefore their job security and compensation. And sure, the wave of demand from this boom has lulled and the golden age is temporarily over.
But the smart phone boom wasn’t the only boom to inspire lasting demand. I’ll concede that the cloud revolution was more a facilitator of the underlying smart phone and web boom. But the web boom itself was a huge boom that preceded and led to the smart phone boom. And before that, it was the enterprise technology boom - mainframe, windows, personal computing, relational databases, etc.That takes us through several decade-plus booms with lasting impact - impact that spawned new booms.
A lot of powerful non-technical people are saying that AI is the next boom to last over a decade and spawn subsequent booms. Maybe they’re right, maybe they’re wrong.
Personally, I think the next boom is very dependent on what happens in geopolitics. War is a huge paradigm shift itself. But war also accelerates R&D in new domains that can lead to new transformation.
Absent war, there’s still politics. It feels like the political zeitgeist is indicating that things have to change. Whether war, depression, crash, revolution, grassroots movement, a lot of things may change. We have big gaps in technology enabled or encouraged by political malpractice or injustice. We have serious issues with data privacy, IP law, digital free speech, data security, fraud and identity theft. We also have social issues fundamental to modern business that if changed, would alter the nature and progression of modern business, spawning downstream effects: golden parachutes, stock based compensation for executives, underscrutinized M&A, regulatory capture, the ease of offshoring nationally critical skill sets, the ease of mass layoffs, the high litigation risk of firing individual workers, etc etc. Any social adjustment to any of these could restructure the paradigms within which developers are hired and what problems they’re hired to solve, for whom, with whom, and at what scale.
Packeselt@reddit
I hear goose farming is gonna be big.
Sheldor5@reddit
gardening for me
schmidtssss@reddit
I’m looking to buy 20 odd acres and start a little farm
death_in_the_ocean@reddit
wood ash and urine
william_fontaine@reddit
I could've bought an 80 acre farm that's been in the family for 150 years for $350 about 15 years ago. I didn't because I couldn't really afford it, but I still should've done it.
schmidtssss@reddit
I’m also mad at you.
william_fontaine@reddit
Yeah everything's about $10k per acre near me now too. I never did get a place and continue to rent, which has really backfired.
That farm wouldn't have worked out though in the few years where I couldn't drive, unless I'd managed to snag a remote job back in the mid 2010s which I doubt.
But even given that, I still regret it all the time.
Greedy-Cook9758@reddit
While offshoring / AI could replace on-site code monkeys that do not understand the business they are in and struggle with social skills, that is as you put it “the lower rungs”
A company that wants to be able to move fast, and doesn’t / can’t afford to feed engineers with very precise requirements will continue valuing social engineers that are engaged in the business.
The tools of how these engineers deliver their value will change, it has always changed. But the high level problem stays the same. How do we automate / solve X using computers
Timely_Cockroach_668@reddit
There are non technical folk in my company creating frontends with LLMs with vague promises of a finished application. My guess is that there will be a lot of backend work and making frontends actually maintainable by humans.
belatuk@reddit
I foresee there will soon be huge demands for technical people in troubleshooting and fixing LLM generated code. We should perhaps thank the AI companies for creating a new category of problems out of thin air to increase the demand for skilled developers.
ivan0x32@reddit
AI is a bubble obviously, so its going to pop sooner or later. But it was also never the reason for all the bullshit in the field. We're all getting shafted for two reasons:
Obviously X-shitter is in deep fucking trouble engineering-wise, the only reason its still alive is because it was obviously built by extremely smart people and they did a really good job there. It had a ginormous safety margin built into it over the years, precisely because they employed some of the best engineers in the world. Unfortunately shit-for-brains executives have no fucking idea that's the case, so all they see is an ooga-booga conclusion of "engineer fire, company survive".
Recession fears are another matter completely. I was hoping this shit will end by now, but Americans decided that eggs are too expensive, so here we are. Honestly can't blame them though, our local electorate is actively electing fascist assclowns too. All of this really revitalized my faith in humanity and the intelligence of a common individual.
Intelligent_Part101@reddit
Twitter was massively overstaffed for the product. Comparable websites had a fraction of Twitter's headcount. Elon was justified in laying off a lot of staff. Tech companies in general went on a hiring spree years ago that was not justified by profitability, and now that zero interest rate policy is gone, costs and profitability matter again. This is actually the main reason you see the tech layoff right now. Not AI. That's a cover story.
Michaeli_Starky@reddit
Begin working on your skillset with gen AI today to be still needed tomorrow.
Outside-Storage-1523@reddit
I think it highly depends on what the sub field is. I work as a DE, and I see BI and Analytic DE as two dead ends. Pretty much all database products built throughout the last decade is to reduce the engineering part from these two sub fields and sell the illusion to the stakeholders that they can do it just by their own, more so recently with the raise of AI.
Essentially, if you work in software engineering and you are DIRECTLY facing stakeholders, your career is in danger, because stakeholders always want to do things ASAP, and your engineering attitude is always standing in the way. In addition, since you are close to the business, you are a bit far from tech, which means you are under the attack of AI tools these days, and I can’t imagine what happens when AI gets better and cheaper every quarter.
Groove-Theory@reddit
Yea but.... the moment scale, complexity, or reliability matter (again), you need engineers again.
All this self-serve stuff solves the first 80%, but the last 20%....the stuff that matters for actual maintainability, still needs engineering rigor. The illusion that stakeholders can do everything themselves usually lasts until something breaks or doesn’t scale.
I mean they'll try. But it won't go the way they want.
Going from mainframes to PCs didn't fuck us even though people said programmers would be obsolete once end-users could build macros and apps themselves. In reality, demand for programmers skyrocketed. Same shit with 4GLs and realizing real systems required real engineering.
Even AWS, Salesforce, etc. reduced a ton of undifferentiated heavy lifting, but the total pie of software engineering jobs grew massively because businesses could build more and faster.
You're right that if all you know how to do is the 80% SQL dashboards and stakeholder hand-holding, you're at risk. But Software engineering as a whole won't die. It'll do what it always does: keep moving up the stack. We don’t write assembly for business apps anymore, but we still need engineers to stitch together frameworks, cloud infra, AI systems, compliance, integrations, etc.
What survives is the engineering mindset. The ability to think in systems, anticipate failure, design for scale, and make tradeoffs.... you know, the shit that business stakeholers have no fucking idea what to do (and history shows us they can’t fake it for long). And those patterns of that engineering mindset hasn't REALLY changed over the decades. The tools have (just like we're expected to use Git and cloud today non-optionally, higher-order architecture skills and tools will be the baseline), but not the mindset.
... so idk. I'm more worried about the business 4heads giving us short-term grief as opposed to worrying about our field long term. Try as they might, they still need the nerds on the group project saving their asses.
Healthy-Educator-267@reddit
Only a few products scale though.
SmartassRemarks@reddit
I love this comment for both its content and tone. Bravo!
PM_40@reddit
Is AI actually getting better ? It might get cheaper but don't see it getting much better. Sam Altman has conceded the fact that Chatbots are done for.
79215185-1feb-44c6@reddit
Yes just compare Qwen 2.5 (2024) to Qwen 3 (2025). The models are getting more and more reliable. You can run both of these locally - throw some prompts at the and see which you prefer at the same model size.
Quiet-Elephant-9708@reddit
But are they improving at a slower rate? To me it seems like it, at least in 2025 compared to the two years before. The tooling is definitely better (claude code, mcp, etc.) so it does seem like its getting much better even when the underlying model may not be
zzeenn@reddit
They’re getting better at CRUD apps where there’s a lot of training data. They still struggle with novel coding tasks. But our field is a lot more CRUD-like than we’d like to admit.
79215185-1feb-44c6@reddit
I've found good success at throwing GitLab Duo (Claude based) example implementations and internal APIs and having it write code based on those. I don't even need to write long or detailed prompts. Had it implement a bunch of API calls that I've had stubbed for months today as an experiment and it only made 1 or two mistakes because I'm pretty sure you can't easily get a process' working directory from its PID in windows.
adilp@reddit
There was a time many many decades ago people legitimately thought we should shut down the pattent office because everything has already been invented. There is not much improvement to be done
PM_40@reddit
Patents are happening in multiple fields so I don't see how this could be a widespread phenomenon.
If you are trying to say that AI can still improve, I don't disagree. But if people who are supposed to drump up AI are conceding we shouldn't take that lightly.
In other words if baber says you need a haircut don't trust that but if a barber says you don't need a haircut then it would be unwise to still ask for a haircut.
Saint_Nitouche@reddit
Does it really matter how easy the tools become? Stakeholders aren't interested in using the tools. They pay us to learn the tools and use them.
Various_Cabinet_5071@reddit
It does in the sense that they can hire for less because they say it’s easy. And if the tools are that intuitive, not hire at all
rlp@reddit
I'm not sure that applies to all stakeholder-facing roles. I think there's a good future at the top as the senior-most developer who directly interacts with stakeholders on small-medium projects. I do this a lot in my current position. The stakeholders come to my PM and me, and normally I would have to farm out work to the rest of the team. With improved productivity from AI, I can do more on my own, faster. My clients have no desire to build things themselves (nor could they), unless the tools become massively better.
Outside-Storage-1523@reddit
Seniors are safer, I agree.
HaMMeReD@reddit
I'm not sure about that. Like I get what you are saying, but I think just the moat between stakeholders and engineering will get wider.
Stakeholders day to day will be working with the AI, but not the coding agent, the BI/Analytics agent, that is specialized on understanding the graph on the data and generating dashboards, reports and other analysis.
Programmers/engineers will be using coding agents, and building things semantic data stores that translates well for the Data agents.
But I think there will be better scoping, better isolation. Not less. It might be possible to have less isolation, but imo that'll lead to a breakdown. Organization is good, if not even more important in an AI world if you plan to sustain processes.
trojans10@reddit
Can you expand a bit more. In this space as well.
ben010783@reddit
Here’s a guess: The dev market starts seeing more of the income-inequality that is going on in society. AI tools make it easier to implement simple features and proof of concepts. When things do break, the top devs get paid huge amounts because fewer and fewer people know how to dig in the weeds and do actual problem-solving.
Leeteh@reddit
I've played around with this idea and my best guess is software development will become more like software governance. There'll still be plenty of coding to do, but a fair amount more managing documentation (laws), workflows (execution), and evals (case law) when it comes to code and software stacks.
More details here: https://scotterickson.info/blog/2025-06-14-Governing-Products
armahillo@reddit
Re #1, we’ll know a bit more when n a year pr two about whats going to happen with that.
I agree management want this. I dont think I agree theyre going to get it. The iron triangle of fast, cheap, and good comes to mind, here.
PreparationAdvanced9@reddit
If I were an experienced dev, I would be on a path to retire in 5-10 years if possible or at least save and invest with that goal in mind. We all make enough right now to do this in a decade. If the job is not fully automated in a decade, we are in great shape. If the job is fully automated, at least you have a tons of buffer to cushion yourself from suffering until you pivot
PM_40@reddit
Good point. Timing of the pivot is also important. Sometimes one might be too old to pivot.
timmyturnahp21@reddit
Exactly. If you’re worried about the field disappearing and you’re 35-40 years old, you pivot to trades NOW while your body can still kinda handle it. Not in 10 years when you’re 45-50
Successful_Camel_136@reddit
That doesn’t make financial sense. Just work as long as can as a SWE, hustle to get multiple gigs at once, freelance/contracting etc and save as much as yso you don’t need to do manual labor. In 10 years if all the software jobs are gone just get some no skilled job for health insurance
PM_40@reddit
I think one doesn't necessarily need to pivot to trade. Law is one field, PhD is anothern people who don't have a CS or Stats degree can get one and upskill. Quite a significant percentage of people are non technical or self taught in this industry or become non technical by moving to management. I think this is the right time to upskill and or pivot.
timmyturnahp21@reddit
If the field becomes fully/almost fully automated, what good do you really think getting degrees is going to do? There are already people with CS degrees that can’t land jobs.
Also, if majority of CS is automated so will all other white collar jobs.
PM_40@reddit
To be honest I have not seen a single job getting automated due to AI in my company. CS degree with experience might be a better bet.
timmyturnahp21@reddit
Well yeah, we’re not talking about right now. We’re talking about 10 years from now in the event most CS jobs get automated.
PM_40@reddit
Do you think humans can predict 3 years accurately let alone 10 years ? Well all white color jobs can be automated or not one knows. It's like throwing a dart in the dark.
timmyturnahp21@reddit
Do you not understand what a hypothetical is?
PM_40@reddit
If goal is to predict what might happen in 20 years, in my opinion it's pointless though I understand others might have different opinions and see value in preparing for 10 to 20 years timeline.
timmyturnahp21@reddit
Buddy I’m speaking of the future. It’s inherently uncertain.
You’re coming at someone talking about the future and arguing with them because you don’t like their interpretation of the future.
If talking about it is pointless, why are you even responding to me?
PM_40@reddit
Most of Reddit conversation are pointless. So I think both of us can enjoy our weekends.
timmyturnahp21@reddit
I’ll tip my glass to that. Enjoy your weekend
HaMMeReD@reddit
The job isn't going to be "fully automated" in 10.
It'll just look very different than it does today. If programmers were "fully automated" it'd be singularity level. It's basically saying Software (Like AI) can produce better software on each iteration. Maybe that'll happen in 10 years, although worrying about your retirement at the point is kind of moot, economics will go out the window entirely, scarcity will become a thing of the past, we'll either progress into utopia or be wiped out, but the status quo won't remain.
The field has endless churn, new technologies and trends to adapt to constantly. It's not going to slow down, it'll speed up. Companies will perpetually be playing catch up with the latest and greatest, until that happens.
DigThatData@reddit
coding continues to be an OP skill, and we as a community make it more OP every day.
Willbo@reddit
Obviously I don't have a crystal ball, but the game lore of Cyberpunk 2077 hits on a few fascinating points.
Nation-states will continue to posture up their hierarchy and waste money on war, rigidity, legislation, dealing with natural disasters, pandemics and health, stabilizing disruptions of the supply chain, and other state of affairs to grapple with a rapidly changing population.
Mega-corps and big tech will continue to accrue massive amounts of capital and data on the population that will make them almost omnipotent, with totalitarian control over logic and language. These corps will poach the best of the best engineers that already have a foot in the industry at one of the other mega-corps. The engineers won't just have to understand one or two languages, they will have to string many of them together to create coherency, similar to stringing many words to create a sentence. Technology will continue to move quicker than governments and subvert legislation, bending only to the will of banks.
The gap between junior and experienced engineers will become even wider, with a great divide between mega-corp engineers and gig-economy engineers. Mega-corp engineers will be speaking their own string of proprietary languages for mega-corps while gig-economy engineers will bet tying various odds and ends together for their own community leaders. Everyone else that is not technical will be living like cavemen sitting around a campfire, seeing the shadows on the walls, but not actually knowing what causes it.
"The real problem of humanity is the following: We have Paleolithic emotions, medieval institutions and godlike technology." - Edward O. Wilson
olionajudah@reddit
Relevant
https://youtu.be/gXbTh70m_q0
originalchronoguy@reddit
I seriously believe it is going to be very good for certain groups of developers.
I have a very positive outlook with bigger projects and more challenging work.
I think we are at the cusp of a new era.
quentech@reddit
Yep, AI - whether it succeeds or fails - will make developers who know how to build shit and get it shipped even more valuable.
Bakoro@reddit
The future is going to probably going to look like the current model pushed to extremes because of AI tools.
The coding AI tools are only going to get better and more reliable.
I would not be surprised if language specific and domain specific fine-tunes become more of a thing.
Businesses running local models and doing nightly or weekly fine tunes and personalized reinforcement learning based on their code base and their developer interactions is probably going to be a thing.
There will be a lot of jack of all trades generalists developers who will be looking over a whole codebase and are more concerned about systems level thinking than worrying about specific language syntax.
There will be extreme specialists who focus on specific business logic, and will need a lot of domain knowledge and computers science to make sure that the specifications to implementation to testing pipeline is actually doing what it's supposed to be doing.
The biggest thing, is that I very strongly feel that formal verification is going to become a standard in more AI workflows, and that could be a job all by itself.
In the past and today, formal verification is generally something that only happens in the most extremely critical infrastructure, if it ever happens.
Most companies and most developers are not doing any formal verification at all. It's time consuming, expensive, and small changes can mean having to do it all over again. It's just not feasible for most companies right now.
Formal verification becomes feasible if you have an AI system that can do the majority of the process and pipe it through the deterministic tools that exist.
I have talked about this before, and some folks flat out don't believe it will ever be a thing, but I think it's the obvious path forward if AI systems are going to be writing a ton of code. Having formal specifications and verification is literally how you know that the system is doing what it's supposed to do.
So, that's really going to be the big thing, a much bigger emphasis on systems level thinking, strong formal computer science skills, and a lot less "I know what I'm doing, I've been coding for x years" cowboy shit.
drnullpointer@reddit
> What's the actual long-term future of the field?
* Entry level jobs are probably going to be unstable and hard to predict. Whenever total number of workers in any are is cut, it is mostly the people with least experience that are affected. When people with substantial experience are cut, they tend to find another job relatively quickly.
* Experienced developers will likely be safe as long as they keep updating their skills.
* The automation or AI *does not* make labour cheaper. It actually makes labour more expensive, because handling complicated automation or AI requires *more* experience, not less. What automation did for companies is it allowed to produce more with less people (ie. made labour more efficient).
* As more automation and AI are introduced, there are going to be more and more people who are not skilled or intelligent enough to be productive. More and more people will compete for less and less jobs that do not require skill. I predict the income disparity will only be growing and it is hard to predict what are going to be the results of it as this will highly depend on politics. Lots of people who are currently marginally productive as developers will in future have to switch to less demanding (and worse paid) jobs.
* There is only so much a person can learn. As development is going to require more and more skill due to more complex environment, there will be further specialization among developers to offset for more skill demand. It might be worthwhile to watch for newly forming specializations to be among first people.
Groove-Theory@reddit
And this is why our field will be safe.
zzeenn@reddit
Generative AI has made factual knowledge incredibly cheap. What it can’t do well yet (or possible ever) is true reasoning.
Hear me out: all the “reasoning” models are really just chain-of-thought text prediction. You still need a human to verify if the output is any good. There’s still some limited upside with multi-agent systems, but I personally don’t think we’ll crack AGI without another major breakthrough or paradigm shift.
freekayZekey@reddit
something in between? i think the years 2015-2022 were an aberration. profits went to shit, and folks completely forgot business fundamentals.
tech will probably improve, but it’ll need to actually solve problems
kanzenryu@reddit
You're a smart person in 1950. Will we have flying cars soon? Electricity too cheap to meter? Six big computers or one on every desk? A future free from war and poverty? Etc.
StepIntoTheCylinder@reddit
If you count all the aspiring devs who would take even a low paying job, then there's a far greater supply than demand. I think you underestimate how many people are trying to be devs. Nobody's crying out for more, they're crying out because when they post a job, they get an overwhelming number of applicants.
ings0c@reddit
Yet few can actually do it.
There’ll always be demand for skilled developers. The nature of exactly what we do will change, but the general concept of “make computer do stuff” isn’t going anywhere, and if you’re good at it you’ll find work.
TheMightyTywin@reddit
Fully automated AI agent workflows building software. A few insanely stressed software engineers keeping it going. Millions unemployed.
All the software companies that manage to successfully implement fully autonomous workflows make money - companies that fail to adapt will be out competed.
_lazyLambda@reddit
There will be an exodus of low quality engineers and there will always be a business case for hiring an actual quality dev. People saying that devs will be eliminated entirely seem to not understand how lazy business people are.
AI will always be a prototyping tool because anything built with AI can be improved by not using AI. No matter how good it gets, the answer to a business problem isnt a matter of probabilities its a direct specific answer that can be optimized for, over and above some generic AI solution.
And I think in the long term, the trends in development will not be dictated by the masses of unknowlegeable Javascript web slop devs but by the people who actually know what they are talking about
79215185-1feb-44c6@reddit
Sounds in line with what I think. Not sure I can add much more.
TopSwagCode@reddit
"The ownership/ management class are dead-set on making labor as cheap as possible" -> This is nothing new. It's been like this for ages. There has been plenty of these bumps along the way. Visual Designers, Low / No code solutions, Software "wizards", AI, Offshore, Nearshore, Internships. These are the ones just top of my head :D
They all work fine fore smaller scope assignments, but fail long run. The problem has never really been the code, but rather the domain and politics. You need people to understand the problems and be part of the team long term.
mauriciocap@reddit
Only two recomendations: * Pareto's "Circulation of the elites" * Ford(ism) connection with the naz1 regime
I recommend thinking about people with the IQ and skill to wield power instead of "the field"
Deto@reddit
I don't know if this is necessarily true. I mean, there is a hard ceiling of 'everyone in the country spends all their discretionary cash on products that rely on software'. You can also apply reasoning like this to different sectors - e.g., for purely digital goods, how much are people willing to spend? And then how much money could there actually be in e-commerce of physical goods? At this point, people are already buying nearly a majority of goods online, so the growth potential of that segment is limited.
No-Economics-8239@reddit
Nothing has changed that is truly new. This cycle has been going on for the three decades I have been doing it. And looking back to the beginning looks like more of the same.
Obviously, a lot has changed in 50 years. Practices and technology have continued to improve, but the inherent relationship between the money and leadership and the creatives is no different now than it has ever been.
The money just wants more money and is looking for safe investments that maximize profits and minimize costs. For most of it, we've been seen as a cost center to minimize. Then came the march and metric of billable hours. And throughout all of it, we've argued about how best to measure productivity.
The industry isn't going anywhere. People will still need to understand technology and be the experts that make the magic happen. Until they fix the model collapse problem, this latest trend is no different than blockchain or WYSIWYG or the host of other revolutionary milestones that were going to change everything.
exodusTay@reddit
My prediction is that we will see an increase at hiring in the future because:
1-) Current LLM's are nowhere near replacing devs 2-) If LLM's find a way to be cheap while getting more accurate they might increase the rate at devs learn fields outside their areas.
So they will try to flood the market with more devs to reduce the pay, as the skill gap between seniors and juniors become smaller.
Windyvale@reddit
Tools change, what we need to do with them does not.
Independent-Fun815@reddit
I think devs are going to be measured by their convictions. The problem these days is ppl think just being smart is enough. I'll like to see devs get punished and rewarded by the strength of their convictions on markets.
ExtraSpontaneousG@reddit
Nobody can see 10, 20, 30 years from now. 10, 20, and 30 years ago was a completely different world regarding development. Not going to spin cycles hypothesizing, just going to do what is fulfilling and relevant moment to moment.
No_Indication_1238@reddit
To be a developer is to create innovation, to modernise, to automate. It's the same. This will stay the same. The tools may change, but the gist will remain.