Hot take: The highest-paid skill in tech and design right now is rejecting
Posted by YogurtIll4336@reddit | ExperiencedDevs | View on Reddit | 7 comments
We’ve spent years optimizing for creation, faster builds, better tools, more output. Now you can generate code, designs, copy… almost instantly. But that didn’t remove the hard part it just moved it.
The real challenge now is deciding what not to ship.
What looks right but breaks at scale, what solves the wrong problem, what introduces hidden complexity.
Most outputs are “good enough”, very few are actually *right.* So the leverage shifts from creating more → choosing better. And that kind of judgment is slower, harder to teach, and way more valuable.
Wdyt?
RunnyPlease@reddit
Are you suggesting that before AI tooling the devs made more money than executives? Because I can assure you the executives and department leads have always made more money than the devs. And often by an order of magnitude.
The people who choose what to invest time, labor, and materials in have always made more money than the ones who physically did the work. Ai tooling hasn’t changed that.
… And uptime, security, changing legal requirements, stability, aligning technology selection with long term business objectives, maintainability, disaster recovery, scalability, localization, failover, server cost, user experience…
We have been optimizing for a lot more than just feature output for many years now.
From a business perspective the challenge has always been about balancing ROI (return on investment) while mitigating risk. That has always been, and continues to be, the challenge for go/no-go decisions for feature development. How much will it cost, what can I expect in return, and how to I minimize risk?
That was there before Ai tooling.
I worked in software consulting for a decade, and I can assure you bad code was a significant problem long before Ai tooling started writing it. Bad code cripples projects and strangles team production regardless of who or what is writing it.
I’ve seen code so scrambled and spaghettified that C level executives told me they considered it a threat to the business. No one could figure out why, but they had to have someone manually restart the servers every few days to keep the entire company’s inventory system from crashing, and it was getting worse. It used to be once a month. Then every week. Now it was every few days. They hired several consulting firms to try to fix it, but the code was so bad they all failed and now that code was literally threatening the existence of the company.
How did they get there? They spent years selecting the lowest bidder, pushing for shortcuts, accruing tech debt, and pushing unstable junk code to production all in the name of “good enough.” Ai didn’t create the “good enough” problem. It just added another way for decision makers to get to “good enough” with fewer steps.
You suggest that “choosing better” is more valuable. If you’re at an org that only values “good enough” then they will not value “choosing better” until it represents an existential threat to the business, or someone can make a solid case for ROI. Until then you will always have a corporate culture favoring “good enough.”
If you want to be valued you need to be able to speak to them about the things they actually value.
We are living in an age of technological development. While it has drastically changed things for boots on the ground it hasn’t actually changed business all that much. There is a very good reason old heads like me point to the dot-com bubble when we see how the market is reacting to Ai. It’s very similar. The pattern is there.
The internet promised to revolutionize business, and it did. But it didn’t change everything. The people who bet on it changing everything lost everything. The people that bet on the internet doing what it actually does well generally came out ahead.
Ai promises to revolutionize business, and it will. But it won’t change everything. Right now in the market there are a lot of groups betting heavily that Ai will change everything. This should sound familiar. As many business journalists have pointed out, a lot of these investments have been made by groups who don’t even have a theoretical way to get a return on their investment. This should also sound familiar.
The truth is the real “highest-paid skill in tech” right now is the ability to convince investors to throw billions of dollars at your company without any justification based on ROI. It’s one of the clearest signs we are in a bubble. If you want big money that’s where you’ll find it and that’s the skill that will get it for you. Get in, create hype, get as much as you can, and don’t be there when it pops.
For the rest of us Ai tools are proving to be very good at some things, not too good at others. We should be focusing on learning that distinction so we’re ready when the bubble pops… Just like we did with the internet.
roleplay_oedipus_rex@reddit
Nice garbage AI post.
The highest paid skill in tech will always be sales.
If you can't sell yourself you'll be one of those people on r/recruitinghell and r/cscareerquestions crying about how tough the market is.
johnpeters42@reddit
Always be closing. ALWAYS be CLOSING!
Morazma@reddit
Chatgpt slop
scFleetFinder@reddit
"that's why I made RejectedAi, it's what they use at all of the places you've been applying to."
pydry@reddit
I miss the days when the industry looked down on hot takes rather than regurgitating them endlessly.
ze_pequeno@reddit
Yeah that's called critical thinking and it's always been valuable :)