Why the "Low-Level" stigma?
Posted by Antique_Mechanic133@reddit | ExperiencedDevs | View on Reddit | 263 comments
I’ve been seeing this a lot lately, and honestly, it’s starting to worry me. There’s this weird growing disdain in CS education and among new grads for anything that touches the metal, Assembly, C, even C++...
Whenever these topics come up, they’re usually dismissed as obsolete or unnecessarily hard. I’ve literally had new devs look at me like I’m crazy for even mentioning C, treating it like some radioactive relic that has nothing to offer a modern environment.
I spent a good chunk of my career in firmware, and I can tell you: nothing changed my perspective on software more than actually understanding what’s happening under the hood.
The problem isn't that everyone needs to be writing Assembly every day. The problem is that without those fundamentals, all these modern high-level abstractions just become magic. It’s like trying to fly a plane without having a clue how aerodynamics work.
I feel like we’re churning out devs who are great at using tools but have no idea how the engine works. Am I just getting old, or are we failing the next generation by letting them skip the foundation?
EternalBefuddlement@reddit
The push from companies is all about productivity, especially in SWE.
You don't necessarily need to know how Kafka works under the covers, or how a compiler turns your Java code into bytecode, to engineer a system.
You know that A does X, it can be combined with B to achieve Y.
Fwiw I enjoy lower level programming, but I'm not paid to do it.
Winter_Present_4185@reddit
By this logic, more of the "engineering" is at these lower levels and your just stitching things together at the higher levels.
EternalBefuddlement@reddit
Well no, as what I stated directly states that you're still engineering something. The level and scope which you work at simply varies.
Lower level leads to a narrower scope.
Higher level leads to a wider scope.
recycled_ideas@reddit
Possibly more than any other field software development sits on the work of those who came before.
We can solve problems that people twenty years ago couldn't have even imagined solving because they spent all of their time fucking around allocating memory.
There is engineering at both the low and the high level, but for the most part, the work at the low level has been solved and we don't need to keep doing it.
Winter_Present_4185@reddit
There are way more products that aren't webdev than ones that are. Think of traffic lights, etc
recycled_ideas@reddit
You do realize that there's code that's not low level that's also not webdev right?
Also, yes, there are millions of traffic lights, but they're all likely running the same controller and it probably hasn't been substantially changed in the last twenty years.
Winter_Present_4185@reddit
Engineering principal is you always try to make products cheaper and cheaper. The controller changes every year
recycled_ideas@reddit
Do you have evidence of this, or is this just a bullshit claim, because re-developing the software every year doesn't sound cheaper to me.
Winter_Present_4185@reddit
Engineering is dealing with the physical world. Most software development isn't engineering.
In manufacturing, the cost to produce the good is always a race to the bottom. Each year cheaper and cheaper micros come out which costs a cent or two less. Economies of scale take over when your dealing with something as widely mass distributed as traffic lights.
recycled_ideas@reddit
So you are just making shit up, ok then.
Designing and testing a new device is expensive, supporting a new product line every year is expensive, fucking no one is creating a new one every year.
Winter_Present_4185@reddit
I have a PhD in EE and spend a lot of my time in manufacturing.
recycled_ideas@reddit
Do you work on traffic lights? Do you know anyone who does? Do you have any proof of what you're saying?
Do you actually have a single product that is being re-developed on a yearly basis that you have experience with?
Because you sound pretty much like every other elitist jagoff who thinks that everything that's not embedded is WebDev and beneath them and sits around with the two other embedded guys they know imagining that there are millions of them out there.
Yes, there's a shit load of embedded microcontrollers, but there's a tiny number of embedded devs.
Winter_Present_4185@reddit
Sure I do. But it kinda should be obvious manufacturing is always a race to the bottom.
As far as continuing this conversation, you've tried to berate me while I've tried to carry a civil conversation, so there isn't a need to continue.
recycled_ideas@reddit
You have repeated statements that aren't logical without proof.
Winter_Present_4185@reddit
And what proof have you offered?
recycled_ideas@reddit
Your argument is that companies are doing extremely expensive redesign on an annual basis, which is frankly insane.
Winter_Present_4185@reddit
And your misconception is that they are "extremely expensive" redesigns. I mentioned I have a PhD in EE as to show I have some experience with the topic at hand.
Respinning a board use to be expensive in the 80s. Wave soldering has cut that down tremendously.
In addition, most micros have an ISO for the same pad layout, even going down to the same XO input with the same caps.
Perhaps you are thinking about the medical device world or aerospace where there is a bunch of red tape. Those designs rarely change.
There is a reason why China keeps coming out with cheaper and cheaper micros. And it isn't because their trying to get into the Guinness Book of World Records.
recycled_ideas@reddit
If the software needs to be re-done you're talking a million easily, if it doesn't then the software doesn't need to be redone and it's moot.
Winter_Present_4185@reddit
Lmao, your Dunning-Kruger is showing. Do you know what a BSP is? If you don't, here is a link:
https://en.wikipedia.org/wiki/Board_support_package
recycled_ideas@reddit
My argument was that no one is actively building software for traffic lights because the software is done.
You counter-argued that they're changing the controllers all the time.
Now you're saying that that doesn't mean re-development, which means that you've contributed absolutely nothing to this conversation.
Winter_Present_4185@reddit
No.. that isn't what I'm saying.
Look, at the end of the day, there will always be new chips/drivers/bootloaders/kernels that need to be created to support physically manufactured product Unless you are saying developments in material science, chemistry, processor efficacy, etc will somehow slow down (but we've seen it exponentially increase as of lately).
Anything that is not tethered to a physical system for deployment (can run on a generalized computer), has the unfortunate circumstance of being virtualized, and with the LLM revolution, I'd argue is also a race to the bottom, similar to manufacturing costs.
recycled_ideas@reddit
Material science changes are irrelevant here.
Even if they had to remake the software, which they don't, the problems of how to control traffic lights are solved, they've been solved for decades. Centralised management of the overall network isn't solved, but that's not done low level.
This is the whole point you don't seem to grasp.
I doubt there's a single FTE working on embedded software for traffic lights on the entire planet. It's a solved problem. This is true for a whole lot of these systems on a chip. And if you're running an embedded windows on it, it's not assembly anymore anyway.
There just aren't many low level problems to solve because we've solved them, there aren't zero, but the number of people solving brand new problems in an embedded context is tiny.
We do still have systems programming, but systems programming isn't actually particularly low level. People are looking down on C and C++ because they're just horrible. C++ memory management is just too much of a foot gun.
Winter_Present_4185@reddit
Why? I'm not longer talking about traffic lights, but in general. Material science changes drive manufacturing which gives rise to new hardware capabilities, which requires drivers to use those capabilities.
Are they? Ever wonder why when an ambulance or fire truck approaches a traffic light, all the lights turn red? There's a sensor on the light to let emergency vehicles have priorities. This wasn't around a decade ago.
recycled_ideas@reddit
And what does the code for that look like? Sensor triggered, trigger light transition. It's not like someone went and built a spectrum analyser in software to do this. Hell even if they did, that's also a solved problem.
We're talking about software here. Not the EE side of it. The EE side of it is irrelevant to this conversation unless it creates some brand new problem on the software side, which none of this does.
There's a reason that a lot of embedded jobs are poorly paid.
Winter_Present_4185@reddit
This is kinda irrelevant. What I'm trying to convey is that it is not a solved problem, because new technology is created all the time (such as the light sensor).
Most of software is a solved problem then? Have database, serve data to user. Garbage in, Garbage out.
The last 5 years of developer surveys from Stack overflow show that embedded gets paid more than front end or back end:
survey.stackoverflow.co/2024/work#salary-comp-total-years-code-pro-dev-type
TightConstruction447@reddit
i get what you're saying, but not everyone needs that deep dive.
aruisdante@reddit
It’s because most people that develop in C in particular tend to be on the Indeed pay scale, rather than the Levels.fyi pay scale.
Put differently, firmware is almost always seen in industry as a cost center to be minimized, rather than a revenue generator to be invested in. This means pay tends to be considerably lower. I would actually argue a lot of firmware engineers are significantly better engineers than “application level” developers, as they have to solve hard problems with considerable constraints on resources. But they’re never really the ones driving product experience, nor coming up with new business opportunities for their company, and this it’s thought of as “lesser.”
diablo1128@reddit
This is the answer.
As somebody that worked on safety critical medical devices, think dialysis machines, for 15 years, the vast majority of companies don't pay like big tech company. They are old school companies with top down management working like it's still the 90's. The vast majority of code I've written was c with classes style c++.
I'm a great example of this. 15 YOE, have my name on granted patents in multiple companies, have lead teams of 20 SWEs, and get paid 110K at a private non-tech company in non-tech city. The vast majority of big tech companies don't care about my experience. While I can learn on the job, they have no reason to choose me over some candidate that already knows what they need them to do.
bland3rs@reddit
Well pay is based more on the profit of the industry more than the job or location.
Consumer products can make money hand over fist. An iPhone will sell like 200 million each year and the profit margin can go up to like 60%.
Medical devices have much lower volume and lower profit margin, like with most industries. If you are loookg to make money, work in financial, consumer products, 94 oil. You don’t even necessarily have to do software.
DenebVegaAltair@reddit
As a FAANG engineer whose team had to hire a staff engineer with low level chops, your claims do not match my experience. It was a hard role to fill. There is a shortage of folks who have more than a superficial understanding of low level concepts, and they can command a pretty penny.
Comedy86@reddit
This was also the case with COBOL when banks began to transition to Java. They needed a few maintenance programmers and would pay them a small fortune since they were rare.
That being the case, to counter OPs opinion, there was no benefit in modern developers knowing COBOL anymore at that point in time and, these days, there's very little need for people to know C or Assembly unless you're seeking a job doing that specifically. Modern processing power makes the majority of those lessons redundant for 99% of developers.
Instance9279@reddit
That's a poor analogy, because nothing (besides banking infrastructure) is running on COBOL, and everything in the world (including COBOL) runs on C and Assembly.
It's like saying "there is no benefit in modern developers knowing Drupal, also there is no benefit in knowing Java"
Comedy86@reddit
The irony here is you're right regarding both of those as well... If you don't need either of them, don't learn them. Neither are in high demand outside of their respective niche markets. You could spend decades making websites and never need either. It's not anywhere near as important as a React or Vue developer needing to understand how native JS works or someone using Tailwind when they don't understand basic CSS.
You don't need to know everything to be an expert in our line of work. A web developer doesn't need to know Assembly and a mobile developer doesn't need Drupal. The only skill that matters is how quickly you can become proficient in a new language or technology and how much you understand the principals of your own line of work.
AlmiranteCrujido@reddit
If you don't understand how the language you're using is actually implemented, you don't really understand the language you're using.
...and that requires knowing at least a bit of low level programming, because fundamentally, every one of those languages is going to run on physical RAM and a physical CPU and physical storage.
Not every dev needs to know that down to a cache line, but if you don't have the basic concept of a pointer and that RAM isn't free, you're in trouble.
alpacaMyToothbrush@reddit
Neither are in high demand outside of their respective niche markets.
I'm sorry, I could well be misreading you, but did you just call java a niche market?
Comedy86@reddit
Yes, a niche market is a market which requires a specialized focus. Many Java roles are for niche markets like Fintech, Healthcare, etc... If you're a Java developer, you learn Java. If you are any other developer, you don't need to learn Java.
The language itself isn't niche due to its versatility but the roles themselves are.
alpacaMyToothbrush@reddit
I mean, in my day I've done defense, healthcare, VOIP, you name it.
I don't regard java, one of the most popular programming languages on the planet as a 'niche' language at all. I'm struggling to think of a less niche language.
1988rx7T2@reddit
A lot of automotive stuff is run on C.
ScudsCorp@reddit
I’m building my rust experience because I’m tired of web shit, but this would be a pay cut and all the job descriptions want to replace a recently retired guy who’s been doing this since the 80’s
Ok-Leopard-9917@reddit
Transitioning to a new field you don’t have experience in often involves a pay cut. Experienced systems developers earn FAANG salaries the same as web development.
lastberserker@reddit
Doing Rust since 80s?! I don't think anyone can replace that guy.
Ok-Yogurt2360@reddit
Why the downvotes? Am i missing something?
Rust is from 2012 right? So your comment is pretty much on point.
lastberserker@reddit
Most of posters here were in middle school 12 years ago - for them Rust was always out there 😂
arcanemachined@reddit
That's around the time I started rusting...
met0xff@reddit
Yeah that's also my impression. Especially since Rust more than enough people who'd like to get out of the web world but the jobs are rare and badly paid. I worked in embedded for a couple years but didn't see a real future there.
Also if you're not at Nvidia or whatever it's hard to have a strategic and influential position while being the low level guy.
tkyang99@reddit
Problem is most of these low level engineers haven't grinded enough leetcode to pass faang interviews...
time-always-passes@reddit
Wait are you saying that my experience with writing interrupt handling code (assembly) is still useful??
Winter_Present_4185@reddit
For the last 4 years in a row, on Stack Overflows developer surveys, embedded earns more than front end or back end. There are fewer embedded positions in FAANG though so they don't get reflected in levels.fyi
GoodishCoder@reddit
FAANG isn't a reasonable consideration when talking about pay as a whole for the domain. FAANG tends to pay considerably more in general.
CorrectPeanut5@reddit
That is extremely dependent on the market and vertical you are in.
Many companies don't hire that engineer on shore. They contract the Chinese hardware maker to do the work. Those kinds of engineers are far more common there and frankly often more experienced and closer to the hardware. It's been rough for the grey beards in that space since the 2000s because they've had decades of having to compete against off shore.
I think we're in a world where companies that are designing their own stuff need those kinds of people and most of them have gone the way of the COBOL programmer. While the rest of the world just gives a spec to the OEM.
SnooWoofers5193@reddit
I studied ECE and now work at Meta arguing with management and partner teams on what UI components and buttons go where and why. One thing that’s crossed my mind is if AI eats my lunch, maybe I should go back into low level stuff if that’s where the demand and the pay is.
But I struggle with low level stuff. Memory and heat and buses don’t feel satisfying for me to work on. If it comes to a knowledge epidemic of low level work, perhaps I could get back in the lab but moving boxes around and working with XFN is so much fun, it’d be so sad. Perhaps a simple way to answer OP‘s original question is that low level stuff is hard and product engineering is fun and easy and social.
I do feel like hardware attracts the personality types of introvert quiet folks who love the technical details, similar to infra work. And product eng requires you to have solid people skills to push tough alignments through. I just don’t think that matters AS MUCH the more technical the work gets, and as you get closer to the metal, I think it matters less and less.
Also, all those hours debugging memory leaks in my C code did not make me want to do that 40 hours a week for the rest of my life
tim-hilt@reddit
Fwiw I found AI to be quite helpful in Debugging firmware bugs! If it can’t pinpoint the issue right away, it comes up with some ideas of where to look next. Oftentimes in areas I didn’t think about before and ultimately lead to the solution.
SnooWoofers5193@reddit
Yeah I graduated 5-10 years ago, I’d imagine problem solving and tooling is largely different and almost easier now compared to then.
I guess another way to put what I was saying above is that in low level and infra stuff you partner with other engineers to get things done whereas in product you partner with other job families to deliver a project. I keep telling myself that maybe as I get older and less social I’ll enjoy infra work more, but I just don’t know if that will be true.
tim-hilt@reddit
Yes, can confirm. Most of my collabs happen inside the team. I do have overlaps with hardware / platform / tooling teams, but the domains stay closely related
Izacus@reddit
I seriously wonder what this guy is on about - pretty much all FAANG have significant amount of code written in C++ and the most paid ICs I've met there were kernel/runtime/language developers neck deep in ASM and other low level programming. Folks that work on that can save millions by optimizing some library.
PoopsCodeAllTheTime@reddit
This is actually the logical result: very few roles where talent can grow, then you cannot fill staff roles.
preethamrn@reddit
I think you're both saying the same thing. You struggled to fill that role (and paid a lot for it) because many of the people you interviewed probably weren't good fits so when you did find someone acceptable, you needed to spend a lot to make them join.
An average applications developer can make FAANG money but an average low level developer will probably find it hard to find a job. The reality is that there's not as much juice to squeeze out of firmware development (a lot of the biggest/easiest innovations were decades ago at this point). Perhaps that will change with the increased focus on GPUs but even there, you'd need to be very talented to command those high salaries.
aruisdante@reddit
Yes. Again, you are pointing to my point I made at the end. There are specific, highly skilled low level engineers that are very well compensated.
Those people are less than 1% of 1% of the employees at your company, and in the industry as a whole. Anyone that’s highly specialized in any field gets paid like that.
The majority of “firmware engineers” in industry write software for lightbulbs, or thermostats, or your toaster.
The point isn’t about the ceiling, it’s about the floor and the median. Most do not genially believe or have any interest in being the next Ken Thompson or Linus Torvalds. They just want a career that will pay them maximum money for minimum effort. Writing firmware for consumer electronics is not that path.
I work in an industry where this split is very real. People on the “software engineer” track get paid, quite literally, four to eight times what people on the “embedded engineer” track get paid. Despite the actual day to day job being more or less identical.
It comes down to labor leverage. Someone writing a web app can, with a single developer’s effort, produce something that can make a company millions of dollars in revenue with zero other capital expenditure. This means the risk in talent investment is much lower. Someone writing embedded firmware, no matter how good it is, is not going to drive millions of dollars in revenue without significant capital investment into the hardware that firmware goes into. This makes them a cost center that actively eats into the margins of the actual product being sold, which is the hardware.
StrawberryWaste9040@reddit
Both claims are true. C developers can't just all go to FAANG today, there's no enough demand for them. The demand is mostly elsewhere - where hardware is made, be it car industry, defense, aerospace, medical, etc.
winggar@reddit
Lol at all the replies going "this isn't true at all, I work in Silicon Valley!" when (a) Silicon Valley is a small minority of CS jobs and (b) even in Silicon Valley there are far far fewer firmware roles than software roles.
AlmiranteCrujido@reddit
I mean, that's the nature of long tail distributions.
I know a guy whose resume was (little hardware startup) -> FitBit -> Google (via the acquisition) -> non-Firmware EM (hated it) -> Nvidia EM leading firmware devs (right before their stock went nuts...)
As you can imagine, he's making major bank, to the point where I'm like "why don't you just retire dude?"
But that's "results not typical" for any kind of tech, even in the Bay Area.
forbiddenknowledg3@reddit
Really? Isn't Google's codebase mostly C++? Then with all this AI coding stuff I see low-level as the more secure.
The_Northern_Light@reddit
I’m an over-the-hill C++ guy and your explanation does not ring true for me.
I recently spent a decade in the SF Bay Area working essentially exclusively around other C++ devs at big tech and unicorns, and we were definitely on the same levels.fyi pay scale as everyone else. There weren’t just a few of us, I was on a thousand engineer FAANG team working on “the jetpack”, all C++.
It’s not just a Silicon Valley thing, either. When I was a junior I got my first real job in a fly over state the week I returned from my study abroad by them directly asking me in for an on site interview in 2 days. No application, nothing. They later told me my newly created LinkedIn account was the only one who matched their search within 500 miles! I broke the junior pay scale then, and it’s the same story now that I’ve moved back home as a senior. Supply-and-demand drives prices in the labor market same as any other, and that market is clearly supply-constrained.
Maybe they merely think the opportunities and pay are less? I could easily believe that. But I don’t think that’s reality. I’d sooner attribute it to the barrier to entry and the (utterly broken) educational pipeline, as a start.
We only have so many things we can hope to master: why would a young person choose to invest in where there are old experts when they could invest in the frontier where their inexperience is much less unusual? (Uncharted wilds are more exciting anyways!)
A young coder coming up today has every opportunity and reason to focus on other things than mere implementation details… especially given that it is clear that a huge chunk of the work that I’ve spent my career (especially early on) will soon be done by AI. I don’t know how you plan an AI resistant career, but it’s definitely not by accumulating a trove of arcane minutia of how to code low level systems.
So unless you’re called to it… why would you go into low level even if the money is similarly good?
Instance9279@reddit
Isn't low level systems more AI resistant career? LLMs can generate apps easier than they can handle high performance CPU optimized code. Also, the lower you go, the more critical potential bugs become, so the need for human oversight and accountability grows. Also, they have much less training data on arcane C/C++ codebase compared to python scripts.
mark_99@reddit
I'm an HFT engineer and today for a C++ ML side project I got Opus 4.6 + Sonnet & Codex reviewers to check for correctness a (naive) matrix operation, implemented in all of AVX-512, AVX2, and SSE intrinsics.
It came back clean but commented (without running any profiling tools) that instruction dependency chains were limiting ILP. I asked it to go ahead and optimise and it unrolled some loops and reordered some operations, benchmarked before and after and showed a 1.5x speedup on AVX-512 and 2x on the other paths. It offered to implement tiling for improved cache coherency but speculated (correctly) it wouldn't make a massive difference for the small sizes in this particular application.
There is no purely technical safe haven - if used correctly, which admittedly (judging by reddit) seems rare, AI models are already better than 99% of human coders even in specialist disciplines.
BTW I think OPs observations are explained by conflating low-level (well respected, well paid) and C (generally to be avoided).
MCPtz@reddit
Interesting. Do you think the loop unrolls were better than what the compiler optimizers was doing? (my whole post is predicated on that you were using a compiler optimizer and this beat it. And ya, I've seen the link from the other poster just below you)
Just out of an abundance of curiosity, I'd be wondering if either of the re-ordered operations or the unrolled loops did most of the work faster, rather than both combined.
Decades ago, I manually unrolled loops and got about a 20x speedup on a custom SIMD system, but that didn't have compiler optimizers.
More recently I confirmed the compiler optimizers were unrolling some loops on parallel ops and it was definitely speeding things up, by taking advantage of strange instruction ordering to get CPU arch specific optimizations that were beyond my immediate comprehension, unless I dug in a lot more.
The_Northern_Light@reddit
I've even had it significantly speed up real code that I had already "half optimized" (I had already done loop unrolling, SIMD, memory layout, and obvious stuff like that), sometimes in ways that I found surprising.
I've not tested this yet, but I understand that in some cases it is capable of beating the compiler at its job: https://lemire.me/blog/2026/04/05/can-your-ai-rewrite-your-code-in-assembly/
Agree on every front. Those best practices are advancing rapidly, and that transformation is going to win-out sooner rather than later. Ignorance of what's actually been happening isn't going to be sustainable for long.
Instance9279@reddit
Hey, I checked your profile, and looked at compiler explorer, your blog and bio, you are a god-tier C++ guy 😀 What's your personal opinion (besides this post) on LLMs, and software developers' job security? Are we all cooked long term?
Instance9279@reddit
I am learning C++ / systems now, to pivot from mobile development. Your post is slightly discouraging for me 😀 I will still stick to learning it though, because frankly it's super interesting for me, and the deeper technical problems compares to the somehow mundane mobile development are quite refreshing.
The_Northern_Light@reddit
It doesn't matter if it's easier for the AI to write an app than high performance CPU optimized code, it matters if its cheaper for the AI to write the high performance CPU optimized code than a human. Remember, humans are slower at writing low level code than apps too!
You mention high performance, platform optimized code... surely it is not hard to imagine an AI capable of exploring the performance surface of a piece of code by systematically applying various techniques in something akin to an autoresearch loop? It's certainly been working for me! And it's little surprise since it knows Agner Fog better than I do. So that entire part of the low level dev's job is not something I'd want to build a career on if I was to start over. Which is a pity, because I truly enjoyed that.
I understand that Mythos's recent reveal is marketing hype, but I do not believe the majority of what is in there is an outright fabrication either. If even half of their claims are real, then we're already in the realm where AI's are superhuman at security tasks. If that's true, then paying for a security audit by a Methos-like model is going to become standard process for any truly important software in the future.
How confident are you that you could spot a bug that Mythos missed? What about its successors a decade or two from now? I certainly wouldn't want to bet my career that I'd be better at finding bugs than the best AI's the future has to offer.
Humans are going to play an important role in review and certification of the most critical things... but let's not pretend like we're infallible at writing secure code either! At some point, the bug creation rate of the "third quartile" developer is going to be higher than that of the best AI. I am certain I've written bugs that everyone has missed, which are still out there today.
Here, look at this puzzle from DEFCON, Gold Bug: Sea Shanty. Try to solve it, and time how long it takes you. When this puzzle first dropped, ChatGPT 5.4 Pro one-shot it in just a few minutes. It wasn't in training data, but it figured it out.
AI is not as good at C++ as it is at Python, and it may never be as good, but it is getting better at both and that is a trend that is not stopping tomorrow. It's personally difficult for me to imagine a world where AI's can crack DEFCON puzzles first try in a couple minutes and find thousands of zero days across virtually all important software, but can't figure out how to work in a clunky C++ codebase.
I don't know where this is all going, but it might lead towards AI's being a significant factor in language development and choice. If the AI's are better at language X instead of language Y... then maybe at some point you invest in just porting your codebase.
"Just port your codebase" is a phrase that sounds ridiculous, but I've been porting a big mess of legacy code for the last couple weeks and it's shocking how well AI's do. It has a reference implementation, so it can just write tests, and verify its work versus the reference. If it messes up it knows it and can address the issue. Especially if you set up your harness to use a separate critic model to check the generator model's work for shortcomings... you get way better results this way. I've certainly gotten better, more comprehensive test coverage this way than I would have done manually.
Maybe people actually do just "rewrite it in Rust", or to some new language developed with AI's in mind. That's a drastic scenario, sure, but I think incremental progress towards something like that is actually very realistic.
We're already rapidly moving to a world where design decisions and architectural structure are the primary inputs a developer brings to software engineering... neither of which are things juniors are great at.
Winter_Present_4185@reddit
I think it matter that your firmware isn't shipped with bugs more than your web app - mostly because it's much more costly to fix a firmware bug in prod than a web app bug.
The_Northern_Light@reddit
Sure, the question is who is better at making sure bugs don't exist?
Even if a team of experts paid six figures a year currently have a lower bug rate than an AI, I don't think it's obvious it will remain that way for long. Besides, most people aren't experts, and few companies have the luxury of only hiring experts.
And for many, many things it will make perfect sense to knowingly risk increase your bug rate in exchange for your labor costs dropping off a cliff. Even in the low level world not everything is safety critical.
Winter_Present_4185@reddit
There will always be bugs, regardless if AI or humans make the code.
But potentially bricking hardware or having to physically require someone to power cycle a device will always be more risky (and in some situations impossible- say space) than just relaunching a website or app.
Besides, low level tends to require much more determinism in software execution flow (similar to ultra scaling at the FAANGs) than 99% of other software.
Instance9279@reddit
Thanks for this. I wonder what type of design decisions would remain for humans to perform, I guess none in the future that you describe (or maybe just for a handful of people).
slonermike@reddit
In a weird (incomplete) way, I miss the days when we were all considered dorks and paid the same as every other type of engineer. So much of the younger generation did it for the Benjamins and they don’t seem to be having much fun with the work itself.
PoopsCodeAllTheTime@reddit
Shorter version: low level is essentially a hobby that won’t help with anyone outside of firmware industry to get paid. It’s like showing up to talk about Pokémon cards with people at work: they’ll think your hobby is weird and if you insist that they learn it, it’ll be weird.
Winter_Present_4185@reddit
Low level tends to deal with electrical engineering. Is that a hobby to you?
Antique_Mechanic133@reddit (OP)
I appreciate the nuanced take, especially coming from someone involved in WG21. You are absolutely right about the economic reality: money flows toward the path of least friction, and vibe-coded shovelware is indeed where the mass-market ROI is right now.
However, my concern isn't about the career path or the paycheck, it’s about the ceiling of competence. When we treat the foundation as a cost center to be abstracted away and ignored, we stop producing architects and start producing assembly-line workers. We’ve traded deep engineering for speed of delivery.
Jaded_Character_2975@reddit
Counterpoint, NVIDIA, AMD, Broadcom, Qualcomm, Apple etc all pay 200k+
Highest paying company in the world right now (Nvidia) is probably 25 percent firmware engineers.
Ya maybe we firmware engineers get paid a tad less overall, but it's not as much as you think.
PoopsCodeAllTheTime@reddit
Also they won’t hire me or you or 99.99% of the employable engineers
FalafelSnorlax@reddit
Do you mean that the number of these roles is just generally low? It kinda sounds to me like you're talking about these companies having higher standards than the average (relevant) population, but as someone who works in one of those, I can promise you this isn't the case.
Dinos_12345@reddit
You need to be the kind of person that has years and years of serious low level experience for a place like Nvidia to hire you.
max123246@reddit
This is not true. I joined Nvidia out of undergrad
FalafelSnorlax@reddit
I factually know this to not be true. I know plenty of nvidia engineers with little to no experience. Large corporations can't sustain themselves without hiring young and inexperienced engineers at all.
PoopsCodeAllTheTime@reddit
This might be true for in-office roles when they can’t find anyone with previous experience, in which case it still means that learning firmware is very unlikely to turn into profit. Might as well just apply without exp
sharpcoder29@reddit
This. Corporations love to hire jrs because they pay them less and expect the leads to bring them up to speed.
Chennsta@reddit
these are not the highest paying companies for swe if you take away stock growth
thequirkynerdy1@reddit
I work in faang on standard business applications but do low level as a hobby and one day would love to be on a low level team.
slamjam25@reddit
Low level is both extremes. You’re probably gonna make $60k writing menu navigations for a car radio, or you’re gonna make €700k writing C++ at a trading firm. Or you can make CUDA kernels for training transformer models 1% more efficient and get paid more money than god at any AI lab.
Environmental_Leg449@reddit
I loved my C/C++/Assembly classes in college and would've preferred to make a career out of it, but way more options/money in python + SaaS
alexlazar98@reddit
Came to say exactly this but in far fewer words. Basically, look at the money.
ice_dagger@reddit
Firmware is not the only thing low level though. Accelerators are hot for instance and a lot of it requires pretty low level understanding (up until how many registers a certain kernel would take for instance). Those jobs pay big imho
Forward_Artist7884@reddit
I'm not sure I've experienced the same you did... during my own CS education I went down the stack every single opportunity i had (started with general CS, then got down to mobile device dev instead of web, then got into embedded/sig proc engineering). Going down the stack like this was seen as pure suffering for most of my colleagues who did follow the same courses i did, none actually went to embedded / sig proc after mobile dev.
Initially it did seem like webdev / app dev would pay better / be easier and provide more job opportunities, but with AI being an actual thing now, junior hiring got frozen by most companies in my area, those who did get jobs before are either being put to work on way more projects at a time / for less pay, some are even getting laid off
But on the embedded side? No change. I got hired without issues, we don't even have a webdev / frontUI post now because any dev can do this works very easily with AI tooling (at least for the GUI part, the backend is still human authored).
The closer to the metal you are, seemingly the harder you are to actually replace. Now some colleagues from the "upper" layers do say they regret not thinking more about embedded as their markets virtually crumble around them... As for your specific point, yes i did see exactly what you mention with people hating low level, it's a thing, and most of the grads seem to prefer higher levels of abstraction, their loss.
yad76@reddit
I've always seen this as a jealousy and fear thing.
The lower level you get, the more complicated things become and the more you have to keep in your head and sort out in order to accomplish things. This is essentially by definition given that higher level abstractions exist specifically to be simpler and more efficient for humans.
As such, the higher you go, the more you open engineering up to a broader segment of the population meaning some combination of lower average intelligence (not saying any level is stupid, just speaking relatively), lower obsession factor over the nuances, etc..
To this broader segment, the thought of the lower level aspects can be anxiety inducing. It is something they either can't or don't want to understand and deal with. It is something they might see as career threatening. What if the tech industry shifts where those lower level concepts become more in play? What if they end up in a job where they get assigned things reaching down into those areas they aren't comfortable with?
At this point, it is a classic case of cognitive dissonance kicking in. They have a CS degree, get paid a big salary, maybe work for a big name company. They can't help but think of themselves as elite but there is this giant murky foundational world beneath all of that they cannot grasp. The brain at that point decides the only thing to do to resolve this contradiction is to diminish that world, to scoff at it, to act as if it is too beneath them to ever care to consider.
You see this same attitude coming from frontend engineers towards the backend.
max123246@reddit
Web stuff is still incredibly complicated, but it's accidental complexity caused by poorly designed software lower down the stack
Winter_Present_4185@reddit
This is the abstraction debate. I think it's quite common to say that the lower down you are on the abstraction tree, the more you changes matter to how performant the system is
max123246@reddit
I disagree. Amdahl's law shows us the entire stack must be optimized, just just the lowe portion or the upper portion
In the naive case you can imagine an expertly designed operating system and someone writing user space code that sorts using insertion sort only. The lower level code is great but the higher level code is killing all performance
Winter_Present_4185@reddit
Nah, Amdahl’s law does not mean every layer deserves equal effort. It shows that speedup is limited by the portions of the stack you do not optimize. Optimizations in the lower levels of the stack tend to have a wider blast radius than higher levels of the stack simply due to how abstraction works.
max123246@reddit
I agree in the sense that everyone depends upon lower levels in the stack, so your design is far reaching. But I still do think for a particular application and its dependency chain, that you have to optimize all levels of the stack to get optimal performance, and that un-optimized code at the application level will make optimized code in the dependency chain not matter much at all for perf.
eyes-are-fading-blue@reddit
Assembly is irrelevant for web dev. It’s largely irrelevant even for embedded systems. Similarity, low level access, i.e., read/write to system memory is also irrelevant for managed languages.
The same gap exists for “low-level” programmers where they don’t know how the hardware works on a fundamental level, like how L1 cache is synchronized across cores. This kind of knowledge is largely irrelevant.
Designer_Flow_8069@reddit
Your downvotes kinda speak that you're wrong here, but you said:
Of course they do. Do you think hardware engineers are just adding features to processors and nobody is using them?
eyes-are-fading-blue@reddit
I am talking about lower level stuff. I gave a good example; how L1 cache invalidation works on the hardware. You don’t need to know this to utilize cache lines.
As for downvotes, I am not wrong. And downvotes hardly mean anything.
Designer_Flow_8069@reddit
Well did you know L1 cache invalidating on an ARM Cortex A9 has a side effect of stalling the CPU pipeline for certain instructions?
You most certainly need to understand the hardware or at least read the block diagrams to understand the tradeoffs.
eyes-are-fading-blue@reddit
Not sure how your question is relevant. Understanding hardware is a bloated term. You do not need to know how exactly cache invalidation is implemented on hardware to realize it has performance implications. These implications are clearly stated in programmers’ manual. It says very little about how things work under the hood. Same gap exists at every level of SWE stack, including embedded.
Designer_Flow_8069@reddit
Haha well most "programming manuals" are abysmal. But, the issue I mentioned isn't actually stated in the processors programmers manual. It's stated in the processors errata, which is a documents dictating hardware bugs found in the silicon
An embedded engineer, you have to work around hardware bugs and fuck ups all the time. Manufacturing at scale is expensive and you can't just respin a board if an issue is found.
Stellariser@reddit
Yep. Most developers have no idea how anything works, and you see them jump from one new thing to another, forever chasing trends and the ‘latest’ because they have no real basis or understanding.
At some point you just give up trying to explain anything to most of them. This industry is basically one huge cargo cult.
behusbwj@reddit
Designer_Flow_8069@reddit
Stack overflow surveys for the last 5 years shows embedded pays more than front end or back end
behusbwj@reddit
Use the geographic filters. Then compare with a tool containing multitudes more compensation data (levels.fyi, glassdoor).
I don’t recommend taking surveys at face value with such a small number of respondents. In all my career, I have never responded to the survey, nor do I know anyone who bothers to respond to the survey.
Designer_Flow_8069@reddit
I respect you for acknowledging my comment and looking up the surveys.
Your comment which I was responding to was geographic agnostic, so I responded in kind without using geographic filters. Of course there are geographic locations which pay more or less, but on average the surveys do show embedded pays more than front end or back end according to those surveys.
Do you have a better source of data which documents specialities within software development? If not, it seems like it's one of the only data point we have. Which is in conflict with your:
behusbwj@reddit
It’s 20,000 respondents. That is nothing. The only country where my statement is false is Germany… respectfully, your statement is a bad generalization of the industry and your own data shows that.
Designer_Flow_8069@reddit
The respondents were half a million 5 years ago and it was the same result. SO just has gone down hill in recent years.
Reguardles, I'll ask again, do you have data source that is fair?
behusbwj@reddit
It’s clear that you’re not actually reading my replies. Goodbye.
Designer_Flow_8069@reddit
Oh, but I am. You said use levels.fyi, but we all know that is the pinnacle of not being biased towards FAANG, right?
behusbwj@reddit
As I said, it is clear you are not reading my replies. The conversation is over. I don’t entertain keyboard warriors. Next is a block.
Designer_Flow_8069@reddit
Smh, you have the energy of a toddler who thinks he's always right. Please block me.
Ok_Job_7203@reddit
Saying this as someone with 20+ years experience in C and C++ (telecom, enterprise grade mail servers, and hypervisors) and multiple patents. The problem is that the jobs for C/C++ are limited compared to other fields. And they also require some depth in the domain. The architecture cannot just move to cloud or modernize without major changes which the companies won't undertake. (Why fix something that isn't broken).
When the time for switching comes, we have to rely on existing network in similar companies to get a role + salary that matches the experience. The jobs are far fewer in than the number of candidates available. Even if you don't like the job, environment or your manager, you can end up stuck in the same place. For people where both spouses work and the spouse can get an opportunity in another part of the city or another city, she can't switch because people like us can't make the switch easily. They mostly fit the cash cow segment of the market where things are stable and don't need heavy investments.
But developers like web, full-stack, mobile (android/ios), etc. can fit anywhere most of the time. They have good salaries, easy switching opportunities and can cross domains. Because that's where new money and new business is.
Nothing against the languages themselves, the work is challenging (multi-threading, IPCs, semaphores, network stacks, kernel level development, gaming, device drivers, etc.), but the opportunities, pay scales and ease of switching or choosing your work don't match the work itself.
For the newer generation, I will advice them to pick up other languages while maintain enough knowledge of C/C++ so that their system and algorithmic basics stay strong.
katikacak@reddit
Why stop at assembly, I have electrical engineering degree, I know tons about transistors, I think you should learn about them otherwise your assembly knowledge would look like a magic.
Bro, stop the gatekeeping
Dimencia@reddit
You think pilots are also physicists?
Yes, it's like flying a plane without knowing the physics and engineering behind it, that's kinda the point. A lifetime is only enough to become a true expert in very few things. People who work with low level code are important, but they don't have the time to also learn how to do high level code (correctly), and vice versa. Many of the patterns in functional programming will ruin you for object oriented programming, and (again) vice versa
As you say, your perspective was completely changed by working at a low level, and you probably struggled working with high level code because things are different there - the patterns we use probably seem counterintuitive or even detrimental until you spend a lot of time relearning things. And they are detrimental, in some ways, but the priorities are different
And if you spent much of your career in firmware, why exactly did you move into higher level software? Was it because the pay is better while the work is easier, or just because there are more jobs available? There were certainly low level engineers that were better at the firmware work than you, and now that you've moved away from it, you'll never be one of them - and you'll probably retire before you truly master high level engineering. Would you really recommend that new devs take the same path you did?
So I think you know firsthand why we try to steer new devs away from low level land, better than most
OldPurple4@reddit
As a front end, getting people to understand the platform, the languages used and how they behave together, has the same vibe. No one writing websites will have utility getting down to byte code (please inform me if I’m missing something fun here), but the same principle you’re describing applies at our level of abstraction.
I get the same reaction when I ask folks how their libraries work. Or why they’re using axios over fetch.
You’re not getting old, but I think these are common frustrations that are only going to get worse as we move further from the metal. We are rapidly being pulled from it.
empty-alt@reddit
I'm split on this. I have a CS degree where we learned assembly and C++ (oddly enough, we skipped C). My day job is being a web dev but my after-hours hobby projects often are in C because I think it's cool.
This was already a conversation when I was in school; everyone wanted to be high-level software devs, not in firmware, but the only option is a CS degree. Where you go off and learn all sorts of weird stuff that doesn't apply to writing web apps. I think it's a valid argument to an extent. If you want to write web apps, we'd be better off focusing on the HTTP RFCs, touching on languages like C for the purpose of a basic mental model, and that's it. It's kinda weird that we have one educational track to cover firmware, web devs, computability, networking, db, algorithms. So nobody gets to specialize in their domain. Everyone gets to be equally dissatisfied. The web-dev crowd are usually just the loudest since they often feel a little taken off guard. They went in to learn how to respond to a request and instead, they are writing linked lists in C and solving integrals. The AI boom has just given them another reason to complain.
EmmitSan@reddit
Yes, we were talking about this in the early dotcom era, when many companies were going all in on Java, and the C/C++ devs hated it.
WhenSummerIsGone@reddit
People don't understand what "computer science" actually is. it's not engineering and it's not vocational training.
Consistent_Photo5064@reddit
It’s not a failure, it’s by design.
The industry demands more people who start working fast and cheap by only leveraging current technologies. Most software isn’t designed for efficiency and only needs to last a decade.
Now, what we’re failing to make it clear is that there’s a choice to be made and bare metal isn’t ever going away. On the contrary it seems it’s gonna get more hype now with mainstream software turning into slop.
QuietSea@reddit
Y'all are writing software that lasts a decade? Hell, I've seen re-writes as quick as 3-5 years. Might be getting worse with AI re-implementations too.
caprisunkraftfoods@reddit
My first job was working on the terminal interface ERP system of a company that was a hundred years old. There were dates change notes in file headers that were older than me. In the modern web world we're deeply unserious about technical debt.
necheffa@reddit
I've worked in codebases dating back to the mid 60s, as recently as a month ago. Even the "good for it's time" stuff gets ugly fast.
VomitC0ffin@reddit
Granted I'm in firmware/embedded development, but I work on product lines that are actively developed & productized for at least a decade, and then supported in the field for at least another after that...
quentech@reddit
The primary .sln file I work with has 17 years of commit history.
It's also a .Net 10 app with Vue 3, Angular 20-whatever, etc.
recycled_ideas@reddit
So there are two separate issues here.
One is a general issue with low level programming and that other one is specific issues with C and C++.
In general, if you are fucking around in assembly, you are reinventing the wheel. That's fine, it can be fun and educational to reinvent the wheel, bit solving already solved problems is not a productive use if your time and unless you're going to work in an incredibly niche segment of the market you aren't solving new problems in low level programming.
On top of that, C and C++ just suck. There's been a push to bin them even at the systems programming level for decades and we have reasonable alternatives at this point. Yes, there's a lot of legacy code in C++, but given that we're seeing Rust cropping up even in the Linux kernel, I think the move to get rid of it is growing stronger.
Laicbeias@reddit
C is a great language. C++ hate is absolutely valid. just look at it
and C should be teached as everyones first language, because otherwise it takes you decades till you understand wtf is really happening
look@reddit
C isn’t what’s “really happening” any more either. It’s basically programming on a virtual machine, too; its VM just has better hardware accelerator support.
cockdewine@reddit
The C abstract machine is still the target architecture for high-level programming languages, though, so it gives a deeper understanding for when you find yourself reasoning through stack vs heap references, passing by value or references, etc.
look@reddit
Oh, I agree with the conclusion: learning C is useful to understanding the abstract architecture that the vast majority of all software today targets, but that architecture is effectively a virtual machine closer in design to a PDP-11 than it is to what’s “really happening” on modern CPU and memory caches.
https://queue.acm.org/detail.cfm?id=3212479
PeachScary413@reddit
Wut? C is compiled to machine code and that is executed on the CPU.. you mean because there is a whole other layer of microcode optimization going on in the CPU or what?
TribeWars@reddit
https://dl.acm.org/doi/epdf/10.1145/3209212
This letter ("C is not a low-level language") nicely makes the case for this opinion.
PeachScary413@reddit
I don't really get it tbh.. are they proposing an entirely new language mapped to the specific inner workings of a specific processor architecture?
Like yeah good luck making your language aware of things like instruction reordering and speculative execution and other stuff 💀
TribeWars@reddit
A low-level language that has been designed from the start to support multi-threaded semantics, give control over cache locality, exposes instruction-level parallelism and makes SIMD part of the language would be good start.
> Like yeah good luck making your language aware of things like instruction reordering and speculative execution and other stuff 💀
This stuff is not more complicated than other cutting edge programming language research. The reason it won't happen any time soon is mainly due to inertia and because our computing systems have grown to such immense complexity that it's too much effort to make a big backwards-incompatible change like this.
Right now CPU makers optimize their processors for programs that run on a simulated computer from the 70s. There's a good historical reason for this, but it's not what you would do if you had the opportunity to design a new computer, free from any expectations that it can run existing software.
Probably we'd make CPUs with hundreds of cores and the system programming language for it would look something like a low-level erlang.
max123246@reddit
They're just saying, learning C doesn't make you learn how a modern CPU works. You still need to dig deeper and it's bad advice to just say "Learn C to learn hardware"
max123246@reddit
https://queue.acm.org/detail.cfm?id=3212479
Web version for easier reading
PoopsCodeAllTheTime@reddit
C is not what’s happening. What’s happening is garbage collection, and it doesn’t care about pointers
Laicbeias@reddit
yeah its not what C has, but what it lacks, that makes it educational. if your brain starts coding within that framework, you will fundamentally understand what's going on. and then get it that gc allocations are not free, and that most languages do so much stuff in the background that its a minefield. there is a reason most software was so much faster in the past.
PoopsCodeAllTheTime@reddit
Uhhh I like C, and I get that people think it’s this mind altering pedagogical method. But it’s just nice, you don’t need to use C in a classroom to understand the concepts, that’s silly
Laicbeias@reddit
You wont get hooked on assembly. C has the right level of abstraction to be educational and easy, for people to get dopamin hits from solving issues. While being able to following the flow. C++ as a first would turn a potential coder into .. heck idk serial killers.
PoopsCodeAllTheTime@reddit
That’s like, your experience/opinion. people will feel very differently
Laicbeias@reddit
Basically. Imaging you start with visual coding early like at 8 next to math. And then you jump up to c at 12. Its not even that hard. But idk what everyones doing. Like functions math all that stuff you could teach 12y olds.
Couldnt teach em c++ or assembly.
PoopsCodeAllTheTime@reddit
I mean, yes that would be fun for anyone that is interested in the topic to begin with. It’s terrible for anyone that doesn’t want to participate in the exercise
Laicbeias@reddit
If they have to do math. Its basically that with practical application. Hell if you get em programmable lego roboters they will love it. Its really just an educational problem. But coding should be teached alongside math
PoopsCodeAllTheTime@reddit
Coding shouldn’t be taught without the desire to learn it, some people know they just don’t care for it and won’t be ever relevant to their lives
Laicbeias@reddit
Obviously its my opinion. I teached for a few years and schools just make it hard not to hate coding. C hits a sweet spot. C++, Java are Over-engineered and abstracted. No algorithmic linear flow. You learn concepts that dont make sense for problems that you dont face.
Coding needs joy in experimenting and building. Thats how you learn. Schools dont do that. They do the opposite. And yes sure the 140 iqkid will enjoy sorting through assembly. It will do so in its free time. But many that could love coding wont.
From a general point of view c hits all the major points as a ideal starter language. It sits perfectly between all concepts while still being readable, without fully hidding what happens. If you know c you learn what all the others did and why they did it.
Smallpaul@reddit
C is not really a great language. There are much better modern choices. Zig and Rust for example.
C was an amazing innovation for the 1970s, but we’ve had 50 years to learn where it went wrong. Macros. Pointer syntax. For example.
bushidocodes@reddit
Isn’t it accurate to say Assembly, C, and C++ is “unnecessarily hard” for nearly all programming domains? Wouldn’t it also be accurate to say that these technologies are “obsolete” in domains like general-purpose application development, CRUD app, web services, Enterprise applications, etc.? Software in those domains used to be written in Assembly, C and C++ in the 80s and 90s. Now a Windows desktop app in C++ is often “legacy.”
That does not imply that Assembly, C, and C++ aren’t important in niche domains or core infrastructure. It just means the languages are irrelevant for most employers.
I have professionally written assembly, C, and C++, and it’s extremely niche. IMO, devs should learn the immediate abstraction below where they run their code. For a lot of folks, that’s familiarity with the JVM or the browser, not syscalls and ISAs.
Southern_Detective27@reddit
Most things people learn with CS degrees don't have anything to do with what you actually do on a job. Maybe there's that.
HQMorganstern@reddit
Do you have a lot of low-level job opportunities or colleges that teach low level programming around you? If there's no opportunities around you it's unlikely the field will be treated with a lot of respect.
I'd say I have the opposite experience, people tend to harp on a little bit too much about low level programming, referring to it as fundamentals of engineering, and claiming knowledge of it is AI immune, or makes one a true thinker, or whatever the current fad is.
Winter_Present_4185@reddit
Software is all abstraction. Low-level is just the bottom of the abstraction tree
ilyas-inthe-cloud@reddit
cloud architect here, been doing this a long time. you don't need to write assembly but if you can't reason about memory or why your api call is doing 47 DNS lookups under the hood you're gonna have a really bad time debugging prod issues at 3am. the real problem is everything becomes a black box. framework does x, library handles y, deploy button goes brr. and then something breaks and there's zero mental model to even start investigating. the devs i've worked with who understand the stack end to end are just more effective. not cause they write C everyday but because they can reason about tradeoffs at every layer.
Antique_Mechanic133@reddit (OP)
Exactly this!
03263@reddit
A lot of people still get into asm for romhacking old video games. That's a good entry point i think. I have dabbled a bit myself.
eieiohmygad@reddit
8-bit computing is a lot of fun. It's simple enough to understand from the silicon to the software, but advanced enough to do a lot of fun stuff. I'm especially fond of the MOS 6502.
FamilyForce5ever@reddit
To me it's more "trying to fly a plane without having a clue how to build one". I'm standing on the shoulders of giants to be able to do things more efficiently. Open-source means most companies can use the same compilers / frameworks, and the ones that can't will publish theirs (Uber publishing OpenSearch plugins I use at my 200-person company; LinkedIn developing Kafka; I'll never need to write my own Linux distro; etc).
It is cool to fully understand the program you're righting from compilation to containerization to cloud infra to raw bytes in TCP packets, but you don't need to know all of those things to do a good enough job for the majority of software roles available.
Boring_Pay_7157@reddit
Oh yes. Preaching to che choir. New grads also have the dogma of "cloud is always cheaper than metal". The amount of money i witnessed go to cloud providers for nothing is obscene.
East_Lettuce7143@reddit
The servers are cheap, it’s the logging/monitoring/load balancers etc. around the servers that’s gonna cost you.
PeachScary413@reddit
Which is expensive... why exactly? There are no magic logging/monitoring/load balancer hardware going on, you are just leasing someone elses nginx instance for 10x the price 🤌
Boring_Pay_7157@reddit
nginx is not even that good of a loadbalancer. It's a developers idea of an LB, vastly inferior to haproxy in LB featuers.
Boring_Pay_7157@reddit
Network traffic as well.
PeachScary413@reddit
I will never understand people spending hundreds/thousands of dollars in AWS to avoid having to boot up a Linux machine on Hetzner 💀
Boring_Pay_7157@reddit
Resume driven development I say!
Unusual-Two7277@reddit
AWS in particular is a fiendishly designed money-extraction machine IME.
Boring_Pay_7157@reddit
Oh yes. I worked with metal both with own DC and other hosting them, and if I were starting a company I'd rather use metal with unlimited ingress/egress, than waste 20% of time on reducing cloud costsd simliar bullshit that no one ever talks about (like when they say: BuT DiD YoU TAkE teChS InTO AccOUNt)
Teh_Original@reddit
Anywhere where you are paying for compute time you should be pushing for high-performance.
East_Lettuce7143@reddit
As a full stack dev, C programmers are gods in my mind.
max123246@reddit
C is surprisingly simple, but it hands you all of the complexity to shoot your foot with. You could learn it in a day. If you write python, you literally already work with references everyday and have to know the difference because of mutability. And a reference is just a non null pointer
bighappy1970@reddit
Can you change the brakes, including rotors and calipers, on your car? I bet not, but you can use it just fine without knowing how it works. Same thing for computers. The number of people where low level knowledge is useful is decreasing every day as systems improve. It will never be Zero but for most devs there is little to no value in understanding pointers or registers. 🤷♂️
max123246@reddit
It's the complete opposite. Who do you think is improving those systems? Who do you think is working on all of the low level software needed for the current boom on GPUs?
hurley_chisholm@reddit
Remarkably, no one has pointed out something I see implicitly referenced in multiple comments in this discussion: the C and C++ communities have notoriously terrible reputations for being full of arrogant dicks and that such arrogance is warranted because “it’s low-level” and difficult.
When I was early career, not knowing C/C++ made me doubt if I’d ever be a “real” engineer and then I got into embedded and scientific computing and quickly learned that it isn’t more difficult because I didn’t understand “first principles of programming”, it was difficult because tools like Clang had a garbage UX and seemed to unnecessarily obfuscate what the hell it was actually doing.
Conversely, I see statements like these:
Both of these statements are indicative of a general perception of C/C++ programming. It’s silly. A thing being hard to learn and do well doesn’t make someone who has done it an inherently better person. I’m also not convinced being able to write Assembly makes you a better software engineer than someone that doesn’t. Software engineering is so much more than programming and the upcoming generations just don’t have the patience for this grandstanding.
Maybe the C/C++/Assembly folks should get off their high horses if they want younger and early career people to be interested.
max123246@reddit
Everyone I've met at work is incredibly nice.
It's the fact that these people online have made a programming language their identity that has made the community so corrosive online.
Outside-Storage-1523@reddit
I know a lot of people actually want to do low level work but didn’t get the chance.
Square-Fix3700@reddit
We can’t do high level stuff if there’s no one doing low level stuff.
Podgietaru@reddit
That is honestly crazy to me. I view it as the opposite. I have a great deal of respect for those that work in low-level languages.
On the other hand, I have had people suggest to me far too often that we move x service to rust in a web-dev context. And whilst I understand that it'd be faster in principle, I think what this often misses is ... It's fast enough, and it's readable and understood. That strikes me as unnecessary and a little overzealous.
I think an education in CompSci needs to include C/C++. I think it's good to understand these things even without needing to actually use them. It's why I am not totally opposed to having someone explain the Big-O of their implementation or even whiteboarding out how some data structures works. While not necessarily needing those things in the day to day, a structural understanding of them - i think - leads to better software overall.
Shehzman@reddit
Getting tired of everything needing to be rewritten in Rust and Go. There are some things that need to be blazing fast that justify writing in those languages. Most of the time though, it’s better to just stay in higher level languages (though Go is also high level) like Java, C#, and Python and just optimize the code unless the core architecture principles of the project were bad/outdated.
Your end user will not notice the milliseconds of improvement in those languages unless you’re big tech scale but the business will notice the significant time lost on the rewrite.
curious_corn@reddit
Well, with the prices of RAM going the way they’re going, migrating to Rust might be a more credible position than before
Shehzman@reddit
Or get application developers to step up and reduce their RAM footprint. Even with RAM hikes (which are hopefully temporary but may not be), I highly doubt most businesses would agree to Rust rewrites unless it’s a small part of the overall application or they’re bleeding money because they chose a language slower than Rust. That can happen, but it’s likely more rare in the F500/enterprise space.
curious_corn@reddit
I’ve seen companies deploying “microservices” to execute 15 LoC message handlers. A whole JVM instance, with Spring Boot and all that. PHP would be better
sintrastes@reddit
Counterpoint though, and controversial opinion, but: Rust is a much better language for application development than Java, C#, and Python.
I could go on.
Yes there's a steeper learning curve, but you get so many benefits even if you are not writing particularly performance critical software.
Shehzman@reddit
While I don’t disagree (don’t have enough Rust experience to refute any of this), I’ve been using .NET Core at work for the past couple of months and it’s honestly been great.
The build tools don’t really get in my way, also has a great ecosystem (especially around web based apps), multithreading has been fairly trivialized with Task, async, and concurrent versions of data structures like hash maps (though you still have to make calls to the async version of methods in many libraries), and compiler warnings are thrown if you attempt to write unsafe code that can access something null. You can definitely make claims about why the Rust version for each of these is better, but I really don’t have the feeling when using .NET Core that I desperately need another language.
DigThatData@reddit
I think what's actually going on here is that the amount of interest in CS topics has exploded, in part because conversational interfaces have lowered the barrier for entry for a lot of techniques and technologies that used to only be accessible to people with specialized training.
Instead of trying to gauge what fraction of the field is interested in lower level work, I'd reframe this to ask how many people today are doing low level work. I strongly suspect the size of this community is continuing to grow, even if the fraction it comprises of the broader community training for work that involves code is shrinking.
MediocreDot3@reddit
My school taught us Java as opposed to C or C++ and tbh those foundations are 100% there in modern languages and abstractions aren't that tricky to understand. I don't like C++ not because it's bad but because it's just unnecessary for 99% of web development at this point which is all most of us do and seek to do
testeraway@reddit
Interesting, there was no low level at all? We started with Java in intro classes. Second semester of the first year we did assembly and C. Assembly was only a small portion, but C stuck around for the next three years.
alpacaMyToothbrush@reddit
My MIPS asm class was the most fun I've had coding. It was neat watching my pointer walk out of bounds and start reading garbage. Really drove home what an arrayIndexOutOfBounds was.
MediocreDot3@reddit
We did assembly but not C, our C curriculum was rewritten in Java before I started
karmiccloud@reddit
You had systems / operating systems class without C?
Scooby359@reddit
My Uni taught us C++ and Java. Think it was partly that they weren't bothering to update the sylabus, but also that if we could code in those, we could code in anything!
valence_engineer@reddit
GCC has 15 million lines of code. You telling me you know every single one of those lines and then know how they all come together to c convert your code into what runs on the machine?
There's always magic. Some people accept it. Others lie to themselves that it doesn't exist.
ChrisLew@reddit
I’ve been a SWE for only 6 years but I resonate with you’re claim there is a “stigma” but I’d rephrase and say that it’s really just lack of knowledge and understanding of what software exists out there
I worked at one of the most popular robotics companies in the world and one of the reasons I got hired was because I was a software engineer who understand alllll that low level shit
If you understand multithreading, modern C++ or rust, memory management and just basic OS fundamentals, there are tons of high paying roles at MANY companies big and small. I’ve personally interviewed or worked at; Facebook, Google , Boston Dynamics, too many high paying finance companies, Pixar, Qualcomm, Nvidia , and I could go on.
The work is hard but well paying and I do feel like I use my brain often but I wouldn’t say that people avoid that work because it’s not interesting but the barrier to entry is just higher
official_business@reddit
I've been working in C and C++ for probably 15 of the last 20 years. I'm by no means a god-tier programmer, but I'm effective enough.
I've found that a lot of peoples minds just snap when it comes to pointers. They seem to not understand that they have two modes (L-Value & R-Value) and they brush it off as arcane. There's also the manual memory management that some people struggle with as well. I think that's a big barrier to a lot of people learning. I wonder if these people would have been filtered out of the programming space had they gone and studied in 1985.
I've also seen a lot of C & C++ shops that just pay straight shit. Firmware dev pay was pretty lackluster at a lot of places. I've had phone screens where we went over pay before the in-person interview and it was just like, "Uhhh, guys...". No wonder you have a vacancy.
I don't think there's any less C & C++ work around. It's just that web dev has exploded and desktop software has died in the ass. There's still plenty of C++ in dedicated domains.
You don't need to know how to twiddle bits and whatnot if you're writing some CRUD app. No one in the business cares if it burns some more CPU. It's web scale, just add another host. They just want to bash the feature out and sell it to the customer. You just don't need C++ for these domains and maybe its the right trade off to sacrifice performance and ram for faster feature development.
Working_Noise_1782@reddit
It's cuz people getting soft. Like a twinky. I'm an electrical engineer that does firmware on micro controllers. C/C++/rust. I feel like CS eng don't play around enough with real hardware. Always want to use OSs for everything. For them memory allocation and management from c and c++ is akin to riding dirty with a gun in your waist with a bullet chambered and cocked ready to blow your wee wee off any moment.
DeterminedQuokka@reddit
So as someone who has only ever been paid to write high level languages but does know some low level stuff.
I think a lot of it comes out of defensiveness and fear. I’m currently running into a bunch of problems at work around garbage collection and interpreter locks in python. I’ve been trying to engage other people (at a reasonably large company). And basically everyone has just panicked because they don’t know what to actually do.
The only person who has actively helped was an ML guy. And I feel like that was more about solidarity for super annoying problems.
Basically, no one wants to acknowledge that the problem might be something low level because they don’t know how to actually fix it.
VictoryMotel@reddit
People always want to warp reality to protect their ego. Someone who only knows scripting knows that a low level programmer can do what they do but they can't say the same.
It's outrageous to try to claim superiority because you don't know something but people are maniacs, especially kids.
olzk@reddit
No you aren’t getting old. There’s nothing explaining how computers work that‘s better than programming low-level. All the haters should be ignored or sent far far away to RTFM ASM for the greater good of humanity
PressureHumble3604@reddit
The standards for education have dropped considerably and modern SWE are usually scared by complex topics.
C/C++ should be the standard languages taught to beginners.
Assembly is not necessary if you don't specialised but it's good to know how it works.
At the same time beware of people that make the opposite mistake: I have seen systems designed by low level people that performed horribly while having efficient code because they only focused on it.
w3woody@reddit
I think there are two things going on--one reasonable, the other not so reasonable.
I think the reasonable complaint revolves around having the right tool for the job. Higher level languages like Java or Kotlin or Swift are the right tools to use in their respective spheres, and each (as well as others; C#, Javascript, Python, etc., etc., etc) provide built-in run-time functionality that is either not present in C or C++, or require third party libraries (like Boost) or require a lot of heavy lifting.
The upside of the heavy lifting is knowing where all your bits and bytes are. There's value in implementing a doubly-linked list, if that's what you need, in C++. But a lot of the time it's excessive verbiage, and it's easier to simply use LinkedList instead.
The unreasonable complaint (in my opinion) is that so many people are so married to either "the latest thing" or have this absurd notion that code needs to be compact in order to be reasonable. That is, the very idea that you should even have to think about linked lists seems completely broken to them. And, at some level, I think there are a lot of 'code plumbers' out there who honestly don't know how any of this shit works--who seem to be compensating for that by complaining how 'unnecessary' it all is.
That is, I think there are a lot of people who couldn't implement a linked list to save their fucking lives--who then complain about the necessity of doing it to cover for the fact that they have no idea how any of this shit works.
WhenSummerIsGone@reddit
code monkeys vs engineers
technicians vs professionals
"why do i have to learn this??" vs curiosity and lifelong learning
lambda-lord-2026@reddit
I haven't seen any of this. Sounds pretty ignorant.
Careful_Lab_3411@reddit
reminds me of when people thought nobody needed to learn cursive anymore
lambda-lord-2026@reddit
Huh? Not even close to the same. Cursive is genuinely useless.
WhenSummerIsGone@reddit
as we go back to handwritten classroom exams, lol
slamjam25@reddit
A phrase I heard once that I liked is that “Wall Street is where people go when they’re desperately ambitious to do nothing in particular”. Silicon Valley took that throne over the last fifteen years or so.
A modern CS degree isn’t about computers, it’s about money. Don’t be surprised that people aren’t that interested in the guts of how computers work.
derpdelurk@reddit
While low level programming definitely has its uses (OS, browsers, compilers, video game engines, firmware, etc), the vast majority of software doesn’t require it and the tradeoff of using a higher level language for productivity will point away from low level.
Now for the controversial part (let the downvoting begin)… while C is not going away in our lifetimes, it should stop being taught as a go forward language. It’s old, crusty and unsafe. Unless you are maintaining an existing code base or are working on a domain that only supports C (some firmware maybe) there is no reason to start developing with a language that is older than many of us and the cause of many security issues. You don’t have to rewrite in Rust. But you certainly shouldn’t perpetuate a superseded language.
slamjam25@reddit
The reason C is taught isn’t because people should be writing C. It’s because C is still the best high level abstraction of how computers actually work.
mxldevs@reddit
But what's wrong with magic? A dev doesn't need to know how to build their own kernel if all they're going to do is serve up cat pictures over the internet.
Is your analogy that not understanding aerodynamics will lead to plane crashes?
And not understanding how code interacts with hardware will lead to computer crashes?
slamjam25@reddit
Pilots don’t need to know a great deal about aerodynamics, but I’ve never met a pilot who thinks the field is pointless.
teerre@reddit
I think the general perception is actually the opposite. Usually "webdevs", specially front-end, are considered "not real programmers" as opposed to the low level "real" programmers
Possible-Werewolf791@reddit
Yep! Can we spell "script kiddies"?
MoreHuman_ThanHuman@reddit
it's not like universities are filling the gap with useful instruction relevant to the modern industry, so yeah I'd say any program that skips all that isn't a good CS program.
Own-Student7991@reddit
Weird... in industry (while laid off atm) and there is currently a lot of disdain towards anyone who doesn't understand low level code and maths (proofs/conjectures).
hooli-ceo@reddit
I think it's simple. New grads and inexperienced developers are probably learning relatively new languages like Go, Rust, Zig, etc., that are abstractions over the more bare-metal languages, and their assumption is it's a 1-to-1 comparison and if these newer languages have better security, memory management, and just generally "better" syntax without much loss in speed (though some claim to be faster than C++) then their assumption is that anything else is just "obsolete".
But I don't think that's true. There are reasons you might want to use C or C++ or even Assembly these days for very specific situations, but as always in modern culture it's either the best or it's "trash" (a very destructive mindset, I believe). So, as has actually been the case all along, the real answer is "Is it the best use case for THIS specific application?" That's what needs to be considered, not whether it's actually obsolete or not.
BusinessBandicoot@reddit
I might be biased, but I definitely see there being situations where you'd want to use c or (inline) assembly, but not so much c++, outside of the scenario where you are required to, or you work in a domain where the required functionality isn't trivial to implement and there isn't an existing set of libraries for rust that meet your needs.
Like the performance of rust is on par with c++, with a substantially better developer experience. Better tooling, a out of the box build system that works in most cases, and a language that has a much smoother skill curve, in part because it isn't four languages in a trenchcoat.
hooli-ceo@reddit
Fair assessment. But that's kind of the point. None of these languages, despite their age, are necessarily irrelevant or obsolete, especially considering how ubiquitous C++ is.
szank@reddit
There were always people like this. Let them be.
For what is worth , one can build a good career not knowing what a pointer is. Having said that I would not be friends with such people.
JaguarOrdinary1570@reddit
Always people like this, and I've found they really can't be convinced to care. I never write C or ASM but I can often take a colleague's slow code and make it 100-1000x faster just applying a pretty basic understanding of hardware and memory layouts. Which you'd think would be somewhat motivating. But of those who asked how I knew to do what I did, none have actually gone to learn from any of the resources I've shared with them. Eventually I figured hey, great way for me to differentiate myself.
guacguacgoose@reddit
Your comment reminded me of this video: https://youtu.be/t992ul_IKtc?si=6l-1MNONqtz3HQuL
tl;dr We have the fastest hardware in history paired with the slowest software in history because of widespread complacency in both industry and consumer groups.
met0xff@reddit
I rather felt there's a new trend now that people want to do this more, especially with Rust and Zig and so on. But perhaps that's just reddit bubble
chandeeland@reddit
I’m staff level working in web apps with over 20yoe. I ended up here by chasing the money, and possibly because my low level chops aren’t the best. I have huge respect for low level skills and I have hobby’s and side projects in where I get exposure them.
Here’s a few examples of how I would mentor an ic who mentions of c or asm at work
You’re overthinking it. Optimizing a JavaScript or python script like you would a c program probably doesn’t do what you think it does. The interpreters are smart, they are open source with millions of developer hours in c. Trying to second guess how the processor works in the case of a high level language is probably a waste of time.
Optimizations exist of course. And approaches scrutinized but nothing that involves a conversation about c
In web app land were very very abstracted away from any hardware. Your python runs thru an interpreter on an os running in a container hosted on a virtualized cloud fabric. Your js is run inside a browser. Let’s not spend time thinking about registers.
You don’t have to worry about getting it the best. If you ship installed software or firmware. Releasing a patch or upgrade is a big deal for the customer. In web app land we deploy 2x a day. Fail fast, as the kids say.
Web app teams are focused on different constraints and problems.
Velocity. How quickly can any member of your team fix bugs in this code, or add features. A great webdev team can totally invert their product in a sprint or three. I imagine that’s not true for firmware
Scale. Squeezing more performance out of the hardware is generally not the problem. Hardware is almost infinite. Just configure a larger instance or more instances in parallel. I know, blasphemy. Trust me the cloud cost rarely matters. Big web scale problems aren’t solved by asm level optimization. They are solved by being less efficient per gb. Map-reduce for example
From a pure cs view web app teams are maybe overlooking some optimizations, and maybe we’ll get to them next sprint. But we’re dismissive of low level because it far down on the priority list.
We’re trying to solve LA traffic, you’re suggesting we consider changing a spark plug.
rover_G@reddit
There are more opportunities in higher level programming roles. That being said low level and systems engineers make bank
honestduane@reddit
These are ignorant people trying to make it seem cooler that they have skill issues.
Don't follow the imposters and posers who tell you not to go to the medal it's perfectly OK for you to learn all that stuff and it's not obsolete and it's not too hard.
talldean@reddit
I've been in industry for almost 30 years. Assembly, C, and C++ are shitburgers of a language if you have to read other people's code.
Knowing the foundations used to be useful, but I"m not even sure now; modern compilers do such extreme optimizations that you're better off knowing *a* language well, but unless you're working on firmware or drivers, I don't think assembly, C, or C++ are a good place to spend much time.
The other exception would be if you want to write a programming language, or work on some of those optimizations, then yeah, usually C. And/or if I was writing firmware or drivers from scratch these days, maybe Rust.
Going back to it, if you want engineers to understand the nuts and bolts, Verilog feels just as acceptable as the ones you mentioned, and no ones worried that new grads don't know Verilog.
zayelion@reddit
Im glad for a coming degree of separation. Writing firmware isn't like making desktop software, phone apps, nor cloud software. It's all different in key ways. What you must know to an obsessive degree is different in all the sub fields. We haven't gotten to the level of specialization that doctors have gotten but it might happen one day.
You can't know everything in this field even if you know "fundementals" it doesn't prepare you for emergent effects which in mass define a new layer of "fundementals". Writing firmware you don't need to know the event loop, or ring security, or database compaction, or even how to hyper optimize data storage across timezones to daily drive.
We all have our domain, the age of generalization is pretty much over. Business have different needs from each other and it shows in our field.
EmberQuill@reddit
Where are you seeing this, exactly? I've had the opposite experience. Low-level enthusiasts being absolutely ruthless in their criticism of higher-abstraction languages and anyone who uses them, to the point of even calling them "not real programmers."
nero_djin@reddit
This feels a lot like we are churning out mechanics that know how to change parts but not how to prevent the problem or find the root cause. Not directly sure it is better or worse but the phenomena is there.
PeachScary413@reddit
Yeah.. or electricians having no idea about how electricity work but "blue wire connect to green wire and it kinda works".
Smallpaul@reddit
Assembly, C and C++ are three very different issues.
New programmers should certainly learn Rust instead of C++ unless they are being recruited onto a specific C++ project.
It’s debatable whether Rust also replaces C as an educational language.
Assembly is certainly very relevant for small corners of the industry. No need for anyone to express disdain because they are in a different part.
farzad_meow@reddit
to drive a car you need very little understanding of how engine works. to maintain the engine you need to do the basic without understanding how engine works and why you chsnge oil. ask anyone why we need to change oil and most people can’t explain it.
there is a reward to investment ratio if that ratio is reasonable people do it. these days there are a lot of high lebvel language jobs so learning fundamentals is not as messy to find a decent paying job.
JitaKyoei@reddit
In my experience it's primarily web devs getting denigrated by low level enthusiasts. Of course, I may be bias as a web dev. I wish the various SWE disciplines could learn to respect and appreciate each other more in general though.
ATXblazer@reddit
Where are you seeing this? I finished my degree in 2016 and it was heavy on C with a good amount of assembly projects.
Brutus5000@reddit
It's not a stigma, it's an entirely different field. I have high respect for people going low level such as for firmware development, but your specialised skills are entirely irrelevant if we are integrating a 3rd party service in our business processes or need to find the bottleneck in data processing pipeline written in Java.
Obviously you could learn, but i'd rather give the job to someone that already knows that and had the experience. And there are a lot of these people looking for a job with matching experience.
And it's the other way round too, nobody would hire me to build mission critical firmware (and I wouldn't even have the confidence to apply for such a job).
So tl;dr it's not a stigma, it's a set of specialized skills that only applies to a small subset of jobs.
PeachScary413@reddit
That's great ☺️ less slop and less future competition in the field for the future
Frequent_Macaron9595@reddit
When I started self-teaching 4y ago, I went on teachyourselfcs.com and followed with the recommendations that led me to start with C and Lisp. It gave me strong foundations that I was lacking (I have been doing web stuff on&off for over a decade).
When I talked with a former boss, CTO in one of the companies I worked for, he was baffled that I dived in C and Lisp instead of sticking to more modern stuff. So not only new comers think like that but some of the folks that have moved up the abstraction ladder as their careers progress think the same sometime…
Glum_Worldliness4904@reddit
I myself a web developer and mostly spend time on markdown engineering nowadays, but I also have patches accepted into the Linux Kernel upstream and enjoy low level hardware optimisations on a cpu pipeline level. Like analysing perf-events, top down analysis, etc…
To be honest I almost never needed that at my daily work for 14 years. Maybe that’s the reason, it’s simply not that useful
bighappy1970@reddit
Markdown engineering? Where is the “engineering” part?
Glum_Worldliness4904@reddit
Basically nowhere. But this is how top management at our company wants us doing our engineering job
waffleseggs@reddit
Nothing but respect for low level people. Ignore the haters.
ContraryConman@reddit
As someone who does embedded, I think it's a combination of the following:
Due to the learn to code stuff, a CS degree has become the degree people get when they just want a clear path to an office job that pays well. It's the business degree if the 2010s and onwards.
If you're just in this to make money, then, objectively, 90% of software engineering jobs are either making a website for a company or making a mobile app for a company.
Also objectively, most companies are not solving novel computer science problems as part of the core business. Most businesses are not dealing with Facebook or Amazon level data with a 5 9s availability requirement. Most companies are doing CRUD operations on a database.
To be successful at the average software engineering job, you don't need any low level fundamentals. You just plug different existing web frameworks together until something works
A lot of Twitter discourse and influencers will heavily default to the web world,
Therefore, if you are a CS student who is only in the major to get a job as efficiently as possible, you actually don't need to know how a compiler works, how an operating system works, what OSI layers are, what the stack is, what the heap is, how to read assembly language, what a register is, with the ALU is, what an MMU does, what a TLB does, or any amount of C, let alone C++. And in fact, you barely need to understand data structures and algorithms, beyond passing a potential LeetCode interview.
If you're one of these kids and you just want a CS job as quickly as possible, all you really need is to just learn one of React.js or Vue.js, and then learn one backend language out of Node.js, Python, Golang, or Rust, and then just cram for LeetCode. Or, as an alternative, pick up one of Kotlin or Swift and make mobile apps.
We're getting to a point even where Javascript is just the default programming language, period. All desktop apps are now React Native. The mobile apps are React. The console apps are NodeJs. If your application is slow, it's because you need to switch to Bun, not because maybe Javascript is a terrible language for your application.
Anyway, the distain comes from the fact that you are asking them to do more than the bare minimum to just land some software job somewhere.
Some kids, like me, came into CS from the Arduino side of things. In my experience we have a totally different set of likes and dislikes than the majority. And some kids want a job, but also genuinely want to be good at CS and take the time to learn. But I'd say it's not the majority
The_Northern_Light@reddit
I’m an over-the-hill C++ guy and your explanation does not ring true for me.
I recently spent a decade in the SF Bay Area working essentially exclusively around other C++ devs at big tech and unicorns, and we were definitely on the same levels.fyi pay scale as everyone else.
It’s not just a Silicon Valley thing, either. When I was a junior I got my first real job in a fly over state the week I returned from my study abroad, and they later told me my newly created LinkedIn account was the only one who matched within 500 miles. I broke the junior pay scale then, and it’s the same story now that I’ve moved home as a senior. There’s just not that much talent, and supply-and-demand applies to the labor market as well.
Maybe they merely think the opportunities and pay are less? I could easily believe that. But I don’t think that’s reality. “People follow money and prestige” is undeniable, but it misses the reality vs perception mismatch, the barrier to entry, the (utterly broken) educational pipeline, etc.
Also we only have so many things we can hope to master: why would a young person choose to invest in where there are old experts when they could invest in the frontier where their inexperience is much less unusual? (And it’s clearly more exciting.)
A young coder coming up today has every opportunity and reason to focus on other things than mere implementation details… especially given that it is clear that a huge chunk of the work that I’ve spent my career (especially early on) will soon be done by AI.
Exact-Row8792@reddit
i had something like this at work
kevinossia@reddit
Again, the average software engineer is pretty awful at what they do.
Have fun doing the interesting work and don’t worry about what others are doing.
brewfox@reddit
I’m an EE and absolutely think understanding under the hood helps all development.
That said, every job I’ve seen that works low-level has worse pay, more competition, and more stigma. Part of that is because most low level stuff happens in the factories in China. Part of it is just “tradition” I think.
Low level work imo is much more difficult than modern abstracted work. So why not work for more pay and do easier things? That’s what I gravitated towards as well. And then into management for the same reason, if you have people skills then managing is way easier, better work environment, and higher pay in most cases.
Low level programming is similar. They have enough people to do it so nothing needs to change from a corporate perspective.
drumzalot_guitar@reddit
The younger devs that have had the exposure (as OP said, not everyone/everything requires or should be done in C/C++/assembly/etc) can troubleshoot because they have a lower level understanding. Those that are using AI, mostly automated tools, etc have no real idea what’s going on under the hood - they only know the basics of how to glue components together. This means it now falls fully on the more senior devs to mentor juniors on what to do when something fails and explain those lower level bits to start to fill in those knowledge/experience gaps.
Talking to many others, they see and express these same concerns and challenges with juniors.
demosthenesss@reddit
How many pilots actually have a deep understanding of aerodynamics? How many of them have much more understanding of how aerodynamics works than "air moves faster over top of wing than lower wing = lift."
What % of them could do (or have at any point in their entire lives) actual engineering math around fluid dynamics involved such they could even name more than a few formulas which are relevant?
And let's not even talk about how many could describe and detail how all the underlying systems on most aircraft work at more than a high level. How many have calculated hydraulic manifold pressures or done FEMA stress analysis? What about engine design? What about understanding how GPS and radio work at a deep level?
So using your example: how many pilots could actually explain describe at a deep level the underlying components and physics for how aircraft fly equivalent to what your expecting of say fullstack SWEs for "under the hood?"
A tiny fraction of pilots could detail all of those to a meaningfully deep level.
There's a common mistake many low level SWE folks make in assuming everyone in other industries have some deep fundamental understanding and awareness of every single piece of "those fundamentals." But the reality is almost never the case.
Pilots and fullstack SWEs probably have similar depths of understandings for how things work "under the hood."
This might not be a popular opinion on this sub but for me, personally, regarding the stigma you mention, the feeling I get somewhat frequently from low-level SWE folks is an inflated sense of importance about how everything at a low level is for actually building useful software. And that's quite offputting in general.
The same vibe comes through in this post to me, and while I personally have worked at all levels of the stack from webapps all the way down to firmware/embedded, I really don't think the level of working knowledge for how things work "under the hood" is as important as most people on that firmware/embedded side think it is when building higher level applications.
demosthenesss@reddit
The problem becomes then what level of systems/embedded/low level knowledge is "sufficient" or useful at all.
Do you need entire courses in a CS cirriculum? Actual experience programming in C? Or a few lectures on low level programming?
Using the pilot analogy, how much do you actually expect a pilot to understand before you become "satisfied" with their knowledge of aerodynamics?
I'm guessing if you don't have a mechanical/aerospace engineering background you'll describe a significantly more basic understanding than someone who has spent years of their life designing/building airplanes and/or dealing with fluid dynamics are on a regular basis might also consider meaningful.
Using the aerodynamics example, would a few sentences satisfy you for the pilot? If so, would a few sentences satisfy you about how assembly or lower level coding works also satisfy you? Why/why not?
quentech@reddit
It's more like trying to fly a plane without knowing how to engineer the metallurgy necessary to build a jet engine.
Someone in the industry needs to specialize in that stuff, but not pilots.
HashDefTrueFalse@reddit
IME this usually comes from people who don't know much about a topic. It's often just a blend of ignorance, insecurity and inertia. They're scared of what they don't know, and this manifests in criticism so that they can rationalise: "I don't know X, but it doesn't matter. I don't need to know X. X is bad and not worth knowing. I know Y. It's much better to know Y. Phew! Thankfully I don't need to spend any time or effort on X! I could if I wanted to, of course. I won't though, because I don't need to..."
I've seen it a lot. New devs are getting worse (in terms of technical ability) each hiring round unfortunately IME. We've only had one hiring round since LLMs and that was a few years ago so I'm not necessarily meaning due to those, either. Before those it was bootcamp grads being less equipped than degree-holders (though some of ours worked out well) and before that it was those with backgrounds in native software vs. building only for the web... and so on. Every new cohort has people who are quick to conclude that anything they don't know or can't do is bad or unimportant etc. Mostly it's the above, but occasionally they can be right.
I've also observed that most grads these days barely consider that most of the electronics around them need software to work. Most seem to have never considered that they could/would write anything other than back or front end web application software. One of the juniors I mentor got really interested in learning about firmware for a while and I really enjoyed showing them the basics of it. Shame we're trying to do less of it. The firmware in the products we buy in isn't nearly as nice as our own stuff, much fudging things to get things working...
SawToothKernel@reddit
I didn't come through the low-level route - went straight into web design instead. And that was the root of a fair amount of impostor syndrome in my early years.
I'm now 15+ years into this career and have learned that you don't need that low-level grounding to be successful. I'm sure it would be nice to have, but it's definitely not a hindrance in a field where systems design and people skills is far more important. By an order of magnitude.
leo-dip@reddit
But the magic will always be there. Even if you know assembly inside out, there will be elements that you will assume will just work, and you rely on them doing their magic.
moreVCAs@reddit
There is no stigma against systems programming. As the value of physically typing in this JavaScript incantation or the other rockets toward zero, actually understanding how the machine works and how to design, test, and debug systems has never been more valuable.
mq2thez@reddit
VCs prefer software because they want companies that hyper scale or burn out. Hardware doesn’t hyper scale much at all, because you always have a lot of money tied into manufacturing, R&D, etc.
Hardware development is also just… harder. It’s a lot less forgiving in many ways. Not surprising that folks aren’t as interested in that. Look at web development: the whole industry is full of brainrot React dev because it’s “easier” (lol).
Solrax@reddit
As others have mentioned, I think it's because most jobs are web dev now.
I spent most of my career in firmware and desktop apps. For desktop apps it is still possible to get by much of the time not knowing what's going on underneath, though you still have to understand pointers and memory management.
But there is still a point where knowing how things really work is a distinct advantage. I can think of a couple of times trying to debug complex bugs in C++ and we couldn't see what was wrong. So I set a breakpoint and brought up the disassembly to step through it. My coworkers (who were excellent programmers within their limits) said "I can't understand that". But it showed us where our problem was. They thought it was magic :)
roger_ducky@reddit
I suspect people are mostly trying for the “higher paying” jobs and thought it’s not used there, so didn’t want to bother adding yet another language.
It’s true you understand a lot more with more low level understanding.
I mean, when someone looks at a compilation error due to missing headers and is completely stuck, that’s… knowing too little.
defmacro-jam@reddit
It's because people tend to discount the value of experience they don't have. Basically The Market for Lemons applied to human capabilities. A true race to the bottom.
HumzaDeKhan@reddit
Here's the real deal. One of the major reasons, you learn these concepts is because they teach you how to think.
This is one of your main jobs as a computer science or software engineer, whatever you may want to call it.
If you don't know how to think, you cannot approach even the smallest problems. You go through the difficult ones; you learn how to break them down. You learn how the very hardware that you're going to build applications upon runs, understands, and interprets your code. Based on this information you can make better choices.
Even if you're not going to write this assembly daily, you can still benefit from learning what's actually happening underneath the hood. If you're going to build on top of a black box which you have zero understanding of, it's not going to take you far in your career.
In your specific case you might be smart but that's not the case for everyone. The curriculum has to consider every single intellect that's going to take a course, all at different levels.
bzbub2@reddit
live replay of OP setting up this thread:
cue video footage of a scarecrow looking thing being made...sort of a... straw man....barely any substance to it....
alright everyone discuss
CactusOnFire@reddit
I can only comment on my experiences within the Data Science/applied (non-vision-based) ML Spaces. I think to "non-technical business people" in my domains, lower level languages just...don't exist. People will absolutely use a python library that compiles to C or use a tool that has bindings that translate to Rust. But because budget isn't allocated to developing those tools, they are completely blind to their existence.
Usually one or two of the more tech savvy engineers will talk about things that are happening in rust, but just due to dogged pragmatism everyone else just deals in higher level libraries or frameworks because there's no buy-in to go deep on systems-level programming.
ghost-engineer@reddit
keep in mind that the people who are saying these things are severely inexperienced.
OllieOnHisBike@reddit
For me they are like mechanics at main dealers, they can service the cars because they follow the manuals, and when new cars are introduced they are trained on the new manuals.
Could they build / rebuild a car from the ground-up - no....