You are never taught how to build quality software
Posted by workflow91@reddit | programming | View on Reddit | 312 comments
Posted by workflow91@reddit | programming | View on Reddit | 312 comments
rmrfv@reddit
It’s completely demoralizing when you want to build things the 'right' way, but management forces a rushed MVP out the door. Perhaps the real failure of CS education isn't skipping 'quality code'—it's failing to teach us how to communicate its financial value. If we can't translate technical debt into business risk for stakeholders, we'll always lose to the deadline.
abeltensor@reddit
Learning how to build quality software is not really a transferable skill. You can explain why certain design patterns are being used with specific project, but it's impossible to explain where these patterns should be used in general. There is always going to be a situation where you need to "break the rules" of "good design" to achieve a workable system.
Some design philosophies attempt to subvert this fact by teaching idioms but those are subjective based on the domain. Philosophies like CLEAN or SOLID do not work with every project and in my experience they work as guidelines rather than strick rulesets.
Generally, if you want to learn to build quality software, you need a good grasp of the fundamentals and some experience in the domain. Looking at other code and understanding the intention behind it's design is also a good way to quickly learn what might work for your use case.
Strenue@reddit
Guidelines need to be contextualized. Like frameworks.
blazarious@reddit
This is basically it. Also, don’t do the things for the sake of it. Do them because (and only if) they bring real value.
st4rdr0id@reddit
Absent from the article is the well known fact that defects are exponentially more expensive the later they are detected in the software process, vs when they are introduced. Eg: a requirements or design defect detected during construction or testing is astronomically more expensive than doing minimal requirements or design verification.
What you see in practice is just programming, programming, programming. Every other phase is neglected or outright denied. Defects are caught when there is a lot of code already written, consequence of rewarding "completed" features in the "agile" game.
ElCthuluIncognito@reddit
This ignores the counterbalance that a feature ready now is more valuable than a feature ready later. Yes, even with more issues.
Customers are willing to put up with a surprising amount of jank if it's the only jank available to them and there is a clear intention to improve on it.
st4rdr0id@reddit
That mindset is completely unprofessional. A half assed feature that causes data loss is a nightmare for the customers with possibly legal consequences, let alone the face loss.
ElCthuluIncognito@reddit
Professional does not always mean reliable or even ethical, assuming what's professional is what makes money. (This actually goes a long way to explaining why OSS can be shockingly more reliable than proprietary solutions).
"Move fast and break things" was the mantra of a multi million dollar corporation. It's good to champion the idea that they should be synonymous, I personally do it, but I just have to look at the gaming industry and see how much money they make pushing broken products to know I can't delude myself into thinking that prioritizing reliability is necessary to be profitable.
riplikash@reddit
Intention to improve is not the same as capability to improve. I've seen a LOT of companies auth shipped products die because code entropy caught up with them too quickly.
In the end it's not as simple as "do it once and do it right" OR "feature now is better than feature later".
There is a science and art to delivering quickly. There is a science and art to knowing what corners to cut, how to cut them, and where you can't cut.
That's a major part of what God software engineering is. Being able to deliver software that is fit for purpose, as expandable and solid as it needs to be for the situation.
mpyne@reddit
This is the whole entire point to using 'agile' methods. You use them to avoid baking-in defects and forcing massive expense to fix them later, by making it much cheaper to do the minimal validation of the requirements and/or design. This is what makes it possible to justify doing these things in the first place, because you won't kill the business in the process.
Neither requirements nor design are valid for all time; they themselves can go stale so you need to have a way to implement the requirements you manage to validate in a timeframe where they are still valid.
Agile is not "just program and ship whatever". You could call that waterfall, even, if you zero-out the upfront phases that come before coding in that process.
st4rdr0id@reddit
"Agile" is often agile product development, and yes, most startups (especially lean ones) know the importance of validating a business case before building anything. But other than that, most often there is no software process in place, be it agile or not. So they validate the business part, and then they don't do any requirements engineering or design whatsoever, if it is done informally it is never validated. Even the last safety net (tests) is often rushed or skipped.
TheGRS@reddit
The thing that no one realizes until they’re closer to it: requirements are very tough to figure out, people are just making things up often, and good requirements gathering is very difficult and sometimes impossible.
s73v3r@reddit
Which is why agile, iterative development is better. Getting something in front of the stakeholders that are going to use the software sooner means you'll spend less time building something that doesn't meet the real requirements.
winkler@reddit
Don’t succeed not slowly, or something…
B0Y0@reddit
Certainly doesn't help that every company I've been to has taken programmer estimates on a task and immediately used them as some sort of performance metric, fully compromising the integrity of them process and value of the estimates in the first place.
Obliman@reddit
"defects are exponentially more expensive the later they are detected in the software process" this is true in other industries as well. Made a mistake in your prototype plastic injection molded part design? A few scrapped parts and a relatively cheap re-machining of molds. Made a mistake in the production-level design? 7,000,000 parts now defective, massive recalls, expensive retooling, etc.
RecklesslyAbandoned@reddit
Are they? They're only exponentially more expensive if a customer find them. Bugs are definitely more expensive the later in the process they're found, but you need to validate the specification before hand and make sure you're building out features a customer actually wants.
paulsmithkc@reddit
Which is why you do market research and user interviews before you build the thing.
70-w02ld@reddit
But customers don't know what is possible.
How can customers even have a clue what can be done? Why didn't they just do it themselves if they knew how to do it?
I know what can be done. I just don't know how to code it. So I find people that do.
jplindstrom@reddit
You don't ask customers what they want.
You ask customers what problems they have.
Then you think about what you can do about that. Sometimes that involves writing software.
whipdancer@reddit
Customers don't need to know what's possible. They need to understand the task/job/whatever they need to accomplish.
My job is to help them understand what is possible and at what cost (time and money are both costs).
I educate them about the project triangle:
Time, money, features - pick 2.
It really is that simple.
masterots@reddit
Customers shouldn't have to know what's possible. To do this right, most of the time you shouldn't build the thing the customer says they want. You ask them what problems they have, and what have they done so far to deal with those problems, what's worked, what's failed. Once you have those answers, you take that information and, as the expert who knows what is possible, you build a tool to solve their problem(s).
70-w02ld@reddit
JavaScript - like I dislike using JavaScript - I just do. Ive found remedies using CSS (cascading style sheets) and DHTML (Dynamic HTML) - where most people use JavaScript - I've found DHTML to have great abilities. But, if I need it done in JavaScript, there are tons of code snippets and free scripts that you can make your own. All over the Internet. - like freeware, shareware. The best is the web hosting companies - you want it, they can do it. Practically for free, included in the web hosting as a free website. Plus you have the free scripts that they offer and the free css templates for page formatting.
But, it's all just basic. The money is in developing things that don't already exist or helping people that don't want to do it themselves. Tons of options to go with. But as for housing quality software and mobile "form" applications and websites. It's all I up in the air as to what it could be. It is what it is and it's all formal and yet informal at the same time.
Like here's a question for thr OP - did you go to school, did they teach you how to properly secure HTML web forms so your site can't be hacked? They may have, but it's not a popular area of research anymore - it was remedied by what my basic programming 101 professor taught me. Which is to make aure that each form field is properly filled out, and it doesn't allow for various special characters, only numbers or alphanumeric, case-sensitive or no case-sensitive then passing it to the next module to make sure it is a yes or no and then moving it the next module. As I was taught from a basic programming college book called "Modular Structures Programming" and it's not very popular, but dor learning programming, basic 101 had everything your looking for, from they way i learned. But I had a great professor whose fighting a court case because he designed the computer we all use today to allow for a locksmith to get into your PC, whereas the guy said he didn't want locksmith accessibility, he wanted something else. But the professor said he didn't mention it, the guy wanted what thr professor built, which to me was a basic system to handle audio video and word processing - but it could be priframmed to work with any device or IO (Binary|In/Out|Off/On) port. Which the professor had already built - it's weird, I may be wrong on thr topic of his court case but that's what I gathered. He also taught me how to open previously formatted volumes and use those - like a library, using MS-DOS.
Did you take basic programmig 101?
paulsmithkc@reddit
You are about 10 to 20 years behind the curve based on everything you've said.
70-w02ld@reddit
You nailed it. I am. Ím not hiding that fact.
But OP wants quality software development.
Design-Cold@reddit
It's always "how long to build the thing" instead of "are we building the right thing" and "have we the resources to build it"
RememberToLogOff@reddit
But exponential is when bigger /s
Andy_B_Goode@reddit
Hence the old joke: weeks of coding can save you hours of planning
SeaManaenamah@reddit
I don't see what Agile has to do with your complaint. I'd argue that management are just as likely (or more likely) to rush features in a non-Agile environment.
hotdenimchicken@reddit
You just described my company, except we haven’t even implemented agile and jira is just a graveyard of backlogs of vague tasks with one line titles
seven_seacat@reddit
this is my biggest pet peeve
70-w02ld@reddit
But don't people just skip all the actual hard work and move straight to putting it into developement.
I don't see to many people writing it all out. Proofreading and editing it. It seems like they just come up with an idea and start coding it.
fire_in_the_theater@reddit
smh, trying to assess everything for value is a terrible way to write software, and is responsible for a bunch ungold enterprise software.
blazarious@reddit
What’s the point of commercial software development if it’s not creating value for the business, though?
fire_in_the_theater@reddit
value generation is not always obvious, that's the problem.
sometimes u do something for sake of it and fix bugs that would have happened, but now don't have the chance to pop up.
writing software efficiently is all about constant complexity reduction, and idk how to do that besides constantly seeking refinements for the sake of it.
as far as i can tell, most enterprise software miserably fails to seek this, and manages to turn simple things into absolute undebuggable shitshows.
blazarious@reddit
You got it, except you don’t seek constant refinement and improvement for the sake of it but because it creates value down the line. Communicating that was the whole point IMO.
TheOneWhoMixes@reddit
Except communicating that is extremely difficult even if the people you're communicating with fully understand the problems at hand.
If you say it "creates value down the line", they'll want specific examples of issues that not refining something could cause because they want to determine the value.
oiimn@reddit
Oh poor you, you have to justify yourself and reason with people
blazarious@reddit
Sure, you need to spell it out and the specifics may vary in each case but one thing might be increased dev velocity or decreased time to market. Those are measurable values.
Qweesdy@reddit
The point of commercial software development is to maximize profit by sacrificing as much quality as possible (forget efficiency, don't bother testing, screw the security audits, adopt vendor lock-in tactics, pervert "Agile" into a justification of subscription fees for software that will never be finished).
blazarious@reddit
This seems more like a guide on how to set up your business for failure IMO.
Glacia@reddit
I've seen a lot corpo programmers say this like a mantra, but it's just so ass backwards. Do you think janitor comes to the office and thinks "Man, how do I create value"? No, he just does his job. Your job is to create software, you're not here to run a business.
blazarious@reddit
Well, okay then. If that’s how you see it. I see it differently.
SalamanderOk6944@reddit
Tell this to my designers.
I jokingly brought up an idea, and it was probably the worst idea amongst a bunch of others, and now that's all they can see.
It's the equivalent of when your junior designer walks in and wants to build Red Light Green Light from Squid Games into your project.
agumonkey@reddit
value is meaningless
imnotbis@reddit
That isn't true. Value is how much a rich person wants to use your software. The richer they are, and the more they want to use it, the more value you produced.
agumonkey@reddit
that's the shittiest definition of value ever produced in human history
imnotbis@reddit
It's the one our society uses.
agumonkey@reddit
I know, that's why I dream of a cabin in the woods every night.
usernameqwerty005@reddit
Stakeholder value, but for which stakeholder? :) The most affected and interested stakeholder is rarely the stakeholder with the most power and influence in a project. Compare with stakeholder mapping: https://www.boreal-is.com/blog/stakeholder-mapping-identify-stakeholders/
Synor@reddit
the funny thing is, high quality software is cheaper to produce
blazarious@reddit
Yes, that’s the point
saltybandana2@reddit
where value is.
Maybe if you want quality then say quality.
TheMerovingian@reddit
Test driven development, integration testing, and pure functional code (Elm is insane) are the biggest quality focused improvements I've learned. I'm sure there are thousands of things I don't know about programming, so this is my 2¢
NostraDavid@reddit
Last time I tried Elm, the creator released a new version that broke any and all tutorials, with no clear way to upgrade/replace anything. That was quite a few years ago, so how is it nowadays?
I just checked the Github repo - is the project alive? Or is the language somewhat considered "done"?
TheMerovingian@reddit
I worry that it had lost its momentum. It felt like a one man project, and I haven't seen anything from Evan in a long while.
CallMeKik@reddit
People hate TDD because bad code keeps them in jobs
TheMerovingian@reddit
I watched a talk about Elm where this woman said that once her project was done, she had nothing else to do because the language was Elm. Elm can't have runtime errors: if you have all the specs correct and the services you use keep working, it doesn't fail. Really interesting to work on, and enlightening that this is possible.
12destroyer21@reddit
No it is because in order to write the tests before i start writing the code i need a rough idea of the architecture and datastructures im gonna use, and before i can know that i need to start writing the code to see if the approach i think im gonna take is even gonna work.
I have written code to solve a problem, where i had to throw out all the code 3 times because i ran into problems with the architecture a few weeks in. If i had written tests for that before hand it would have taken twice as long to come to that realization.
I instead propose that we should do Test Driven Design instead, where we make sure that the architecture we are making can be tested, i.e solid principles etc.
When we have proven that the solution can fulfill the business requirements then we can start working on a testing suite for that code along with other QA practices
CallMeKik@reddit
“i had to rewrite my app 3 times because i didn’t used TDD. but if i did use TDD then i’d have only written it once”
naringas@reddit
"BuUuT I AlReAdY KnOw pHp/pYtHoN/JaVa/sCrIpT So pAy mE To lEaRn tHaT ShIt oR FuUuUu"
Shimmeringbluorb9731@reddit
The problem with TDD is that most development is not green field development. But brown field development and it is impossible to use TDD with a legacy systems. I tried and gave up because it took up too much time in the project schedule. We had to abandon TDD.
PM_ME_C_CODE@reddit
TDD is only impossible with a mature code-base if the legacy code wasn't written with testability in mind and management thinks "refactoring" is some kind of bad word. "Brown field development" just tells me the code base was designed poorly and probably wasn't testable in the first place.
If your codebase is actually extensible, TDD should be easy.
GregsWorld@reddit
Yes, that is what legacy means.
Can't refactor without tests, can't test without refactoring.
Its a slow painful process to add tests to an existing system.
hhpollo@reddit
That is exactly what they're trying to fix lol
PM_ME_C_CODE@reddit
Good luck. I mean it. I've got the same kind of shit-sandwich on my hands and the only way we can test it at all is E2E. I've got the E2E approach automated to the fucking nines, but that still leaves a LOT of the codebase that we still cannot test because it's actually impossible as written due to a ton of in-line configuration logic spread all over the place.
Pro-tip: Don't let your data scientists write their software. Have them write specifications for actual software engineers and software designers following good programming and design practices to implement instead.
NostraDavid@reddit
Read https://en.wikipedia.org/wiki/ISO/IEC_9126
The new standard is ISO 25010, but that doesn't have a nice Wiki page.
holyknight00@reddit
Quality as well as security and other "secondary" matters in software are only desirable for msot companies. They won't bat an eye if they can ship their app 2 days before, even if that means creating some sh1t piece of software that will haunt the whole team for 2 o 3 years. This is where the technical management (at least the CTO) should step in, and negotiate what is the minimum quality bar for the software.
Be aware that techno-fanatism also doesn't work. Having perfect software that is delivered one year later than expected can be completely useless from a business perspective. If delivering on the "X" date is not technically feasible but is still really important for the business, that means the development team must negotiate either a reduced scope or concrete plan on how the features that were rushed will be compensated afterward.
Just delivering sh1tty code with no plan should never be an option and that's a management failure.
st4rdr0id@reddit
What doesn't work is putting tech-illiterate managers in charge of the software process.
Other industries and sectors have been regulated over the years. Eg: while 19th century factories in the industrial London preferred to have children work 12h a day and ignore the frequent safety accidents, we now can eat tuna cans confidently without expecting a human finger inside it.
The software industry however apparently resists civilization and wants to eternally continue working in far west style.
Firm_Bit@reddit
Dealing with a half-tech-literate slip level boss right now. They know enough to be an ok junior dev but not enough to reflect on their own knowledge successfully.
So they do things like “investigating” an issue and “solving” it. In reality they just retraced steps we’ve already taken or would have taken very quickly and then wrote a ticket detailing what we need to do to solve it. Of course they missed underlying issues or just totally misdiagnosed the issue. So we’re left dealing with not only the actually issue but the perception that it was nearly already solved and that we’re slow to implement their solution.
holyknight00@reddit
I have yet to see any particular industry that objectively improves after regulation. Most industries just seem to "get better" in the short term and then stagnate shortly after.
For example, most of the improvements in working conditions in the automotive industry in the early XX century came from the companies themselves, decades before regulators made them mandatory.
sonofamonster@reddit
Sometimes, an improvement in working conditions will increase efficiency and profit. The industry will naturally adopt these.
Sometimes, a degradation of working conditions will increase efficiency and profit. The industry will naturally adopt these unless there’s a system in place to ensures that it leads to a decrease in efficiency and profit.
Capitalism optimizes for profit extraction. That’s all it can do. Expecting it to do otherwise is like expecting water to catch fire; it’s an anomaly, not a norm.
bushwald@reddit
completely ignores the wildly life-changing gains achieved by the labor movement in the first half of the 20th century Mostly the companies just decided to be better
Zusatzzuckerl@reddit
How would he know?
Dachande663@reddit
This is the major advantage of sofware engineering over pure computer science degrees.
trollporr@reddit
Where? How?
I did a “pure CS” program and we had several classes on “software engineering” where we learned about different ways to gather requirements and plan and do testing.
The problem in my experience at work is not what the programmers know or don’t know, it’s what power they have over how work is done and planned.
C_Madison@reddit
Same here. The problem is that - unsurprisingly - CS degrees vary much between different universities. One good indicator often is if CS either started from the Math department or as a separate institute (or whatever it's called) - at least from what I've gathered talking to others. If the university has a well known math program CS is often seen as only some offshoot of it, so you have 90% math and almost nothing useful.
To preempt some typical comments: Is programming the only thing you need in computer science? No, of course not. That's why there's more in a CS degree, but a computer scientist who cannot program is as useful as a surgeon who cannot operate (credits to my hardware engineering prof). At least in the industry and also in many parts of computer science research.
turunambartanen@reddit
I find that analogy to be flawed. IMO a better comparison would be surgeon=programmer/Software engineer and computer scientist=medical professor. I don't expect the med prof to work in the theater, I expected them to read papers, request studies, analyze data, study the underlying biochemistry and use all of their combined knowledge to suggest new treatments. And conversely I don't expect the surgeon to know everything about the latest advances in research. They need to know the current standard of care and practice their craft.
I have sympathy with all who were tricked by the marketing department, but if you study computer science your education will most likely be tuned to the science part, not the practice part.
It probably varies around the world, but in Germany we have certified apprenticeships for software engineering or system management. There you will work in a company and get to know all that is required for software engineering without spending unnecessary time on computer science. It baffles me that this is supposed to be covered in university in other countries. How are you supposed to learn programming in a science environment?
paulsmithkc@reddit
There are three different departments that are usually over CS:
This definitely impacts what is taught and how. But I don't think the parent department is a good indicator of quality. They all end up erring to far to their preferred proclivity.
C_Madison@reddit
I meant more independent department vs. any parent department. Didn't think of the other options though, cause Math is by far the most prevalent here, but I agree - each parent department will pull it in their specific direction, often too far.
G_Morgan@reddit
Businesses don't want to do quality software. It has never been tested because businesses just don't want to know.
ward2k@reddit
I think it really depends, CS degrees can vary massively. In some places CS degrees are pretty much just software engineering degrees with a little more time spent teaching the theory behind it (that's how my University taught it(
While in others it's nearly all theory and little to no actual programming
Shanteva@reddit
I had 1 class on SE everything else was math and big O notation
Turtvaiz@reddit
That just sounds like an extra shit curriculum
Shanteva@reddit
I did have 1 other resume padder where we did mobile apps and opengl etc, but you left having written 1 unit test and no idea what the day to day programming was like
Shanteva@reddit
Honestly though, the diversity of the student body made up for it. People that went to more technical schools around here are mostly suburban squares
Emt2softwaredev@reddit
Not to mention the value of building projects instead of leetcoding
downvotesonlypls@reddit
So you're saying me grinding for 10 hours to solve advent of code isn't the best use of my time? (╯°□°)╯︵
joelangeway@reddit
IMO that kind of practice is a valuable but under appreciated way to improve software quality. Time spent practicing reading and writing code results in fewer mistakes going forward, and clearer conversations about how things should work.
Bwob@reddit
I mean, it's not a BAD use of your time. It's just training different brain muscles.
Turns out you've got lots of brain muscles related to programming. And you need to train them all! Definitely don't skip brain-leg-day!
Me_Beben@reddit
I just train my brain abs because that's all the ladies want to look at.
Emt2softwaredev@reddit
Gotta get the girls
studentofarkad@reddit
How do you learn best software engineering practices then?
rookie-mistake@reddit
go to a better school. i don't feel like arguing with half the people ITT but my university had a pretty robust software engineering course that was mandatory for the degree, group projects with agile dev/sprints etc etc, all of which carried over pretty seamlessly to an actual job
Phreakiedude@reddit
Reading a lot of programming books. Every senior or architect I worked with had at least 20+ books they read and studied
elmuerte@reddit
By following the right computer science courses, and then applying the theory a lot in actual projects.
skesisfunk@reddit
You really don't need the courses. Or that many, you certainly don't need a degree. Many of the most accomplished programmers and architects I know either never went to college or have a degree in something other that compSci/software.
Its more about problem solving skills, self-directed learning, and experience.
TheBananaKart@reddit
Personally I think than rather applying “advanced methods” good engineering is mostly keeping to simplicity and in depth documentation.
skesisfunk@reddit
IMO "simplicity" is to vague to be a guiding light. You could use "simplicity" to argue against dependency injection and other various abstractions that people might label "advanced methods". When writing software the path of least resistance isn't always the best one because, especially when writing a large application, that way will usually lead you to a hard to understand code base that you cannot effectively document yourself out of.
Many software problems are inherently complex so employing "advanced methods" is necessary to simplify your architecture. Knowing how to employ abstractions is considered esoteric and unnecessary by many people. However it unlocks the ability to separate logic in to well defined responsibilities that makes refactoring, testing, and adding features much more manageable.
Writing software is a craft, you can't just ignore the finer parts and expect to get an acceptable product.
unique_ptr@reddit
If I had a professional mantra or "guiding light" it would be: strike a balance. If your solution is too simple it will be inflexible or brittle, and if it is too complicated it will be difficult to maintain and slow to respond to required changes.
Engineering is all about trade-offs, after all, so the quality of your work is a direct reflection of the trade-offs you chose.
maikindofthai@reddit
My mantra is “do it good”
edgmnt_net@reddit
I both agree and disagree with you. Yes, you don't need that many courses. No, I don't think it's fair to chalk it up entirely to experience, much like math skills isn't just experience and self-directed learning after reading some theorems. I believe most of it is teachable or at least it can be directed to some degree to improve chances of success. Even if some do succeed entirely on their own, we can do better.
Unfortunately, universities do not focus enough on stuff that matters in the industry. The little that does get taught (say OOP) is either superficial or dated. Just think about how little code review is actually done or how little interaction with actual codebases students typically have, unless they get into one of those really great universities.
And usually when I say this, someone comes along and says "But what are we going to teach, JS frameworks? That's trivial, university teaches you something far better!". Sure, let's also do that to math problems and ignore them completely. Not exposing students to stuff will surely end up great. The truth is that what the industry wants from software engineers is also hard to obtain, while there are plenty of people with a degree out there.
No, I'm not saying you should teach frameworks, but you do have to have some way to involve students into actual projects. That is why most companies ask for hard experience, because whatever homework and personal projects people have done was on a completely different level.
-IoI-@reddit
Degree certainly helps though, as it forces you to cover all the bases for at least 6 months.
Hrothen@reddit
Every good software engineer I've known without a computer science degree has self-studied quite a bit of actual computer science.
GayMakeAndModel@reddit
Every software engineer I’ve known with a computer science degree has to get a CS graduate to fix their database performance although I’m sure there are exceptions. With database performance, you have other know hashing, sorting, loop joins, when those joins should be used, statistics and histograms, what is a b-tree, how do you fix deadlocks, how do you prevent blocking , and what are isolation levels and lock compatibility. Honestly, I think locking is the most difficult thing to learn about databases.
limeelsa@reddit
Hey it’s me! I failed out of college after studying chemistry for two years and psychology for two more after that. I took a data analytics certificate program 2 years ago and I currently run the DevOps department in a startup!
platoprime@reddit
Any tips for finding one?
Budget_Putt8393@reddit
Internships. The single most important thing you can do in college.
mnilailt@reddit
The one thing people never mention is good mentorship and experience. Reading a bunch of courses and books will only take you so far. Applying critical thinking and a solid mentorship foundation is 100% what makes you a great developer. Humans don't learn in vacuums, a good mentor is worth 100 books.
doggyStile@reddit
You from mistakes, yours and others
await_yesterday@reddit
honestly: you get lucky and read the right blogposts.
jared__@reddit
Lol in my computer science degree at a large state university they never taught me how to debug in an IDE.
grauenwolf@reddit
I have a software engineering degree. Unless the state of education has improved dramatically in the last 10 years, it's mostly just fluff and dogma.
mnilailt@reddit
If anything most of what I've learned in my Software Engineering degree has turned out to be quite bad practices prone to over abstraction and dogmatic processes. And this is for a very reputable university.
grauenwolf@reddit
A good example of this is design patterns.
In my dream class, we're taught how to identify and create design patterns. We learn about the higher level concept of pattern languages and why it's so important for code consistency. We're taught the difference between generic or library design patterns and application specific design patterns.
What I actually got was three classes on memorizing the example patterns out of the GoF book.
If cooking schools operated like software engineering schools, they would force feed you cake for 3 months and then declare that you're a baker.
Melodic-Equivalent-2@reddit
Also even the author of GoF himself has said that the patterns are often overused and/or misused, and that over abstraction and putting things to design patterns when they shouldn't be can be damaging.
grauenwolf@reddit
Well look at his book. The named patterns are in an appendix.
Raknarg@reddit
That sure is an opinion
st4rdr0id@reddit
Not an engineering. If it was, two things would happen:
IchBinBerto@reddit
Engineering is the practice of using natural science, mathematics, and the engineering design process to solve technical problems, increase efficiency and productivity, and improve systems.
You're thinking of the Professional Engineer distinction.
86LeperMessiah@reddit
"Category theory is to programming what chemistry is to cooking. Whether you know what a monoid is or not you are still using them."
Feel like Software Engineering is the top down approach while CS is the bottom up. They both arrive to the same conclusions, but I prefer bottom up because it starts and builds on top simple mathematically provable ideas. Though top down is definitely easier to digest and replicate.
Rivale@reddit
i went to school for software engineering when my uni also had a cs degree. I have years of professional experience now and the stuff you learned for a cs degree, it's stuff you should eventually be able to learn on your own. The stuff the software engineering degree was way more valuable because it's stuff that needs evaluation from people with experience like creating and executing a software development cycle and the instructors provide mentorship with the business side.
OrchidLeader@reddit
Three of my CS courses included a single semester-long group project that determined the entire class grade.
It was frustrating to say the least. But they were the most useful classes I took in retrospect.
This was over 15 years ago, and I had not heard about scrum. But without realizing it, by the second class, I had come up with my own scrum-like process. It helped my team have a solid project to submit, and it helped us deal with dead weight on the team (since we only ever assigned small tasks to each person, and we could take someone’s task away if they weren’t making progress).
Kuaizi_not_chop@reddit
Even on interviews, these people don't care about good coding practices. Which is hilarious because that is one of the most important skills to have as a foundation
vom-IT-coffin@reddit
At the end of the day it's about delivering the solution. The business could care less what your factory does.
One_Curious_Cats@reddit
They care when a proper solution will take too long to build. They care even more when the rushed solution breaks in production. In both cases it’s your fault.
vom-IT-coffin@reddit
Always be up front. Accept tech debt, tell them we can do it this quick now if you give us time to settle our debt latter, if it doesn't happen, here are the consequences...
upsidedownshaggy@reddit
Eeeh it really depends on the business if the fault is or is not on you in their eyes.
notaloop@reddit
What business ever blames themselves rather than their ICs for failings?
vom-IT-coffin@reddit
Work in a toxic environment like this. The ENTIRE development team quit in span of 2 months. Including management and the VP. They quickly learned whose fault it was.
Plank_With_A_Nail_In@reddit
Narrator: They didn't get time to settle the tech debt later, the business lied to them.
namtab00@reddit
... also, even if consequences were described upfront, it was their fault anyway...
One_Curious_Cats@reddit
The number of times I've drawn the iron triangle to executives at different companies explaining that they can't have it all...
skesisfunk@reddit
Not really true. Well architected applications are easier to maintain and add features to which reduces lead times. Businesses definitely care about this. If your team is taking forever to fix bugs and deliver features and somebody opens the hood and discovers your application is a hot mess you would be surprised how fast the higher ups will come down on you in these cases.
nanotree@reddit
Yeah, treating everything as a problem to solve on your way to product release is a good way to create an insurmountable wall of tech debt. And eventually that kills the product, sometimes much later after the company has already sunk millions into it. Unless the problem is how not to create shit tons of tech debt. Green field projects are super intimidating for this very reason. Because most of what you do will become the standard (especially the stuff you say is only temporary). The number of times I've seen "TODO" left by some long-gone dev in the comments of a legacy project tells a sad story, especially when the TODO is placed on a pivital piece of code that is now spaghettified throughout the codebase.
Plank_With_A_Nail_In@reddit
My experience is that "Tech debt" is mostly just a cover for being too lazy to learn how existing systems work and and the desire to always work on something new.
No dev in history has ever preferred to maintain someone else's code and the demand to rewrite everything instead. Tech debt is just the latest attempt to trick management into giving lazy teams what they want.
nfojones@reddit
This is hilarious. Sounds like someone who isn't actually experienced in what they're talking about but is in or around the management half of the picture. Like a disgruntled ~~scrummaster~~ agile delivery lead type who is tired of how much the dev team has to argue their estimates over a mound of tech debt.
You have never written code in your life my man. Please come clean. #1 way most dev's learn anything is reading code that's already been written. When that code was built by experienced developers it can often be very fulfilling and educational work. I would certainly argue its likely harder to luck into that code base in modern times as it may have been when the industry was smaller so its not uncommon to have a whole mess of blind-leading-the-blind coding going on but you are just absurdly wrong on this point. You are either extremely early in your coding journey or talking completely out of your ass.
Does not compute. Devs who regularly complain that they're never working on what they want to or on bleeding edge shit rarely last long anywhere nor does anyone miss them. They aren't the rule. Everyone else who is reasonably interested in keeping their skill sets up to date will advocate/ideate ways to bring that work their way, invest in learning new shit anyway or rightfully move on.
Rewriting everything and lazy are generally on the opposite ends of a spectrum. Sometimes a rewrite means ending a painfully stupid development process and ultimately reducing the labor involved over time for the same output. That's not lazy that's smart. But more often than that -- rewriting or deconstructing become a means to an increasingly necessary end. These are generally by no means simple feats in live production systems. Pitching rewrites means sticking your neck out in management circles to move the needle around practices generally maintained by the true lazy devs. The reward is minimal in relation to the risk for your actual career in these spaces.
Lazy devs are the ones content in how things work because it doesn't ask much of them to keep stacking the mess higher. That you, my guy? Or you just their stan?
nanotree@reddit
I'm really confused at what you think tech debt is. Tech debt is pretty universally a bad thing, and was likely a term invented precisely because management kept making ridiculous demands and asking why things couldn't be done faster. The answer is tech debt. It's a really convenient way to succinctly explain to higher ups that you'd be generating costly problems for the product in the future.
asdfjaoiwnenoiaw@reddit
>No dev in history has ever preferred to maintain someone else's code
Well apparently I am the first.
Thegoodlife93@reddit
Lol okay, go try to add a new feature to a 12 year old, 1500 line SQL stored procedure that is executing business logic for a necessary business process and tell me that tech debt is just laziness.
itsjustawindmill@reddit
We did it boys, we found the Coworker From Hell
DumDum40007@reddit
My team has the practice of never leaving TODO, but to actually create tickets to fix the issue and leave a link to the ticket instead, it helps to not forget about the TODO's
hurenkind5@reddit
Does it help to actually do the todos though?
Plank_With_A_Nail_In@reddit
Those tickets will get lost when you do an upgrade to the ticketing system or switch vendors. If the released system works outstanding tickets will get forgotten.
HopefulEuclid@reddit
The importance of this to the business depends on the application. A lot of applications don't require major modifications after initial release, and in my experience probably the majority of the time important tickets take from start to finish is taken up by the product team figuring out what they actually want rather than developers implementing the technical solution.
I'd be surprised if a developer with 40 years of experience could count on more than one hand the instances they've seen of this actually happening. Nobody "opens the hood" on applications, and 99% of code is going to look like a "hot mess" to an outsider. More realistically if your team isn't delivering to the point where it becomes a serious problem either your lead developer gets replaced or the project gets transferred to another team entirely. I've been on the "another team" side of this several times, usually what we got wasn't bad but rather just over-architected, hence it running severely behind schedule. The end result was us scrambling to keep the original deadline resulting in heaps of shit code that we'd slowly document and fix up after the fact.
BundleOfJoysticks@reddit
If your code doesn't look like a hot mess, you're either not moving/creating value fast enough, or dogmatic to an exceptionally annoying level.
storizzi@reddit
One coder’s wet dream of perfection is another coder’s hot mess
skesisfunk@reddit
This recently happened at the company I work at. A high importance customer started asking what was taking so long, got permission to see the source code, and started grilling developers on that team about specific things in their code.
HopefulEuclid@reddit
If management at my workplace decided this was an acceptable course of action and even more so appropriate behavior by the client I'd be sending out resumés the same day and having a serious talk with management once I had a good offer. It violates workplace hierarchy, and shows developers you have no problem throwing them under the bus when a client gets pissy. If a client wants to be an active observer in the development process and this is agreed on beforehand that's fine, but if they want to start critiquing code they can and should get their own dev department.
If deadlines are being missed and it's becoming a significant problem you deal with it internally and deal with the client on a business level. Allowing this shit is how you annihilate morale, and do absolutely nothing to help relations with the client because it's not like they're gonna go "Oh well the code's shit and we're not getting what we're paying for but at least they let us look at the code and vent on the devs".
NotUniqueOrSpecial@reddit
I've only got 20 years experience and can count to my toes on hot messes I've had to fix.
And I don't mean "I got there and didn't get what things were so I rewrote it".
I mean "I checked the DB access information and saw there were more than 1 TRILLION rows returned in the lifetime of a product with 5 barely active customers over a year and a half."
People on this sub are part of a specifically rarefied group: we care about our field/craft/skills/what we do.
I assure you, there are far more people out there making a good wage writing shit that is more unbelievably garbage than you can possibly imagine.
s73v3r@reddit
They'll claim they do, but actions speak louder than words.
juwisan@reddit
Well architected software has little to do with good coding practices though. Good Architecture first and foremost is about good requirement, good analysis and description of systems, subsystems, capabilities and functions.
Plank_With_A_Nail_In@reddit
Lol none of those things exist in the real world though.
juwisan@reddit
They absolutely do. Ever written things for regulated environments? They often require you to have proper quality management setup. Quality management often requires you to have these things.
Ever heard of functional safety? The area I work in, the highest category of functional safety requires you to not have unused code paths. Good luck bringing evidence of that without able to trace code back to requirements and design.
zrvwls@reddit
I think they exist, just not really in the order that would be helpful to to someone wanting to build it as well as possible the first time
KDallas_Multipass@reddit
Exactly.
"Oh you built a gorgeous cathedral! Too bad I needed a townhome"
Isak531@reddit
To be fair he said they could care less though, implying that they actually do care.
ammonium_bot@reddit
Did you mean to say "couldn't care less"?
Explanation: If you could care less, you do care, which is the opposite of what you meant to say.
Statistics
^^I'm ^^a ^^bot ^^that ^^corrects ^^grammar/spelling ^^mistakes. ^^PM ^^me ^^if ^^I'm ^^wrong ^^or ^^if ^^you ^^have ^^any ^^suggestions.
^^Github
^^Reply ^^STOP ^^to ^^this ^^comment ^^to ^^stop ^^receiving ^^corrections.
moosehq@reddit
Good bot
ammonium_bot@reddit
Thank you!
Good bot count: 434
Bad bot count: 184
CantPassReCAPTCHA@reddit
Dipshit bot
moosehq@reddit
It’s correct dingus.
goat__botherer@reddit
Not in reply to that comment, it was correct when it replied to the original comment which used it.
Iggyhopper@reddit
Not if managements plan is to ship it and kick it so it looks good on their resume.
MuxiWuxi@reddit
"Who cares about the issues tomorrow? We care about money in our pockets today."
Plank_With_A_Nail_In@reddit
Business keep being told to rebuild their systems with better architecture but development time still keep getting longer. IT teams keep asking for more tools but development time still keeps getting longer, more testing tools but bugs still happen with the same frequency.
Only have the architects word that it is better anyway and they are never going to say its bad.
Systems are designed and built by humans and they suck at designing and building systems.
ammonium_bot@reddit
Did you mean to say "couldn't care less"?
Explanation: If you could care less, you do care, which is the opposite of what you meant to say.
Statistics
^^I'm ^^a ^^bot ^^that ^^corrects ^^grammar/spelling ^^mistakes. ^^PM ^^me ^^if ^^I'm ^^wrong ^^or ^^if ^^you ^^have ^^any ^^suggestions.
^^Github
^^Reply ^^STOP ^^to ^^this ^^comment ^^to ^^stop ^^receiving ^^corrections.
BundleOfJoysticks@reddit
Bad bot
Fuck off
ammonium_bot@reddit
Hey, that hurt my feelings :(
Good bot count: 434
Bad bot count: 185
BundleOfJoysticks@reddit
Have you considered getting a job
travelsonic@reddit
/s?
moosehq@reddit
Fix your grammar mate.
vom-IT-coffin@reddit
Good bot
ammonium_bot@reddit
Thank you!
Good bot count: 434
Bad bot count: 184
agumonkey@reddit
but that's too short sighted as a definition
every too narrow 'solved first' choice will bite you in the end
that's why people build libraries and frameworks, to avoid piling up 'solved first', copy pasted, 3 pages long methods etc
the velocity has to be balanced with some amount of modeling / modularization (just like databases)
Xyzzyzzyzzy@reddit
Which is very interesting because modern "agile" development came from lean software development, which came from lean manufacturing, which came from the Toyota Production System.
The Toyota Production System emphasizes that the business very much cares about what your factory does - it's the business's top priority, and all other aspects of the business revolve around keeping the factory working efficiently and effectively, and making sure its output meets Toyota's high quality standards.
One of the key principles of the TPS is genchi genbutsu, "(go and see) the real thing in the actual location". Problems can only be understood by observing the actual place (gemba) where they originate. If you're a factory manager and you think your paint shop ought to be using 500 units of paint per day, but your measurements using 600 units of paint per day, you go to the paint shop, observe how painting and measuring happens, and ask questions until you find sources of waste. Maybe the paint shop's painting process is wasteful. Maybe the measuring system is ineffective. Maybe everything is working fine, they really do need to use 600 units of paint per day, and you should go visit the office that 500-unit estimate came from to discover why the analysts are producing defective output.
TPS rejects emphasizes having a detailed and thoughtful process to find, study, fix, and prevent wasted time, money, materials, motion, and machinery. This enables kaizen, continuous improvement of the production process. The way in which the product is made is constantly evolving as everyone from technicians to executives finds and eliminates sources of waste. The work process is standardized, but changes as improvements are discovered.
Ancillas@reddit
I wonder how many CPU cycles in AWS are wasted every day because of poorly written software? That’s a metric that very clearly translates to bottom line which is something the business cares deeply about.
The business might be paying 3x, or more, what they could be paying for the same solution.
paulsmithkc@reddit
If you can keep the annual cloud costs below 100k or less than 1% of the business revenue, that puts you into a safe territory where the costs aren't a major factor.
Ancillas@reddit
How can you say that absent revenue or margin data for the hypothetical product?
paulsmithkc@reddit
A) 1-2 devs will cost you more than 100k B) If the company can't spend 1% of their revenue on software/cloud, then they shouldn't be building software in-house.
Ancillas@reddit
This is nonsensical.
NotUniqueOrSpecial@reddit
In what way?
Everything they said was factually true (assuming the U.S. market).
A really capable senior dev. will cost 2-3x the point they made in
A.If you can't afford 1% revenue spend on cloud, then you're already dead in the water.
And yes, your dev. team absolutely should be producing more than they cost, or why are you employing?
Ancillas@reddit
I was talking about writing wasteful software and suddenly fictional numbers are being conjured out of thin air.
1 developer. 100 developers. It doesn’t matter. Wasteful software will burn cash in the cloud. Maybe the overall cost is negligible while you’re growing and making piles of cash, but without benefitting from economies of scale, the value of the revenue will fall off as margin continue to scale up linearly and growth slows.
Going back to the original point a few comments back, just solving the problem isn’t enough. You have to at least do the analysis to understand the waste before you can make an educated business decision to not deal with it until later. That feedback loop of measuring costs and picking your trade-offs is a continuous exercise.
f3xjc@reddit
Almost all of what is described as good software practices increase energy cost.
Loose coupling, abstraction, independent services with message bus, redundancy... Those are all extra cpu cycles.
Ancillas@reddit
Increasing energy cost is not necessarily the same as wasting energy.
The comment I responded to proposed that problem solving comes first. I assert that cost management is part of the initial problem solving. That means that how the software is designed and written is critically important and not an after thought.
Stopher@reddit
For Amazon, a wasted cycle is profit. Not for the company paying for the cycle. There’s gonna be a balance between speed of output and efficiency of solution. It is an interesting problem. As hardware gets better, efficiency matters less but you scale up enough it matters more. I dunno. I’m a hack. Lol
Ancillas@reddit
As an AWS customer your cost is directly scaled with your consumption. Regardless of the speed of the hardware, if you spend N times longer performing a task you pay N times more to perform it.
If your costs aren’t aligned with the economic model of your product, you can’t effectively take advantage of economies of scale. Amazon knows this and even spoke about it in a keynote at re:Invent.
The business absolutely cares about how the solution is built.
Where I disagree with the comment above is that cost should be baked into the design of a product. It’s a non-negotiable non-functional requirement.
Plank_With_A_Nail_In@reddit
Not as much as is wasted on all the SPAM resulting from the expertly architected SMTP protocol.
ElkChance815@reddit
I think it is because what can be called good practices kinda subjective. In my place, people sometimes ask about SOLID but sometimes, it's more like a nice to have. Otherwise, performance and deliver time can be measured in number more correctly which is easier for interviewer to rely on.
MrEllis@reddit
I like asking questions where the fasted solution to write blind is also the cleanest. Candidates will rush to solve them with nested loops, loose variables with coupled data, and ugly if-else statements and get lost and be unable to test/debug.
But if they used multiple functions and maybe spin up a struct/class or two the problem suddenly becomes simple at the problems are all easy to diagnose.
Basically the kind of problem that i have watched explode into a mess that haunts a new hire with prod-bugs for months because they didn't use a clean approach.
Chris714n_8@reddit
" ''Just in time' - Upgrades, Security-Patches and Services - on sale..!' - It doesn't cost you anything more than your soul."
3np1@reddit
Any recommendations on how to interview for "good coding practices" given the realistic time constraints of an interview?
post_static@reddit
Do a pair programming section of the interview process and then discuss ways to improve the existing solution. Which has obviously been designed with purposeful faults
ICantWatchYouDoThis@reddit
I ask them following the line of what are some rules you follow when you code, what is your criteria for good code, or if you have to maintain somebody else's code, what do you want their code to look like. They usually answer naming principles, or defining some principles they've been following. Or if they have experience they can say the flaws they saw in their previous job and how to prevent it from happening again.
English is not the language I use in interviews so the wording might not be exactly translated here.
kamuran1998@reddit
Write some garbage code and ask them to fix it is a good way to go about it
new2bay@reddit
If not fix, then doing it as a code review exercise also works well.
GamingWithMyDog@reddit
First I’d say have the interviewer be someone you trust with their coding practices. If you don’t have that, hire a consultant. If that person is truly qualified, they can review applicants and make a solid judgment with a normal hour interview of various questions. Tricky part is finding the person you trust
bigskeeterz@reddit
I disagree that coding practices are that important. Fundamentally I personally would like to see someone who has good intuitive and problem solving skills. Coding practices can be learned on the job by more senior members.
Searingarrow@reddit
Where is a good place to learn what good coding practices I should know?
HatesBeingThatGuy@reddit
When I see this I immediately note it as a negative and if there is any doubt about their ability to have good logic they are toast.
PM_ME_C_CODE@reddit
Interviewers who don't care about good coding practices come from teams who put delivery to stakeholders first, and have managers who don't ever allow engineers to work on tech debt or maintenance. You know the ones...where the moment the product is "released" half the team quits because while building new features is fun, dealing with legacy code isn't worth their precious time (because it's hard).
Kuaizi_not_chop@reddit
Dealing with legacy code is quite fun if they let you refactor....
ICantWatchYouDoThis@reddit
it depends on the fields. In game development, good optimization and maintainability are absolutely required.
ward2k@reddit
Yeah there are a few skills that never came up when I interviewed for roles, and arguably they are some of the most important things you do in your role.
Reviewing code. I've never had to do a form of PR or point out faults of another person code before during an interview, but assuming you're in a reasonably sized team you'll need to be able to do this well
Refactoring code. It's rare you'll ever write something from scratch since most teams you'll be on will be working with already established code bases, where you'll need to work around and refactor this code
git. Extremely important, though apart from a throwaway questions asking if I've used it before, I've never been asked demonstrate any knowledge of it. Honestly I'd rather take a fairly average coder who is competent with git over a high level programmer with zero experience with it.
Testing. For the love of god please ask about testing, it's so painful getting anyone new who has no concept of why writing tests is so important
Dull-Contact120@reddit
Job security, writing codes only you can debug
_Pho_@reddit
The problem with quality software is that ultimately, it depends.
People who work in enterprises that can afford to spend 30 developer hours to make innocuous changes often scoff at developers coming from startups where everything is made as quickly as possible. But the reality is that if you have 30 hours to change a widget then your work is probably not very high value.
There is so much variation that what constitutes "good code". Developer time, business value, scalability, ability to reason about it, composability, et cetera et cetera.
My favorite example is the book "Software Engineering at Google", a relatively famous book about the best practices of SWE at Google. To its credit, there is a lot of sound advice. But when reading it, so much of it was a condescending vibe of "us Googlers do things the right way", where the right way to them means "the longest-dev-time, most abstracted and performant way".
Of course something like this is only made possible when you have the capital of a trillion dollar company. Furthermore, so little of what has been done at Google in the last decade or two has had anything to do with Google's business success. Advertising is 80% of Google's revenue and the technology to support that has been in place well over a decade.
So when a Googler is telling you about the best way to write code, you have to consider the strong possibility that the Googler has done less to drive revenue for his company than the Wordpress dev churning out templates for small businesses.
GregsWorld@reddit
Actually quite the opposite. Big companies have more customers so changes impact more people, that's why the decision to spend the extra time is worth while. The shade of colour of a button can make or lose Google millions. That's high value.
I too would scoff at anyone thinking the start-up way of moving fast and breaking things is the best way just the same as anyone from a big company moving to a start-up thinking their way is best. There's a time and a place for both approaches.
I disagree that there's a lot of variation in good code. The examples you gave are all about business value not code quality.
_Pho_@reddit
This is the standard point (the Google committee to change the Gmail button color) in which a few things must be said. First, the shade of button color is primarily a UX concern in the first place. Second, the entire claim around that being a million of dollar impact is dubious at best. Of course when you look at the details there are always unconsidered covariates, and arriving at any sort of analysis of those impacts is difficult in the first place, especially when the company in question drives data via ad revenue instead of out-muscling its competitors with superior design.
Of course there are programming decisions which impact millions of dollars, but f.ex if we look at the spread of those decisions at Google, it's probably a tiny minority affecting that kind of change and not in the ways you think.
Not at all. "Need for scalability" determines the need for abstraction, which is a tradeoff with developer velocity.
Ability to reason about code (refactorability) is often at odds with performance and complexity.
Developer time is ultimately a judgment call about these factors. A good developer weighs these things in context of the product they're working on.
If you're writing maximally performant, maximally scaled code for every single project you work on, the velocity of your product is probably so low that nothing you write matters.
GregsWorld@reddit
Dubious sure but not unrealistic. Amazon says 1s increase in page load time would cost them $1.6b a year in revenue. That means any changes increasing load speed by 1ms would cost them over 1 million. When you're working with an application which effects millions+ of users every change matters a lot more.
Performant code avoids abstraction, but non-performant code should also avoid it. Not to the same extreme because no abstraction is bad, but also too much is bad. So at best they're only loosely connected.
Performant code is often the code with least instructions; the simplest code. Good code is easy to comprehend; the simplest code. These things work hand in hand not against each other.
But simpler code takes longer to write. As the saying goes "had I more time I would've written it shorter", so yes it is a judgement call and writing good, performant, scalable code takes the longest.
But that doesn't make it any less valuable unless you're measuring value as number of lines of code written.
If you spend 1 hour on a feature for 1,000 users. Or 1,000 hours on the same feature for 1,000,000. Your value per user is identical.
Spending more time on code doesn't reduce its value to a business.
teslas_love_pigeon@reddit
You seem incredibly smart, what books would you recommend that advocate your positions? I want to go all in.
BundleOfJoysticks@reddit
Another method is to not discuss the testing time/methods with PMs or business people because it's not an additional cost, it's how software is built and they are not qualified to have an opinion or argue about it.
We don't question the nitty gritty of their sales ops, bizops, product management tools, etc. I don't question my carpenter's expertise about how he squares up his studs (unless they're obviously messed up). We don't question the methods our surgeon use to keep the surgery site sterile.
Testing is how software is built. It's not negotiable.
With that said, it then becomes our responsibility to deliver value from any testing methodology, because that's what we owe the business and our customers. If you build a spaceship of CI and useless time consuming testing infrastructure that aren't needed, then you're not working on the right thing. But writing unit tests, e2e tests, integration tests, that's up to us, and other parts of the business lack the skills and authority to even talk about it.
LiamTheHuman@reddit
How do you balance that though. There are so many times where the need for something delivered was so valuable that the profitable choice is quick and only partially tested code. Like I think you are agreeing that it's a devs choice when those times are but I feel like I keep being in the situations where writing 'good' code isn't the best choice and it's frustrating. I want to make myself a system that is easy to work in but making my job easier is only as good as the efficiency I gain in doing so
TokenGrowNutes@reddit
You balance this by having the minimum number of tests that give you the confidence that the program works. The minimum - not all scenarios, not all testing methods in all the layers.
Right now I am driving a project where the feature is communicating with a 3rd party, with mostly integration tests. I do that because in the end, we only really care about data is returned from a 3rd party. I could have driven it with unit tests, inspect what is in the payload going to the 3rd party pre-send, but that's just overkill.
s73v3r@reddit
I have never seen an instance where that is actually true.
LiamTheHuman@reddit
honestly that surprises me because it comes up all the time for me. I'm sure it depends on what industry you are in and how quickly it is moving. I bet banking software devs don't ever see this whereas ones working on AI might see it all the time.
s73v3r@reddit
People say it's a thing all the time, but it almost never ends up being one. It's just product managers who are trying to get stuff quicker, cause they don't think we're serious when we say how long things will take.
-grok@reddit
The truth of the industry is that most code written is worthless and goes unused.
Furthermore if there really is a strong, unfilled need for the code being written (doubtful because most product managers just try to copy something that is already successful), the fact that it is "quick" and partially written means that when the customer tries to use it, they quickly figure out it won't fulfill their needs anyway and go back to whatever was working in the first place.
CreativeGPX@reddit
The point was that the business people shouldn't control it. And you're really just talking about when they control it implicitly rather than explicitly... by constraining resources to force you to cut the corners rather than literally saying "don't test". So, what you're saying isn't really a different case or counter example to what /u/BundleOfJoysticks suggests. It's just a case where the philosophy wasn't actually adopted even though on the surface it may look like it was.
Because devs need to be able to determine how something is done, they need to be able to determine how much work it is, therefore they need to be able to determine the timeline. While there are some cases where business people may be warranted in imposing a timeline (e.g. a law is going into effect and we must comply with it by July 1 no matter what), the correct way to contextualize that is to keep devs in control of their workload in the bigger picture to then be able to say after the deadline, "okay, we need x days of time to pay down technical debt".
LiamTheHuman@reddit
Ok so let's say you say that you need 1 month to pay down tech debt and outline the risks of not doing it. Then the business side says that X new project will provide lots of value and while the risks associated with the old codebase are present they are acceptable because the potential profit lost will be less than what is gained by working a month on new X project
s73v3r@reddit
Have them show their work. Most of the time, they don't have anything to actually back up their statement.
CreativeGPX@reddit
I think your comment requires more context to determine if it's good or bad. That conversation in isolation doesn't really say enough. It depends on things like:
Depending on how you answer these questions, I'd say your scenario could be acceptable or unacceptable. This is what I meant by the bigger picture. Rather than obsess on unsustainable purity (i.e. managers should never be able to impose), it's healthy to allow some exceptions as long as the bigger picture fits the ideals.
LiamTheHuman@reddit
I'm not saying it's good or bad. It just upsets me. As a software developer I want to make a well built product but the current business environment just doesn't support that goal. I was just venting about the frustration that it's often the right choice not to build quality.
BundleOfJoysticks@reddit
It's not even about the workload, it's about how software is created. Testing automation in particular is how software happens. It's not optional.
Thurak0@reddit
Widen the timescope. If you will work in that environment in three years... the time saved by then will help future-you to do better work then. Infrastrucutre pays off and also not having to deal with all the shit that is in you almost untested code.
Sometimes though we lose too much time of that shit (as per comment you answered to), but not doing anything in that direction also costs you. Perhaps not next week, but probably in the much nearer future than you anticipate.
Tech debts don't go away. Evenetually they will be collected.
BittyTang@reddit
Maybe think about why it's harder for you to write good code than bad. IMO good code is almost always the easiest and simplest solution. It's often very stupid code, like O(N^2) stupid. But it's obvious and solves the problem today. And if you're not testing something, then you don't know that you actually did your job anyway. How would you know you're delivering something that works without testing it? I'm not saying you should unit test everything or go to the other end of the spectrum and manually test everything, but you need to figure out the balance that gets the most value and sets you up for success at least a few months in advance.
LiamTheHuman@reddit
This is more around the design or refactoring of codebases. In terms of testing it can often be easy to test something and have minimal automated tests to get it out there. I think the problem is that goals change based on success. So a project that takes a long time due to proper testing and code that is easier to extend doesn't end up getting expanded on while code that produced value quickly is then pivoted to a larger audience or larger requirements. So either choice you make you end up having made the wrong choice
Kautsu-Gamer@reddit
When short-term profit stupidity and trickle-down connievery is the "economics", no quality is feasible as it is profitable to get customer pay for maintenance.
PatrickAdamsAZ@reddit
Nope! I've been the lead on billion dollar products with several massive teams of QA. I'm currently in a start-up that has less than 20 people. A need for more QA is just a symptom of a disease that more QA won't cure. A few good (actually good) developers don't need QA. Average developers always need several massive teams of QA 🤷♂️ This is literally the reason a tiny startup can run rings around big companies. On top of that QA folks are normally CS failures, if I'm ever hiring QA at my company it won't be from a pool of CS rejects, it'll be the bartenders I see busting their ass that can hold an intelligent conversation.
BundleOfJoysticks@reddit
That's funny, my best QA hire is a college dropout who was a bartender and holds a mean conversation. The others I've had were CS rejects and completely useless.
Synyster328@reddit
Absolutely. The problem comes from the weird hierarchy where SWEs are at the bottom of the chain, and report up to people like product owners, PMs, or others in that space.
And they bring people in fresh out of school with no other experience of why this is bad, how to push back, negotiate boundaries, etc. they just say "This is how it works" and everyone goes along with it.
But no. An engineer should not be a project manager's resource to be moved around like a piece on a game board.
Quite the opposite, actually.
SalamanderOk6944@reddit
Ha, I like this approach.
I feel like my development job is constantly testing. Test first, then iterate, then test, then fix or re-assess or iterate, then testn, then fix or re-assess or iterate.
Yet I see my producers put deadlines on the discovery process all of the time.
And they wonder they projects get canceled.
bjarneh@reddit
Isn't this the real problem? I.e. that a good engineer has a high value to an organization, so they cannot be "wasted" by giving them a position of leadership. I.e. you almost always end up with PM's that have no technical abilities, and much of the problems that happen during a software project are a direct result of people not really understand each other.
BundleOfJoysticks@reddit
Again, the PM doesn't have a say in this. It's like the PM telling you to use tabs or spaces or 6 character indents. Not their skill set. Their opinion doesn't matter.
wizrangelord56@reddit
I have this issue in my company now, all the experiences people are leaving for better opportunities so the ones that stay are just slammed always and never have time to teach the new hires.
Management doesn't prioritize this teaching at all. I feel like the team would have so much higher output if the most experienced did coding and just taught and supported junior devs.
bjarneh@reddit
I've almost exclusively worked in places with this type of problem constantly.
There are a ton of things you can do when a team is swamped, but all those actions require a manger/pm with a good understand the long game. You need to take time to do a bunch of stuff which does not immediately pay off for instance (like supporting jr. devs.); but none of this will go over well with an non technical PM.
TokenGrowNutes@reddit
You are never taught the test driven way, either. And that's key.
A brief mention of test driven development in the article, and the author admits they do not drive development with tests, which is ironic.
ImYoric@reddit
I'll draw my trump card: the best way to learn how to build quality software is to join a project working on quality software, i.e. a large open-source project.
frogking@reddit
In my experience, “quality” is.. it runs, it does what it has to do, and you can leave it for 10-15 years without worrying about it.
… downside; you are screwed when somebody wants new features..
skesisfunk@reddit
That's a pretty big downside don't you think? Most software projects are built iteratively, so you start with an MVP and then you add new feature sets. If your MVP runs but it isn't extensible then you did a horrible job.
frogking@reddit
It’s a downside, but 10-15 years later it’s somebody elses problem.. i.e. my problem, because I have to add the feature on stuff that has been running for decades..
skesisfunk@reddit
That may be true in a waterfall development model, but in agile not planning/designing for extensibility that is just shooting yourself in the foot. Agile process will tend to have deliver something simple and minimal first with the plan to being to continuously add features to that. So its not 10-15 years down the road its next quarter.
frogking@reddit
Yeah… last month I had to touch code that hadn’t been touched for a decade. The waterfall was all dry and there was no documentation but the code itself.
I know that we have all these nice methods and good plans.. and that the docs are always present.
There’s just WAY more systems that are left in the state that I have to deal with. Systems that are fulfilling their purpose.. which might as well be “passing the butter”.
rarted_tarp@reddit
A lot of software is built iteratively, but in expanding pieces.
You'll add features onto an existing system, and you need to trust that system is stable and works. Think of it like a foundation. If you did it well, you can build amazing things on a foundation, but you're not going to change that foundation.
If the foundation is expanding, whoever is building on it will be building on quicksand.
There's a reason that most of the Linux kernel is still the same as 15 years ago, but has new pieces added to it every year.
Acc3ssViolation@reddit
Exactly, if you can't add new features or change anything then you haven't built software, you've built hardware.
Holmlor@reddit
We can change hardware today faster than you guys can rollout a new front-end.
wademealing@reddit
How's amd doing with zenbleed 4 firmware fixes ?
Synor@reddit
Software is not Hardware. High quality software is easy to change and true to it being a soft thing.
frogking@reddit
You assume that any of the current employees were involved in developing the software that was installed and have been running for a couple of decades.. and now has to be changed a bit..
rjcarr@reddit
Quality to me is when a new feature is requested it can be fit in nicely, without having to retool and refactor everything.
Like, the quality of software is inversely proportional to how long it takes to build upon it.
frogking@reddit
It’s a choice… if new features are added at a rate of once a decade, there’s no point of making full unit tested code.
Ythio@reddit
So you're screwed all the time then ?
frogking@reddit
Every 10-15 years, yes.
holyknight00@reddit
That's only true if you go to an absurd extreme. Easily testable code is generally simpler, more cohesive, and modular.
Even if your intended change breaks 200 test cases, it is still much better than having no test and then spending 6 months blindly solving issues your feature broke in 200 parts of the code you don't know.
I've seen multiple refactors or new features introducing bugs even 2 full years after they were implemented and running on production with no issues because no one really knows what the system does.
These things almost never happen on codebases with reasonable quality assurance.
If your system is complex, changes and new features will take a long time and effort no matter what. With proper testing practices, you can at least have some metrics to estimate the real effort of what you are doing.
To put it in some "numbers" you can think it in this way. All the numbers are obviously completely made up, but based on metrics I actually saw across different projects.
1) Project with no automated testing or quality:
Estimated implementation effort: 50h
Estimated testing effort: 25h
Total estimation: 75h
The time you actually spent the next 6 months after you "finished" the feature debugging and fixing the code: 175h
Real total effort: 250h
2) Project with no with proper quality practices:
Estimated implementation effort: 150h
Estimated testing effort: 25h
Total estimation: 175h
Real total effort: 200h
Initial estimation 75h vs 175h
Real final effort: 250h vs 200h
Yeah, it's really easy to sell a much lower estimation, but that's because you have absolutely no idea how much time you will actually spend fixing things that are broken. The problem is that this "extra" time is never taken into account, so having crap development practices is really easy to sell as a "faster" way of making software when in reality is much much slower.
ward2k@reddit
I absolutely adore having a shit tonne of tests in a project. A good mix of Unit/IT tests, UI tests and performance tests makes me feel so comfortable knowing that it's almost guaranteed any mistake I make will be picked up by one of the tests I've written (or has been written before me)
But weirdly some people in the programming space have such a hatred for writing automated tests that I really do not understand (though it feels more like those new to programming on programminghumor that have this belief that tests are bad)
holyknight00@reddit
My wild guess is that most people who complain about this, never really worked on a project with decent quality assurance practices. You can actually move fast, without breaking things.
Nothing worse than working on a project with no test at all where you are basically praying that you change won't blow up production in two weeks or two months. So stressing, I still have war flashbacks about these things.
Holmlor@reddit
YAGNI
People tend to write far more generic code, which is necessarily more complex than less generic code, which takes more time to get working.
holyknight00@reddit
How "generic" you write your code is a compromise, as almost everything. Early optimization is definitely a good way to doom your code base into an incomprehensible mess, but this doesn't mean you should copy-paste a function 5 times all over your codebase because it will take you 10 extra minutes to give it thought and write it properly and with some basic separation of concerns.
All of this, anyway, doesn't have anything to do with proper testing and quality assurance.
Quality Assurance and testing are just the bare minimum you need to write decent software. You can still perfectly test sh1tty code, or write sh1tty test for excellent code.
Testing just ensures the proper behavior, it says nothing about the implementation (or at least it shouldn't, because that would mean your testing is tightly coupled to your code and that means your tests are useless anyway)
gentle_fade@reddit
My impression is that nobody reads other people's programs anymore. The obvious way to learn how to build quality software is to absorb and steal from other good programs.
Majik_Sheff@reddit
One cannot be taught what good software looks like any more than you could be taught what the wind smells like.
The only way to fully appreciate a garden of jasmine and sage is to have previously been downwind of a poultry farm.
NotSoButFarOtherwise@reddit
You are not taught how to build quality software because people don't want quality software - at least, they don't want to bear the cost of developing it. Businesses don't want to pay for it when they can ship out a half-baked "MVP" or "v1.0" for half the cost, and whatever consumers say, in practice the overwhelming majority of them will choose free-to-use or freemium software with more bugs over paid software with fewer bugs. How many times have you heard "We'll delay release until all the bugs are fixed" vs "It's okay if it still has a few bugs, the important thing is to ship/release it."
There are a few niches where this isn't true, and if you work in one of them, you definitely do get taught how to build quality software.
travelsonic@reddit
Honestly, I can't help but feel like another version of companies trying to shift the blame onto consumers.
RecognitionNo6665@reddit
heyyy
alex206@reddit
"ain't nobody got time for that"
kasperlitheater@reddit
I really hate these self-righteous, respectless click-baity a-hole titles. May be I was tought to build quality software, may be you don't know it yourself. Everyone using this kind of titles, go f yourself.
/rant over
rfgm6@reddit
Most software don’t really require strong algorithm skills. Good engineering practices are far more important.
patate_volante13@reddit
Candidate are not evaluated on algorithm skills because of their usefulness in the actual job, but because it is a good proxy for intelligence.
LessonStudio@reddit
I hear people endlessly argue about this methodology or that process, but the reality is it is 100% about tech debt balanced against business requirements.
An architect need to understand how the tech debt curve will shape out over the length of the project. At a certain point the interest payments may become so high that progress grinds to a halt. The key is to make sure this Keynesian End Point (an economics term) is not reached before the product is released. A sure sign you have blown past a Keynesian End Point is when a project is stuck at 90% done. Basically, at this point you are just making interest payments, not any progress.
A counter-intuitive one is unit/integration testing. If you don't do this from the start, the project will be slower. So very many project managers cut this corner to "speed things up'. Unit/integration testing is easily the #1 tool to prevent tech debt.
But tech debt is not necessarily an evil thing. This is where time to market can influence the choice of the entire tech stack. If the MVP is modest and the deadlines are tight, then picking a high debt tech stack which will allow you to pound out a product in time is better than picking one which will potentially kill the product before it is released. The key is to plan a route out of this tech debt pit of despair; often a modular architecture where you can start replacing chunks of the crap tech stack. Effectively this is like paying off a high interest rate CC.
Using tech debt as the key guide, any process, architecture, or any part of the tech stack can be carefully chosen. Also, as time goes by, it can influence decisions about the existing architecture, tech, etc. Maybe some authentication tech was great in 2018, but now it is endlessly causing headaches. If there is a new tech which will not be too hard to implement and will remove the headaches, then this would be like removing high interest debt with a one time payment.
Of course, this all needs to be balanced against business needs. I've seen companies replace code with FPGAs to achieve required business goals. I couldn't have imagined a much larger lump of tech debt possible, but this is a case of where needs must. Weirdly enough, this last tech turned out to not be so terrible to maintain, it reduced server count by enough to justify this alone, and then it made lots of money.
oakwoody@reddit
Well said. I would add that an architect needs to understand and articulate risks vs. rewards. PMs and bean counters prefer to operate on "x amount of time, y amount of money" and it's really hard to get them to extend their mindset to "if we aim to complete a task in x amount of time and y amount of resources, we may gain $b but the risk of failure is %a" and adjust the parameters from there. I'd lump tech debt into the risk category -- it's not inherently bad but it tends to increase risk in the future.
ElephantExisting5170@reddit
If we built quality software how are we meant to sell a support contract?
zecvm@reddit
" Neglecting QA is a shame because 90%+ of all students work in a company context " Ok but computer science is not about software development thank you very much. It's not lacking, it wasn't missed, it's just not a part of it.
Dan13l_N@reddit
But people don't want to be scientists. They want to develop software and earn money. From my experience, very few people are interested in the science part in any field.
joninco@reddit
We are taught that building quality software takes too much time and sales has already sold some new feature that doesn't exist yet.
Fandango1968@reddit
True quality software died with the mainframes back in the 90s.
azizfcb@reddit
i agree
pjmlp@reddit
Depends on the university, my Informatics Engineering degree surely had optional lectures about writing quality software.
So the opportunity to learn was there.
TheRNGuy@reddit
WIsh someone taught google and twitch teams how to make sites.
titanicx@reddit
I don't know about nowadays however when I was in college they taught people how to build good quality software and good code.
Gwaptiva@reddit
You are never taught this, if you are smart enough you pick it up over time by learning lessons from things that go wrong and things that go right. It's why you pay experienced developers more than inexperienced ones.
goranlepuz@reddit
We feel this is true, but the numbers are made up, we don't have them, do we?
And this is in part titled "speak about money". One can't speak about money with a flick of a finger.
Preparation with past performance and forecasts are needed first.
wang-bang@reddit
Posting to read later since save doesnt work
Massive-Computer8738@reddit
“Move fast and break things” the PM said
DreamHollow4219@reddit
I have no idea how much this affects me since I am self taught and usually self-test my own programs...
Melodic-Equivalent-2@reddit
Downvoted for offering a different perspective. Good job reddit.
norse_dog@reddit
Well, or you are taught how to build quality software, and then your lead overrides your drive for quality with a drive for deadlines and good enough quality. And once you finally become a lead yourself and want to do things right for your team, your manager in turn will ensure that you won't be able to ;)
StendallTheOne@reddit
I disagree. The information is there but people put the focus in "if it works it's good for me". So the first things that suffer it's easy modification because of loose coupling, scalability, security, and so on. Most developers just don't give a fuck while they can get the check because they don't like programming in first place.
Melodic-Equivalent-2@reddit
Now I'm not against design patterns completely but I very often have seen them applied to things that very obviously should have been more procedural and didn't need to be OOP. Then there are huge knowledge gaps and unmaintainable code when they move onto another project, because the convoluted OOP code doesn't make sense to anyone else.
Melodic-Equivalent-2@reddit
I like programming A LOT but also really don't like over abstraction. A lot of design patterns are used unnecessarily in my experience, leading to very difficult to maintain code. At that point, I'd rather have "messy" code where I have to copy paste a little bit so to say.
70-w02ld@reddit
I took a college basic programming 101 course, and the professor was the same guy that built the smallest operating system, and also states that what we used today, he designed and built. He also pointed out the accounting software most accounting firms use, was also developed using quickbasic 4.5 - I asked a lot of questions. So, I learned how to build quality software.
But your right. Not many did. My brother went to silicon valley college. He learned stuff. Basic stuff. I sat at home, studying and he would explain things to me, answering questions I had. But in the end. He's basically using what I learned. I even pointed him in the direction of wix.com - he loves wysiwyg editors, and he's a wiz at throwing together a website. Their bad ass. No other wrbistes like his. Bigger form fields, larger designed thumbnails, bad ass.
But your right. I don't think anyone going to school.for it, is learning how to build quality software. I guess if they raise their hands and ask, that would make all the difference.
spytez@reddit
The world is filled with quality software that has never seen the light of day.
Stopher@reddit
College doesn’t teach that. You have to get that from your job or elsewhere. It’s a ginormous hole in education. The author is right. This should be part of the basics.
Barn07@reddit
ideally you define what is considered good quality with your clients, your team and your manager
sybesis@reddit
Whoa, you're going to rely on people that would rather have the things they asked today to be shipped yesterday to define "quality" of software?
Holmlor@reddit
If it would take you four-hundred years to build the software at your pedantic quality level, what good is it.
You must accommodate other architectural priorities and typically the most important of which is time-to-build.
sybesis@reddit
If it would take a day to release a broken code to production only to start fixing it repairing damage as soon as it hits production because you have abysmal quality level? What good is it?
I've been in position where my "manager" would rather have the automated tests disabled, monitoring disabled etc..
To this day, almost 2 years after leaving that place. The monitoring server is still stuck in a boot loop.
Barn07@reddit
I dont like your strawman and I believe continuous alignment on those things is a good thing
sybesis@reddit
You're talking to a person that has been told that the pipeline that would execute and run the automated tests were preventing things to get merged and so had to be removed and disabled.
For some people, it seems. Only releasing as fast as possible is important.
rornic@reddit
I agree. Businesses need outcomes and projects have different quality requirements. Everything is a trade-off; sometimes quality can be intentionally sacrificed to achieve a better impact. It is important to agree that with the team and customer.
It’s better to be pragmatic and iterate than follow strict dogma.
Holmlor@reddit
Everything is not a trade-off. Deming proved this in the 50's.
For example TDD reduces time-to-market.
rornic@reddit
I was too bold with the word “everything”. I was thinking more in the context of non fundamental “nice to haves”.
If it’s a question of compromising fundamentals, like a suite of unit tests, then the team should push back on it.
rornic@reddit
I was too bold with the word “everything”. I was thinking more in the context of non fundamental “nice to haves”.
If it’s a question of compromising fundamentals, like a suite of unit tests, then the team should push back on it.
Truck_Stop_Sushi@reddit
Just have to get everyone to agree on what those quality standards are. Shouldn’t be too difficult.
edgmnt_net@reddit
What counts as QA, though? Is it only something along the lines of testing, or should we also consider stuff like defensive programming? What about type systems and proofs? Or abstractions that are difficult to misuse? Or keeping scope in check, since at a basic level quality means "fit for purpose"?
bummer69a@reddit
This made me feel good about the projects we've developed/work on
tjsr@reddit
WTF? Yes you are. It's called Software Engineering.
mrbojingle@reddit
Can't teach someone no one knows.
franklindstallone@reddit
I've been "punished" career wise for having taken automation roles that were deemed pleb testing roles but I think they've made be a far better software engineer. To be honest, as far as career progression goes, somewhere testing with at least automation should come before senior engineer.
Either way we need to find a way for people to be more well rounded and understand the full life cycle of software.
GoTheFuckToBed@reddit
well yeah, every tutorial leaves out error handling
EquivalentExpert6055@reddit
Judging from the software I have seen by people claiming to know these things, I am at the point where I think about saying „okay, fuck it, I’ll just break every rule, deliver your fucking feature, fuck tests and go home on time“.
Articles like this are an insult more often than not. I WOULD write better code. I want to. I don’t want to spend 3 years complaining to my colleagues that our code base is shit and then another 5 doing it while actively receiving pushback. And it’s the same everywhere.
ImNotTheMonster@reddit
It's the year 2023 and there are still people that don't get these fundamentals....
Holmlor@reddit
You are taught this in Software Engineering degrees which are master-level programs.
antirationalist@reddit
These articles have gotten very tiring to me. People complain endlessly that universities did not "prepare" them for business, seemingly unaware that the purpose of a university education was never to do that.
There are many reasons why most developers don't make quality software, but in my view two of the biggest ones are (a) the lack of industrial standards in the software domains where the vast majority of developers work, and (b) methodologies like Agile which put business considerations in front of technical considerations such as reliability, maintainability or other quality attributes. There's a reason they don't use Agile in electrical or civil engineering; you can't put an unfinished product on the market (let alone open an unfinished bridge to the public) with hypothetical future improvements as the justification.
Warm-Carpenter-6724@reddit
I graduate with my BS in like a week and I have to say I disagree with some of the claims. Yes I had to take 1-2 courses on hardware & things towards the beginning. However, the focus of my CS degree was software engineering & I had to take software security/coding standard classes for each of the languages we learned. Does this mean I now just write flawless perfect code? Obviously not but to say these things aren’t taught is a very broad statement to make, it’s going to depend on the school, the focus of the degree and the student as well. They can teach you what you should be doing but if you aren’t implementing and following those guidelines for all the classes (that technically don’t require it) then you’re just putting yourself at a disadvantage and potentially making life harder on your future team/co workers