I'm dumbfounded by the number of devs that don't test and stories of devs that don't test
Posted by WenYuGe@reddit | programming | View on Reddit | 574 comments
rollie82@reddit
Not testing what they wrote? Or not writing automated tests?
WenYuGe@reddit (OP)
Not writing automated tests
Light_Beard@reddit
Manager: "Sure as long as we release when we said without consulting you"
WenYuGe@reddit (OP)
Moment of silence to devs who work in these places.
Light_Beard@reddit
Do... do you all not?
photosandphotons@reddit
I thought I worked in a company with some bad testing practices but it’s nowhere as bad as this.
And most of my tech contacts in other companies are more sophisticated…dealing with FAANG and a tier below here.
rocketbunny77@reddit
No. Our product owners know why tests are good and specifically make sure they're part of the definition of done
Cheraldenine@reddit
Where I work the product owners are domain experts, which is nice, but with zero knowledge about good software development.
4THOT@reddit
If you're the software development expert it's your job to say "this will require testing, we will need dedicated to it" instead of cry on reddit that management isn't reading your mind.
Cheraldenine@reddit
Where I believe tests are needed, I just write them. That's no problem. If there is a problem, it's that colleagues that don't write tests and also cut corners with other things get praise for being so quick. That's annoying.
teslas_love_pigeon@reddit
Sounds like an oxymoron.
U747@reddit
Seriously. I'm an engineering manager and my devs have ultimate control over when it's released. If they tell me it's not done because it's not tested, it's sure as hell not going out.
WenYuGe@reddit (OP)
I haven't even seen/heard of a company like this in real life...
expatjake@reddit
Not for the last 20 years, no. I wouldn’t go along with that culture for more than 10 minutes.
TrainsDontHunt@reddit
This whole place is like a prehistoric Trader Vic's.
stayoungodancing@reddit
👋
joshc22@reddit
I'm fairly certain it's all of us.
katafrakt@reddit
It's not
Maxion@reddit
This right here, we write full test suites for all projects where there is budget for it. Which is like 1/10.
If we'd say no to the 9/10 jobs, we'd all be looking for work elsewhere.
hippydipster@reddit
Budget for it? Its faster and cheaper to build with tests if the project scope is anything beyond a couple months of work.
Maxion@reddit
Many projects work in such a way that the client ( or som idiot they hired) buils up a team from random consultants. This PO, which sometimes is the client, sometimes an external PO, handles the project planning. When you as the lead add tickets related to testing, or acceptence criteria related to testing, the PO/client shoots it down and requires you to remove that.
Projects IRL often have a scope of 3-6 months at a time, even if all parties really know that the project will last 2+ years.
Reality is very sad :(
hippydipster@reddit
3-6 months is beyond a couple months, and so it will be cheaper and faster to build with tests. You don't write tickets for writing tests, because it's part of normal dev work, just like I don't write tickets to write a class to inherit from some other class.
Of course, if it's not a unified team working together and agreeing on how to work, then very little is going to function well anyway :-)
Maxion@reddit
Ain't that the truth, a lot of coding advice is written with the baseline that all stakeholders are a) rational b) competent and c) present. Reality is not always like that.
hippydipster@reddit
What advice would you give the irrational and incompetent?
Maxion@reddit
Learn to live with the non-perfect. Learn how technical (or not technical) the people(s) in charge are, and take advantage of that. Either to try to convince them every now and then to sneak in some refactor or some testing. Or, make something overly complicated and claim it takes a long time, and then sneak in some tests or refactoring.
hippydipster@reddit
this sounds like advice to me - are you saying I'm irrational or incompetent?! lol :-)
Comprehensive-Pea812@reddit
writing automated tests is not foolproof though.
had a coworker who refuse to manual test and spent days going back and forth with QA for something that can be easily discovered by manual test.
MassiveInteraction23@reddit
What do people mean by “manual” vs “automatic” here?
Are we talking regular, fixed value tests vs property testing and fuzzing?
Or a human plays with the program and checks it out vs any kind of tests written in code?
wk_end@reddit
The latter; I don't think I've ever heard "regular" unit/integration testing referred to as "manual".
In any case, the article is pretty clear:
[etc.]
hippydipster@reddit
Nothing is foolproof.
Don't hire fools.
Comprehensive-Pea812@reddit
funny is. that ex coworker of mine was considered top performer, yet still uses a b c as a variable name.
I guess manager should learn to spot smooth talker.
narnach@reddit
There are trade-offs, but in general the things that are not automatically tested can easily break undetected that one time you’re not manually verifying it. Or the person who knew how to verify it leaves or is on vacation.
kernel_task@reddit
Unit tests have almost no value. Integration tests and end-to-end tests are really difficult to write, especially correctly so they don't fail spuriously. Tests are extra burden on maintenance, so they better be of value. Lastly, even if you write the most beautiful tests in the world, bugs will still get through.
We're all trying to deliver quality software at the lowest cost and there are trade-offs to everything. I won't pretend I have it right. I'm trying to write more tests in general, but I have my job and its deadlines to think about, not just pleasing bloggers on the internet. Sometimes tests help with the goal of saving me time while maintaining quality, often they do not.
bumblejumper@reddit
In my case, it's neither.
I've worked with smaller devs, independent shops, and small teams for over 25 years. I've yet to find a single dev at any level who consistently tests what they release, and even fewer who test their "fixes" to reported problems.
Even something as simple as, "Hey guys, the form validation won't allow for last names with an apostrophe, can you make a fix allowing this?".
You get back a response - apostrophe issue fixed.
You test the form - apostrophe issue NOT fixed.
This has been driving me nuts for almost 3 decades.
Kevinw778@reddit
I think I've only seen the, "They didn't even fix the problem at-hand" scenario once. The, "I fixed the thing you wanted me to fix, but in doing so broke something else" scenario on the other hand... Rampant.
I just don't understand how one doesn't test the fix they put in... You have to fix the thing, how did you not fix the thing? LOL
bumblejumper@reddit
I don't get it either, but it's so common we built a 2-step verification system into our tools that require a sign off from the dev who fixed the issue, and the person who made the original report before considering the issue resolved.
Looking through our history on a project we've been building, and maintaining for just over 5 years, we see over 90% of bugs being reported as "fixed" having a second "fixed" before it gets the final 2-party "resolved" tag is applied.
If that wasn't clear.
90% of the time we get a "fixed" from the dev team, it wasn't fixed.
This is a smaller team that has ranged between 3 and 7 people, all of whom I know personally.
We talk about this in person at meetings, over drinks, at the bowling alley. It never changes.
Kevinw778@reddit
What else would you have to shoot the shit about, if not for people not properly doing their jobs?
rollie82@reddit
Thin line between confidence and hubris.
wrincewind@reddit
Nah, couldn't be me. I'm too powerful to be affected by hubris.
Perhaps... more powerful than the gods??
mi11er@reddit
"Works on my machine"
Classic-Try2484@reddit
Windows every time
foursticks@reddit
It's sad that I can tell you actually believe this, even if it's only deep down.
wrincewind@reddit
(it's a joke about the original definition of 'hubris', in the greek sense :p)
mike0358@reddit
lol
RevanchistToast@reddit
Look, humility is for the humble all right?
BlandInqusitor@reddit
This is horrifying
Xemorr@reddit
If it's their full time job, fair enough.
If they're doing it for their own enjoyment on the side, then they don't have to guarantee perfection.
bumblejumper@reddit
I mean, no one is asking for perfection.
Seems odd you think it's ok for an employer to make a specific, very small request, and get back a response that the task has been completed, when it obviously hasn't.
Xemorr@reddit
misunderstood my response. I meant that it's fair enough to have that grievance if it's their job 😂😂 and not if they're not being paid
tomvorlostriddle@reddit
Now it shows an error message to the user saying
Bug sent for rework
next iteration with error message
...
crinkle_danus@reddit
This is same for PRs as well. Dev said they already resolved the comment, then turns out its not yet resolved and the dev forgot to push their code. One minute later they pushed the code, and still not resolved.
wonder_grove@reddit
I feel we are missing better review tools here. For me, I get quite a few comments on any of my PR, depending on reviewer. Many of them are 30 seconds changes, nothing that requires a lot of thinking. For these, I need a way to go through them and mark them as resolved for myself. The reviewer, of course they also need a way to confirm that my resolve is when they wanted. I think we need tools with 2 resolve check marks.
Excellent-Cat7128@reddit
This is why I have a rule that only the reviewer can resolve comments. I worked somewhere where the submitter would resolve comments when they felt they had fixed it. Often they hadn't fixed it or they hadn't pushed. Forcing the reviewer to make sure it is actually done and on the branch made a difference. Of course that requires non-lazy reviewers...
lally@reddit
Yeah that's one weakness in PRs. The author wants a checklist so they can mark off when they've done something but before they've published it to the PR. The reviewer wants their items checked when pushed.
Excellent-Cat7128@reddit
What we did was have the author respond to each item as they fix it, but not submit the responses except at the end as part of a "review" (as GH and GL call it). The same with the reviewers responses. Or they can be done asynchronously and the author can leave a comment immediately as something is addressed, so long as it is pushed and the test environment is updated. These approaches have the side benefit of clearly documenting the issues that came up and how they were resolved. If stuff is marked resolved without commentary, then someone might not have done their job and it stands out.
Turbots@reddit
Pair programming solves most of your frustrations. Been doing it constantly for many years, it's so much more efficient than doing peer reviews after the facts.
Excellent-Cat7128@reddit
I've had good experience though I haven't done it a ton. It can be tiring so I think there needs to be a way for individual work to still be validated.
Turbots@reddit
For sure, it's really hard to do more than 6 hours of intense pair programming a day, usually about 3 before noon, 3 after...
And yes, individual work needs to be reviewed as well, but like... Don't do it like you described lol.
People basically need to become more mature and responsible in their jobs. I know, crazy right. Many devs are just lazy or careless and there are typically not a lot of implications to them when they are.
How many times has someone REALLY been reprimanded for working like you described? In any other job, this would result in getting grilled by your boss and if it kept happening, threats to get fired. And rightfully so.
Excellent-Cat7128@reddit
When I was a team lead, I did push hard on people doing good reviews, and it worked. They would take longer, but a lot of issues were found and dealt with. We also had QA do a run through before MRs could be approved and merged.
But it does take a culture that is serious about it and a lot of devs are lazy and entitled. We see it here in these types of threads where people make it sound like any sort of process is unnecessary "context switching" and a waste of time. Meanwhile, their job is churning out React components.
Turbots@reddit
Agreed 💯
Dx2TT@reddit
The truth is that the majority of devs don't know how to dev. The problem is that, also, the majority of dev managers, don't know how to dev manage so they can't figure out which devs are which because they can't determine when a problem took a long time because its actually hard or it took a long time because the dev fucking blows.
wrincewind@reddit
I don't like solving problems twice and i definitely don't like being whined at by end-users, so you can guarantee that i double-check my code does what i think it does. (....most of the time. sometimes i forget to test every aspect of a big change, but at least i'm trying!)
jamart227@reddit
As a QA responsible for testing the offshore devs code, this is the most painful part.
Made a bug fix request to change form email length validation to from 355 to 255 the other day. Later that day I get a message that it is fixed, I check the form, now the validation length is now 325555...
bumblejumper@reddit
That's not what you intended? /s
BasicDesignAdvice@reddit
Any advice on finding "smaller" gigs? Everything I see is big corporations. Not even as many startups these days (and it's always seemed like you need to know someone for those).
bumblejumper@reddit
Network.
I used to write my own stuff 25 to 30 years ago. 95% of my hires are people I either know from real life, or people that were recommended to me.
That said, I'm probably not a great example. I'm not looking to grow to 100 million in revenue on any project - it's just not who I am. I build, grow, and either sell, or move on to something else pretty quickly.
Brilliant-Sky2969@reddit
Also: "it works locally"
bumblejumper@reddit
You're not wrong.
As if I give a fuck it's working locally. ;)
Fennek1237@reddit
And something like this would even not require a formal technical test but just opening the form yourself and entering the apostrophe. I know this well myself. Even when you explicitly tell them to please check if their code is working and go to the frontend themself and check it - they won't do it.
bumblejumper@reddit
Right, I'm not even talking about a technical test.
I'm talking about doing the absolute, bare minimum. Blows my mind how the bare minimum isn't even being done.
Michaeli_Starky@reddit
Out of curiosity, which country are you from?
bumblejumper@reddit
The US.
Over the 25+ years I've either been in development myself, or was the manager of a team, or the guy coming up with the ideas for a started - it has never changed.
Big firms, small firms. Guys from the bar, guys from school, guys I've hired through recruiters, or people from overseas.
It just feels like it's something in how the mind of a developer works.
They look at the code, see an error, and resolve it. In their head, it's resolved. The code "should" work.
bumblejumper@reddit
The US.
Over the 25+ years I've either been in development myself, or was the manager of a team, or the guy coming up with the ideas for a started - it has never changed.
Big firms, small firms. Guys from the bar, guys from school, guys I've hired through recruiters, or people from overseas.
It just feels like it's something in how the mind of a developer works.
They look at the code, see an error, and resolve it. In their head, it's resolved. The code "should" work.
Pious_Atheist@reddit
Sounds like a culture problem. There are tools (like husky, githubactions) that can allow you to prevent a code from being committed or a PR from being accepted if code coverage isn't x, or unit tests aren't created, etc
bumblejumper@reddit
Honestly, this has happened to me when working one-on-one with guys I hand out with on a personal level, and teams I've hired overseas who I have nothing but a working relationship with.
It just feels like most devs look at the code, adjust, and assume the error has been corrected because the code looks right.
Looking right isn't enough. That's why testing exists.
Pious_Atheist@reddit
Also, no excuses in the era of AI copilots.
mccoyn@reddit
I know this is a quick example, but it isn’t a god problem description. Maybe there are two apostrophe issues, the dev fixed one and you tested the other.
A good bug report has steps to reproduce, expected results and observed results. Don’t make it a guessing game.
bumblejumper@reddit
Says the guy who doesn't correct the mistake, then says "I thought you meant something else". ;)
rulnav@reddit
Well, at the opposite end of the spectrum, there are automotive and medical where even the smallest changes go through weeks or even a month to get integrated, going through >85% coverage unit testing, component testing, peer review, QA and then a meeting with the various owners and integration teams to explain what you have changed and why, because Jira is not enough.
omz13@reddit
In automotive and medical, if you screw up people die. Which is why the testing regime is strict. And the development/qa costs are astronomical.
rulnav@reddit
I agree and it certainly should be that way, but you can't deny it's not very developer friendly. That's why I gave it as a contrast. What I was saying is like: "you think you want robust testing, but what you really want is somewhere in the middle".
457583927472811@reddit
It shouldn't be developer friendly. I'm tired of things becoming 'developer friendly' in spite of all the issues it causes. Moving quick means jack all when you ship broken code that causes a security breach or worse someone's death. Developers need to truly realize what they're responsible for and the implications if they fuck up.
Redleg171@reddit
And still not work right after all of that.
Particular-Key4969@reddit
This is the realest comment ever lol
Capaj@reddit
well it's hard to keep it working when it drags on for months and you need to resolve 1000s of merge conflicts along the way
Cheraldenine@reddit
I've had a tiny bit of contact with that sort of thing and then I ran away.
What I always wondered was: do that really do all that process for every change even when a new application is written from scratch? How can anything be finished at all?
The_No_Lifer@reddit
I wrote code for manufacturing equipment for a med device company. We didn't do any validation until the initial qualification, which is needed before it can touch product that goes inside someone. Once it was qualified, ANY change needed to be validated before it was qualified again.
We would usually take the machine down, test, revert to the old, write the validation docs, then install the updates. It's horribly slow and we would often include QOL upgrades with bugfixes because they wanted the QOL for months but now have justification to take the machine down
For an initial qualification of a ~300k machine (including labor), the test validation doc was 90 pages
MrRogers4Life2@reddit
Where i worked you would have periodic test windows where you batch groups of changes, say you release every 3 months, for a small change you might identify the tests you think are risky and just run those and merge it into a release branch, at month 2 they'll start to run the full battery of tests and you can remove/update problematic changes and they'll rerun the whole suite. Changes that are dropped will be picked up in the next cycle.
TheEndDaysAreNow@reddit
If it never reached prod, it never blows up prod.
rulnav@reddit
There might be a gradual tightening of the nose. But a couple of months before release, that process would already be 99% in place.
gareththegeek@reddit
You would go through all that process right before initial release, waterfall style
Cheraldenine@reddit
Right. I was in a healthcare startup with four founders plus me as the only dev. They were getting certifications so had a QMS (copied from another startup in the same building that was a bit further along), which required a whole process for every change. Of course the founders were in the "R&D" department where it wasn't necessary, but I was the "Development" department and needed to get every commit through the change committee (them), for a Django app I started from scratch.
They're actually successful now, but I'm so glad I ran.
f10101@reddit
Yeah, that's deranged. They must have misinterpret "change" in the QMS they cribbed. Easily done, I guess - the documentation is very dense.
BigHowski@reddit
Same with ERP here. I don't recognise this to be the case in my industry at all. Sure you occasionally get the odd "fixed" and it turns out not to be but mostly we have so many processes that as long as the process is accurately described the mod/fix does what it should
Loves_Poetry@reddit
I have also encountered this situation way too often and it is shameful to have to address it
I see a PR with changes that claim to fix something. I see something that catches my eye that I find suspicious. I check out their branch to see if it really works. App crashes on startup.........
Yes, they just threw it over the wall without even bothering to start the app. I wasted a lot of time carefully reviewing something only because some other dev was too lazy to even start the app they are developing
no_spoon@reddit
It’s not up to the dev to decide tests need to be written. It’s up to the project spec and thus the PM. Clients don’t give a flying fuck about your tests. They want to know whether it can be done on time within budget. Do you want to be the person to say “it’s going to take me twice as longer because I always write tests”.
This is all common sense.
VulgarExigencies@reddit
It is up to the dev to say that tests are non-negotiable actually. If a PM ever told me there wasn’t time to write tests I’d politely tell them no
MaleficentFig7578@reddit
It's up to the dev to NOT EVEN TALK ABOUT TESTS. Do what you need, to make it work. This includes tests. You don't talk about your IDE and font size do you?
wrincewind@reddit
of course I do, how else are we to ensure total corporate consistency? :¬)
no_spoon@reddit
Ok, we’ll consider that during our next round of layoffs
hippydipster@reddit
Writing tests is how I write code, so as the dev, it is up to me and me alone. There's no "twice as long because I write tests", there is simply how long it takes.
no_spoon@reddit
I wouldn’t hire someone with that logic.
hippydipster@reddit
No worries, wouldn't want to work there.
no_spoon@reddit
Let’s take a step back. Are we talking web dev or something else? If it’s web dev and you need to write tests for everything, I wouldn’t touch your resume with a 10 ft pole. Web devs are problem solvers, not stuck up pretentious nerds who need things done their way. You lack any insight as to the actual problems you’re solving, not just for the code base you’re working on, but also, more importantly, the people.
Also, code coverage is meaningless. I make my devs write code. Then later on in the project, when a test is failing for some stupid reason that has nothing to do with anything because the project specs have changed (which they do), instead of having my dev complain about the failing test, I evaluate whether we need it to begin with and I’ve been saying “just delete the test” more often than not.
hippydipster@reddit
Thanks but no - not with someone who makes personal attacks so easily.
no_spoon@reddit
lol. Wow. Nothing personal dude. I just see your type often. You’ll fit nicely in the corporate space where you’ll feel like you’re making a difference and your boss is happy, but really you’re just a number. You’ll have the capital to allow for your over engineering, but that could change so I’d be careful.
BornAgainBlue@reddit
I have had the same experience, it really affects trying to get a job when you have to admit that no job you've ever had implements unit testing. I FINALLY got a job that used it, and if course the downsized and I got laid off.
Scroph@reddit
Sometimes the fix doesn't get deployed to the right env, other times it's caused by a regression. I used to include screenshots in the corresponding Jira ticket before sliding it to the next column just so that I don't lose my sanity if it gets brought up again
MasterLJ@reddit
I remember counting the "cycles" at a previous employer.
A "cycle" is where I report a bug, they claim they have fixed it, I go to verify and it wasn't fixed.
It was 18 cycles before I lost my cool. My last day at that job, it was over 50. Exact same bug. Each cycle took me about an hour.
zelphirkaltstahl@reddit
Sounds like someone just biding their time, as long as it works and they don't get uncovered. Or simply incapable. Even in situations where there were no tests, I tested at least manually, that my changes fixed past wrong behavior. This is just someone being careless or lazy.
GoTheFuckToBed@reddit
hm I wonder if I should filter for this during hiring
Katut@reddit
I find it depends on the workload.
If you swamp your Devs in tickets with unrealistic deadlines, you will get that behaviour.
bluemaciz@reddit
We fired two devs this year, both above junior level and one was very senior, because of a refusal to bench test.
smallballsputin@reddit
Junior devs, sloppy devs, or "i only know php"-devs.
They so the bare minimum and have no passion for what they do. These are common no matter where you work.
Cheraldenine@reddit
I've been that programmer -- but the workload there was very high, and spending more than a few minutes on an issue like that was hard.
sillymanbilly@reddit
“But it works on my local”
zman0900@reddit
95% of code reviews I do - "please add unit tests"
testedonsheep@reddit
It's probably fixed on staging or testing environment.
tomvorlostriddle@reddit
It's often an agency conflict where the dev is incentivized to push out as fast as possible or thinks this is the case. And where any quality issues of any kind are never the devs fault.
I've had devs who push it to me for manual testing the second they wrote the last line of code. Bonus points if they at that moment also already messages the client and/or my boss with "it's done".
Then I found that for example they were calling from their own new methods their own existing methods with a typo, so it couldn't have ever worked in any scenario, and they get frustrated that I'm hindering the process, my boss gets frustrated why I'm still holding it all up...
Yeah, so, you don't want any of that.
debugging_scribe@reddit
When I joined my company they had zero automated tests and it was lucku if they tested it manually for 10 minutes. Thankfully the lead dev at the time was open to improving things. The first 2 years here was just continually putting out fires. The code base was a decade old at that point. So you can imagine it took time. I still work here and it's so chill these days. It's weeks between issues and a year since the last major one. It's amazing how much automated tests save your arse.
SippieCup@reddit
Decade old is pretty new!
jtinz@reddit
At my company, they've started more testing. Only they're literally not testing what they wrote. They're now writing unit tests for Angular with mock services providing mock observables / pipes, which have a behavior that is completely unlike that of the ones of the actual service. They're testing behavior that only exists within the tests.
And it gives them the confidence to implement, review and accept merge requests without actually trying out the code. Sometimes the development branch is now unable to load any age at all, which didn't happen before.
deong@reddit
That’s been my consistent experience with unit testing. People go all in on test coverage, and a bunch of shitty programmers write endless junit tests to make sure that integer addition and the built-in ArrayList methods work.
Loves_Poetry@reddit
Testing built-in methods gives zero coverage, so I have no idea what those devs are doing. But they're not creating more test coverage
deong@reddit
They're probably not directly writing a test on ArrayList.add. They've probably written a bunch of trivial things like getters and setters and methods like "addCustomer" that looks like
and then written unit tests for all that stuff. Which is equivalent to just testing the built-in, but obfuscated just enough to feel warm and fuzzy about it.
hippydipster@reddit
You need better mentors then.
deong@reddit
I don't necessarily disagree, but "you need to be better" is kind of a universal solution to any problem. It's kind of hard to put into practice consistently though.
Like if you have great engineers and great mentors and great management structures, almost any development process works really well. If you don't, you might need help from tools and processes, and I just think that a major focus on unit testing hasn't necessarily seemed all that successful in filling that role in my experience.
hippydipster@reddit
There's frankly a lot to learn, and it shouldn't be surprising that the words "unit testing" do not convey the full information necessary to understand how to write automated tests.
That's not true though. The best engineers in terrible processes and environments can't do very well either. It's like saying geniuses don't need a scientific method, empirical experiments to verify their hypotheses, double blind testing, peer review, etc. But, of course they do. An idiot can't do science with all that, but a genius can't do science effectively without it. It's all quite similar.
Can you, or anyone, write 40,000 lines of code without ever running it and deliver and have it work? Even a "great engineer"? No. You run the code to make sure what you wrote works. That's testing. Most likely, you run the code A LOT to test your work. Automating that just makes sense - speeds up the process and makes it more effective.
deong@reddit
That's kind of a dumb strawman to set up. I'm obviously not suggesting that you not test or even run your code. I'm specifically talking about the culture of unit testing and test coverage. You can be an incredibly good tester and never write a single unit test. It will almost certainly be easier to maintain correct code over time if you have some automated tests in place, but for most of the history of software, we tested stuff by just writing code and doing ad hoc functional and integration tests. I don't see where software got any better when automated testing became de rigeur in the industry.
Again, not saying there's no value in it. There is. I'm saying it isn't a replacement for just making sure your software works, and there are lots of people out there who think that just writing tons of unit tests accomplishes that, and it doesn't.
hippydipster@reddit
Well, maybe discuss it with them then.
deong@reddit
I have a job. I'm not quitting to travel the world evangelizing to other people's engineering staff.
VulgarExigencies@reddit
More and more I favor tests that do not mock anything except the external dependencies of my applications (databases and message queues using docker containers, HTTP APIs with wiremock, toxiproxy to simulate network failures), and test everything via my application’s API, rather than unit tests that calcify my codebase and make it a pain in the ass to change without actually giving me much confidence in what I’m doing
marxama@reddit
This is what I do, too. I've even built this whole thing for our system of services, there the exact same tests can be run in two different "modes" - in one mode, we start eg postgres and Kafka etc using Testcontainers and docker, and start up our services as actual HTTP APIs, and the tests then make actual HTTP calls, and the services make actual DB calls, etc.
Running in this mode usually takes more time than you'd want for frequent iteration though, it's more used in our master build just to make sure.
But then we have the other test mode, where we have mocks "on the edges". So a mock replacement of postgres, Kafka, etc. Still functioning, we're not mocking each individual DB call or anything, it's a "real" (but limited) in-memory DB and everything. And we don't make actual HTTP calls and so on. Still the exact same tests, and still the exact same application code, but the tests are an order of magnitude faster to run.
I've been extremely pleased with this approach, it makes me and the team very productive and actually confident in our tests. I've spent a lot of my career focused on test automation, and it's EXTREMELY challenging to get it right, my experience really resonates a lot with one of the parent posters, there are so many developers writing tests that only test the tests, all just to fill a quota. Or maybe they even believe that they are doing something useful, and can't see the holes in their approach...
nukeaccounteveryweek@reddit
Amen.
cstoner@reddit
This is the way.
TestContainers are a real game changer on this front. It's worth making sure you can run them on the build server.
I still do unit tests of units that are non-trivial/where I want some insurance in the future that changes won't break expectations, but it's way too easy to spin up most external dependencies in testcontainers these days not to.
taedrin@reddit
Unit tests are fine. But they aren't a replacement for integration tests.
LiquidLight_@reddit
Can you scream that at my entire leadership chain both business and tech? Because this project is like 7 years in and we have no integration tests, no end to end tests, only unit tests and manual testers.
TangerineSorry8463@reddit
You can have some sort of SonarQube or other static linter that checks which lines of code are actually executed.
jtinz@reddit
How would that help?
Here's an example: On navigation, a NavigationService provides a value through a pipe. The "page" in the SPA subscribes to the pipe and expects a value. Without it, the page won't load. Only the navigation occurs before the page is created. Therefore, the page will never receive a value and as a result won't load.
The test implements a mock NavigationService. It's mock pipe will provide the value on subscription and the page loads just fine.
BenAdaephonDelat@reddit
I've had 8 jobs as a programmer. I've only ever done automated testing at 1 place. I would love to have done automated testing, but most jobs have pre-existing code that wasn't built for testing and the managers are never willing to take on the tech debt of making automated testing work.
ThisIsMyCouchAccount@reddit
I was on an internal team making a system that pushes/pulls data from all our other business systems.
It was pounded into me that this *had* to be correct.
Great. Can I write tests?
Absolutely not. I guess it's just better to manually test workflows by resetting data in the database or just using one piece of manual data.
And while you're right - it would have taken some refactor - it was still very early in the project. Plus, it was 99% API. It would have just been breaking out some of larger data processors into testable methods.
Opposite-Piano6072@reddit
FYI implementing automated tests isn't taking on tech debt, it's paying it back. The tech debt is already there.
PayDrum@reddit
Doesn't work much better for new projects/codebases either. I'm a consultant and get to work on a whole new project every few months. The deadlines and budgets are always so tight that writing automated tests are almost always not possible within schedule. Good luck convincing the client to pay more to make extend the project timeline
sockpuppetzero@reddit
Yep, design for test is a real thing. Critically important in the realm of PCB design, as you have to literally test a unique physical object that may have unknown manufacturing defects, and you have to test these boards quickly and reliably at scale if you want to go into mass production.
falconfetus8@reddit
If you actually read the article, he's referring to automated tests.
tofagerl@reddit
Well, first you test manually - then you learn to automate. Then you learn to smoke tests. Then you build CI/CD pipelines with smoke tests. Then you skip the manual tests. Then you skip the automated tests. Then you skip the smoke tests, because when's the last time they ever failed?
Then you learn to test manually...
Excellent-Cat7128@reddit
Do you actually start skipping these tests? I've never had that happened. Maybe a test that is no longer needed goes away, but otherwise every automated test that gets added stays in the pipeline. And invariably, at some point in the not-too-distant future, it fails and a prod issue becomes a staging issue.
RupertMaddenAbbott@reddit
The post already addresses this point.
This is a bit like standing up off the couch and claiming to have exercised. Whilst technically correct, it distorts the meaning of "exercise" into something that most people would find unrecognizable compared to the real thing.
Similarly, its not just "not writing automated tests". Its watching somebody do the equivalent of standing up off the couch and claiming to have exercised. The value of this "testing" is low that it barely qualifies as testing and to accept it as such distorts the very meaning of the activity.
Tests (manual or automated) can provide three distinct pieces of value:
It's not that manual tests are never correct - sometimes that is the cheapest way to test something. It's just that you can't leave your test plan in your head and on your dev machine, unless you are happy that the behavior is throw away, or that it is acceptable for it to be regressed.
So when we say "this hasn't been properly tested", we don't literally mean that a developer hasn't gotten off the couch. We mean that claiming this to be "testing" says more about their understanding of that activity than it does about the value that has actually been achieved.
gelfin@reddit
Doing neither is shockingly common.
So, I don’t think it’s a secret to anybody in here that SWEs often have a really bad case of “smartest kid in class” syndrome. “I can see no reason this would break” is kind of the default position among many of them, to the point that look and make sure doesn’t even occur to them. I once worked at an org that had a sort of “honor system” policy baked into the pull request template. The dev had to check boxes confirming they had done commonsense things like:
This was more effective than you might expect. People who were “too busy” to sweat the details more often than not forgot to deal with the checkboxes too. There were few cases where people outright lied, because dishonestly is not the failure mode here. Overconfidence and lack of rigor is.
A close friend is a QA lead. She’s been working for over a year trying to get the organization on track doing proper automated testing, but is constantly hamstrung by the rest of the dev organization. This is an org with a culture of testing things in prod, just tossing code over the wall, keeping low-confidence initiatives secret (individually or within the team) and making drastic changes without telling anybody. The result is an ongoing quality dumpster fire. My friend is an experienced automation engineer, but she’s reduced to playing a schoolteacher nagging students into doing their homework just so the product can limp through manual testing. I can’t tell you what product they’re producing without risking identifying details, but it’s one where failure in prod can be a pretty big deal. Like potentially “human lives” big in the worst case. I am actively repulsed by management, but I have never wanted to step in and knock an organization into shape more than this one.
And I only wish I could say organizations that operate this way were unique, or even rare, in my experience. The “cowboy” mentality is alive and well.
Log2@reddit
I wouldn't even say it's shockingly common. It's the de facto norm. The better companies have tests, everyone else doesn't. The better companies are a vanishingly small percentage of all companies writing software.
ben_sphynx@reddit
I've definitely seen evidence of merge requests where the code does not run. Strongly suggesting that no, the dev did not test it at all.
dravonk@reddit
Writing good tests is hard and unfortunately rarely taught well.
In object-oriented programming, if you are writing tests for internal classes you are effectively blocking refactoring rather than enabling it, as is often advertised. You would need to test against some sort of API that is supposed to stay constant for a very long time. Depending on how clean the architecture is, identifying what is "internal" and what is an "API" can be a challenge on its own. (A video I recently watched: TDD, Where Did It All Go Wrong)
A test should ideally only test one fact and not break when something else is changed. It is little help when you want to change one minor thing and hundreds of tests fail.
So no, I am neither dumbfounded nor surprised about the fact that many devs don't test, when they had the personal experience so far that most tests caused more trouble than they actually helped.
Fearless_Imagination@reddit
I wish I could upvote this more than once.
People really need to stop writing negative-value tests. I call them negative-value because someone spent time writing it, then when it fails I need to spend time identifying if it failed because an implementation detail changed or because actual behaviour changed, and if it was the former, either change or delete the test.
And in the end people spent probably a couple of hours on the test, and we gained precisely nothing from it, because a test for the actual behaviour on the public API also still exists anyway.
Private_Kero@reddit
Can you give a concrete example for your negative-value tests?
is it about things assert func() == value?
My biggest hurdle in testing are APIs, because I still don't quite understand what mocks are good for and I have the impression that you build fake (mock) objects here that supposedly test something (feels like cheating).
My impression is also that there is little content, especially with regard to testing, which provides non-trivial examples with the typical add(x,y).
Fearless_Imagination@reddit
I don't have time to write up a non-trivial example right now, but:
No, it's about writing tests on implementation details. For example, you factor your code into a few private functions, then write tests on those specific functions. Then you later decide to refactor those functions into a separate class... and now all the tests you wrote on those functions are failing, despite the fact that you haven't changed how anything actually works. But now you have to check if these tests are indeed only failing because you wrote them on the wrong level of abstraction, or because you actually did accidentally change behaviour while refactoring.
Mocks are for the things that are external to your code, like another API or your database. You use dependency injection to inject some kind of adapter for those things into your code, and then mock that during your unit tests. The main benefits are that this allows your tests to be fast (no latency) and independent (don't need the external API to be up), and allows you to more easily fake error responses. The downside is that this approach doesn't quite cover everything and you will need to test with real implementations as well, as they real things might behave differently than your mocks of them, although that is usually done in a different set of tests.
IAmMuffin15@reddit
You don’t even have to right tests. Just use the debugger.
Alternative-Link-823@reddit
Until you get to integration testing, which is just and important and even more difficult to do well.
FoxyWheels@reddit
The #1 reason devs don’t write tests is management not giving time to do so, even after the devs heavily advise against it. Then 3 years later everything is going to hell so the devs leave and new devs are hired. Repeat until retirement.
Feeling_Employer_489@reddit
Agreed. This blog is the same-old obvious argument why testing is a good idea. I think most decent developers understand that testing is a good idea.
The problem is that testing is a public service, a benefit to the team but a harm to the individual. If you write tests, you will be slower than cowboy coders who don't and have more work on account of needing to fix or add tests to the cowboy code. A manager is not going to care about tests ("just don't write bugs"), so you need some other process or leader to enforce testing.
WenYuGe@reddit (OP)
Off topic: how do you justify testing to management and demonstrate value?
stahorn@reddit
You don't have to justify testing, or shouldn't have to at least. It's like asking a carpenter to justify measuring before cutting. If you have managers that doesn't understand that code must have testing, then I hope you eventually will find yourself a workplace where the managers do. It's the same argument for writing maintainable code.
Sometimes we should write quick and dirty code though, with few or no automatic tests. Usually if you want to have a small test tool for yourself or your closest colleagues or make a quick experimental software to try something out (sometimes called a "spike"). Making sure that these programs doesn't become real products can be tricky though...
superide@reddit
Workplaces where the managers understand code must have testing also unfortunately don't understand the other managers that don't are usually out of the control of the developers working for them. That has been my experience when being interviewed and asked about testing, anyways.
The result is managers not realizing that testing is an activity that has to be learned at a job but only accepting people who already have this testing experience (catch 22 rears its ugly head again). Developers who test and those who don't remain segregated, because they tend to take the fall as individuals for the lack of something that requires a team effort.
You're correct here- you don't have to justify testing. But you probably will have to justify not having to do it well at first, but pick it up at a rate they're comfortable with, and that you're not a cowboy messing their well structured shit up. I'm not aware of any workplaces that "sandbox" their new hires so that risk is minimized while they acclimate to their practices.
pemungkah@reddit
You can always point out that you have to test it anyway, so writing that automation saves time, as you don’t have to spend programming time manually repeating the tests and possibly missing one.
winkler@reddit
I always quip, “will AI write unit tests when it takes our jobs?”
Honestly, while I support them, I don’t feel like they are 100% necessary. Code is documentation and all that. Takes time and the right team to get right.
As a manager you build it into the estimates, which means you have estimates, which means you have top-down support. It took my last boss 3 years to convince upper management to adopt basic agile and he had been there for 10yrs.
All this to say you build trust with non-engineering and then do what you want to deliver what they want when you want.
BasicDesignAdvice@reddit
Collect data that proves people are wasting time. Convert that time into money.
The answer is always to put it into money terms.
I once got four engineers to drop what they were doing for three weeks so we could move a Jenkins instance to the cloud. Before that builds is the app could take hours (of waiting for the on-prem build machines to run all the jobs queued before them). After the move builds took minutes because each build spun up new workers.
They argued until I spent half a day doing the math at what we would save by improving velocity.
cstopher89@reddit
You ask to do an experiment where you track your defects over time. As you add tests you can check to see if the number of defects go down. I'd imagine the more testing you have the less defects there will be.
Then you have something to measure against. Once you can measure it then it's much easier to prove value to business.
This is a general approach to justify anything to management and be able to demonstrate value.
deeringc@reddit
I'm very much in the pro-test camp, but one problem with this is that a lot of the benefit of the tests I write today is that in 6 months when I or someone else makes a change in the code, or fixes a bug, my 6 month old test prevents a regression. There are immediate benefits to testing too, the code I write today will also have fewer defects and have a better structure but a lot of the benefits are medium term. A sceptical management team won't want to run an experiment over years.
Cheraldenine@reddit
And a lot of the cost of tests is not in writing them initially, but in the many times afterwards where the tests needed to change because the behaviour of the code changed.
To me whether tests are worth it really depends on the kind of code we're talking about. The more we're in the domain of library-like code, well defined functions, logic -- test test test. On the other hand, frontend code that sits on top of everything and is basically concerned with how everything is shown on the screen -- people will test it by eye anyway, and it changes far too often for automatic tests to be worth it.
Maxion@reddit
I generally try to at least shove unit tests into frontend projects, even when the customer tries to keep me from writing them.
I try to be quite strict with keeping components to just drawing UI, and keeping business logic to service classes or state management stores (depending what flavor of SPA we're working on). This way its semi-easy to write unit tests for the business logic, but you can keep the frotend UI components untested.
deeringc@reddit
See, I don't think it's right to view that as a bad thing. The test isn't being "fixed", the contract of the production code is changing so it's absolutely natural that the corresponding test gets updated to match the new contract. That's a feature, not a bug.
Now, some tests are badly written where they end up testing the internal details of code rather than the external contract, but that's another matter.
I'm talking about properly designed tests - they give you the safety net that ensures that a contract change is indeed intentional, rather than accidental, and locks in the new behaviour to ensure that it doesn't get broken accidentally in future. This is exactly the kind of deliberate change you want in a system.
I wouldn't say that preventing regressions is rare - in my experience it happens often. It's just a silent part of the iterative dev workflow. Make a change, get a test failure because of some obscure requirement I was aware about 2 years ago when I wrote the code but had forgotten, amend the code with that in mind and iterate again.
Cheraldenine@reddit
We did that, and spending time on writing tests was nowhere near worth it. We had a few defects but they were all of the type where the developer misunderstood or forget about some requirement, and thus wouldn't have written a test for it either.
(this is frontend React code that is 80% CRUD, 20% a bit more involved)
RackemFrackem@reddit
How is that off topic?
headhunglow@reddit
I'd recommend lying. If they ask "how long will this take" you give them a timeline including running tests, setting up a CI environment (or whatever needs to be done to automate tests, never done it).
justinlindh@reddit
"Are you cool with production outages and downtime? Me neither. We should make sure things work as intended before the next release. Do you want to hire a ton of people to manually press the buttons and make sure all this complexity works (which won't scale linearly as functionality is added), or allocate time to engineering to automate (or at least minimize) that?"
If that fails, unless the software you're working on is truly "simple", the company is a ticking time bomb.
I've never had to make an argument in favor of testing before, though... having to do so is a pretty big red flag.
deus_pater@reddit
Who are all these engineering managers who don't understand the value of testing, and why on earth do I not have a job? lol
Gwaptiva@reddit
It is what they pay me for: developing software includes writing tests; in fact, 80% of your time consists of thinking of ways to test the features you are coding
Drumheadjr@reddit
Show them some statistics and studies etc.
Management loves statistics
Seriously though, I have been watching this happen recently in the company I work for and the biggest way to get the higher ups to buy in is to have external publications and studies and data that goes into the stuff in a "We spent 2 years doing infrastructure and our productivity increased by 300%" kind of way, with data and proof to back it up.
There are some books that are floating around out there that have this sortof thing going for them. But looking at publiciszed pieces like "how google does QA" etc. is what you want to look at if you are trying to sell these ideas. Management want's to be google, because google makes a ton of money. If you are pitching stuff and you can say "this is how google does ____, here is a thing you should read from google on how much money it saved them over 3 years", that is how you get buy in from management.
just understand that if you can't deliver you might be out of the job. there is a risk in rocking the boat of course.
Feeling_Employer_489@reddit
I gave up trying. I test anyways, probably about half the time, when I think it would help me develop faster and I think the task was over-or-correctly-estimated and there aren't other high priority things to fix.
If I wanted to gather proper in-org-metrics on the impact of testing, I'd need to do that on my own. And I'm not particularly interested in working for free, outside of work hours, with no guarantee that anyone will listen after it's done.
dontyougetsoupedyet@reddit
It’s odd sometimes the difference in experiences and expectations people have. I use the tests of an org as a litmus test but in my eyes a large body of tests indicates an org mostly has coders operating on hope rather than engineers that understand what their goals are. If I see more tests than product I’m convinced of two things, that the product folks have no idea what they want, and that the coders don’t know either. Usually in that type of org I’m trying to avoid the only actual documentation of what a product should be are the tests. I believe most code does not benefit from tests, but folks are really making up for organizational shortcomings.
venustrapsflies@reddit
“Most code does not benefit from tests” is an absolutely wild position outside of the most basic scripts IMO. I think a more defensible framing would be “it’s difficult to write legitimately good tests for most code”
dontyougetsoupedyet@reddit
You probably immediately start coding and don’t have a clear understanding of the program being constructed, and when you are “done” your programs are probably changed soon after. I know why you think tests are so important, and most of the time it’s due to organizational problems rather than what programs require. Most coders aren’t aware of any other way of working. I’m fine with you working that way, but I’m definitely avoiding those organizations. I like programming more than being a coder.
Dragdu@reddit
Sure buddy, you can visualize the next 20 years of development and write perfect code that covers it right at the start.
dontyougetsoupedyet@reddit
Buddy, that's very besides the point. I am willing to wager that at your organization you also have more tests than product code, and not a single spec in sight.
pemungkah@reddit
Most places the test suite is a significant part of the specification! It’s simply switching a spec written in prose for one written in a programming language. There are definitely tools meant to try to cross this gap, like Cucumber, which uses a restricted language to describe a specification, making it much easier to convert the spec directly into a test.
I started out as a contractor at NASA so I 100% get the concept of a spec. We did multi-hundred-page specs for the projects I worked on because those were 100% waterfall projects. One I worked on was implementing a full-duplex bisync link between two different sets of computers; I was responsible for the middle and ensuring it met spec, but I had no programming access to the machines on either end. All the programming was done to the spec, and there were no automated tests (this was so long ago there wasn’t even any source control, and backups were “punch the source onto cards”.
Did it work? Absolutely, once it was deployed, it ran without changes for the life of the satellite. Was it easy? It was not. I spent many hours running the program, going upstairs to the machine room where the line monitor was, seeing an error, going back downstairs, making a fix, rerunning, lather rinse repeat.
Did I think of a way to automate the tests? No. No one was doing that at all then, and I had no way to get a loop back line set up to allow me to
TommaClock@reddit
The #1 reason is just inexperienced devs working at garbage-tier companies. Either startups/scammy consulting companies where noone has any experience with testing because they're new, or legacy companies where noone has experience with automated testing because they didn't do that in the old days.
I've reviewed too many (trivial ~30 minute) take-home submissions where we asked the applicants to test their submission and they either don't use a testing framework, or document a manual testing process.
And this often is from developers with multiple years of experience and/or "senior" on their resume.
nachohk@reddit
If you didn't explicitly specify that a testing framework should be used, then that is an obnoxious expectation for an interview assignment. Part of being "senior" is knowing better than to waste time on automated tests for a 30-minute throwaway piece of code. I say this as someone who writes automated tests for non-trivial code as a matter of habit.
TommaClock@reddit
The point of the exercise is to demonstrate a basic knowledge of best practices (and the prompt specifies as much).
Wouldn't be much of a test if all of the answer were specified now would it?
nachohk@reddit
Best practices for a tiny write-and-forget program are not the same as best practices for larger applications. I think you are judging candidates by a stupid metric and are very likely losing out on the better ones. But that's your prerogative.
TommaClock@reddit
I've actually tried many times interviewing candidates who wrote otherwise good code but either documented a manual testing process or "tested" in their program entry point.
That's the entire reason why those submissions are discarded on sight.
nachohk@reddit
I think this is probably too stupid to be a serious reply. But at risk of feeding the troll, I will say that I have extensive experience with CI and automated testing tools. And if I was trying to put something together in thirty minutes, as you said, and using a testing framework was not explicitly communicated as a requirement, then I would absolutely not bother. Bringing in a testing framework for anything that can be written within thirty minutes is a farce. If you are truly filtering candidates based on this, then you are selecting for inexperience. Anyone who actually understands testing tools and methodology and the benefits is going to assume that you are probably not this much of idiot, and will not waste time on this when it hasn't been clearly stated as a requirement.
TommaClock@reddit
There is no perfect method of filtering candidates. Someone using a method which isn't leetcodes isn't a troll. This method has worked to hire some fairly good candidates.
ReversedGif@reddit
Testing frameworks are not an intrinsic requirement for anything. If you have a compiler for a Turing-complete language, you can do anytthing.
Kids these days...
ProtoJazz@reddit
You don't setup a full ci instance for your interview assignments? Damn.
One of the early interview assignments I did that I remember getting a frustrating reply to. It was a poker hand evaluator. Big fuckin words on it saying treat this like a real work assignment, and "ONLY IMPLEMENT THE FOLLOWING HANDS" and gave a small subset of possible poker hands.
Very specific about doing only those ones. My guess was there'd probably be a live portion and you'd build on it. Or maybe they just didn't feel like looking though a bunch of shit and got the idea after the first few.
One of the first things they said to me after I handed it in. "Well why didn't you implement all of them?"
I told them the assignment specifically said not to. They told me they wanted people who were willing to go above and beyond.
And I don't know about you, but I don't consider going above and beyond to be doing something you were specifically told not to do. If someone's up on a ladder and says don't move it, and you yank it out from under them, they're gonna kill you. They aren't gonna hit the ground and congratulate you on going above and beyond.
nachohk@reddit
Absolutely insane.
I have kiboshed otherwise promising interview processes over smaller, whiter lies than that. There is a level of stupidity that is hard for me to grasp in beginning one's professional relationship with a job candidate by giving them a reason to distrust the things you say. If they'll lie about the requirements for the work they want you to do, what the fuck won't they lie about?
TommaClock@reddit
I could explain further but I think next time I'll just tell Reddit that I hire on leetcodes.
dweezil22@reddit
I went to a large financial company, took a repo with zero tests and wrote a few hundred unit tests. They all worked, and while code coverage was still at best 20%, it was something. I asked all the other devs to follow my lead and write tests.
Two weeks go by.
"Dweezil, you broke the build"
"Oh sorry [takes look]. No I didnt break the build, you changed the behavior of a function without updating the test. You need to fix the test."
[1 day goes by]
"I don't know how to fix the test. How do I disable it?" [tells them]
I went back and fixed it later.
Repeat 20 times, and I stopped going back. Wait two years, 80% of the tests are disabled. Nobody got fired, and mgmt accepts the near-zero dev velocity for the ppl that weren't offshored to lowest bidder Cognizant folks. So... yeah...
My new place we write a lot of awesome tests and it's pretty cool.
fubes2000@reddit
The number of times a dev has asked for help, listened to my explanation, looked me in the eyes and said "thanks", and then gone back to their desk and done sweet fuck all is quite frankly staggering.
Especially when they have the balls to show up the next day and complain that the problem hasn't fixed itself.
Frown1044@reddit
Maybe this a communication issue? If many different people seemingly don’t understand what I just explained, then it’s probably a problem on my end.
WearyAffected@reddit
If someone doesn't understand what was explained they should say so, not pretend they do. Proper response to not fully understanding is to ask more questions and continue the discussion whereas some people just nod and you can tell they don't really understand.
So yes, it is a communication issue, but from the person who is pretending they get it.
Frown1044@reddit
Asking for help is rarely an emotionless conversation that only stops until the asker understands the explanation.
In my own experience many devs aren't good at explaining or communicating. They get easily annoyed or have a hard time breaking down complex topics to someone who knows nothing about it. When someone is a bad communicator, others don't want to interact with them either.
If many different people (like the previous poster implied) seemingly ignore your explanations, it's time to look within and not blame everyone else.
WearyAffected@reddit
You shouldn't let emotions factor in when you're asking for help. There is no defence for pretending you get something when you really don't. If the person you're asking help from doesn't explain things well the first time, you let them know. Or you ask someone else. Why would you say you understand if you don't actually understand? What benefit is there?
Frown1044@reddit
Emotions are an inherent part of communication. You can’t take it out of the equation. As long as you’re dealing with humans, you have to account for their emotions.
WearyAffected@reddit
The fact you ignored almost the entire comment speaks volumes.
Frown1044@reddit
Because you keep missing the point of communication. It’s not about what’s the most efficient and logical way to communicate. This mindset is why devs are stereotyped as having poor communication skills.
The other person has to be receptive to what you have to say. If they aren’t and they exit the conversation prematurely, you probably fucked up.
WingZeroCoder@reddit
Yeah, what’s with that? Why do developers act like they’re users?
I work with a few like this. They literally have access to all the same code I do, and they all spend more time on their phones than I do so I presume they have more free time.
So why when there’s a problem do they act like they need to report it to me to get fixed?
And when I suggest they give it a go themselves, and give tips to get started — why do they act like they somehow don’t have access to the same codebase we all have access to? Like they’re somehow prohibited from working on anything outside their own tiny little corners?
Dartht33bagger@reddit
A fair number of people are good at getting others to do their work for them. They know how to work the 'quick call?', put a meeting on your calendar to debug, email your management chain game to get others to do their work for them.
FloRup@reddit
Maybe they learned to depend on you? Maybe they think "Coworkers X is going to fix that for me faster anyways". Maybe they just use your good will to offload work. Maybe they have some kind of "learned helplessness" where they think that anything more than their own little coded corner is "magic" and impossible to change.
Either way, no matter the reason, in my opinion the only way is to stop helping them. You shouldn't let yourself get dragged down by someone who uses you or is unwilling to change.
iiiinthecomputer@reddit
Some of them are also incompetent and leaning on others because they don't have a clue how to do the job. I've seen people last years this way.
chucker23n@reddit
All of that, but I’ll add: a transactional worldview. They learnt that if they put effort in a project that isn’t their management focus, they get no reward or even acknowledgment from management. Whereas, if they focus on their own stuff or just overall do less, they have smooth sailing, and management doesn’t mind or care. So they offload what they don’t have to do to (in their view) suckers who’ll do it. The thought that if people in a team help each other a little more, that feels good and productive and people are grateful and that’s rewarding, too, never occurs to them. It’s always “what’s in it for me?” first.
PrintfReddit@reddit
You need to get buy in for tests to be a blocker, and for disabling to not be an option. You also need to make them fix their own damn test
TheCritFisher@reddit
If you're fixing tests all the time, you're probably writing bad tests.
If your code is hard to test, you're not writing testable code.
Inject dependencies, where possible. Test integrations is possible. Mocks are a smell (sometimes). End to end tests are amazing, but slow. Unit tests are fast, but often not useful.
sparr@reddit
Most developers have no exposure, let alone training or practice, on what makes code testable. The terms are entirely foreign to them.
booch@reddit
This is true of a whole host of topics. Many devs have no experience or skill at
The solution is to force them to learn the above things.
pemungkah@reddit
Please do not get me started on people who claim all code is “self-documenting”, or that “the comments just get out of date anyway”. Yes, they do, Sparky, if you’re a lazy goddam twit who does half their job.
booch@reddit
Exactly that. Part of your job as a developer is to keep the comments up to date. If you can't do that, then you can't be trusted to make sure the code correctly implements the requirements, either.
Even the best code, without comments/documentation, only tells you what the code does, not what it is intended to do. And those could be two totally different things.
pemungkah@reddit
And why. So many uncommented code problems are caused by well-meant changes that were done because something “looked wrong” but wasn’t.
Spare everyone the probably-repeated error and just say why it’s done the way it is. Maybe someone can indeed fix it better later, but at least they’ll know what’s going on before they try.
Chesterton’s Fence:
KevinCarbonara@reddit
This is why python devs are always so surprised that C#/Java devs don't have to write anywhere near as many tests
kuribas@reddit
Enforcing type hygiene in python helps (rejecting pyright errors in the build). It’s an uphill battle, I keep hearing how typing doesn’t solve problems etc. . I am the only one that wants to get rid of all type errors in our place.Th alternative is 100 % coverage.
Jonathan_the_Nerd@reddit
I would like to know more. How do you enforce types in Python? Can you point me to educational materials?
Asyx@reddit
Yep. We have 98% coverage and 80% of all conditionals. We do some typing but it doesn't help that the tools are kinda garbage for typing in python.
ben_sphynx@reddit
I think rightly, it is that 'lack of typing hides problems'.
Eliminating the possibility of having a certain class of problems hidden in the code is a real good thing.
KevinCarbonara@reddit
Honestly, most unit tests are just a way of detecting runtime errors at compile time. Which is also what typing does.
kuribas@reddit
I had in clojure code (:and foo bar), which just always evaluates to bar, since foo is not a map. I meant (and foo bar). It just went undetected, until I was looking at that code and saw this. This would never happen in my typed code. Also in dynamic languages errors propagate through the whole codebase. It may seem to work locally, but cause a bug somewhere else. Also in clojure, tons of bugs with snake case vs camel case :my_symbol vs :my-symbol. Usually because of nil punning it just returns nil, and then propagates the nil. If nil is accepted it is simply always nil, or it will fall through until a database write, where it gives a schema error. In my python code, most of the bugs that weren't me misunderstanding the logic was due to the parts that couldn't be typed.
Sorc96@reddit
Thi is completely wrong. If a static type system makes your tests obsolete, you were writing the wrong tests to begin with.
KevinCarbonara@reddit
Absolutely not. If you don't have a static type system, and you aren't writing tests to ensure consistent typing, you are writing the wrong tests.
pheonixblade9@reddit
pyre exists, but it'd be nice if the language was just better :P
quisatz_haderah@reddit
Yeah I did not fully understand unit tests until I worked on js and python
Scroph@reddit
Or, as was the case in my previous workplace, the expected behavior changed all the time
TheCritFisher@reddit
Woof. Well fixing/rewriting tests is bad. It's either bad tests, untestable code, or straight up bad management.
Regardless of the reason, it ain't good.
Halkcyon@reddit
dweezil22@reddit
I was a consultant at the time, I got my billable hours towards bonus and a morbid amusement, no hard feelings from me.
double-you@reddit
I see a lot tests that are not well written or documented and so updating those can be difficult.
It should be clear from the test what part of it is the actual test and what it is testing and why do they consider the result they are getting correct. I'd also prefer explanations of why things were chosen to be what they are, like how long the test runs, buffer sizes, ... why the numbers are what they are. "some random number" is acceptable since it tells you that it wasn't a critical part of the test (at least at that time).
As well as documenting the reason why a test was disabled: what broke it, why can't it be fixed, etc.
ProtoJazz@reddit
Yeah, I've seen an aweful lot of tests that really really really care about shit like order items are in a list, or that a function is called with x, y and z params specifically
But they never actually test that the work being done by the code is what they want.
I actually got bitten by something similar not long ago. I'd written significant change to something. Basically when something got updated in the warehouse, instead of just updating and saving, it went through a process of validation and logging the changes so that it could be followed up later. Who changed what and why basically.
And I added new tests for the new functionality, but there were a lot of existing tests too.
Everything passed, thought it would be fine. But a couple days later we realized nothing was updating. I'd fucked up, and accidentally removed a condition. So instead of "if someone tries to do this specific type of update, regect it, put it in this state"
And instead just always put it in that state every single time.
I couldn't belive it. How were all these tests passing?
Turns out, we had a ton of tests. Not a single one of them actually verified the core functionality of "When I update this item, the item gets updated"
dweezil22@reddit
Agreed, a test with good docs can be the best real-world docs of all. My one complaint for the system I work on now is that many of the thousands of tests are fairly undocumented, so it's unnecessarily hard to find the exact right test that you want to explore how the system works or start a new feature.
aonghasan@reddit
this sounds like a bad test
dweezil22@reddit
Walk me through this theoretical bad test you're describing.
_rezx@reddit
Financial institutions are hilariously bad at listening to devs so the bar lowers until you get people that show up and throw up. (Like I did for 10 years) Hey, they only cared about money and in that way, we were aligned.
shoe788@reddit
Those managers probably got some fat bonuses for that move too lol
Canthros@reddit
On a previous project, I wrote a service to do something, with fairly copious tests. They probably weren't great, but they covered the high points. I then moved to different team to do other stuff.
A couple years later, I was contacted by my replacement's replacement looking for details on what the service was supposed to do, because it was throwing a lot of errors and so on. The best I could offer was documentation they already had and a recommendation that they check on the tests.
There were no tests. My replacement had removed them because they had 'no value'. Probably because nobody had bothered to maintain them. (I was contacted by his replacement, because he had since left.)
And so it goes.
jl2352@reddit
I’ve heard this excuse many times, and in practice it’s rare.
What is more common is management failing to drive testing as being important. The places where I’ve worked with great testing, were the places that C-level management would ask about testing in update meetings. Something goes wrong, and they ask why it didn’t have a test to capture it. They insist we add a test to prevent it happening again.
Management has a huge effect on pushing culture. When we don’t have people pushing for testing, it’s fine to let it slide.
break_card@reddit
The #1 reason I’ve seen is sheer laziness because they don’t want to figure out how to setup the dev environment
Ksevio@reddit
That's often an indication the process to setup a dev environment is too complicated. If you make it easy to test stuff then stuff will get tested more
moseeds@reddit
Disagree about it being management. In my experience nearly always developers, especially those lacking experience, just don't consider testing as development.
As a result they then compartmentalise development into 'actual development coding' and 'testing' as if they're divisible activities. This then leads to really poor time estimation and planning. Over time it becomes a habit and therefore culture. Which is then taken by that very same person into their management years.
chrisza4@reddit
It is a little bit more complicated than that.
I have found myself working quicker than many devs around who do not write test 90% of the times. So the issue of “not having time” is not really true. You usually move quicker on average when you write tests.
From my experience, the common problems are
management don’t give people time to get pass the automated test learning curve. Things will get slow for a while but then significantly faster.
Dev usually said at a standup that “I need to fix test”, make it like the test suite become blocker but in reality around half of “I need to fix test” is dev actually break something they don’t aware of. So it’s more like “my code is not ready because I haven’t handle some edge cases I did not aware (and test tell me this)” and not “my work is done I just need to fix test”.
stahorn@reddit
This times a 100 when you have to maintain old code that you wrote. The amount of time I've saved over the years because one of my old tests reminded me about some strange edge case is huge. This of course carries over with an incredible amount of time and money saved by our customers, as if I hadn't had the test, they would have gotten this bug in production.
At times it's very hard (and quite boring) work to keep tests working. It's tempting to just "go fast" by skipping on the tests. Luckily for me, I have already tried to "go fast" many years ago, and I remember that it only took a few weeks until I regretted the decision. Now I just "work slowly" and progress much faster.
ProtoJazz@reddit
Something my grandmother used to always say when I was a kid, I've thought really applies well to testing.
As a kid Id frequently rush through things, and do a shitty job. Homework, housework, that kind of stuff. My grandmother would insist I go back and do it right. And she would frequently say "If you don't have time to it right the first time, you sure as hell don't have time to do it twice"
Which as a kid I always just interpreted as "OK, then I guess I just won't do it at all" so maybe not as applicable in all situations.
yegor3219@reddit
Just don't ask them.
ThrawOwayAccount@reddit
“Why is this person taking so much longer than everyone else to finish tasks?”
yegor3219@reddit
Could also be "why does this person get so many reopened tickets?"
timewarp@reddit
In my experience, it's just both.
no_displayname@reddit
Exactly.. a plumber doesn't ask if they should test if all the seals are watertight. They just do because it's part of the job.
946789987649@reddit
Once you have the initial setup of a framework, it's quicker to write with tests than without. You have to check your code works somehow, and if you can write a test in a few minutes and verify it the same way you would manually, then it's far quicker.
SirPsychoMantis@reddit
With this mentality you can easily fall into the trap of mocking up some things, say "test is green" and never actually verify your code ever works.
Yes tests are good, but they aren't foolproof.
946789987649@reddit
My mentality is that way because I don't just have unit tests. I'm full stack, so I will often do backend changes -> write integration tests (which actually test the end points themselves) -> write FE code -> write end to end tests
I cringe when I see people load up their backend and start manually inputting things into swagger.
GeoffW1@reddit
The real time saver for coders that don't write any tests is that they don't have to make the code actually work. Someone else does that a later date. :(
NeonVolcom@reddit
Lmao I was coming in to write more or less this comment. "Write tests? Sorry do I have time for that?"
SoFarFromHome@reddit
I worked at a company where:
The only solution there was to find a new job and quit, which is what I did. My new company doesn't write tests, either, but at least they're proactive at identifying and fixing crashes and bugs.
BobSacamano47@reddit
You don't ask your managers for permission to write tests.
debugging_scribe@reddit
It should be considered part of the code you MUST write.
SoFarFromHome@reddit
I worked at a company where:
The only solution there was to find a new job and quit, which is what I did. My new company doesn't write tests, either, but at least they're proactive at identifying and fixing crashes and bugs.
MyTwistedPen@reddit
Agree, you are the expert, not the manager. You know what is required to do the job.
MyTwistedPen@reddit
Agree, you are the expert, not the manager.
-grok@reddit
Yep, devs can't fix bad management.
luckymethod@reddit
Like bad management is the cause of bad developers. Sure there's definitely bad management but lazy developers are EVERYWHERE. Let's not pretend people that code are some kind of superior race.
-grok@reddit
Well....bad management technically is the cause. Management does the hiring and firing at every place I've worked.
Show me weak sauce engineering management, and dollars to donuts I'll find an organization full of weak sauce engineers.
rusmo@reddit
Devs can provide estimates thst include time to write tests.
-grok@reddit
And the business can stop pre-selecting deadlines that don't have room for estimates that include time to write tests.
The truth is that a minority of developers work under management that is appropriate for software creation. This results chronic under resourcing of development work, industry wide.
FlyingRhenquest@reddit
Include it in the estimate. Inadequate testing turns development into a fire department, constantly putting out fires and not having time for anything else. Some developers seem to prefer the constant stream of emergencies they can point to as things they addressed while doing no work on the company's current goals. They can "fix" the same thing a couple dozen times, frequently without addressing root causes, and that's a lot easier than designing something new. Especially with "this company's shitty tech stack."
campbellm@reddit
And the managers who insisted on it have failed up and bear zero responsibility for the shit-show that inevitably follows.
FoxyWheels@reddit
It usually comes from upper management directly above in the same vertical (director or VP). Then they pressure the hell out of lower management, who are easily replaced and just want to hit targets to get their bonus, and finally falls on dev’s heads.
Tl;dr company culture of short term profit > long term stability ruins it for everyone.
HoratioWobble@reddit
It's part of you feature / bug / whatever. You don't tell anyone you've got a fix until you've written them. You don't need to ask for time to do it.
FoxyWheels@reddit
You either work for a better company, or live in a fantasy I have yet to find.
HoratioWobble@reddit
I work for an absolutely abysmal company. Doesn't really change the approach.
Tests should not take long to write, if you're building testable code. Minutes on a small feature, maybe an hour or two on a large one.
Unless you're telling management every keystroke of your change, no one knows you're not done until you tell them so you take a bit of extra time and add tests.
Better yet - start with tests.
FoxyWheels@reddit
UTs aren’t really the issue. They do take time as we have constantly changing features and requirements (detailed in my other comment). They also do not add a ton of value when compared with e2e tests. I can have all the perfect UTs and code in one module I want, but the product can still not function if it does not properly mesh with changes made to the other 15 modules it depends on for the feature.
It’s the e2e / integration tests that kill us. Take a large team, ~20 major moving parts, all with the constant requirement changes etc. writing / maintaining e2e / integration tests in this environment does take significant time, needs coordination between team members, and is not doable without support from management.
HoratioWobble@reddit
I work in an overall team of 140, distributed globally, working across 232 repos on a product that processes billions of transactions a day. We have about 80% coverage.
The company is so renowned for how shit it is, it was part one of the biggest scandals in my country spanning 30 years.
I got a ticket this morning, 2 hours later I was being badgered for a status update.
I still wrote tests and we have still E2E tests, written in Playwright in the individual code bases.
You absolutely have time. You've just convinced yourself you don't.
FoxyWheels@reddit
We’ll have to agree to disagree. It would take months to get any meaningful amount of test coverage on this project at this point. When your choice is deliver feature or face the axe, you deliver features. No matter how shit you say your company is, your direct management seems much better than mine.
phd_lifter@reddit
Why don't you factor the time for testing into your estimates then?
FoxyWheels@reddit
We do. It does not matter. Everything is p0. Requirements change weekly. Features shuffled around monthly. AI needs to be shoehorned into everything whether it makes sense or not. You then have PMs looking at it “working” in a dev k8s cluster and saying they already sold it to clients and need it now. So management tells you it must be shipped now, tests fixed later.
This has been my experience at 3 separate American fortune 100 companies.
Currently I have a tech literate director who will actually look at git and if they see you “wasting time” on tests “we can add after release” will get mad and force you to move on to another feature.
This is the reality of “agile development” in my experience. It is soul crushing to be forced to ship things I am not proud of and are not my best work, but I need to afford to live, so I do it anyway.
RiftHunter4@reddit
This. It's a leadership issue. Even if they don't want to, devs will test if leadership insists on it. I've seen people get fired for not testing.
agumonkey@reddit
or no team structure or appropriate design/modeling phase which means it's a bunch of pieces that don't make sense and will always be a moving target full of of holes.. which no one want to write tests on because they'll be scratched before the end of the week
NoMoreLatency@reddit
IIRC I actually saw a post where a guy took like maybe three days or to write a test suite. Didn’t even hold up his own work, still got everything done. After a few more weeks, people started adding more tests, and productivity started to improve.
Management fired them for wasting company time and resources. It’s an extreme situation but is a perfect indication of where priorities lie in an organization.
Photoperiod@reddit
This is my experience. I will admit, I have generally been bad about writing tests. But I'm making a serious effort now, but because ridiculous p0 deadlines keep popping up out of nowhere, the time to do them never gets budgeted. You tell management we need unit tests or our tech debt is gonna continue to build and every future release is a risk. They just say we will work on that after this high prio deadline is done. But then the cycle starts over.
BenAdaephonDelat@reddit
I've only worked for 1 company that had automated testing. Every other company isn't willing to take on the tech debt of making the existing code base work with automated testing. Most code bases I've worked on would require significant rewrites to even be testable.
b1ack1323@reddit
You're so real for this.
epicfail1994@reddit
Yuppppp we have basically no unit tests and it sucks
zerosign0@reddit
Everything will really really depends on: - Your other Team Member - Tech Lead - EM, Head, heck VP or even CTO - Team Focus, Dynamics & Engineering Culture - How cumbersome the current codebase (how easy it is to create one)
ck108860@reddit
A lot of devs don’t know how to or are not good at writing tests. That makes them not want to even more
ChrisR49@reddit
This is me. Wasn't taught in school, hasn't been required at any job I've had yet. I'm willing to learn, and know that I should, just haven't seen good examples of what and when to test for the projects I work on, from full stack websites to console applications.
ck108860@reddit
It was me too out of school and beyond a bit until I came to a place that required it. Then I slowly got better and better at it and feel I’m at a decent spot now. Definitely not an short amount of time to get used to doing it though
dimitriettr@reddit
That's the main reason.
People tend to patch the existing code just to fix the issue, and do not have time to understand the whole use case.
When a test fails, it is either poorly patched, or just disabled/deleted.
Repeat this process with a few different people and you end up with garbage tests, or no tests at all.
ck108860@reddit
expect(someFunction).toBeCalled()
Ok great that “covered” your code, but what did it test? Not much hah
ComprehensiveBoss815@reddit
I mean I assume it tests that the function doesn't make the computer explode.
lucid-node@reddit
IMO that's not the right way to test. We should be testing behavior, not implementation details.
ck108860@reddit
For sure
ProtoJazz@reddit
Ive removed a great number of tests that are basically just
"mock x to return y"
"assert x returns y"
Like good fucking job, you've confirmed the mocking framework still works. Now leave that to the developers of that software and not us.
Ksevio@reddit
This is the other end of the spectrum with the devs that like to show off how good they are at writing tests (or sometimes unreasonable coverage requirements)
ck108860@reddit
lol yep. Testing that Array.push adds an item to an array isn’t a useful test
dimitriettr@reddit
If you can remove code and the function still works, is it really useless?
ck108860@reddit
No it’s not useless at all, I’m saying you end up with this simple test and nothing more in “garbage/no test” cycle you mentioned above. Sure it’s better than nothing, but it didn’t test that anything happened which means you could remove all the code from that function and this would still pass
Kinny93@reddit
It depends if that class is defined in the file you’re testing.
For example:
If you’re instantiating a class and then calling a method/function from said class, that should be stubbed and the only test should be to make sure it was called.
However, if you’re inside the class where that method was defined, then you should be testing the method itself to make sure it behaves as expected.
psinerd@reddit
You can't call yourself a professional software engineer in 2024 and not know how to write tests or be fairly decent at it. 😡 That's the mark of a less-than-amateur skill level. Not sorry about offending anyone here-if you aren't writing tests for an overwhelming majority of code you write you should be ashamed.
Lonely-Suspect-9243@reddit
Indeed. Which is why I won't claim that I am an engineer until I cover my project with tests. Unit tests at least.
acommentator@reddit
Came here to say the same thing. I guess I shouldn't be surprised to see you downvoted, but I still am a bit.
WenYuGe@reddit (OP)
Me included sometimes... Some systems are a f*king ride to write tests for... and they end up flaky.
ck108860@reddit
A function with inputs and outputs - easy. A DAO with all sorts of external dependencies - much harder and requires learning the testing tool being used in order to mock things, etc. And then there’s UI tests…
oorza@reddit
I'm a crazy person out here on my ledge with my heresy but I think making running end-to-end UI tests in production possible and only writing end-to-end tests results is a more stable software for the same time investment. We write software to be used by users, so if code can't be reached from the edges of the system where the users use the software, whether it works or not does not really matter.
It's worth pointing out that UI end-to-end tests are very difficult and fragile and flaky, yet are likely the most stable of all end-to-end tests. I still think this is a better overall testing strategy than investing any time building out the bottom of the testing pyramid.
fdeslandes@reddit
One thing I found also really valuable is component testing for core UI components behavior. Somewhere between unit tests and e2e, you can use it to test components in UI libraries instead of a full product where what you want to test is the UI logic and not the business logic.
Cypress component tests have made that part work really well. Now, if only it was as easy to find actual screenshot comparison tests that work well and are not flaky based on pixel difference ratio... It would be nice to be able to ensure UI/UX stability when things are changed in html/css, because fucking up the UI because some devs cannot see the difference happens too often...
WhoLetThatSinkIn@reddit
Cypress is the absolute bees knees. We use it for api and UI testing at this point instead of involving another framework.
ck108860@reddit
I work at AWS and we test everything. I’ve heard this argument before and I agree with it to a point, it doesn’t work at AWS because we couldn’t care less about our actual UI, services need constant tests (canaries) regardless of UI so we potentially can know when services are failing before customers do.
But at a smaller company that does most of their transactions through their UI - write unit tests for thing that are easy to unit test (e.g. regex validation does what you expect it to), then e2e test the heck out of your UI and call it a day.
oorza@reddit
If your product's primary interface is an API, that's its "UI" as far as this discussion is concerned - it's how your software is interfaced with by its users. For a REST service, for example, an end-to-end test suite should just be a series of API requests it makes - that's functionally the same thing as Selenium clicking elements on a webpage.
WhoLetThatSinkIn@reddit
Did we just become best friends?
I argued with our QA team for WEEKS that they should be validating api results only via json schemas because we don't make the data.
Shockingly tests are flaky because they still validate data five years later.
ck108860@reddit
Yep, the API is the “end” or the surface or whatever you want to call it (why the term “e2e” is usually associated with UI tests is another topic lol). Test the things your users interact with and you’ll have (the most important) coverage you need
hbthegreat@reddit
We know you don't care about the UI. We have to use it. 🥹
ck108860@reddit
Don’t use it! Use CDK or some other IAC to manage your resources, so much nicer.
WhoLetThatSinkIn@reddit
Our console access is read only with break glass access requests.
It's awesome.
hbthegreat@reddit
I do. But YAML is the enemy.
ck108860@reddit
Certainly. Cloudformation itself sucks, but CDK is just normal code (TypeScript or whatever language you want) that builds into CFN. No need to read yaml ever again
WhoLetThatSinkIn@reddit
Wish you guys had tested whatever the two parameters in my RDS SQL options group that's attached to our master DB that's been up for three groups broke.
Went to restart one of our readers to update the cert and poof, gone. Literally unrecoverable.
Tried to create a new reader, nope can't do it.
Support said "you should restart your master with the default options group, but we can't guarantee it'll come back up and we're pretty sure failover won't work".
Thankfully we're an ooold company and nothing connects via SSL so we're not interrupted but I've spent two weeks drawing up plans to get a new one restored from backups, not interrupt any of the archaic Talend truncate and load jobs, and get DMS/CDC rolling in the downtime.
justin-8@reddit
The API is the customer interface for the majority of AWS services, so it makes sense and works quite well
nvn911@reddit
I mean by majority it's like 99.999% tbh
justin-8@reddit
Did you mean to reply to me?
iscottjs@reddit
I feel somewhat similar at times but it depends on what we’re building. Over the years I find myself prioritising unit tests less and often find myself pining for good quality UI tests because it feels closer to reality.
I end up going with a hybrid approach, where we’ll prioritise UI tests in the areas that make sense, but also still encourage unit tests for certain gnarly areas that just can’t fail, data sync, headless commands, background processes, anything involving maths or calculations or money, etc.
We’ve had unit tests protect us a few times when upgrading some libraries on our daily data import jobs broke something.
But I’m not a fan of writing unit tests for the sake of it, especially for the most trivial things.
MadKian@reddit
I’m currently on a team that’s obsessed with code coverage in unit tests.
I keep seeing the tests we wrote and I cannot see how people think they are really useful. Specifically comparing the effort it takes to write them and how often they interfere with a code push or at least make it painfully slow. (Because yes, we have a pre-push hook running all tests)
lunacraz@reddit
why would you run the whole suite? run tests on files you touched, beyond that, leave the test suite on CI
precommit hooks for linting is one thing... the whole test suite?!
liamnesss@reddit
The whole point of automation should be to free up the humans to do other things. Pre commit hooks might be okay for checks that run very quickly, but generally I think if it can run on the CI then it should be running on the CI. Watching tests running in a terminal is not being productive.
aholmes0@reddit
A perspective maybe missed regarding unit tests - they are invaluable for a fast developer feedback cycle. Knowing immediately whether you broke something takes much less effort and time to resolve than waiting for a full build and e2e tests. Ultimately I think this is what unit tests are for; not for proving a system works, but to provide feedback on problems as early as possible.
liamnesss@reddit
For UI tests I would start by looking to see if there is a Testing Library package for whichever rendering technology you're using. At least when in comes to testing web apps, things have gotten so much easier in recent years. Better tools are available and good testing practices have become commonplace.
StrangelyBrown@reddit
As a game dev, most of it is impossible to write tests for. If systems are kept clean then some isolated parts can be tested and we can do integration tests but unit tests don't cover more than a small percentage.
Scroph@reddit
I would assume that game dev is one of the fields where manual testing is at least somewhat fun
StrangelyBrown@reddit
Yes and no. Definitely sometimes fun. I always remember back when GTA vice city was in development I knew a QA guy and he said one of the tests was to run against every wall in the city to check collision. So it's an interesting mixture of fun and very tedious sometimes.
Scroph@reddit
Oh wow I take that back. It would be unfortunate if they do find a walk through wall and the dev team is like "ok we changed the collision detection algorithm, gotta test all the walls again"
Jonathan_the_Nerd@reddit
I remember falling through the floor (non-fatally) in an MMORPG once. I'll bet QA spent who-knows-how-many hours making sure the collision detection in that area was good, and I found the one pixel they didn't manage to cover. After I got back out, I tried falling through again and couldn't do it.
LosMosquitos@reddit
Have you seen the talks from Sea of Thieves devs about testing?
StrangelyBrown@reddit
Nope
LosMosquitos@reddit
You should check them out. Automatic testing in games is possible.
dyskinet1c@reddit
This is why it's important to code with testing in mind. Once I learned how to write code so it's easy to test, the quality of my work improved significantly.
a_redditor@reddit
ck108860@reddit
Agreed, but that untestable code isn’t always the engineer who is tasked with writing tests’ fault
Naouak@reddit
We usually say it like writing test is complicated. It's not. What is complicated, is writing code that can be tested. It asks of developpers to write in a way to make the code less coupled and coupling is usually a way to go really fast. It also asks of the developper to learn to break down what they are doing in logical steps which is harder if you don't take the time to think through what you are doing.
ck108860@reddit
Totally agree. a lot of new devs will also ask questions like “what do I test” because things are overwhelmingly not testable at first glance
TheOriginalSmileyMan@reddit
And the ones that are proud of it
jcddcjjcd@reddit
I developed android apps for 13 years without a singe unit test.
I did however vigorously test on real devices and identified bugs that way.
It worked for me.
WenYuGe@reddit (OP)
That's incredible! Do you work solo or in a team? What type of apps?
I'm genuinely interested in exploring if the 100% test coverage goal is like the book clean code, and should be taken with massive grains of salt.
deeringc@reddit
100% test coverage is a foolish requirement IMO. In any codebase there's ~20% of code that has very little value being tested and gets increasingly difficult to test. I don't think that coverage should be a score to achieve. Rather, it's a tool to spot testing gaps and trends over time.
GMNightmare@reddit
It's the correct requirement. You test all the code you write, period, simple. No more threshold games.
The \~20% of code you don't want to test is exactly the code you need to test. Not only is it the worst written code (hence why you struggle to test it), it's often where all the bugs hide (for largely the same reason).
Requiring 100% coverage does a few things.
1) Proves all the code is testable.
2) Every developer then has to write tests for all the code they write. Simple rule. Nobody is writing tests for somebody else's code, like:
3) Prevents the common occurrence of trying to make a simple change, such as a hard-coded value in a method, and then having to write tests for the whole class because the previous developer skipped tests for it by writing enough tests elsewhere. This is a constant problem whenever I've in a threshold team.
4) Ensures you always get the value of highly tested code, instead of only if you're working on the part with tests.
The only arguments against it generally amounts to, "I'm bad at testing and don't want to." Like, people are outing themselves here.
sards3@reddit
The actual argument against it is that it takes a lot more time and effort, while providing not much marginal value in return. The goal of your project is to deliver some working software, not to deliver a quality test suite.
GMNightmare@reddit
Yes, that's the, "I'm bad at testing and don't want to" part. No, it really doesn't. Not when you start out with 100% coverage and simply maintain 100% coverage.
People who only have threshold goals are often confused because they keep running into having to write tests for code they haven't developed due to changes tipping a threshold, which is a huge drag on time and requires a lot of effort as it's code they didn't write.
If you're writing code that takes a lot of time and effort to fully test, you're writing bad code.
Patently false. It's an excuse by people who largely never worked on such a project. Value gains are huge. As I just outlined. Even just by the process alone. No more arbitrary thresholds, no surprises over coverage drops, no more excuses: If you write the code you test it.
sards3@reddit
I was referring to value gains for delivering working software products, not value gains for your testing process. It seems like you are saying that a requirement of 100% test coverage is better than an arbitrary requirement of 80% or whatever. Maybe, but they are both bad.
GMNightmare@reddit
I did not list only value gains to testing process. I listed value gains to delivering working software products. Nothing I said specifically said for tests.
Improvements to workflow are value gains in delivering working software products, since development is iterative.
My initial claim remains true: The only arguments against it generally amounts to, "I'm bad at testing and don't want to." Like, people are outing themselves here.
You have not made any statements otherwise. In fact, you didn't even try this last post. You just claimed 100% coverage is bad at this point. No, it's not. Developers who don't test there code are bad. Skill issue. Problem exists between chair and monitor.
psycoee@reddit
I work in a regulated field. Let's just say none of the companies I worked for had 100% unit test coverage. Some had no unit tests whatsoever. Sometimes unit tests are a good idea, other times they are impossible or meaningless. For example, driver or kernel code is extremely difficult to test with unit tests. Your test harness would have to replicate the behavior of the actual hardware in some sort of emulator, and most bugs occur precisely because the programmer doesn't precisely understand the hardware behavior in corner cases. A much better option is hardware-in-the-loop tests where you run the code on actual hardware and test it by feeding it simulated inputs.
Unit tests make the most sense for things like self-contained algorithms. It makes sense to test algorithms, and the tests are meaningful and document the behavior. It makes sense to unit test blocks that have complex logic. On the other hand, it's not useful to have tests that are just mirror images of the code. You have to use engineering judgement.
Unit tests don't really work for complex systems where most bugs are related to concurrency and interactions between modules or external factors (hardware, networks, etc). And having a large suite of unit-level tests can easily double the amount of code that needs to be changed if something is being refactored. The optimum is almost never 0% or 100% test coverage. You want unit tests for stuff that benefits from unit tests, and other types of tests elsewhere.
For example, if you are designing an ECU for a car, you probably want to put it in a test harness with a simulator of an engine and exercise it through various operating conditions. Unit testing might make sense for a number of modules, such as the communications stack or e.g. the real-time scheduler. However, it's not sufficient on its own and in many cases is not terribly useful (e.g. if it's code that isn't expected to change after it's debugged and thoroughly tested).
Sage2050@reddit
I work in hardware which sometimes means embedded systems and this is how we do all our testing, unit tests confuse me as a concept.
FredeJ@reddit
I agree. Everyone’s preaching unit tests everywhere and the example is something like apostrophes in forms.
And I’m sitting here thinking about how to unit test this driver. I can’t mock the hardware - 80% of the development is figuring out how the hardware works in the first place.
JustinsWorking@reddit
Its because they’re all web developers ;)
billie_parker@reddit
You could mock the hardware, but I acknowledge maybe the tooling available is insufficient and makes things more effort than they're worth
WenYuGe@reddit (OP)
This is end-to-end testing, which is still testing IMO. I think this is perfectly fine/valid
I've been to places where we did mostly E2E instead of unit testing.
xampl9@reddit
That’s pretty much how you used to do it before the introduction of unit tests. You ran the application and fed it test data.
DeathByWater@reddit
I've worked in places that fail of the pipeline goes under 100% coverage.
It's fine - for most scenarios, there are similar existing test setups elsewhere. For something very different, it might take a bit longer to figure out the right mocking/stabbing.
It's also not enough - plenty of places between services that fail on integration.
My preference is to require 100% coverage, but explicitly exclude certain low value files (and do your best to make them small)
Worth_Trust_3825@reddit
Imagine the manhours you wasted clicking away manually when you could have a script do that for you.
teslas_love_pigeon@reddit
Yeah, no offense to them but how is that an exactly good endorsement?
"I wasted 13 years manually verifying easily automated behavior."
BlandInqusitor@reddit
Real question: how long has automated testing been a thing?
doinnuffin@reddit
Surely, not you don't mean CrowdStrike
chihuahuaOP@reddit
This is why I adopted test-first development. Like I was creating notes with data to debug anyway by creating my tests first all my "notes" are well organized and incredibly easy to reuse in my factories. All of this creates a super easy development experience and improves my efficiency.
notoriouslyfastsloth@reddit
if its not valued by the org why would one do it?
LetheSystem@reddit
Feel like I'm a bit of an outsider in this conversation.
I wrote a test. Once. 20 years ago. Decided it gained me nothing that a manual test harness wouldn't. A little bit of logging, done.
I've been writing code full-time since 1995. Vb, c++, c#, a handful of other languages. SQL. It's not been uncommon for me to spend literally years doing nothing but write code. It's also not uncommon for my applications to run for a decade without intervention. I'm talking to a customer next week who's finally ready to rewrite an app I wrote them in 1998. Just upgraded a series of my websites which have been running since 2008.
Tests play a role in a certain kind of application development. I don't think they belong in every workflow, nor do they necessarily produce golden code. Code with tests isn't inherently superior, cleaner, freer of bugs. They're necessary in certain pipelines, fine. But they're not necessarily necessary.
jdrobertso@reddit
Considering that you say things like "...an app I wrote them..." and "...my websites", I'm going to guess that you are writing code in projects on your own, without a team who has to understand and maintain your code. You are then, it sounds like, passing off a "complete" piece of software with no expectation of maintenance.
In general, when people are talking about code that needs tests, they're talking about code being maintained by teams of 5-20 people, where multiple people are contributing daily and upgrading functionality basically constantly. I'm currently working on a super complicated piece of code that has been hand-developed over the years by one person, so it has no tests, no documentation, and none of it makes sense to anyone but this guy who left the company. When I make a change in section A of the code, and my teammate makes a change in section B of the code, we might accidentally step on each other's changes when the user clicks a button on page C, that neither of us has touched.
In the world of software development where you're working on a team, developing something that others are going to maintain, tests are absolutely critical. When you're making a Wordpress site for a mom and pop t-shirt printing shop? Sure, skip the tests.
LetheSystem@reddit
Generally I work alone, but have run teams of 12-15 for maybe four years, a handful of smaller teams, some colocated, some dispersed (California, UK, India, Malaysia). Applications of many millions of lines of code (for lack of a better measure). Running inside three biotech companies. Running the financial portion of the custom ERP of the largest ship management company in the world. Running robotic test systems.
It's just a different way of doing things. And the argument pisses me off, honestly, because people clutch testing so tightly and don't understand that a different way can also work, and has done for the majority of programming's existence.
I'm sorry your previous coder was idiosyncratic and you can't just throw it away. I've had the luxury of doing so several times. Or that you can't pick their brains, but that often doesn't work well either.
If testing helps you get where you need to be, awesome. I wish there were a better way to reverse engineer the thing than to code and test to see if it's good code, though.
Personally, though, I hope to die before I write a test, get approval for a PR, or can't publish straight to production. At billion dollar companies, who frankly don't care about any of those things.
adobeblack@reddit
You can qualify yourself however you want based off your YoE and team size, that doesn't change how stupid your position is.
LetheSystem@reddit
Calling someone stupid isn't a really good way to argue. Whatever.
schmuelio@reddit
I think your justification needs a lot more than functionally saying "I don't need it and I'm qualified to say that".
Your process is (self-admittedly) pushing untested code directly into production with no oversight, I don't know why you're surprised that "trust me it's fine" isn't enough for some people.
LetheSystem@reddit
Yes, there's much more to be said on this whole topic. Reddit isn't the place for that discussion, and I'm frankly sorry I said anything here, as I alluded to with my original comment.
As to "untested" - I don't believe I said that. I mentioned test harnesses if needed. One's own manual tests and debugging. I didn't mention designing the entire pattern of my application to fit into a testing methodology, no. That I didn't.
As to, "I'm qualified to say that," my presenting applications and credentials was in response to the person who attacked my qualifications by saying that I was building wordpress sites for mom and pop t-shirt printing shops. True, that was an appeal to authority on my part, and not good argument methodology.
I think it's fair to say, though, that there are different ways of going about the whole software development process than the test methodology currently en vogue. Pointing that out should be a reasonable thing to do, and shouldn't be met with pejoratives and attacks.
Plenty of large companies don't have stage / test and push straight to production, by the way, for whatever reason of their architecture, practices, etc. They have processes to make it work, not all of which involve automated testing.
I'm not surprised that some people can't get by without tests - it's how they've been taught, and how they work. It is not the only way, however. Other ways are just as successful, if not more, depending on the task and the players.
So, "trust me it's fine" isn't something I believe I've said, and I'm not surprised at the responses here.
schmuelio@reddit
I don't think anyone is saying "not testing" doesn't happen, they're saying it's a bad practice.
You said multiple times that you don't write tests. Including saying that you hope to die before you have to write a test.
Test harnesses aren't tests, they're frameworks for writing tests in. A test harness is usually a good idea for large software but it's by no means a requirement for having tests.
Being generous, there are three ways to read your comments so far:
Listing your qualifications are about the closest thing to justifying your position you've provided. I can ignore them if you don't think they're relevant but it leaves you with even fewer stated justifications for thinking that way.
You didn't say it, but when you provide no reasons for thinking the way you do, then you are just asking people to trust you when you say it's fine.
Finally, to address the sort of underlying bits of what you said. I think you're severely missing the point here. Having bad practices be common and accepted things in industry doesn't make them good practices, and saying they're common and accepted isn't a justification for not pushing to improve those practices.
The reason why writing repeatable tests for your software is a good practice (compared to just not doing that) is that human beings make mistakes, they fail to consider every use case, and generally miss things. Tests provide an extra layer of checks to reduce the likelihood that those mistakes make it into the final product.
That doesn't mean that all tests are always better than no tests, tests are subject to the same mistakes and oversights that code is, but they do introduce an additional place where something has to go wrong for a mistake to make it into the final product.
Not pushing directly to production is an extension of that reasoning, it adds more places where something has to go wrong for a mistake to propagate through.
Also, this bit:
So this was interesting, because those "processes" to "make it work" almost always involve human beings manually checking that everything works as expected. This is testing, it's just explicitly less efficient and less effective than doing the same thing but automatically.
The reason why it's interesting is it sort of implies that you don't consider manual testing to be testing. Which implies that you do test things, you just don't want to have the computer do it for you.
LetheSystem@reddit
When I said that Reddit wasn't the place for the discussion, that was to say that the discussion is far more involved than comments, on a phone (made it to a computer now, whee!), where nuance is basically absent.
I believe "testing" in this discussion to mean "automated testing" rather than something performed by the developer - to mean tests which run as part of the check-in process, or what have you. I do not write automated testing, and do not work with teams which do, for one because I believe that they add less than they cost.
I'm not attempting to ignore my academic qualifications - I'm saying that I agree, my academic qualifications should not necessarily lend that much weight to my arguments, as they simply indicate that I got an education. I may feel differently, but I also have a degree in analytic philosophy (logic), which means that I understand that the logical argument of leaning on my degrees is flawed. I will certainly not ignore my professional achievements, however, which do include significant applications and leadership.
I'm not asking that anyone trust me. I would, however, like for people to consider the alternatives to automated testing. I'm sure I've represented that poorly in the comments.
As you point out, there are opportunities for failure all through the development process. I merely point out that slavish devotion to automated testing will not solve those failures, there are alternatives to that methodology, and that those alternatives are routinely ignored.
As to manual testing being less efficient and less effective, I'm sure there are metrics. I'm also sure that I can offer anecdotal evidence to the contrary of those metrics. Do I believe my experience over someone else's study? Yes, actually, because those metrics do not meet my experience.
I guess I would say: "having the computer do it for me" is an interesting concept, when building your own app, to test what needs testing / debugging, is actually having the computer do it for me. I'm just not trying to go for code coverage, or something like that, and not building my whole application around testing (e.g. via dependency injection). I'm saying that what I do is what I would consider debugging, rather than testing - it's the developer, working out the kinks. If they're good enough, enough kinks get worked out that it's a functional application. That's not to challenge someone's skills, but to say that it's a methodology you get used to & you get good at doing the work, or at recognizing where there's a problem. Test-centered programming... at the very least, it's not for me, I guess let's say.
schmuelio@reddit
To bring it to a close, I'll note that you haven't really explained what your process is (or why it's better, but at the very least what it is would be good).
You've mentioned several times that there are alternatives, but I think it's noteworthy that you haven't actually detailed what those alternatives are. Let alone why they are preferable or what benefits they offer.
LetheSystem@reddit
What I do on a project?
I may spend longer working on the CASE tool than I do on the code I generate with it, and longer on that logic than anything else. The CASE tool generates the application pattern I want to generate, and if done well then I don't have manual edits to the code when I'm done.
In terms of the benefits: by the time I'm done with the CASE tool, I (should) completely understand the application I'm building. It's probably fair to say that I'm "testing" when I'm working on that tool - iteratively working out the code in aggregate or something.
As to why it's better. I once rewrote 2 years of a guy's work, in 2 weeks. It was just a pattern and he was hand-crafting it, we were stalled, it had bugs. I was tired of waiting (and young and cocky) and designed a CASE tool. Now his code wasn't buggy and we could get on with things.
I have a full career of applications behind me, quite a few of which are still running and have been for decades. In terms of benefit, there is another. My applications have generally lasted decades (and no, they're not trivial).
It's not much of a process, you'll say. Maybe. But I'll (tongue-in-cheek) quote Don Knuth to say it's the Art of Computer Science.
schmuelio@reddit
You write a custom CASE tool per project?
That seems unnecessary, CASE is meant to be general purpose. Although I suppose there's a spectrum between DSL and CASE. If you're reusing CASE tooling (which you sort of should be since the main benefit is that they're reusable) then you probably should be testing it since the code base for that tool will change over time as you need new stuff adding to it.
Either way from the sounds of things it seems like you don't do much code maintenance if that's your general process. Picking up a piece of legacy code and working with it (the vast majority of software and tooling companies) is a whole other ball game. Given how you're talking I'm guessing you're a contractor who's brought on to projects to write new software? If so I wouldn't say that's generally applicable to most dev careers.
LetheSystem@reddit
You're correct, I do reuse, and maintain those case tools as part of each codebase - they're left with the client and I'll upgrade them as opportunity allows. I'll blow away thousands of lines rather than edit one - doing a diff to be safe, etc. - so, maintenance preferably looks like that. When it can't, it might be a total pain, and that's possibly where automated testing would be beneficial... if I'd spent the time to create them, of course. With my apps, maintenance tends to be very seldom*, so I really don't think I could justify it.
Everything I've ever written has been my database and only one hasn't been my complete architecture. All greenfield development, all database centered**.
When I'm confronted with legacy code, it's to rewrite it. That means dissection and a brand new application. Which may mean writing an application to understand the legacy app, and that'll be more ad-hoc, tearing at the app, data, code.
And yes, contractor has been probably 2/3 of my career. Brought in to deliver a whole application, start to finish, from requirements gathering to team formation / hiring. Sometimes project management formally, sometimes more architect / team lead. Quite often solo.
I have sensed that what I do is radically different to what a lot of programmers do, how we work. I haven't ever had an opportunity to see how "modern" craft looks. Quite a bit of it sounds quite anathema, as in sure I've expressed.
**Marketplace competition algorithms, demand planning, complex invoicing. Things running biotech processes, applications running on about 1,000 container ships. Databases running inside medical instruments. Pieces of a custom ERP running in over 100 countries.
schmuelio@reddit
To put it bluntly it seems anathema to you because you pretty much only write new tooling and applications that adhere to your design.
In "modern" (automated testing isn't new, Jenkins is 13 years old and I'm sure it wasn't the first of its kind) enterprise software development that rarely happens.
Also to be blunt, the examples you've provided are mostly not things that companies would bother maintaining, embedded software usually is left alone (I would count the container ship application as embedded), ERP would very likely get replaced rather than updated, biotech is full of legacy code (of which your applications would be one), and so on.
Most developers work for a company that produces (for internal use or for sale) one or more complex tools that they didn't write. That necessitates at least dozens of hands working on it over the course of years. By "complex tools" I mean things like Visual Studio, Outlook, compilers, document editors, analysis software, website frameworks, virtualization platforms, operating systems, design tools (like CAD), etc. These things have massive and wide-reaching use cases and get used by a ton of (usually untrained) people, and need to work anyway.
LetheSystem@reddit
I do not disagree. Totally different work, theirs and mine. There's a lack of knowledge about the other on both sides, and some looking down. Different development and application lifecycles.
I'm not sure it would justify automated testing on my end, to drag if back around to that. And that's the beginning of this: there are different, valid methodologies other than automated testing.
4THOT@reddit
It's not an appeal to authority if you are an actual authority, it's just being an authority on a subject.
LetheSystem@reddit
In the nitpicking, philosophical sense, it's an appeal to authority. ...when there is controversy, and authorities are divided, it is an error to base one’s view on the authority of just some of them. Because I accept that there are others with authority here, whose opinions differ, I cannot in fairness make statements on my authority and expect that authority to have sufficient weight to counter theirs.
4THOT@reddit
In the nitpicking, realistic sense, everyone saying 'appeal to authority' is using it as shorthand for 'fallacious appeal to authority'.
Treefire_@reddit
It is absolutely still an appeal to authority.
ReasonableLoss6814@reddit
I don’t think you understand their position, which frankly makes you look even stupider.
schmuelio@reddit
This is a terrible position to hold. Pushing your changes - untested - straight to production, with no oversight?
That's a recipe for disaster, and relies basically entirely on the assumption that you never make a mistake or overlook anything.
billie_parker@reddit
You realize that works against you, right?
LetheSystem@reddit
That I've studied, extensively, in the field? And don't believe that automated testing is worth doing?
billie_parker@reddit
The stereotype is that academics don't know how to program
LetheSystem@reddit
Ahh. Hadn't even seen that, thanks.
There's too little space / time in here to really give a coherent argument, or picture of oneself, one's skills, one's experience.
That I started on education after about 8 years of experience, because I wanted to become better at it.
Or that I wrote a medical records and billing system when I was 12, and upgraded it a few times until them buying off-the-shelf when I was about 30.
It's annoying, not to give the full picture. It's worse than Covid WFH.
praesentibus@reddit
tf m8
WenYuGe@reddit (OP)
same. no disrespect. genuinely surprised this sentiment exists and would love to hold a conversation about why this is the though process.
LetheSystem@reddit
"same" to whom, please? And which sentiment / thought process?
WenYuGe@reddit (OP)
To your experience and sentiment toward testing. I genuinely am curious how ensure anything you write works and continues to work. I write tests mostly to convince myself that these things are working somewhat according to my expectations.
I'm curious about the other approaches and thought processes :D
LetheSystem@reddit
Well, this kind of formalized testing came about when I was ten years into this, for one thing. At that time, you created a test project and ran tests manually, and it was a whole lot of work for little reward, for what any of us (at HP) could tell. So we didn't look at it again.
We'd build a test harness - an app, usually console, with a bunch of methods we could use to exercise the functions we wanted to debug. In other places, we'd put in logging. And that's about it, really. Build a test harness if debugging it isn't possible in the ide. Otherwise, it's all down to debug skill.
Don't get me wrong: the things I put out have been tested, but by me or my team, knowing what we've written. Are there bugs that hit production? Yep. There always will be. I kinda justify it by saying we make progress. But really it's 'cause it's not life critical.
Yes, I'm a fossil.
I also don't give a damn about anybody reviewing pull requests, and I deploy straight to production. C'est la vie.
gpunotpsu@reddit
It really depends on who you're working with. I made a huge set of tests for some of my coworkers because I was sick of them breaking things. It's paid off in spades.
LetheSystem@reddit
This is the sort of use for tests I appreciate. Thanks!
ReasonableLoss6814@reddit
Sounds pretty normal to me, just a different way of testing. Also nothing wrong with fast feedback loops by deploying to production early and often.
praesentibus@reddit
I worked in finance for a while with a guy who said, I quote, "I don't believe in unit testing." That dim view of automation was fractal - he didn't have test databases, and didn't even use C++ smart pointers; he'd bump reference counts by hand, and his code leaked like a sieve. Whenever I'd point out a place where he forgot to decrement a refcount, he'd be like "sure" and would insert the line there, as if it was nothing to think about. Absolutely no concern for the pattern, only for the immediate.
His code was many shades of awful, way more complicated than it could have, with shitty names too. Nobody could scrutinize it and the guy had great job security. As an aside, he was a recent graduate from an Ivy League school and he was firmly convinced he had nothing to learn from his coworkers.
He was fired the day his code blew up in his face on an important day when an Asian market opened.
The fact that people harboring such views hold jobs in this industry is why we can't have nice things.
Polokov@reddit
It's simple really. Code written defensively, that allows business logic implementation is build upon trusted invariants, thoroughly manually tested on all supported scenario plus some extra steps on data space, only modified in a way that respects invariant or clearly defines new one will live a great life.
Basically, automated testing works as a safe net for people that don't care enough of invariant either in the definition or respecting them. Automated test or not, the only way to have a solid codebase is to have it predictable when you do changes.
You probably had case where you exactly knew what tests needed change after a code edit, even maybe updated it properly before even running them, so everything was always green. Well, with the proper mind set and code base, you can work that way 100% of the time, and manually test the code paths impacted by your change.
Swamplord42@reddit
That's why people don't want to hire old developers. No one wants someone that tried something once 20 years ago, decided that was enough to have an opinion and won't budge from it even though it goes against industry best practice.
Having the position that tests aren't worth it most of the time is reasonable. Having the position that automated tests are never worth it isn't. Not even re-evaluating whether this position makes sense periodically is insanely closed minded.
LetheSystem@reddit
Didn't say I hadn't kept up with changes in the world. Didn't say they were never worth it. Didn't say I hadn't reevaluated them. Said it was a different kind of development. Jeez, read, don't read into.
People don't want to hire "old" developers because of a variety of reasons, ageism is one, cost is another.
BigHandLittleSlap@reddit
Looking at your comment history, it seems you use mostly PHP and JavaScript.
Yeah, those would be madness to use without automated tests.
In the world of strongly typed languages like ASP.NET (C#), the value of tests is a lot lower because the type system does many of the same checks implicitly at build time.
Something you would need a unit test to discover in JS the compiler will simply report as an error in C#.
FullPoet@reddit
Honestly feel that from your comment you dont understand why you need automated testing or have never built anything bigger than a toy project or basic CRUD software (where most of the time you could easily get away with no real backend).
The type system does not and cannot protect your application from producing data that is can be considered invalid from a business perspective and valid from a type perspective.
BigHandLittleSlap@reddit
Oh.. let me guess: you’re a user of MySQL too?
I use SQL Server and encode these rules into constraints I can trust to be enforced by the database engine.
FullPoet@reddit
What are you smoking? That isnt what you wrote, you said the dotnet type system. Not db constraints.
Like I said that works for basic crud applications. While sure its possible to write dB constraints for business rules, but why would you? Splitting up business rules into both the dotnet application (and whatever) db engine you're using is insane and prone to tons of errors.
Can't mention how many times ive had to dig out edge case business rules that weren't documented at all but were implemented as db constraints, triggers, views, etc.
No. I don't use mysql, its not 2010, nor do I use mssql - frankly its irrelevant which db you're using. Let the db store and the application worry about business rules.
LukeJM1992@reddit
Yes and no. I would say type system checking is a very semantic level of your business logic. The real meat and potatoes of unit testing happens a bit deeper with your classes, methods and modules and types won’t get you nearly all the way there.
BigHandLittleSlap@reddit
This also depends on the type of code you're writing. Trivial read-only query code rarely needs unit tests. Complex business logic with "cleverness" is definitely the scenario where I would start sprinkling unit tests on.
newsreadhjw@reddit
If it’s important, customers will test it and let you know
CorrectCount2808@reddit
Has always been the biggest problem for me working with devs!
LessonStudio@reddit
I've been creating software for many decades. I've consulted with, hung out with, and known many people who have all worked with many companies.
The number of companies doing no tests would probably be 90% (or more); I don't count a few notional tests which haven't been run in 20 builds.
The number of companies doing notional testing (less than 30% code coverage) and not including them with a CI/CD would be the bulk of the remainder.
I would guess around 1% of companies are doing tests with more than 80% code coverage which is also at least somewhat part of the workflow. This could be CI/CD or at least part of a code review or some such.
I'm not even including two bit companies building wordpress restaurant sites. I'm talking people who make medical stuff, train (rail) stuff, oil & gas stuff, utility stuff, etc. By stuff I mean software and hardware with embedded code.
These are systems where billions are lost, people die, and ecological disasters happen if something goes wrong.
Yet, the people doing these things will often claim what they do is "rigorous". Yet, a careful examination of their rigour will turn out it is "rigorous" because it is done by electrical engineers who have a PEng. Or they will claim they have a "rigorous" code review process, yet it doesn't look at unit tests, integration tests, or even a static code analysis; just looks at code style, comment style, file naming, etc.
Often in these high value systems they will have manual testing. Except they are often very complex spaghetti architecture systems where code in one spot can affect functionality almost anywhere, thus a manual test focusing on the changes could easily miss the fact that some other critical functionality has entirely crapped the bed.
Here's my own personal experience with testing: Once my system has become even mildly complex I like my tests. They often find weird little bugs; my tests tend to beat up the code pretty hard. Insane inputs, zillions of attempts, null objects, the lot. When a bug is found outside of a test, a test is then created to exercise the bug. Then, this test starts passing when the bug is eliminated. Regression is monitored through the tests.
The code is also cleaner with the knowledge that I have to make it easy to build a test. More modular, less spaghetti architecture, cleaner API.
The tests are fantastic tutorials on how to use the API, but the unit tests are a great tutorial on how to use the functions within. Often the tests are all about some constraint or requirement. The test will be: System must allow for 1000 simultaneous logins per second; this test pushes this to 10,000 per second.
Timing the tests is great as it can reveal bottlenecks, or new slow crappy code. This last is often a sorta bug. New code might not be killer slow, but now some GUI which was super snappy is now taking 100ms. This is both a waste of compute time, but also means other similar slowdowns might make for a terrible GUI; so fix it now.
But what all this means is that new code is sitting on top of a clean well tested foundation. I will spend very little time fussing with the rest of the system trying to get my new functionality to work. This means my time is spent working on the actual problem, not fighting the crummy tech debt codebase.
This is no small thing. Tech debt of this sort is what grinds productivity to a halt. Features which should take hours are now taking weeks. Weeks of trying not to break a large complex fragile system.
There's a book on legacy systems which goes something like, it doesnt matter if you use DRY, PIMPLE, OOP, or any of the best practices in software development, if you aren't writing tests, you are writing bad code.
The usual attack I see on unit tests is that they don't guarantee good code. Absolutely true. But, no tests do guarantee bad code.
headhunglow@reddit
Yup. I wouldn't even know where to start testing PLC code... Where I work all the safety stuff is relegated to minimal safety PLC:s or hardwired, so the PLC programmers can mostly ignore it. Still, a bug in a PLC can (and has) cost our customers millions.
LessonStudio@reddit
I would suggest some kind of external io simulator. At least this could exercise piles of inputs and check against desired outputs.
Then fuzzing where nutty inputs shouldn't bork the PLC.
WenYuGe@reddit (OP)
I feel like I've no idea where those devs are. I've been at hip startups or tech focused companies all my short career. I am genuinely surprised to hear these numbers.
gpunotpsu@reddit
I've been writing software professionally since 1991. Until recently I've never written tests. I shipped a lot of stuff that made a lot of money. I like tests but tests aren't magic. There are many ways to achieve reliable software. The most important one is to not rely on tests for software quality.
Lt_Duckweed@reddit
The ~half of all developers who are in the tech sector and at hip startups are probably statistically more likely to be the sort of devs that are highly passionate about what they do, and are willing to push hard for things, like testing, that add value that isn't readily apparent to management.
Devs in other sectors are more likely to be the sort that got into development because they were good enough at it to use it as an easy way to 9-5 punch their way to six figures.
I certainly fall into the latter camp, and it's just not worth the time and effort to climb an uphill battle against management. I'm not all that passionate about development, it just has a really nice difficulty to pay ratio.
Maxion@reddit
My experience in coding as a consultant for companies at various levels (Fortune-500 level companies, and mom&pop repair shops) is that how "good" the process is is roughly correlated with the revenue of the company. The lower the revenue, the more sure you can be that everything is a crapshoot. The higher the revenue, the higher the likelyhood that you'll encounter well-defined roles, requirements, documentation, ci/cd etc.
The higher the revenue, the older the system(s) tend to be, too. You'll see more Java, C#/.NET, Laravell, more XML, more SOAP, more "REST" (lol).
The lower the revenue, in general you see more Node, Django, React, Github/Gitlab, no tests, no/poor documentation, crappy management.
deeringc@reddit
It may vary per industry. 90% of devs not writing tests is absolutely not my experience in my almost 20 years writing professional software.
psycoee@reddit
What usually ends up happening is these medical device companies started out as 10-person startups with 1 or 2 extremely talented software developers doing everything. They are already working 80 hour weeks, so they are not going to write tests if they don't have to. That caliber of programmers can often write relatively bug-free code, especially since they are doing it from scratch, there are only two of them, and they are probably sitting next to each other. Eventually, the 10 person startup becomes a 10,000 employee company with a multi-MLOC codebase that traces its roots directly to the code written by those two guys. And since they were in survival mode that whole time, they were building functionality, not writing tests. At this point, the code works, the customers are relatively happy, and at this point it's hard to justify a $100M+ investment in writing tests for code that basically already works well enough.
Usually, this attitude changes only when you have a Crowdstrike-scale clusterfuck. But by then, the company is probably in maintenance mode and nobody is adding any functionality anyway.
cosmic_chef@reddit
Idk about these numbers, but I know my team is that 1% because of our culture. Full CI/CD for merges to main and when we tag. Security checks for hardcoded secrets, unit test code analysis that won’t let you merge under 85% new line coverage, test container builds, and when we do merge we perform automated e2e testing where we act like the user. Verify all inputs provide expected outputs. If that fails, we can’t promote anything to higher environments. Any stories we have, our definition of done includes unit testing.
Does this suck sometimes? Ya. But if you’re really feeling lazy you can typically get 80-90% of the way there with copilot. Just ask it to give you unit test cases for a particular class and you’re off refining those until you get full coverage.
Not everywhere I’ve worked is like this though.
campbellm@reddit
I've been developing since the early 1990's, and I haven't been in a company yet that didn't have some form of test runs that had to be passed in order to put to production. The 90% value he posted I'd have to see some data for, to be honest.
The companies I've worked for haven't had necessarily good coverage, or quality, or consistency, but there has always been SOME form of automated tests that were a "to-production" gate.
Oakw00dy@reddit
In a team environment, unit tests are great dogfooding. If devs are forced to actually use their own code, it tends to decrease the amount of write-only crap before it shows up in code reviews
thebuccaneersden@reddit
Yeh, it’s unfortunately very common. This be why it is very valuable to have a robust CI/CD pipeline and good leadership to maintain those things.
loup-vaillant@reddit
Oh, I know: in environments where the ones calling the shots aren’t engineers, we’re punished for being thorough, and we’re rewarded for being careless assholes.
It’s not exactly that (as /u/FoxyWheels puts it management isn’t giving us time to test. Testing shouldn’t even be on management’s radar. They ask us to do something, we’re suppose to do the thing and give reasonable guarantees that the thing works.
Problem is, that second part often doubles the dev time. The temptation to cut corners and look faster is huge. And on teams that don’t test to begin with, you don’t want to be the only slow developer on the team.
But there’s worse: you know what happens to a feature you’ve thoroughly tested, and therefore just works? Nothing. We just forget about it, and you get zero reward. Had you delivered it twice as fast and full of bugs, not only you’d look more productive, you also get to save the day once the shit hits the fan. Double reward for crappy work, isn’t this great?
And that’s if you haven’t already capitalised on your "productivity" and left for greener pastures, like my tech lead once did. I got to debug his code, and found out the guy was a fucking tactical tornado. His code was full of repetitions and useless comments ("loop over the list" was typical, and that’s an actual quote). The conclusion from my hierarchy? He was the productive one, and I was taking too long to fix simple bugs.
FoxyWheels@reddit
You hit it on the head. You need the majority of the team already testing. If you join a project that’s a few years in and everyone is a cowboy, guess what you have to be if you don’t want to be part of the yearly LR.
codyrat@reddit
I'll tell you another secret. I know a lot of devs that don't document either.
HarveyDentBeliever@reddit
I work at a pretty tight shop now where the team is adamant about total unit/integration test coverage. At first it was a bit agonizing but I’ve learned to appreciate it. There’s lots of peace of mind in that you know there’s a high chance it’s caught by the test suite if you break something when adding new functionality or bug fixing. And “verifying functionality” seems trivial until you start doing it and realize how often you overlook something simple and save yourself the time of having to eventually catch it manually.
PrefersEarlGrey@reddit
Because most of the time testing becomes an exercise in obtaining x'% code coverage mandated from management. Which is pointless and makes the tests become essentially what the compiler does, verify the code works as advertised. Functionally useless.
Edge case unit tests and day to day usage integration tests have value in ensuring code stays quality over time, but that's a nice to have when there's always something else more priority to do that has to be prioritized.
WenYuGe@reddit (OP)
I genuinely do want to know how to better justify this investment to management though... or should we actually not try to hit 100% test coverage?
gpunotpsu@reddit
100% test coverage is a lie. 100% of what? There are infinite ways subsystems can interact. You cannot have infinite tests. With an infinite budget (time and money), then sure it's nice that something looks at every branch in the code, but that doesn't actually give you bug free code. With limited time and money there are much more efficient ways to achieve quality. You need experienced devs who understand what the root causes of most bugs are and know how to target the testing budget for the greatest return on investment.
PrefersEarlGrey@reddit
I think it comes down to managing expectations. Management gets the warm and fuzzies when they hear 80-100% unit test coverage because they can tell their leaders "this app has been tested and won't go down". But it's misguided and their true aim of the app being available most of the time is better served with well documented code and useful tests that mirror the actual usage of the app.
BilldaCat10@reddit
A number like 80-100% is a concrete thing and can be put into a contract as a condition that has to be hit.
Something like 'The app has test coverage against common failure conditions' is very nebulous when it comes to that same thing.
Not saying I like it, but it's what I see in the contracting world.
LukeJM1992@reddit
We write unit tests for most new code we write. We don’t have 100% coverage, but critical paths are tested to make sure the user experience is always stable. We have had zero major production bugs in 2 years. I think you need to advocate for reducing bugs both major and minor over a given period of time vs. lines of code tested.
How will management feel when your app makes an incorrect payment, or exposes sensitive information to unauthorized users. Without tests, we are just hoping we did this perfectly, and to that I ask, what would you take that risk?
Unit tests are cheap, and can be written alongside the code at the same time to validate it. They are artefacts of good engineering and process management.
deeringc@reddit
Don't try to hit 100%. Every codebase is different but that last ~20% adds very little value and gets increasingly harder to achieve. The tests get increasingly contrived just to get some extra piece of coverage but without actually testing something useful. It's during this "hard" and "low value add" phase that people and teams get turned off testing. Test coverage is just a tool, we shouldn't have an absolute target but rather it should be used for ensuring things are trending in the right direction (ie. Not dropping) and for finding test gaps.
chowderbags@reddit
Realistically the returns are going to be on an S-curve, where having a few crappy tests is pretty worthless overall, getting 80+% coverage is pretty good, and trying to squeeze from 99% to 100% might be a nightmare that will take more effort than its worth.
shoe788@reddit
A percentage target is the absolute wrong approach. Tests are for reducing risk. You reduce risk until you meet the desired level of risk tolerance. Risk tolerance will be different everywhere. Management cant standardize it depends so they come up with percentage targets they can enforce everybody to obey.
WenYuGe@reddit (OP)
Love this
LordoftheSynth@reddit
A properly chosen suite of build verification tests (and I'm talking like 100) for your product should get your coverage numbers above 60%. If it's not, your build verification has some pretty serious test holes.
Hitting 80% with a full functional suite is not hard. Again, if you're not getting there, something
I do agree it is a game of diminishing returns: tasking SDEs or SDETs with writing increasingly specific tests is a waste of resources. You don't need a test to exercise every possible failure condition, for instance.
SilverCats@reddit
You don't need to test if you have no customers.
If you have customers you always test. Sometimes you test before your customers.
cdsmith@reddit
This is another one of those best practice articles that forgot to say anything interesting. Do I believe that there are a lot of developers that don't write automated testing? Sure. Does writing an article about it do any good? Not really. The problem here isn't that these developers just haven't been enlightened about the value of testing, nor that they don't have "discipline", whatever that means.
The real problems:
In some cases, as well, the people complaining that others don't test just have a narrow point of view. One project I worked on had a comprehensive set of representative real-world data that we were trying to do the best job on, and an elaborate system set up to monitor for changes in the quality of the result and attribute them to individual code changes. In that context, if you determine that adjusting a system parameter improves performance, you can confidently do it. But we still had the occasional "this is the best practice" types sending changes that would tweak one parameter, and then also write an elaborate test that verified that the system really did use that new number that their commit modified in the config file, sometimes even including refactoring to make the code many times more complex to "improve testability". When asked to remove that monstrosity and trust the system, I have no doubt some of them walked away wondering how we hadn't got the memo that testing is a best practice.
(That's not to say that integration testing makes unit testing unnecessary. Quite the contrary, if you are building abstractions, then it is immensely valuable to test at the level of those abstractions so that you have trustworthy building blocks to build with. But there are too many people for whom "write a test" is a checkbox they tick off without stopping to think why the test has value.)
FINDarkside@reddit
What many people here seem to miss is that your job is to get results. If tests don't end you up with you producing better results, you're just wasting your time while more efficient devs are miles ahead. How much tests to write depend heavily on the ramifications of something not working, how complex writing those tests is, how fast is it to test manually, how likely it is that the code won't be touched for years, how much does the company benefit from getting that feature out fast and so much more.
george_____t@reddit
Agreed. A lot of code is just obviously correct and testing it is a waste of time. Getting to 100% coverage often doesn't seem the best use of finite developer resources.
I've often wondered to what extent the mentality stems from the use of weakly-typed languages where innocuous-looking code can fail in unexpected ways. I know from experience that a lot of JS/Python devs are horrified by the lack of tests in a lot of my Haskell projects. But I test what's most important, and they largely just work!
stayoungodancing@reddit
That’s a bit of a stretch
RupertMaddenAbbott@reddit
I agree that the structural problems you talk about here are also problems.
However, I have worked with plenty of people who say the things this article talks about and believe this is also an individual issue that needs to be addressed at the individual level.
In fact, I think these individual problems feed into the structural problems. I have observed planning meetings that have gone like this:
There is a competency issue when it comes to testing. Many developers are of the opinion that it is not their job, or they have a very low bar of quality for what it is acceptable.
I completely agree with you that some developers take testing to far. I have also seen people put in huge effort into testing and get almost zero value (sometimes negative value). This is the same competency issue refracted in a different result. These developers don't understand testing properly and thus are unable to properly achieve the right balance.
1337_BAIT@reddit
No code review without unit tests
charcuterDude@reddit
I manually test code, but never unit tests unless it's a personal project. My entire career has been "push that shit out as fast as you can."
If you work somewhere that actually gives devs time to do a good job and you're hiring for remote .NET developers please DM me your career page. That sounds like a dream come true.
alwyn@reddit
They lack that important gene called "Attention to detail" and "Pride in your work". The only QA issues that ever came back to me were from QA's testing in their own hacked together environment.
njharman@reddit
I think I failed interview cause I wrote unit tests for their take home coding challenge.
Talking to team, I think I scared them. It was tiny company and the team seemed self taught and less sophisticated more start uppy. Which could be what's needed. I wasn't gonna be a good culture fit.
popiazaza@reddit
Really depend on if the manager willing to give enough time to write proper tests.
I'm not gonna work overtime for that.
Give me a strict timeline and said "we already have QA" (all manual test btw).
It's not worth fighting for, instead, I'll ask for time to fix the problem later. Same goes for security.
Don't blame the player, blame the game.
This is the agile they want, and I just want to get paid at the end of the day.
nextstoq@reddit
Same. I've been a dev for over 20 years, most of that in web development. Very rarely is there budget for unit testing. If you're lucky there are QAs who have automatic or at least some sort of structured testing. Bottom line for the client is their bottom line - it's usually better economically for the stuff I work on to get it live with a few bugs than to test it so it's "bug free".
Maxion@reddit
Exactly, there's 0 business value in a project that goes over budget, or is released late. There can be a lot of business value in buggy software released on time. This is what a lot of managers (And purchasers) have to deal with. Not every company in need of custom software is a 1 billion a year revenue behemot, companies with ~10-20 million a year revenue also need software, but they don't have the margins to pay for more than 1-2 developers. Hence you're always incredibly time constrained.
jackmans@reddit
I think this depends drastically on the project and the reason for the budget / timeline. Many timelines are highly arbitrary and being late by a week or two makes no difference whatsoever.
Maxion@reddit
That is true, I guess when I used the term "late" I meant it more in the "True late" sense when the project is no longer economically viable, and not in the "Stakeholder Late" term where it's the date some person or committe pulled from a RNG or the coffee stains on the conference room chair.
ROGER_CHOCS@reddit
We makes billions a quarter and there is no room for unit tests.
LukeJM1992@reddit
I see your point, but unit tests should be written while you write your code (more or less), and if a client experiences a major bug in prod then that puts you immediately on the chopping block. It would seem sensible to write tests and get ahead of that, no?
nextstoq@reddit
Obviously it depends a lot on the project and the client. I am not involved in product development, nor any sort of life critical application, but tailor-made website/commerce solutions - which often and regularly require new additions and changes.
The clients I work with are simply not willing to pay for the time to write unit tests, knowing that they'll constantly require rewriting as requirements change.
They accept the risk of downtime. We do also run with QA-systems, and dark "live" systems for testing too. To be honest, downtime I have experienced due to programming bugs has been minimal over my career.
Kamay1770@reddit
Yeah, we all start with good intentions but time is money and management puts pressure on getting shit out the door.
They know lack of testing can lead to issues, but it's weighing up:
'how often your devs lack of testing actually causes issues'
vs
'how much time and money would be wasted needlessly testing stuff that is unlikely to cause issues'
vs
'how much would the damage cost if it did go wrong'
Then you decide where and when to test. You don't need unit tests for, or to test, everything.
Another thing is to just hire only very senior developers to reduce your risk of issues as they're more likely to get it right first time.
It's complex and nuanced, but it isn't always ignorance or laziness. It's often a business decision.
Savings_Row_6036@reddit
No tests, no testing, and not even writing the code they commit.
EvaUnitO2@reddit
This only gets fixed when the business begins to value testing in addition to valuing feature development. Otherwise, testing will always be pushed back down to the engineers as a thing they can do if they want but should not be eating in to the rest of their effort toward new feature development. That's why testing just often just never happens.
It's also why tech debt tends to mount and mount.
foxcode@reddit
I'm dumbfounded by the number of devs who advocate for 100% test coverage, with no consideration for the current state of a project, or an acknowledgement that tests do not all provide the same value. Context matters.
If a piece of code is handling resource authorization, then that code is more important than most random buttons in your user interface (exception for buttons that fire ze missiles!). Some code is more important than other code, and it's corrrectness matters more for the business, and or developer sanity. Writing tests for that code has more value. This is the value factor.
Now there is the cost factor. Pure functions are generally easy to test. Tools like dependency injection can help you here, but there are limits to how pure your code can be. User interfaces again are an obvious example. You generally have to fake a lot of functionality, using a browser like environment, with many layers of magic to make a real integration test work.
What I'm suggesting is you could plot a graph. Ease of testing against the value of testing, and while I don't suggest actually drawing it out, I think this is a better approach to testing. 100% coverage requirements lead to box ticking, and huge amounts of time spent fighting with layers of magic that get updated far too often.
Two final points.
Using a static strongly typed language can really help. While I don't enjoy typescript, I've found it helpful in large javascript codebases, both for readability and code quality. Where I can, I'm trying to use Rust, as that provides significant levels of safety.
I think there is a third metric, likely subtelty of a bug. An error in some code is likely to lead to far more subtle or nasty consequences than other code. This should be included in the value calculation.
the_unsender@reddit
As someone who has done lots of contracted development/devops work, I can tell you from experience that clients get PISSED when you bill them a significant amount for automated tests. I mean absolutely livid.
jackmans@reddit
This is true, clients tend to have strong opinions about how contractors do their job and how they're paid. However, you don't need to bill them for automated tests specifically. Why not just lump it in with development?
DullBlade0@reddit
Would I like to test? Yes
Do I have projects that could perhaps benefit from it? Yes
But when clients change requirements up to hours before going to prod (I'm not joking here) shit changes way too fast to make that a reality.
And after that I'm not getting paid nor get time allotted toward writing tests so what can I do.
RevolutionaryYam7044@reddit
Have you ever heard of this interesting little word called "No"?
Simply refuse last-minute changes. It's not possible in that short time frame. End of discussion.
DullBlade0@reddit
Then the whole thing doesn't work, then the client calls my boss, boss calls my manager, manager calls me, I explain it won't work, a bit of back and forth and in the end I still have to implement that thing because pushing back timelines for the client is a no-go.
jackmans@reddit
Okay so it's the classic issue between you, your manager, and the timelines and quality standards you've agreed upon (either explicitly or implicitly). It isn't going to be easy to suddenly switch from delivering shoddy code quickly to higher quality code slowly. Now it will require a fairly long conversation and argument about the value of testing, technical debt, faster development in the long term vs. the short term, etc. Maybe your manager comes around or maybe he doesn't, in which case the only way you'll really be able to do this is either behind his back (e.g. coming up with various excuses for delivering more slowly) or at a different company that doesn't have an ingrained culture of shipping shoddy code quickly.
farcryer2@reddit
Have you heard of these interesting little words called "Job", "Boss", "Salary"?
Go ahead. Those words mean nothing, right?
RevolutionaryYam7044@reddit
Idk what kind of company you work in, but I'm not afraid to lose my job just because I refuse to sacrifice software quality for unreasonable demands from clients.
If I say something is not possible then my manager has to accept that, because I'm the expert and he is responsible for managing client demands. My manager's job then is not to put pressure on me, but to negotiate with their client.
DullBlade0@reddit
Meh,
If the client wants to deal with downtime and their customers complaining about bugs it's their problem.
Boss/Manager/Me will just bring out the mail where we said they were asking for untested shit to go live and that if they want the fix they can wait the proper time to make one.
Traveling-Techie@reddit
In my career I’ve worked for about 17 companies that developed software, and only 1 or 2 did software testing.
ClubChaos@reddit
In my experience unit testing, integration and e2e test are at best a complete waste of time and at worst a mess of creating an unoptimized, normalized for the sake of being normalized codebase that is impossible to extend or modify without rewriting tests.
All so some dev can be like "see? The tests are the documentation and proof of work" with a smug grin.
Testing is useless for 95% of projects and I completely agree with theo's view on it.
underNover@reddit
Think tests depend on the domain you’re developing in. Mission critical stuff like infrastructure software for law enforcement or interpol, or medical software? I’d just become a whistleblower. CRUD application to lose weight? Can be worth it.
From my experience though, even if it’s mission critical, management will sometimes still abandon such ideas. We only manually test for example, but bugs keep pouring in en masse after each release, even small ones. Hell, even had a case where one of our clients invoiced too much because someone broke the formula again, and it just got silenced to avoid brand damage.
aefalcon@reddit
Yeah, it's a constant struggle with people I work with. And those same devs usually write code that isn't modular, because writing testable code basically forces you to write modular code.
keepthepace@reddit
In most of the companies I worked with, we start coding without specs written. We make POCs, prototypes and then only products.
You can't write tests before specs especially when these are changing.
You write tests when other people depend on a specific behavior in your code, aka when you have reusable functions or libs. For these it is mandatory, but not all dev work is about that.
mosaic_hops@reddit
Reminds me of Donald Trumps approach to COVID testing… “well if we don’t test people, the numbers will go down right?”
Bloodcount@reddit
You must be new.
psycoee@reddit
I think a good way to think about it is in terms of risk and the cost of a bug. Unit tests reduce the risk of introducing bugs when making a change, and reduce the cost of finding a bug because it can be detected before code is even pushed to the main repo. If you are writing software to fly a plane, the cost of a bug might be in the billions of dollars. If you are writing the code for some entertainment app, the cost of a bug is much lower, possibly close to zero for something that few people notice. So that's one consideration.
The other consideration is bang for the buck. There are many ways to achieve software reliability, and unit tests are just one. There are many other ways. You can do functional tests, hardware-in-the-loop tests, manual tests, formal verification / theorem proving, et cetera. Usually there is a tradeoff between discovering bugs early and fixing them cheaply, and the overhead of maintaining the tests. You might already be doing a bunch of tests on the system level, and so unit tests may be less useful, particularly for code that is hard to test and is unlikely to have serious bugs (e.g. GUI dialogs). Unit tests may not be useful if your code is generated from a high level model, such as a state machine. It might be easier to formally prove certain propositions by examining the high level model.
The last thing is the process around changes. High test coverage is great if you want to have short cycle time, like a lot of DevOps environments. On the other hand, some industries take multiple years to release a new software build because of all the formal verification it has to go through.
I think the bottom line is, it depends. It's something that needs to be evaluated from the perspective of your specific project. A DevOps style web app versus an avionics module are going to have very different tradeoffs.
ProtoJazz@reddit
There's a couple types of tests that I like that I don't see implemented a lot
But they somewhat spend on how your company is structured and what you do.
Contract testing can be fantastic if you have multiple teams that depend on each other's apis. One company I worked at did this really well. There was one team who made the api that a lot of other teams used, so each time would basically write a small test that captured whatever parts of the api they used.
Now it never really prevented anything from braking. At least for my team. The api team basically always took priority and would just say "Hey, this next version is failing your tests". Which not exactly what I'd want, but it's way better to know the upcoming version breaks something than finding out the version that just went out broke something.
Another good one is snapshot testing. This one is more UI focused. And depending on the complexity of your UI might or might not work.
But basically what would happen is if you made a change, and it changed the way something rendered, you had to acknowledge it and include the updated test in the PR. Now it wouldn't catch a lot of breaking changes, especially because looking at complex web pages as text is a nightmare.
But it did catch changes like ones where you change one small thing, thinking it's only used one place. But turns out that small thing is used allllll over the place and you've changed a lot more than you intended. Basically just a 2nd step of "Yes, this these are the things I meant to change"
Fyzllgig@reddit
This is the nuanced opinion I dug through comments to see. Everything in our field is about weighing tradeoffs because almost everything has a variety of solutions. You can go as deep as you want and write your software from the transistors on up if you’ve got infinite time and the patience to do so. But that almost never makes sense and so we have to weigh our options all the way up the stack.
I’m someone who wants to have some easy to run unit tests which will at least verify that for a known input to a function or system you can verify a known output. If 1+1 != 2 then I know I introduced a bug to go find. I don’t find test coverage to be a very good metric and typically don’t even consider it (I work at small to medium companies mostly pre-ipo so this is an oft accepted opinion). If I can look at all of my code, look at the functions, have a unit test for each of them (that can reasonably tested, sometimes then I bet of external dependencies makes it impractical) and I’m covering the happy path and whatever failure scenarios and edge cases occur to me at the time, then that’s wonderful. If you find a bug when you’re running in a real deployment environment, find it, fix it, write a new test case to catch it if possible to guard against regressions.
Integration tests are best effort. If I inherit a system that’s not designed in a way that makes it easy to run without having to emulate or mock tons of other systems (like one I recently got that is a very basic API layer but has dependencies on Postgres, elastic, cloud run/cloud tasks, and firestore as well as the other two system components that the API layer is connecting) then great! Let’s get them written! Standup Postgres in docker compose, maybe seed it with test data as we go if that’s what we need, whatever. But when I’ve got too many images to spin up just to even run the system, I am displeased to say the least.
Even worse are “integration tests” that communicate with live resources, even if it’s just a staging environment. But even then, if that’s what you’ve got to work with then you do your best to make it work
cjet79@reddit
Glad you wrote this. The original post and many of the comments seem unaware of tradeoffs. It kind of gives off vibes of "Testing is a religion. It must be done." This is a little extra surprising since they seemed like they were in finance.
Eirenarch@reddit
What a wise article. And yet Bill's code is in production and makes money for decades so if the article contradicts reality it must be wrong
BingBonger99@reddit
tbh a lot of tests you find in corporate codebases are dogshit anyway just to scam pass the "100% coverage" requirements people dont understand tests put on the teams
bwainfweeze@reddit
Except in the case of very high functioning teams, if the code coverage exceeds 80-85% it’s because someone is gaming the tests.
Bad tests can be worse than no tests, because honest tests have low coverage in parts of the code that need more eyeballs. You know you have to do other kinds of tests in this code because you can break it without breaking the tests.
HappyPudding2936@reddit
Writing unit tests in frontend frameworks is bizarre and hard. Especially when you make sweeping UI changes. I've missed sprint deadlines and gotten in deep shit because of how long fixing the tests took. Plus, I've never had a unit test find an actual significant bug. I think it's a waste of time people just like to do to seem vigilant.
bwainfweeze@reddit
You have to write code with testing in mind. If you try to write your code the same way then you won’t find bugs, and it becomes a self fulfilling prophecy that tests are a waste of time. Testing your code is a waste of time. Testing mine is only a waste of time if someone gets fixated on code coverage numbers.
My pet theory is that it’s difficult to test HTML UIs because browsers were mostly written before unit testing caught on.
If you had browsers written with test automation in mind, and an app that only had to run on that kind of browser (a very unrealistic situation), how much easier would automated testing be?
Selenium is a dumpster fire. The only really successful, unfrustrating project I had with it, I found a clever way to rig our API to send out of band data to the tests so they only looked at the web page when all traffic had settled.
Async logic is absolute hell on negative tests without this sort of information, because you could just not be waiting long enough for the page to update.
bwainfweeze@reddit
There’s a mental trap in coding.
You have to be a bit of an optimist to do things that either haven’t been done before or you are unfamiliar with. A realist looks harder for existing solutions.
But that optimism also tells you that gambles will be okay that definitely will not.
So it’s either hard for you to start things, which means you don’t show up, or it’s hard for you to actually finish things instead of convincing yourself prematurely that you’re done.
This is part of why teamwork matters. You need openers and you need closers, and those are different personalities.
golgol12@reddit
Almost the entire game industry lacks automated tests.
Most of the gaming industry's code is a burning dumpster fire.
WenYuGe@reddit (OP)
I think outside the active online tech circle when I venture into the wild wild world of local dev shops, banks, and whatnot, their software practices are always surprise me.
I like, don't understand how there are stories of devs not testing. I mean I've also seen companies that keep a running log of changes in files for the last 30 years and require manual formatting of files...
mrMalloc@reddit
As a consultant I have seen code that “shouldn’t exist in production…..” in business that both got the cash and should have the know how.
I also taked to devs about “unit tests” to be scoffed at why do I need to test my frontend. I can see it works ……
sojithesoulja@reddit
I don't know how to test because I don't know how to develop properly. I think I'm close... been reading up on software architecture. Specifically, hexagonal + DDD. Google domain driven hexagon for a great github repo with nestjs. The concepts make sense but just require a fuck ton of practice and planning. I'll check this out tomorrow.
Also, fuck Optum. They never trained or taught proper practices.
On that note, maybe someday I'll escape this hell. I made a mistake 2 years ago quitting with only 2.5 years experience and I did so at the top of the tech job market.
WenYuGe@reddit (OP)
:kek: quality content.
tangoshukudai@reddit
It is very frustrating when the dev doesn't do any testing, and throw it over to QA with zero verification. I get why you might not create a unit test or UI automation test, but it's unacceptable to not manually test your code.
kernel_task@reddit
Unit tests have almost no value. Integration tests and end-to-end tests are really difficult to write, especially correctly so they don't fail spuriously. Tests are extra burden on maintenance, so they better be of value. Lastly, even if you write the most beautiful tests in the world, bugs will still get through.
We're all trying to deliver quality software at the lowest cost and there are trade-offs to everything. I won't pretend I have it right. I'm trying to write more tests in general, but I have my job and its deadlines to think about, not just pleasing bloggers on the internet. Sometimes tests help with the goal of saving me time while maintaining quality, often they do not.
bbgun142@reddit
All I know is that most code put into this world is made of glue and bubble gum
sphygmomanometer_ch@reddit
Why waste time on expensive devs writing useless code on useless environments when you could have your customers test in production?
icpero@reddit
This is the way.
Paratwa@reddit
Me and you man. I can’t grasp it. It should be tested by the dev, tested by an independent team member and approved, then united tested and then monitored pre and post change with the results captured along with any issues/bugs logged and noted.
Any downstream incidents/issues not captured in the above testing should be logged studied and evaluated for how they can be integrated into future monitoring and testing.
The amount of pain and horror caused by not testing baffles me, that above seems like a lot and it IS but an outage or error is worse far far far worse.
All of the above can be done and becomes second nature with some good habits and automation.
vincentofearth@reddit
I want to write tests, lots of beautiful, thorough tests. I often don’t have time to or have the right tools and libraries.
If I’m feeling the pressure to deliver some part of a feature by the end of a sprint or to reach some milestone at a given date, tests will often be the first to go.
…And I know someone will say that’s not how you do it, or that my team is committing blasphemy against the Agile gods and that they figured it all out (pinky-swear promise) if only us disgusting little plebs didn’t muck it all up—well tough luck, this is the hand I was dealt with, this is the “flavor” of “Agile” I’ve experienced in every workplace since college, and I only have so much energy and power.
Because apparently broken or incomplete feature can still get someone a promotion, but a feature that never ships can’t.
CathbadTheDruid@reddit
Back when Dinosaurs Roamed The Earth and i was still young and believed in things, I made my code rock solid, tested everything and rarely had any issues at all.
when i'd get a request for a change that would break things, or wa just out of scope, i'd say "no".
And Life was good.
Then Management took over. They didn't' care what was possible or reliable. They cared about what lies they could tell the customer to get more money.
A first I'd say "no" but they just kept coming. So eventually I said "fuck it". Want Pong inside the Wait Cursor for your banking app. Sure IDC. Want the DB accessible on the internet beu h VP like to look at it and doesn't want a VPN? Sure. IDC. As long as I get paid.
All the current BS can be laid directly at the feet of bean counters who transformed software development from an enjoyable Creative process with time to reflect test and re-architect and think" to "how many widgets did you code yesterday sn ho much money did we make last month."
IntelligentSpite6364@reddit
deadlines dont always have the courtesy of giving you time to do automated tests, work with what you got
Neoshadow42@reddit
Was in a situation recently where a Senior who hadn't worked on the repo before submitted code - I asked where the tests were and they said "Oh this interacts with an external component so it can't be tested"
"Uh...you need to mock the component?"
"What do you mean, mock the component" is the response I got from a SENIOR software engineer.
I'm not surprised that it's an industry-wide issue because I've been in so many situations where I ask for tests to be written at any level and the response I get is 'But why'
axilmar@reddit
I hate to break it to you, but:
Every development team should have at least:
That many companies are cheap enough to not provide the above 3 for any software project, it's a problem of the company, not of the developer.
Why shouldn't developers do the testers' job? it's simple enough, developers have an idea of how the code works, and they are always biased towards the thing working. Many times they do not see the use cases that may lead to trouble, or they are exhausted from trying to making the thing work, and they don't have spare energy for testing every god damn case that comes from the requirements, or the lack of requirements.
_Pho_@reddit
Honestly I went full circle on tests. I used to hate writing them, I think because I didn't understand what they were good for. Also testing frameworks are still really horrible DX, especially in JS.
BlueGoliath@reddit
Maybe it's just me but I'm never sure what to test. A public API surface can be huge and testing everything is a lot of work.
Logging is similar but is made worse by potential performance issues.
WenYuGe@reddit (OP)
I feel like it's sensible to have a smattering of key happy flows tested at least. It helps make sure that nothing breaks as you add new things.
campbellm@reddit
The rule of thumb is test that, at the VERY least:
BlueGoliath@reddit
Dogfood testing basically? Better than nothing for sure. If the API surface is big enough there could be major gaps though.
WenYuGe@reddit (OP)
Yeah, perfect is hard to reach. It's also super scary to touch code in a low-coverage repo in my past experience. I don't know how many things can break when I change one method :kek:
Worth_Trust_3825@reddit
In my case it was always when I inherited an application that was built in a way that you could only do end to end tests. So you just wouldn't bother.
goomyman@reddit
“Having unit test being run as part of CI (Continuous Integration) on a system that mimics the specs of the deployment environment is the best way to validate a program…”
Ok I am not a unit test purist (Idon’t care a single test tests several things, or if the tests only touch a single method) you can call the tests anything you want but for the love of god “unit tests” should not have any environmental dependencies. There is no such thing as a unit test that mimics a production environment. There should be no environment.
I am a big fan of metrics in production and continuous synthetic tests against production rather than integration tests - which I find expensive and flaky.
Thousands of fast environment independent tests -> synthetics monitors and metrics in pre production and production environments-> safe fast and reliable automated rollback on errors.
That’s the trifecta IMO.
somewherearound2023@reddit
"I dont care what you call your tests, as long as you call them before you check your code in"
safetytrick@reddit
I mostly agree with you, except that I think there is a lot of value in learning how to write tests that aren't flaky. Flaky code happens in and out of tests, I've fixed a lot of flaky tests that were only flaky because the application under test was flaky.
goomyman@reddit
A test without an environment is rarely flaky
bramley@reddit
I test anything that does calculations. I was told not to test UI because that will get picked up in QA anyway. This is mostly because writing tests for React UIs gets complicated quickly, AFAICT.
All my other clients have lots of tests written for them. Especially the Ruby ones.
dethswatch@reddit
Fun, but can we also gripe about how bad the analysis on new items is? One-liners that barely describe what needs to be done? wtf
Razvedka@reddit
Some in senior technical leadership literally don't believe in testing, and they're bizarrely very proud of that stance. Quite brazen about it.
donatj@reddit
I work with a dev who regularly submits giant multi thousand line PRs that usually only kind of work. So often I am just yelling at my screen "did you even TRY this code?" and he's clearly peaved that I'm holding his code up to higher levels of scrutiny but it's like start writing some f*cking unit tests and maybe I'll trust it.
Zanthious@reddit
i found making the devs develop all new code thru unit testing just to save time on test apps and having to use an app to test a function thats 10 button clicks away to be easier than anything else.
This is a 2 way street though, i have had tech support people deal with little annoyances for years before even bringing it up to the dev team who solved it in 3 minutes.
one i got them to code via unit tests my stuff got alot less buggy. *shrug* to each their own i guess.
drawkbox@reddit
I always test code via unit, integration and system testing. However unit testing can go too far and become a weight.
I am more annoyed by developers that don't test the final output on web, device, desktop than anything.
The project also can determine what is the appropriate level. A product that is years and years in progress should have many tests. A one off project that might last only a short time or a small project it may be completely unnecessary.
The best way to be is setup using interfaces and the ability to test, whether that is needed or not at the unit level as much is up to the project.
However not testing the output and final presentation is just not smart nor good for a product, even one that is ephemeral.
Neocrasher@reddit
The thing I don't like about testing is how easy it is to write bad tests. A lot of the time when I make changes to our codebase and then fail a test it's not because the functionality is bad or broken but because the test made certain assumptions about how the code would work that aren't true anymore.
OutsideDangerous6720@reddit
On a strongly typed language, most unit tests have negative value
fire_in_the_theater@reddit
testing doesn't get bullet points on impact resumes, it's really that simple
RddtLeapPuts@reddit
If you write tests during your technical interview, I’ll want to hire you right away. I don’t know why candidates never do this. It’s so easy
assert reversed(‘abc’) == ‘cba’
Write 3-4 instances of this. It takes a minute
omz13@reddit
There was (many years ago) a product demo in front of a client that went quite wrong (promised new functionality caused app to crash when invoked). When asked "did you test your code" the response from the programmer was "it compiled without error". There there followed a 10 minute very profanity-filled lesson in the differences between compile-time and run-time errors. You would have thought this programmer would have learned: nope, did something similarly stupid a few months later.
GalacticalSurfer@reddit
My co worker can’t even grasp simple typescript. I tried to get him into it and I saw “any” all over the place. Imagine tests. I had to do some changes to a small front end he developed in js a while ago and it was a nightmare. I managed to make it work with the updates, don’t ask me how.
I am in that group. As others said, not knowing how to write good tests and tight deadlines were the main reasons. I don’t have much years of experience but unfortunately a lot of responsibility.
Recently I’ve had to take over a project by an ex employee. Results were not going as expected. Again, what a mess… this time in TS but seemed like he was learning NestJS framework. So back at again fixing other people’s shit, slowly started to implement unit tests where I can. Not always possible because writing tests and mocking a bunch of dependencies is also a pita.
GalacticalSurfer@reddit
Oh and since this is supposed to be a microservice (you guys watched this movie many times, distributed monolith), some entities are shared. He used a different name for one, and created an exclusively local entity with the same name as the other entity on the other service. Inconsistent relationship property names. It’s gotten me pissed off that I finished what I was doing to tackle this mess and do it correctly. I just changed the name of the entities, modules, services, etc (without the relationship properties like “companyId” and “clientId”) and we’re looking at 50+ files changed for each entity. Then the reference properties, then migrations. Fun without tests.
With the tight deadlines I didn’t have enough time to fix it in the beginning, now the system is growing and it’s becoming a mess. Of course part of it is my fault. Maybe I should’ve sticked to a plan and continue the wrong naming, idk. When things are a mess and you contribute to the code, you are also contributing to the mess.
Easy-Bad-6919@reddit
I test what I wrote, but I don’t write automated tests
larikang@reddit
It’s the first thing in the article…
Nothing more needs to be said. Everyone tests their code, at least manually. Writing this down as repeatable test cases almost always saves time in the long run, unless you somehow know that the code will never be modified again.
throwitway22334@reddit
I've worked with devs who write no tests and only verify the happy path before submitting a code review.
I've also worked with devs who don't even check the happy path before submitting code reviews.
I've also worked with devs who don't even bother compiling before submitting code reviews.
neotorama@reddit
Even crowdstroke skipped the test
SignificanceWild2922@reddit
around half of the candidates in tech interviews i've led can barely write a meaningful test in the stack of their choice...
OmnipresentPheasant@reddit
I'm dumbfounded the URL contains a comma and works
ezaquarii_com@reddit
"out client doesn't pay for tests"
tyjuji@reddit
Does anyone know a good Java framework for doing unit tests on a database?
Solax636@reddit
Tell me you've never worked in a 30 year old enterprise code base without telling me bla bla
captain_obvious_here@reddit
I'm dumbfounded by the number of people that believe the software development world is either black or white.
Just like everything else in the world, tests are not always needed. Sometimes they're absolutely necessary and not writing them implies tons of issues and tons of time to fix them. And sometimes they're absolutely useless, and writing them is a waste of time.
Everybody with a few years of experience will obviously have example of both situations.
IsRando@reddit
Amen!!!
jfp1992@reddit
As a qa don't mind Devs not testing, but don't rush and at least check the happy path of you can
awfullyawful@reddit
I'm the only dev for a startup I cofounded, I have no automated tests at all. If something does go wrong, I generally fix it within minutes. A lot of things that have gone wrong are due to unexpected/undocumented behaviour from legacy systems I'm forced to interact with. No way I could have tested for them anyway.
Snooze_Loose@reddit
Why did I not see this post yesterday???? Because of me all workflows are failing , this post reminded me of the mistake again...all devs please test your code ,now I am afraid that I will get escalated...wish me luck :(
WenYuGe@reddit (OP)
Honestly the pain of refactoring my first side project without tests vs with tests convinced me testing is for my own good
MyTwistedPen@reddit
This. I would refuse to refactor any code that does not contain test.
Known-A5@reddit
It's not the devs: Often it's a management decision not to write tests. Later they discover that test coverage is really low and expect devs to integrate testing retroactively and of course without accounting for time and effort needed. The shit always falls downwards.
headhunglow@reddit
I maintain legacy software (~50K C++, 100K Python). It doesn't have any tests. If there is a bug in production i fix it, deploy it and commit if it works. Then I hope that I didn't break anything else. I wish I had the time and budget to write some tests, any tests...
bordumb@reddit
I’ve never worked anywhere that put so much pressure on us that we couldn’t write tests.
Fuck that shit.
I’d be out in a heartbeat.
ruminatingonmobydick@reddit
I worked for a large fintech company that unofficially said testing was women's work. I'm not normally one to be derisive to brogrammers, but I think there's a pervasive culture of man-children in silicone valley and beyond that insults the three quarters of my life I'vespent orderingones and zeroes. My dad taught me that if you won't change a tire, you don't deserve to be behind the wheel. I'd say if you won't test your code, you shouldn't be a dev.
deftware@reddit
Sometimes you just want to make stuff happen ASAP, and tests are antithetical to that.
gjosifov@reddit
Of-course devs don't test, however the problem can't be solved with devs should do tests
There a lot of problems with testing ecosystem
Testing literature is very bad - most authors think they are smart when they make claims like - tests can only prove the presence of features, not the absence of bugs. Sherlock how about step by step tutorial how to test ?
This means that even the testing ecosystem doesn't take testing seriously
Most software today is component based software - that is why it is easier then ever to produce software, because there are a lot of components (framework/libraries) that will solve your issue.
Nobody parse pdf file byte-by-byte, they use pdf libraries
Well, in component-based software testing is hard, because most software components don't provides testing part of the component
This leads for 3-rd party testing frameworks having to hack the component in order to test it
and this hacking has to be done by the devs
Can component based software provide testing part ?
it will be hard if it OSS and authors don't have budget for it + most components are OSS
and lastly
Management - Given the nature of literature and component-based software it makes testing more hard and
and in most cases useless
So naturally the management thinks testing is low priority and it can be more easily done with Manual QA
When testing will be easily understandable and easy to use (without hacking) - testing will be no brainer
Until then customers will are paying to QA - just ask gamers how that feels
practical-programmer@reddit
This is the same as 100% automated unit testing. I had the "pleasure" to work on both types, and the place that does 100% testing is worse, so much dogma and rigidness yet there's still tons of issues -_-. The place that does not do automated tests is crazy, at least they do manual tests but still crazy looking back.
Best way to go about testing is test hotspots, tricky logic, or high value paths, basically think deeper about what should you test, don't go for percentage coverage because some devs will be lazy to just try and hit it. But thinking requires more time and I haven't worked at a place where the company gives the dev team more time to think about meaningful tests (unit,integration,etc). Such is life.
shevy-java@reddit
I test, but I don't use most of the default test-software, as I find these either awful or counter-productive. I test for feature-set and completeness of feature-set. I don't think it makes sense to do e. g. unit-tests.
I think the main question is: how to test.
Dragdu@reddit
I have honestly never worked at a place that didn't have tests and didn't gate merge behind the tests passing in CI.
Yet, in pretty much every survey you can see lot of people not writing/having tests. So the question is, WHERE DO THEY ALL WORK?
ROGER_CHOCS@reddit
I work for one of the world's largest companies, you all have used us, and I have never seen a unit test. You don't need unit tests to make billions of dollars..
But it's not boeing or anything real life critical like that, in that case I would have a much different opinion.
Majestic-Extension94@reddit
My last 6 work engagements in South Africa lack of automated testing from developers and QA is a common problem. Most of these companies claim to be agile but the team has no say how they conduct themselves. At my current work engagement I worked over december getting 1 service to be mostly tested. 81% code coverage.
They have liquibase for db migration scripts. They are using spring boot 2.4, so updated that to 3.2.X at the time. Wrote all the test, demo'ed it to manager, SM, team, tech lead, etc. Dev manager(who was also a dev in this code base not long ago) veto's it as *he* had a bad experience with liquibase(this is contradicted by team lead).
So I have put it to the dev manager: Then what is your solution to solve the lack of dev testing? Basically manually testing but he has not even conveyed that.
I have 27 years of dev experience and it is disheartening because to turn their fortune will take effort but it's not impossible
nfjsjfjwjdjjsj4@reddit
Where i work (not as a dev, as an end user of our in house software) no one tests anything. If i report a bug they ask me to test if the fix works on a good day.
In some areas they dont even have the permits required to test.
nfjsjfjwjdjjsj4@reddit
Where i work (not as a dev, as an end user of our in house software) no one tests anything. If i report a bug they ask me to test if the fix works on a good day.
In some areas they dont even have the permits required to test.
kingius@reddit
Seems like the quality of developers might be dropping over time, if this is true. Perhaps AI generation is giving developers a false level of confidence in the code they are checking in.
PM5k@reddit
You have some tickets on the board.. some familiar from the past, some are not. Perhaps you pick a novel challenge… You write garbage to make something work. You write it fast and loose cause the sprint has other shit on as well. So you do the PoC and then it works. You go back and refactor it so it’s not so shit anymore. As you do, you might think - I wonder how this could break… so you write a test. Maybe five tests. You’re confident it won’t break now. A worm burrows in your ear. “It’ll still break” the worm whispers. So you write some edge case tests. Somewhere in the office, a chaos monkey slams two cymbals together — the junior dev broke prod. You smile, that used to be you a long time ago. Now you have your tests and your polished code and you push that to staging and watch the test suite pass, success as expected and the worm is pleased. You stand up and go make a coffee before yanking a ticket off the board for your next task. When you make your way back to the desk you notice a slack message from QA - “your X broke when I just Y”. You scream internally, the worm screams, in the distance you hear the cymbals.
Junior devs either test literally everything or don’t test at all - either way they’ll break shit. Senior devs test only what makes sense from experience or not at all. Those more senior still will test behaviours over units or not at all. And QA have no clue that they are part of the test suite - perhaps one of the most important parts. Because sometimes you just don’t know every edge case, you can’t possibly predict what will break. And you’re biased anyway so what the fuck do you know about testing your own code the way a clueless user or uninvolved QA tech would?
Jaded people don’t bother because shit will break anyway. Responsible people will test what they have experience in seeing break and leave the rest to God. Psychopaths without families or desire to have free time will test everything and obsess over tests. Shit will break. Same and practical testing is still worth it. Don’t bother trying to read tea leaves for the rest. Just let someone break it early so you can go fix it before this happens in prod.
ShenmeNamaeSollich@reddit
Unless you “learned to code” entirely in the last ~10yrs, chances are none of your initial exposure to programming included instruction on how to write tests … Especially more complex things requiring mocked services etc.
You learn variables & data types & control structures; then you learn OOP; then maybe you build a game or a website frontend & backend & a database. Then you play with DS&A.
At no point in many undergrad CS classes or bootcamps is “testing” more than an afterthought.
Same was true of most online tutorials, programming language books, video courses, framework documentation, etc, that I saw from 2010-2020 when I was first muddling along.
You wanted to work with Angular or React or iOS or some other hot new thing? “Testing” was literally an appendix or throwaway chapter at the very end of the Docs for all of those.
Sure, you could maybe buy a separate dedicated book about “testing in [language/framework],” but who has time for that?
Literally only in the last ~5yrs or so have I seen an emphasis on, and materials that explain, specifically how to write tests with examples beyond something stupid & trivial like “assert(1+1).equals(2)”. (Shoutout to Google’s Android courses that incorporate realistic unit tests & mocks & e2e tests early on).
Testing has very much been a “draw the rest of the fucking owl” endeavor for me. Devs who aren’t taught to test as part of “learning how to code” overall aren’t going to write tests on the job either, unless there are already good examples in place to build on.
Testing may be viewed as “additional work” as opposed to being an integral part of the process or outcome.
On top of that, we have a solid 15-30yrs of entrenched legacy web & mobile code out there written in ways not very conducive to adding tests afterward. There’s a lack of good examples and a lack of mgt support for retrofitting things.
drinianrose@reddit
I used to have a developer who would swear that he tested his code but it nearly always failed QA. I finally asked him about how he was testing…. his definition of testing was compiling it. If the code compiled, he said it was tested.
rperanen@reddit
The biggest benefit of testing is not necessarily the tested code but known dependencies. Good developers are lazy and if writing tests takes a long time then they change code more testable which forces it to be a bit more loosely coupled.
End to end tests are great but fixing a bug is faster if code is properly tested. In that sense unit and end to end tests complement each other.
Sadly, humans are slaves of their habits. I have had long and tedious arguing with some self proclaimed geniuses who simply do not want to take care of unit testing. They rather run code like it is late 90's or early 00's than change their way of working. Cherry on top is that writing tests takes too much time when project is late and project is late due to shortcuts on qualify assurance.
agk23@reddit
So I started my own software company. I was the original developer (self taught over 20 years) but now have 9 software developers. I sold the company in March, but still run operations and drive product decisions.
We have 0 tests. Most of our application is basically a CRUD app that I think automated testing in that situation doesn’t provide a lot of value. That being said, I’d never advocate to do 0 tests, and hopefully we get to a point where we implement testing but honestly the value of faster dev outweighs the risk for us right now.
ykafia@reddit
Unit tests are not sacred, I am in charge of a 3 BRMS services, never wrote unit tests.
Main reasons for that are :
I preferred writing myself a tool that analyzes changes and generates a dashboard ready to be used by business to understand the impact of changing one rule.
What I'm trying my best to do for future devs that would replace me is keeping everything simple, as explicit as I think is best and documenting everything, from business motives to the code I write.
acroback@reddit
You will be surprised to know that a lot developers I manage think QA is there to test their code and thus they don't have to do any system testing.
This ticks me off more than anything else.
MaxwellzDaemon@reddit
QA is a somewhat thankless and extremely valuable practice.
10000BC@reddit
At the end of the day it comes down to "values" and passion for the craft. 2 things that need to be explored before hiring. The team need to care for the craft and stay strong together.
seventomidnight@reddit
"We let our customers test for us." Not even kidding, I had a boss that told me this once when I complained about not enough testing being done before deployment.
master_mansplainer@reddit
It’s not really that common in gaming to have a lot of tests written. Backend sure that’s a different story. But gameplay code tends to have a lot of aspects you can’t easily control or mock, even if you didn’t manage to convince management its worthwhile.
Versaiteis@reddit
Hi, game dev here. What's a "test"?
Jokes aside, modern game engines actually have some decent utilities for test writing but they don't make it super well known or the easiest. I know Unreal has like 3 or 4 different test suites built into it
Bash4195@reddit
One thing I haven't seen mentioned here is that it depends on scale. Enterprises and larger companies should of course be testing, reliability of the product is super important. But startups and agencies don't have time for that. Plus writing tests is not fun
psycoee@reddit
The problem is that every large company wrote most of their code when they were a small startup. With a small number of exceptions, large companies rarely create new things on their own, they usually just buy startups. The bureaucracy created by a large organization makes it almost impossible to do anything other than maintain the status quo.
AxeLond@reddit
I think the harder it is to run your code in the expected environment or hardware the more essential testing becomes.
If you're doing something simple and you can just run it and see if it works then I can understand skipping tests. However it's probably still a bad idea due all the regressions you'll have.
hairlesscaveman@reddit
At one previous job I integrated with a payment gateway. It had a test mode but it was really slow to respond to test requests. To speed up the dev I created a mock service to test against, and left this in the newly created CI pipeline. All tests passing, integration working as expected, payments start coming in. All good.
A couple of months later I go on holiday for a couple of weeks. I get back, ask how things are going, another dev mentions there was a blip in payments but he fixed it. I think nothing of it.
2 weeks later management call me into the office, no money is coming into the bank account. I go and check and the payment service was in test mode. And had been for 4 weeks. Turns out that the other dev got a report that there was a payment issue, flipped on test mode in the gateway while investigating, and suddenly all the payments “started working again”. Problem “fixed”, he goes back to other work. Except now we have a few thousand transactions that were faux-therised. I’ve never had that sinking feeling so hard in my life.
Thankfully, the payment provider was able to replay all the test transactions as real ones for us. I think we had 2 payments that failed, one of which was the original payment that caused the investigation. Cos there weren’t enough funds on the card.
I spent the next 3 months pairing with that dev to hammer home good testing practices.
code_munkee@reddit
Ain’t nobody got time for Boehm’s Law, I got features to push!
uniquelyavailable@reddit
you aren't living life until you're developing directly on production in real time
Positive_Method3022@reddit
I was bullied a lot for trying to use best practices, then I gave up. I realized that I can't fight against the environment.
confuseddork24@reddit
I was at an e-commerce shop helping build out the data warehouse. They used Google analytics and the business logic we had to implement was super fragile and spaghetti because the Google analytics implementation was consistently inconsistent across web, iOS, and Android. I brought up standardizing naming conventions, string formats, and some other basic things and asked why they don't test the tagging implementation so they don't accidentally break downstream analytic tables. Turns out they didn't have any testing, at all, period.