That's not an abstraction, that's just a layer of indirection
Posted by fernandohur@reddit | programming | View on Reddit | 162 comments
Posted by fernandohur@reddit | programming | View on Reddit | 162 comments
robin-m@reddit
There are abstraction whose goal are not to hide stuff, but to uniformise stuff. It’s when you add an adapter to be able to use some legacy module as-if it had the same interface as the rest of your code. Such abstraction are indeed 100% a level of inderection, but the cost is not in the abstraction, it’s the existance of a legacy module that doesn’t follow the architecture and convention of the rest of your code. I totally agree that adapter are a nightmare to look throught, but they are the messenger, not the root cause of the issue. The issue being not enough time allocated to clean-up and refactoring.
OkMemeTranslator@reddit
Well put. The article has completely failed to understand why abstractions exist in the first place; it's not like people want to use abstractions, it's just the lesser of two evils. It's something you won't understand until you run into an old legacy software that doesn't have any abstractions but now needs to be changed drastically.
Noxitu@reddit
It goes even further than software engineering. Even numbers themself are an abstraction. Fundamentally, there is no such thing as "2 apples" - there is an apple, and a different apple. Abstracting away the fact they are different is fundamental premise about numbers.
Fuerdummverkaufer@reddit
You don‘t even have to reach for a legacy module. Creating adapters for third party libraries is also perfectly valid. Serialization modules and SSL libraries come to mind. For example, you may want to support both Ring and OpenSSL at the same time.
lturtsamuel@reddit
On the other hand sometimes people try too hard to unify things. They create sub-optimal interface for things that are actually very different, and introduces a new layer of indirection for little benefit
agumonkey@reddit
an operational level of indirection that removes a semantic level of indirection so net zero :D
Isogash@reddit
Code with lots of abstractions is sometimes difficult to understand, but code with few abstractions is almost impossible to change.
kingdomcome50@reddit
This is comically false and one of THE MOST misunderstood concepts in programming. The number of times this has been repeated…
It’s not code with few abstractions that is difficult to change. It’s code that is poorly factored that resists change. I know I know… “aren’t those basically the same thing?”. Not exactly (something something a square is a rectangle yada yada).
Well-factored code doesn’t have to be abstract, rather, it just needs to be structured according to the functional requirements of your system and organized. What does that mean?
It means that the “creases” (or points of possible abstraction) in your codebase are properly separated from the inescapable business logic. Think sane function/method signatures, project structure, etc.
You don’t need some abstract “adapter” or “repository” interface to separate the code that uses a specific shape or persistence solution from the code that depends on the above. A single function/method is fine (and not that hard to abstract further if/when necessary)
Structured programming is, itself, an abstraction.
NullField@reddit
I think where so much goes wrong is that this is basically the entire point of things like clean/hexagonal/ddd architecture, but they always fail to mention the fact that you can have something extremely similar to them with basically zero abstraction.
Their usefulness is too overstated in things like testability and being able to swap out implementations when in reality the primary objective is clearly defining boundaries and data flow. Neither require abstraction.
TheOneWhoMixes@reddit
IMO this is partially a marketing problem. It's a lot easier to get programmers to read your blog/article by saying "We were able to swap from Mongo to Postgres with only 3 lines of code" than it is by saying "we were able to consistently ship iterative features for on time".
Perfect-Campaign9551@reddit
People forget that a function name IS also an interface abstraction
Entropy@reddit
Amen
Structured programming is not an abstraction. It is almost metaphorically equivalent to normalization for relational databases - a set of constraints rather than something you program against. For a relational database, normalization constrains layout to keep data relationships well-factored. For imperative languages, structured programming constrains control flow to keep the spaghetti straight and well inside the box. You do not consume either of these, so they cannot abstract. You employ them as a bedrock deep engineering principle.
fernandohur@reddit (OP)
Use abstractions, but use them wisely. My experience is that many software engineers will create abstractions without even thinking too much about it. Good abstractions are rare. Bad abstractions aka layers of indirection are everywhere.
ryo0ka@reddit
Good abstraction: that just happened to predict how the software would be scaled in the future
Shookfr@reddit
And then there are also abstractions of abstractions ...
My dumbass colleagues think it's a good idea to wrap everything.
AWS SDK = wrapped, because using an already in place abstraction doesn't make you look smart you gotta create your own.
Terraform module, let's build a module around it without even providing proper implementation.
kaptainlange@reddit
Unless your domain is in the business of managing AWS resources, abstracting away the AWS SDK seems like a good idea though. Even then you might want to have an abstraction because you're likely placing your own types and behaviors ontop of the AWS SDK.
If you need a queue in your business logic, your business logic doesn't need to know that the queue is backed by SQS do you?
KevinCarbonara@reddit
Why else would you be using the AWS SDK?
kaptainlange@reddit
For example, I have a domain that involves receiving Foo requests, adding them to a Foo queue, reading from the queue, and then doing various processing on those Foo and marking them Done. None of that is defined in terms of AWS. It's only defined in terms of my domain.
The queue interaction in the domain business logic revolves around the following:
queue.Add(Foo)
queue.Get() Foo
queue.Done(Foo)
The domain does not need to know anything about how the queue is implemented. It just depends on the abstraction and the contract that abstraction communicates, and the abstraction is fulfilled by various implementations.
My domain business logic doesn't need to know which implementation is being used, because it depends on the abstraction.
AWS SDK is just a detail of interacting with AWS. It's not even the only way to interact with AWS. So hide it away in your adapter concretions so your domain logic isn't tightly coupled to that specific implementation choice.
This makes it much easier to change the domain logic (which happens frequently) or to swap out the queue concretion (which happens rarely, but is a pain in the ass when it does happen if your domain logic is tightly coupled to it).
KevinCarbonara@reddit
I understand all this, but I still don't see how you're actually using the AWS SDK here. Its purpose is to manage AWS resources.
kaptainlange@reddit
Yeah, the point is that the domain code is not using the SDK. The adapter code is. So I was responding to a complaint about wrapping the AWS SDK in an abstraction with a counter that it would actually be a good idea in most cases to do so.
GordonFreemanK@reddit
Let's create a layer that will have fewer features than the underlying tech, get obsolete quicker than a tiktok fad and will have no community or online support to help you use it.
Not only people who do that are idiots, but they do that because they think they're smart.
They genuinely think they have the time and ability to create a layer over a project dedicated to public use and worked on day in day out by literally the biggest company in the world. How deluded can you be?
mr_birkenblatt@reddit
The next day.
Corporate: we're moving to azure
renatoathaydes@reddit
Devs like the parent: "hm... I am sorry but we wrote all our code based on how AWS works... to change to Azure means refactoring basically ALL code. I'm afraid we're stuck with AWS unless spending several months on the migration is acceptable".
The idiots the parent was talking about: "oh ok, we'll implement our interfaces on top of Azure, should take a day or two".
ub3rh4x0rz@reddit
Zero value abstractions are one of the absolute worst antipatterns to find in a codebase
James20k@reddit
On the other hand, having a zero value abstraction is the best thing ever when it turns out that you have to swap out the underlying technology that its built over
nikolaos-libero@reddit
If you want children from every random stranger, I guess you can skip the condom. 🤷
As for me I prefer not having third-party code rooted into a thousand files.
ub3rh4x0rz@reddit
A good abstraction that was good by mistake is an exception to an otherwise toxic mode of development. You shouldn't make an abstraction until the pattern it abstracts over has materialized or you are 99% certain it will materialize, and you have the experience to back up that certainty.
hippydipster@reddit
Sounds like signing up for analysis paralysis. People afraid to make abstractions until they're 99% certain they're right? Forget it, start learning instead to make more easily modifiable code.
ub3rh4x0rz@reddit
That's not at all what I said. The tl;dr is that people make abstractions in anticipation of a ton of instances of a lower level pattern, this is a mistake, and instead you should rack up enough instances of lower level pattern for the abstraction to be worth it before making one. More often than not this never actually becomes a problem and the code is far clearer using the unabstructed bits. More experience is required to correctly anticipate that there will be enough instances and frequent enough change, the cost of false positives is very high and false negatives is low. Grepping and updating 5 places with the same 10-20 lines of code is better than prematurely abstracting.
hippydipster@reddit
There are a ton of reasons to make abstractions and ton of different kinds of abstractions. I suppose if we just think abstractions is a way to reduce code duplication, then you have a point, but this is not how I usually think about code design and abstractions.
ub3rh4x0rz@reddit
I think anyone who's done this for a while recognizes my point regardless of any caveat you're trying to add. Premature abstraction is a net negative to a codebase, and amounts to obfuscation and needless entanglement. More people need to be told to exercise more restraint in creating abstractions than need to be told to create more abstracts, by a very wide margin.
renatoathaydes@reddit
This is really difficult. If you "wait to rack up enough instances of lower level pattern" before you reach out for some "abstraction" you may never do it because by then, lots of code has already been written without that and refactoring all that would be too costly. You end up with a terrible code base where the abstraction would really have helped you if you were brave enough to do it with incomplete information. OTOH if you went for the abstraction too early, you would have high chances of creating a bad abstraction as you had too little information. There's no way to tell which case you'll be in until it happens. With more experience, chances are you will get better at creating such abstractions, so it may be worth it to start with one.
hippydipster@reddit
Well I've been doing it for 30 years and completely disagree. The problem isn't abstraction yes/no, now/later, the problem is people choose poor abstractions. And one of the reasons they choose poor abstractions is they get this funny idea that abstractions is about reducing lines of code.
ub3rh4x0rz@reddit
"Reducing lines of code" is more of a correlated effect, never a sufficient reason to do it. If you're creating abstractions and the result is more LOC, your abstractions probably suck.
hippydipster@reddit
Well I couldn't agree more. Unfortunately, when people delay creating abstractions, and wait to see what redundancies pop up, the usual motivation at that point to create abstractions is to reuse implementation and reduce boilerplate. It too often leads to bad inheritance plans and inflexible framework designs.
Which aren't good motivations for abstractions, IMO. It too often leads to bad inheritance plans and inflexible framework designs.
Good motivations for abstractions start from the get-go, and are looking at things like letting implementations speak the language of the problem space rather than solution space, and they're looking to create restrictions that lead future code writers to work correctly. Good abstractions are vehicles of communication to other developers, both now and in the future. And you want to be communicating clearly as soon as possible.
ub3rh4x0rz@reddit
I don't subscribe to GoF pattern heavy, traditional (inheritance based) modeling, which sounds like what you're advocating. In fact I don't think even Fowler advocates for inheritance at this point. So yeah, I don't think up front abstraction from first principles is ever the right move. Similarly I think hard core hexagonal architecture is an antipattern, even if strategy pattern is great once you have the need. It's far better to know how and when to abstract than to front load it. Some will wait too long or go about it the wrong way when there's a material cause, but whoever writes a shitty abstraction at that stage was going to write an even shittier one if they were forced to imagine future use cases.
corbymatt@reddit
I would change that slightly: programming is about humans understanding and making understood data and behaviours, specifically when in relation to expressing instructions to machines.
I agree yours sounds better though.
OPtoss@reddit
YAGNI
ub3rh4x0rz@reddit
exactly
ryo0ka@reddit
I think the same. The worst, burn-out-inducing abstraction is the one that attempts to cover every possible case of the future.
KalilPedro@reddit
YESSSSS abstractions that try to cover every use case cover none well and are a pain to extend and implement by a new module.
ub3rh4x0rz@reddit
It takes a certain amount of experience and humility to accept that copy/paste and grep are the first order tools, and abstractions have to earn their keep.
Empty-Win-5381@reddit
What do you mean by this materialization?
nanotree@reddit
One or two layers of abstractions is usually my maximum. If you design it right, your average OOO-style interface is the only layer of abstraction you should really need. Rarely is inheritance a good idea. I've personally dealt with code that used inheritance heavily, and damn, it gets obvious how bad it is for layers of classes to inherit from one base class. When those child classes access protected methods and fields of the parent classes, it becomes a strongly coupled dependency.
seweso@reddit
Just like people every layer should hide secrets and shit which should never see the light of day!
ChadtheWad@reddit
Hard disagree there. If you're lucky and the changes you need to make fit what an abstraction predicts, then it's easy to change. But 99% of the time I've spent on projects with "abstractions" is dealing with unanticipated changes, and then suddenly all those abstractions become a blocker.
I think modularization is a much better tool than abstraction for ease of change.
There are a few related articles/talks on this. I'd check out Volatility-based Decomposition and this talk Simple made easy by the creator of Clojure.
Isogash@reddit
Modularization is abstraction.
ChadtheWad@reddit
Yeah that's true, technically modularization is a form of abstraction, but it works differently from code that has lots of abstractions.
If you take a 1000-line function and break it up into sub-functions, it's a whole lot easier to understand in contrast to code with lots of abstractions because it's broken up into simpler pieces that perform a single task with explicitly declared inputs and outputs. This is in contrast to, as you say, code with lots of abstractions that can be difficult to understand.
When we start looking at those functions and their parameters and define common interaction models and shared data models, that's when abstraction can be dangerous. It gets really bad is when people start trying to anticipate changes by nesting their code heavily inside abstractions -- such as taking what could be simple functions and wrapping them in classes to start, and trying to move function parameters into shared abstract classes.
fear_the_future@reddit
Lotta assumptions there. It is only simpler if the functions perform a single task, if they are named well, if they have meaningful declared inputs and outputs... Instead what I often find written by colleagues are functions like
doThing
immediately callingreallyDoThing
just to separate the error handling; functions that do multiple things but with no discernable commonality; functions that have output types that don't tell you anything, or if you're especially unlucky you're using a shit language like Python that doesn't even have a usable typesystem.chakan2@reddit
This is situational, and I've seen it go both ways. Sure, in a well designed abstraction, I only have to change code in one place.
HOWEVER...changing that code has a massive impact on your system and often in very unexpected areas.
There are situations where I'd rather have just fixed the 3 or 4 duplicate chunks of code than deal with the root implementation and all the changes it caused through the layers of abstraction.
I get it, it's a general feeling article that's loose on examples. The problem it's describing is very abstract, and frankly, you don't see the problem it's talking about until possibly years after your implementation (think about this from a slow moving fortune 50 type company).
I like the spirit animal of the article. Don't abstract just because the book told you to. Think about the problem and if anyone will every actually need that abstraction before you write it.
flukus@reddit
All too often it turns out that you only want those changes in 1 or 2 of those duplicates chunks anyway. Go through a couple of iterations of changes like that and each caller essentially has it's own code path through the complex abstract code anyway.
tehRash@reddit
It's more than just difficult to understand imo, it's often about the burden of maintaining or extending overly generic code. I'm firmly in the camp of only write abstractions when you truly understand what you are abstracting, which you seldom do the first or second time you write it.
Oftentimes overly abstracted code is even more difficult to change since you have to jump through several layers of functions calling each other/cognitive indirection until you realize that all those functions only had one caller (which satisfied the original use case that didn't benefit from any abstraction) so now you have to do five changes some simple tweak because the abstraction didn't properly consider this new use-case that popped up.
I don't know, it often feels that when you abstract something you quietly say "I wholly understand all the future use-cases of this piece of code" which is often not the case.
valarauca14@reddit
I believe it was Dennis Ritchie who once said something along the lines of "code isn't mature until its 3^rd rewrite".
Isogash@reddit
Stop trying to jump through several layers of functions at once in trying to understand them then! That's what's making these abstractions difficult to understand: you are working against them.
Is it your first instinct when using a library to follow the code flow through every function you call in order to understand how it works? No! You just accept the abstraction at face value and examine the behaviour externally to confirm that it's working as expected. You might even read the documentation!
You should treat abstractions within your codebase the same way. Granted, sometimes people overdo the abstractions and build something that "feels" abstract rather than that is actually useful in abstracting away a concern. You can refactor these into useful abstractions if you prefer.
And the measure of success of a good abstraction is that people don't both jumping into your code because they trust that it works!
falconfetus8@reddit
Guess what: not all code works. If the broken code is on the other side of an abstraction, guess what you need to do.
Schmittfried@reddit
As they said, most abstractions, they are layers of indirection. You recognize them by the fact that you do need to jump around to understand what is going on. They don’t abstract anything away.
kobriks@reddit
Idiotic implementations of layered architecture are the worst offenders here. Some dogmatic morons insist you have to go through each layer step by step instead of going straight to the layer you need to.
Empty-Win-5381@reddit
Absolutely
acc_agg@reddit
If you can't understand it you can't change it.
Isogash@reddit
Code with few abstractions is not often any easier to understand.
shipshaper88@reddit
The eternal struggle.
seanamos-1@reddit
A lot of bad abstractions is both difficult to understand and change.
Really, the use of abstraction, or lack thereof, doesn't say anything about the readability/changeability of a code base. Some great code bases have little or none, others have many, the same for bad code bases. It depends entirely on what you are doing.
I tend to err towards abstracting only as necessary, and try sure they are decent ones.
JarredMack@reddit
How does the "abstraction" that has 10 layers of logic paths baked into it as new requirements came up over time and the developer couldn't be bothered abstracting again factor into this assertion?
doesnt_use_reddit@reddit
So long as it's the right abstraction! The wrong abstraction can make things harder to change as well
Cheraldenine@reddit
Depends on the change, and how much repetition there is.
Without knowing more, easier to understand and fewer lines of code implies easier to change.
jhartikainen@reddit
I think this is something that occurs especially when devs blindly follow guidelines/rules about function length.
Splitting the function into smaller ones doesn't help much if it's on the same level of abstraction as the original function... but I think it can sometimes be challenging to determine what is the level of abstraction of something.
hardware2win@reddit
Ah yea, the small functions of "clean code"
jhartikainen@reddit
I think Clean Code gets a bad rap because people take the advice way too literally :)
4THOT@reddit
Clean Code gets a bad rap because it's dogshit advice and makes no sense and makes your code run like actual ass.
"A book about inserting glass rods into your dick and smashing it with a rolling pin gets a bad rap because people take the advice way too literally."
Sometimes an idea is just bad, and there isn't actually a trade off because someone says there is.
I'm sure I could write a semi-convincing 300 page Bob Martin-esque book about how pleasure and pain sensations in the brain are remarkably close in structure, and that through the novel sexual paradigm of smashing rods of glass in your dick you can theoretically achieve new levels of both pleasure by training pain tolerance to experience the exquisite pleasure of glass being broken within your dick.
That's all you're doing with Clean Code. You are just doing the software equivalent of smashing glass into your dick and insisting that 'the upsides outweigh the downsides' or 'you just shouldn't be so literal, you use the Rust™️ Glass™️ Urethra-Checker™️ and it's totally fine!'.
You don't see anyone writing real software using "Clean Code" or OOP. Not NASA. Not kernel developers. Not game developers. Not driver developers. Not shader developers. You can see database developers try it in MySQL and it loses performance to no appreciable gain in features with an explosion of cool bugs so people abandon it for a better database system (just fucking use Postgres).
If any of this shit actually worked we would have seen some meaningful returns in the last 30 years of dick smashing.
Find me a single OOP/Clean Code development house that is pushing a ton of feature rich and bug free software (that is more than some shitty website with broken parallax scrolling), I'll even give up the performance argument entirely. Why hasn't Uncle Bob out-competed anyone, anywhere, outside of consulting?
jhartikainen@reddit
I'd love to see an actual honest critique of Clean Code for once. I just see people really polarized about it on Reddit for some reason.
I read it a long time ago and I seem to recall it's mostly pretty regular advice on stuff like naming things, functions should do one thing at a time, law of demeter, etc. what have you the usual things considered reasonable. I think it had one or two things in it which were a bit odd on first sight, like functions should have no parameters, but it's quite clear the intent is to reduce parameters where it makes sense, not enforce some unenforceable "zero params" rule.
DavidJCobb@reddit
A lot of the code in the book is low-quality, and this is code that the reader is explicitly meant to learn from.
Speaking from my own experience, I occasionally see people write code that is messy and labyrinthine specifically to follow Clean Code recommendations. Things like dividing a single function up into a class with several microfunctions, and avoiding passing arguments by instead using class fields, such that the code becomes complete spaghetti where they've basically built a miniature global scope and polluted it to hell. The ridiculous absolutist rules that Bob Martin promotes -- and from what talks of his I've watched, the man is absolutely hardline about them -- make it impossible to write genuinely clean code and enormously difficult to get anything done, so people have to rules-lawyer them and build unhinged messes.
jhartikainen@reddit
The article you link is reasonably argued, but many of the arguments seem to stem from misunderstanding what the book says.
For example, "Martin's reasoning is rather that a Boolean argument means that a function does more than one thing" - This is not his reasoning for it. It's just something that could be an indication.
Similarly, the points on functions not containing nested control structures, functions being short, etc. - these are all sort of ideal things to strive for, not something you force your code to follow at the cost of legibility.
And same with the criticism on testing/tdd advice: Making a test-specific "DSL" is actually a good idea, but obviously not in a toy-example. At sufficient complexity, absolutely.
There's definitely some weird code examples in there, no disagreements on that lol - but it seems some of these concepts may be difficult to present in such a constrained format.
I think most readers seem to ignore the most important thing the book says... That you should not take the book as claiming to be absolutely correct. It literally says you should seek information from other "schools of thought" as well.
I think the advice has value, but the reader needs enough experience to have the perspective to see the actual meaning/use. If not, then exactly what you said will happen - people will just blindly adhere to the perceived "rules" and write a bunch of garbage. Perhaps the book is not as well written as it could be for this reason.
I've not watched any of his talks so can't really say for any of that stuff.
4THOT@reddit
Congrats on not knowing what anyone here is talking about?
jhartikainen@reddit
Yeah, it's kinda hard to tell what people's problem with it is when they talk about it using metaphors like sticking glass into their sensitive bits lol
4THOT@reddit
https://www.computerenhance.com/p/clean-code-horrible-performance
here's a dickless explainer
jhartikainen@reddit
Well that seems like a reasonably fair critique. I don't think anyone can argue that using more complicated abstractions is good for performance.
I think the important thing to note here is that at no point does the author have any actual critique on it besides the performance point of view. There's no critique on the rules purely from a software design and architecture standpoint, which is generally what Clean Code is about - it isn't performance focused.
For many developers, they don't need this level of optimization and performance, but obviously if you do, then it certainly makes sense to think twice about following the Clean Code suggestions.
4THOT@reddit
Nothing is gained from following "clean code".
MardiFoufs@reddit
That's funny, I've seen more migrations towards MySQL recently.
BestPseudonym@reddit
What's "real software?" Anything not written by NASA, kernel devs, game devs, driver devs, and shader devs? Or is it the use of OOP that makes something real software? Also what definition of OOP are we using here? I don't have much experience in the other fields but game devs absolutely use OOP
4THOT@reddit
As in "actually needs to run as if someone cares about performance, debuggability, or safety".
That's super cool for them.
The fact I have to dig through 10 "abstractions" to know what a single function does. The fact that none of the prescriptions make any actual sense. The fact that there is no evidence this works.
It's super easy to write off. I write the instructions I want the machine to execute in the most straightforward and direct manner possible.
BestPseudonym@reddit
The fabled "just write the code" method, I've been trying to learn this
Sokaron@reddit
How are you supposed to take a coding book's advice, other than literally? This isn't self-help or philosophy. The entire book is in-the-weeds, written code examples of "unclean code" (per Uncle Bob) vs. "clean code". "You need to take it with a massive grain of salt" is an indictment of the book, not a defense of it, IMO.
jhartikainen@reddit
Well, as an example, I've seen criticism against the "functions should have zero parameters" rule because "it's stupid to force all functions to have zero parameters". But the book doesn't tell you you have to slavishly make all your functions have zero parameters.
That's what I mean by taking it too literally - I guess it's more like "the people who I've seen criticize it seem to lack basic reading comprehension" but I'm pretty sure saying that here will just get me flamed.
jrochkind@reddit
"duplication is far cheaper than the wrong abstraction" -- Sandi Metz
And I don't think she actually means in terms of compute resources, but in terms of developer attention and time.
https://sandimetz.com/blog/2016/1/20/the-wrong-abstraction
Uberhipster@reddit
too late to exact any meaningful discourse but von Neumann was famously irritated with one of his grad students who invented assembler code (as an abstraction over operational codes)
von Neumann's principle objection, reportedly is that the grad had made the machine - which is designed for mathematical computation - employed to do clerical work which was his responsibility
IDK what that means in terms of wrong and right abstractions but I think von Neumann was prolly rolling around in his grave by the time COBOL rolled out (but I like to think he would have enjoyed and supported LISP)
anyhoo - that's my 2c
bring_back_the_v10s@reddit
Can someone please help me understand the difference between abstraction and layer of indirection? I honestly don't see the difference.
nan0tubes@reddit
my crappy example
bring_back_the_v10s@reddit
I think I get your point but isn't
ErrorLogger
an abstraction? To me it seems to be an abstraction on top of another abstraction, and each abstraction is also a layer of indirection.fear_the_future@reddit
Every abstraction is indirection, but not every indirection is a (useful) abstraction. The abstraction layer can only successfully abstract if you can use it without knowing details of the implementation.
bring_back_the_v10s@reddit
The "useful" adjective completely changes the direction of this conversation.
Yeah I agree but again this is not the point of the discussion.
Look, I don't want to sound pedantic but in our field we need to get fundamental concepts right if we want to be able to solve real problems. Abstraction and layer of indirection are interchangeable terms as far as I understand, they're the same thing. The author makes a big confusion by incorrectly making this false distinction. He's not talking about abstraction vs indirection, instead as you correctly pointed out he's talking about unnecessary abstractions (or indirections), but you see how much confusion is made by not clearly/correctly establishing the basic concepts?
fear_the_future@reddit
Well, I don't agree that they're the same thing. Abstraction requires something more that is hard to pin down. It actually has to reduce a concrete thing down to a more general concept. The fact that many people call mere indirection an abstraction doesn't make it so.
chakan2@reddit
It is...but someone needed a PhD and came up with the term indirection. I'd consider it a specialized abstraction.
abw@reddit
It's a good example and I totally agree with the point you're making.
However, I have found myself writing code like your second example to "uselessly" wrap a third party component.
At some point in the future we might want to switch to a different third party component for some reason. Perhaps the author abandons it, changes the licensing terms, refuses to fix a bug you've found, becomes uncooperative, or whatever. Or more often, releases a new major version with an incompatible API.
Sometimes abstractions/indirections are useful to isolate your application code from a dependency on a particular implementation that you might want to change at some point in the future. On the surface it might look pointless, but it's there to hide away (aka abstract) the details of an implementation.
Of course, being a good software engineer is knowing when this is likely to be useful and when it's YAGNI.
__Maximum__@reddit
Feels a bit like AI written, lacks details, and real code examples. Just a vague idea expended with too many words.
OkMemeTranslator@reddit
Also it's all based on false premises.
False premise #1: Abstractions slow down the code (and that it matters).
Not only are there actual studies made that concluded that abstractions actually speed up your product in the long run, but what kind of code are you working on where a few extra function calls per high level API call are such a huge performance issue that your CPU is in trouble?? We're talking like 0.001 % of real world use cases, this article is a nothingburger and a horrible premature optimization at best.
False premise #2: TCP somehow being different than other abstractions(??).
The article plain out states that TCP is a great abstraction, a living proof that abstractions are good. Yet when OP's abstractions don't work, the fault is somehow in abstractions in general and not OP just being bad at software engineering?
This applies to literally anything. The REST APIs in my company are very complex and convoluted, do not use REST APIs!!! Of course there's X, Y, and Z that are great REST APIs designed by someone else, but mine don't work so don't use REST APIs!!
Also TCP is really really good for transmitting data. Yet when I come up with my own data transfer protocols, they're always bad. Therefore data transfer protocols are evil!!
josefx@reddit
That study only shows that you can optimize a completely unoptimized code base while also making sure that an unrelated metric goes up.
And where do you draw that line?
chakan2@reddit
I think you have to have coded for a while to really feel this article. The examples you want aren't concise little 10 line snippets of examples. They're convoluted rats nests that end up 5 to 7 layers deep and on the outside, look like good code. It's not until you really dig into them for that elusive bug that you realize you've hit a quagmire of garbage.
For example. My last job. We had a tool that had a bunch of integrations with other tools. Great, out of the box they just kind of work. One of our architects got it in his head that he wanted to dynamically inject credentials and job information into these integrations, so he wrote an abstraction layer on top of them. Seems reasonable...
Guy leaves the company and I show up to take all this over. The code looked good, I've got a handle on it, first requirement comes through to add a new parameter...Ok, no big deal...lets get into it.
I decide to start at the bottom and work my way up. The call to the product's integration, easy, add the parameter. Start to add it to the wrapping function...it's a spread operation. Ok, nbd, extract that and put it in.
Get to the next layer up...hey, boss, you know where these environment things are coming from, and what they actually are? No? Shit...Ok...There goes a couple days tracing through our pipelines to figuring how how all that stuff is injected.
Up another layer...uh...boss...why do we even have this layer. "Because it's the way." WTF does that even mean? I get an hour long diatribe about corporate politics and so forth that ends up with...I think Co-Pilot wrote this layer, not sure if we need it at all, but there's a layer of unit tests and automation built around it that's very costly to change.
THEN, FINALLY, I get to the calling code. I did it, finally, I'm done...No...there's a special function you call and pass in this function to make it all work (I think it was a decorator that relies on yet more abstract environment setup).
Oh fuck this...I quit.
TL;DR: Don't abstract shit until you're sure you need it. The reason why isn't simple.
OkMemeTranslator@reddit
I've been a software developer for almost 25 years, and I have no feel for this article. It's literally just "some people at my company wrote bad code" disguised to make it sound like it's an issue with abstractions specifically, like those people wouldn't have written bad software anyways. Also considering how the author seems to blame abstractions specifically, I'm pretty sure they suck at abstractions themselves.
One small UML graph will easily describe that. Would have taken them 2 minutes to draw on a tool like draw.io. Not an excuse for not providing any proof or examples while making extraordinary claims.
"Extraordinary claims?" Yes, our entire world runs on abstractions. Whether it's computers and software (starting from transistors through assembly to high level languages to the frameworks you use), or how businesses are run (not like the CEO knows every detail in the company), or how a car is driven (most drivers don't know what happens in their car), everything is abstraction on top of abstraction. Yet the author asserts abstractions are bad and provides zero proof or examples.
What you described is a bad developer at your company. I too knew a bad developer, they stored everything into arrays. Never maps, never objects, always arrays. Something like:
Except they didn't even have those comments. Now should I write an article about not using arrays and indexing (without providing the above example even)? Or about commenting your code better?
No, this has nothing to do with the tool being bad. This has everything to do with a bad software developer using the wrong tool for the wrong job. And there's nothing you can do about it but fix it yourself and - in the case of content creators - teach the correct way.
Yet the OP isn't teaching how to use abstractions properly. No, they're not even showing how they can be used wrongly. They are just stating that abstractions can be used poorly by poor developers. What an useless piece of article.
cdb_11@reddit
Sorry, but that sounds like projection to me. In your other comment you did exactly that, and posed it as a false dichotomy of either spaghetti code or overabstracted code:
Meanwhile, the article author acknowledges that some abstractions can solve real problems. What the author is arguing against is premature abstraction.
chakan2@reddit
Oh oh oh...I see...yea...um. Well...I've never seen a UML that actually fixes bad code.
The array indexing thing is easy to fix. I can make those descriptive variables in an afternoon. That's really a really minor thing to fix.
A bad layer of abstraction could be months of work depending on how ingrained that is in the code base. AND...If it's legacy code, you're straight fucked. No one will invest in fixing that.
I think my favorite abstraction of all time...and this is by the book gang of 4 OO java...I saw a company abstract Booleans. I shit you not. And...if you really go by the OO standards laid out in Java, that team was supposed to abstract their Booleans.
That project was another example of what this article is getting at. We had what was essentially a 500 line ETL job in C#...It was performant and holding up under excessive load. The architects came in, saw what we did, and put a Java team on it. That layer exploded into a 10k line behemoth. 9000 lines of that were just abstractions into the different payloads we dealt with.
Yes, they had a slick UML of the whole thing, and it was by the book OO abstraction.
I can't tell you how many time an enterprise architect has come in, said abstract the problem away, walks out, gets promoted away and left said implementation team stranded. That's happened at multiple companies of multiple sizes.
I liked the article. It's putting words to a very abstract problem that's pervasive in the industry.
OkMemeTranslator@reddit
What? I simply wanted the author to describe their problem with UML, not fix anything. I genuinely can't be arsed to read further, if there was anything important then maybe consider not majestically fucking up the very first thing you say in your comment the next time.
Kinglink@reddit
You are acting like you have to be a senior to understand the article, but you REALLY talk like a junior.
And it really sounds like you're complaining about bad documentation, not bad code.
If you have a problem with some guy's code.. Did you ever think maybe it's that specific implimentation/programmer's work you don't like? But also the amount of insults you throw out.. Really makes me wonder how you are to work with, Yikes man...
raze4daze@reddit
Nonsense. It’s actually the opposite. You have to have coded professionally (no, school doesn’t count) for no more than a few years to feel this article.
Based on your comment on not understanding why these layers exist and then deciding to go on a long diatribe about corporate politics (wtf….), I’d say you’re in the same bucket.
smackson@reddit
I am genuinely surprised how hard you missed the point of the article.
If anything, I'd say the author is advocating against premature optimization.
different than some other abstractions.
Yes. Yes it does
a living proof that some abstractions are good.
No. You're missing the point so infuriatingly obviously here. Author never states that abstraction in general is at fault. He is saying that not all abstractions are created equal.
Author never said that.
The problem, as I see it, is that the article's point is nuanced, and in order to complain about it, you're claiming that the author is more dogmatic than they actually are.
It's a common trope in internet wastes of time. You interpret someone's ideas as more polarized than they actually are, and thereby you are guilty of doing the polarizing.
OkMemeTranslator@reddit
He's advocating against abstractions based on performance gains. He says it like 12 times during the one page article. Abstractions are not there for performance gains, they're there for extensibility support (e.g. swapping to a different implementation).
Yes and physical exercises (abstractions) are bad. Sure, some exercises (TCP) are good, but others are really bad (I just won't provide any examples of such exercises, maybe my company used uranium weights, you figure it out yourself, I'll just assert that such exercises exist).
If that's his point, then sure, he's not lying.
Only some exercises are good for humans. Definitely not the ones where you're in contact with uranium, though. I had those at my company, now I just avoid all exercises to be safe. I should probably write an article about how you should be careful with choosing your exercises.
Not all exercises are created equal.
So you're implying that he simply wanted to write an article about "some people are bad at their job" with no specific point regarding abstractions being different? But instead in 10 paragraphs and decided to just talk about abstractions specifically while ignoring all other aspects of every single job that has ever existed on this planet? Yeah, what an intelligent author!
I'm sorry if I assumed that people who write articles have a point besides "some people are bad their jobs". What's next, an article of water being wet but in 10 paragraphs instead of three words?
QuodEratEst@reddit
That's the author 😂
OkMemeTranslator@reddit
LOL
Well, changes nothing. At least I called him out truthfully, not sugarcoating things knowing it's him.
QuodEratEst@reddit
No one other than the author would have made it down that far the comments and responded like that right? Lol
VulgarExigencies@reddit
what a weird take
QuodEratEst@reddit
How the fuck is it weird??
VulgarExigencies@reddit
you're deep in the comments responding, why would someone who agrees with the article have to be the author to be doing so as well?
Sufficient_Meet6836@reddit
Not really. Your "false" premises 2 and 3 aren't presented or implied anywhere in the article. Your comment is a pretty awful representation of the author's argument.
OkMemeTranslator@reddit
If you would kindly read my earlier comments in this very comment thread you're replying to before actually replying, you could then clarify where exactly I'm going wrong with my line of thinking. As far as I'm aware, either it's the most useless article anyone has ever written ("some people do their jobs poorly" but 10 paragraphs about abstractions instead), or my assumptions are correct. Feel free to provide a third option I haven't considered, but thus far your comment is as useless as their article.
Sufficient_Meet6836@reddit
I did read your other comments. I actually agree with the point that the article needs more substantiation. But you don't need to make up false claims to make your argument better. Your premises 2 and 3 were completely made up by you. It's intellectually dishonest. Do better
OkMemeTranslator@reddit
Ah, found the author's alt account. Maybe if you keep asserting this as a fact a few more times then it will suddenly become one? Not like we expect people to explain themselves when making bold claims or anything, nah your word is the truth mate!
Sufficient_Meet6836@reddit
🙄🙄 Insufferable and more intellectual dishonesty.
You made the claims. You have to back them up. Those premises do not occur in the article anywhere.
OkMemeTranslator@reddit
I already backed them up more than enough in my previous comments. I already admitted that they are implied premises based on common sense. The only alternative I could think of was that the author wanted to write an article about "some people are bad at their jobs" but decided to disguise it as an article of abstraction instead, when it could have been about any job in the world ("paving roads").
I would ask you to back up your claims next and explain where I'm wrong, but I already gave you plenty opportunities and choose to block your stupid ass instead.
Equivalent-Way3@reddit
You can't be serious here. He says in literally the first paragraph:
Of course he doesn't think they're inherently bad.
He literally states TCP as an example of good abstraction.
People write about poor use of good tools all the time. That helps people use those tools better.
OkMemeTranslator@reddit
How did this article help anyone to use abstraction better? It's not about good and bad uses of abstraction, it's about "some people have used abstractions poorly". What does that teach me? How can I avoid that in the future?
It's a nothingburger.
Equivalent-Way3@reddit
I'm not arguing if the article succeeded at its goal. I am pointing out you're either commenting in bad faith or you have poor reading comprehension when you say
OkMemeTranslator@reddit
Ironic coming from you, considering my very next sentence was:
Kinglink@reddit
Imagine if instead of saying "Oh some mythical abstraction is bad" She actually cited examples...
Nah dude you're defending a BS article, it's poorly written, and just written to be written.
OkMemeTranslator@reddit
I decided to read even further and it just gets worse.
Uncited performance cost referenced again in a new chapter(??).
Incomplete assumptions of what abstractions are for:
Depends who you ask. Yes, instead of having to hard code support for UDP, TCP, Modbus, CanBUS, and 21 other transmit protocols, I can just use one abstraction. It does make things a lot simpler, no? Also in my experience their main purpose isn't to make things simpler per se, but to make things simpler to change. Good luck swapping your HTTP requests all over your code base to UDP packets if you haven't used any abstractions. I'll just change my
new HttpClient()
tonew UdpClient()
.Also some more uncited assertions on abstractions just magically "not working":
Reality based on what, you being a bad developer? And which is easier, learning the rules and interfaces of the abstraction layer, or learning every single protocol in the entire world and somehow implementing switching between them based on different clients' needs?
hrvbrs@reddit
Ironic… this article preaches about the dangers of abstraction while being a complete layer of abstraction in and of itself.
kobriks@reddit
It's hard not to be abstract when talking about abstractions
eightslipsandagully@reddit
The fundamental point is sound, and really it's just another dressed up version of "don't be dogmatic"
BarneyStinson@reddit
In my experience most programs suffer from insufficient abstraction. It is funny how so many developers readily accept the layers upon layers of abstraction they are building upon, but reject the idea of creating abstractions of their own.
Often improvements to the code are not even considered because it is too hard to implement (or even think about) them with the present level of abstraction.
That said, identifying good abstractions is a bit of an art. I would be more interested in an article giving some guidance about finding and implementing good abstractions.
twitchard@reddit
This article would be way better if it had a couple examples of pure abstractions. Without examples it's a little too... well... abstract.
prouxi@reddit
From the title I thought this was /r/programmingcirclejerk
TangerineX@reddit
Ironically, this article abstracts the concept of making abstractions, and falls into the same pitfall it warns against. This article would be much better with real world examples of what to do, and what to do instead.
LovesGettingRandomPm@reddit
Abstractions should be clear to understand and be completely independent, that way you don't have to go down another level of abstraction.
So you should use them sparingly imo
yieldsfalsehood@reddit
Does this article hide complexity or add a layer of indirection regarding what abstraction is?
bring_back_the_v10s@reddit
Functions, methods, classes, interfaces, are all abstractions by definition, regardless of whether or not they hide complexity. It seems the author doesn't have a clear conceptual base in his mind about abstraction and indirection.
plexiglassmass@reddit
This is why I refuse to use UDFs. And no built-in library imports either. For example my python scripts are just
But seriously, I think the struggle to balance too much indirection against too much coupling is one of the hardest things to strike.
Also, this article could have been a paragraph. Not much to write home about here.
Dwedit@reddit
One-line accessors and mutators are pretty silly, especially when you go up a class hierarchy just to expose some member in a third-level child class. A lot of busy work. But it does achieve encapsulation, even though you manually need to poke the holes.
bwainfweeze@reddit
I think my first real view into the sins of indirection came when I found the architects talking about changes for major+1 version. I spent a good bit of time talking them out of an architecture layer because they had a facade layer for receiving actions and sending it to the implementation, but the only thing that talked to it was another abstraction layer for sending the actions. In the first place.
I was adamant that having abstractions that only talk to abstractions is waste. You should only need one abstraction between sender and receiver. At least for the number of solutions we had for the same problems.
Internally I was thinking architectural astronaut.
Existing-Charge8769@reddit
Example of this: Langchain
jjeroennl@reddit
Just remove it then? Removing abstractions is relatively easy?
I have never seen software fail because of too many abstractions. I have seen software fail because of too little abstractions.
If you have bad abstractions they can slow you down at worst, but again, removing abstractions is much easier than shoehorning them in afterwards.
4THOT@reddit
Is it possible to be so blind that your eyes begin to emit light?
jjeroennl@reddit
Is it possible to give actual arguments
4THOT@reddit
Sure, MySQL is losing performance due to OOP indirections making their calls take longer, has bugs so severe that the developers would rather hide them.
https://smalldatum.blogspot.com/2024/08/mysql-regressions-update-nonindex-vs.html
jjeroennl@reddit
I have no clue what you are arguing against lol, but according to your own link:
I also wouldn’t call MySQL a failure lmao…
I didn’t mean abstractions can never have bugs, just that I have never seen projects fail because of too many abstractions.
4THOT@reddit
See what I mean by blind?
"bugs were fixed it's fine"
"performance regressions? doesn't look like anything to me"
jjeroennl@reddit
You’re literally arguing against I have never said lol.
I said I have never seen PROJECTS fail because of too many abstractions. Not that they can’t have bugs, not that they cannot make bad abstractions. Just that the project doesn’t FAIL.
I have, however, seen MANY projects fail because the code became unmaintainable because of over coupling, spaggetti code and under abstractions.
bring_back_the_v10s@reddit
Also people often confuse "too many abstractions" with "bad abstractions"
jjeroennl@reddit
True. And then again, I still prefer a bad abstraction over no abstraction at all. Most bad abstractions can be fixed relatively easily.
Replacing over coupled and messy code with an abstraction is much harder than fixing (or sometimes removing) a bad abstraction.
lunchmeat317@reddit
Abstractions are great when done well.
Unfortunately, they usually aren't. This is actually enforced by langiage design aa well.
Functional abstractions are the best if your language supports it. Classical abstractions are tolerable at best and awful at worst.
The GoF patterns are useful if your classical language doesn't provide alternatives to solve the problems that the patterns solve. With modern languages, they aren't always needed and can make things messier than they need to be.
enraged_supreme_cat@reddit
How can i reply to this article if there's no code example?
Paddy3118@reddit
Raymond Hettinger said something similar when talking of replacing all classes in a Python codebase that had only one method with a plain function call to reduce complexity and increase speed.
Kinglink@reddit
Lol no.
TopHatX@reddit
I love this article. The essence is that all abstractions are leaky, so when you start using an abstraction you eventually, if you use it deeply, end up needing to understand the underlying layer of abstraction to be performant. This is true at every layer from transistors to assembly to C to python.
shevy-java@reddit
Sometimes indirections are necessary. For instance, as I am working on trying to create a cross-UI (e. g. where button.on_clicked {} will work on the web as well as traditional GUIs), some toolkits support more things than others. In the module that ties them together, some of the things it does is just indirection and delegation to sub-modules that handle these things properly on that particular toolkit. I feel the notion in the title is not convincing, since it assumes that an indirection can never be an (or any form of) abstraction, which I think is incorrect. For some method calls I can pass things 1:1; for others I need to handle things differently based on the toolkit at hand. For instance, on the web, I handle things mostly via javascript functions. In GTK I handle things mostly directly (I guess I could also use gjs and use javascript but boy, I hate javascript so much that I want to use it less rather than more when possible).
daedalus_structure@reddit
Indirection is an implementation of the Abstraction interface.
You're welcome.
CatolicQuotes@reddit
I don't like this article full of presumptions and vague talk lacking real examples
Grandpas_Plump_Chode@reddit
I find the author's use of "abstraction" strange.
Typically when developers talk about abstraction we talk about abstractions in relation to coding practices, using things like interfaces, parent classes, etc.
While it's technically correct, I find it odd to call TCP an abstraction here. It's also technically correct that any video game allows us to "operate as if the underlying complexity doesn't exist." That's basically the entire point of any software project.
Just because a concept is a great abstraction as a whole doesn't mean it avoids abstraction within it's codebase.
This is such a silly criticism. All abstractions absolutely should be derived from the thing they are supposed to be abstracting. You don't waste your time creating an interface unless you've already created (or know you will be creating) several classes with the same basic structure.
Even at a high level (such as the TCP example) this is true, you fundamentally need to know the thing that is being abstracted in order to create a good, meaningful abstraction layer.
The real issue with abstraction is that many times people try to abstract preemptively without truly understanding the thing they are abstracting.
ub3rh4x0rz@reddit
Premature abstraction is the root of all evil in web/app dev. A pathological effort to be DRY needlessly couples code that might start out "the same" and gradually the need to diverge causes that abstraction to crumble under its own complexity. Mid level mistake.
hamsterofdark@reddit
Is “indirection” a bad word? I’d say it doesn’t have a positive connotation, but it can certainly play a constructive role in SD. An apt analogy: you are driving from A to B (letters are actually freeways). You indirectly take a very long and winding system of ramps and loops. It’s 200ft as the crow flies, but you drove .3 miles. But that’s ok because the alternative are traffic lights and less total throughput.
YesIAmRightWing@reddit
most the time people with abstractions just really want a facade.
ShinyHappyREM@reddit
And more importantly, the Big-O curve in relation to the expected usage
seweso@reddit
The worst abstraction layers are tightly coupled with whatever its trying to abstract by having its own version of pretty much everything its abstracting (value objects, enums, interfaces etc, exception types).
Just be more leaky on purpose or do some actual work in your abstraction layer. If most of the code you write only needed one braincell.....you are probably doing it wrong :P
Apprehensive-Soup405@reddit
Keep it simple, abstract as you need to and not before. Don’t try and predict future use cases unless you are completely sure you know everything about them otherwise it’s just technical debt
C-Tez-43@reddit
Didn't read it