What are some past "fad" fields of computer science that didn't age well?
Posted by playerNaN@reddit | ExperiencedDevs | View on Reddit | 797 comments
There have been plenty of fields in CS that had a huge spike in popularity and then many people moved on such as cloud computing, parallel computing, and big data. Some of the "fad" fields are still heavily used like the ones I listed, but others haven't seen much practical use and most people who know what they are doing have moved on. Blockchain being a recent example.
What are some of the fields that were really popular for a time but were ultimately forgotten about or looked back and cringed at?
TangerineSorry8463@reddit
Aspect Oriented Programming, to the best of my knowledge, has been reduced to a couple of logging libraries - mostly cause it's really dependant on consistent naming of things.
(might be talking outta my ass)
PedanticProgarmer@reddit
Yeah. The usage of AspectJ has dropped significantly when compared to 2015.
For good reasons. This was a tool for „smart” devs who abused quick hacks.
No dude, it’s not smart to change what my method returns in production while keeping it untouched in tests.
Background-Rub-3017@reddit
Railway Oriented Programming too
Regular_Zombie@reddit
It's not largely referred to as that, but effect systems in functional language tend to be implementations of railway oriented programming.
Background-Rub-3017@reddit
Oh speaking of functional languages... they are also a fad.
chodmode2@reddit
It's used in a lot of systems that you consume everyday.
Regular_Zombie@reddit
Really? One of the seminal books in software engineering (and computer science) is the Structure and Interpretation of Computer Programs. It was first published in the early 80s. It used a functional language from the 1960s...
David_AnkiDroid@reddit
Railway Oriented Programming is a teaching tool for functional concepts.
It's heavily used, but it's mostly called 'good design' when implemented.
mikkolukas@reddit
Is that bad? (it sounded cool, last time I heard about it)
Scott Wlaschin present it in a very positive light
Background-Rub-3017@reddit
Sounds cool but not practical. I heard about it like once and never heard again. Who's using it?
Ghi102@reddit
I mean we are. Why wouldn't it be practical? It ensures that the error control flow is explicit and removes reliance on exceptions (outside of real exceptions like division by zero errors). Makes it very clear if a part of the code can fail.
In our functional repos, it is also very easy to write.
LloydAtkinson@reddit
Literally every functional codebase ever. Showing your ignorance a little there.
mikkolukas@reddit
The concept is actually used in Rust, Haskell and other programming languages.
In some way it is also how exceptions work in procedural code, although languages that use exceptions does not require you to handle every case where the program can fail (you can choose not to write any exception handling for parts of your program).
Railway Oriented Programming is a way to teach about always to handle both the happy path as well as validation, logging, network and service errors, and other annoyances in a functional way.
The goal of Railway Oriented Programming is to flatten the learning curve for less experienced programmers, to have a visual model of what you are actually doing, instead of deep diving into monads and other terms that can scare off people.
Of course, as soon as you get more experienced, you will rely on more abstract mental models for reaching your goals.
---
ChatGPT summary:
Railway Oriented Programming (ROP) is a functional programming approach to handle errors in a way that's clean and intuitive, especially for developers new to functional concepts. This approach, using a "railway" analogy, illustrates how to manage both the "happy path" and various error scenarios without cluttering the main logic.
Key Highlights:
The approach, while effective, isn’t meant to replace standard monads but to simplify error handling for newcomers in F#.
rodw@reddit
Aren't a lot of "mixins" and @Annotations - the kind that are all over the Java world notably the Java Spring / Spring Boot framework - essentially AOP?
yxhuvud@reddit
That is part of the point - it never reached outside the Java ecosystem.
smors@reddit
Annotations are also used quite a bit in C#.
hungry_dawoodi@reddit
It’s quite alive in mobile development (android uses annotations aggressively)
Swifts property wrapper also seems eerily similar
rodw@reddit
I mean, it's not my platform of choice, but Java continues to be among the top 2 or 3 most widely used platforms and (depending on whose numbers you look at) still growing in popularity - and that's not even counting alternative JVM-based languages like Kotlin (and to a lesser extent Scala and even lesser, Groovy). You could do a lot worse in terms of influence than being strong in but rarely reaching outside of the Java ecosystem. I wonder how many developers almost exclusively work in Java?
And while I don't worry too much about the pedantic technical definition of "aspect oriented" so like the grandparent commenter I may be speaking out of turn, but stuff like Ruby's mix-ins and Typescript decorators feel a lot AOP to me. For that matter, raw JavaScript's prototype-rather-than-class-based inheritance orientation lends itself to Aspect-ish reuse-thru-compositon, although it definitely isn't always used that way.
Maybe there's a more precise or specific dimension to AOP that makes this declarative proxy/decorator/middleware/IoC/compositional injection-style stuff not quite true AOP but it seems to me that AOP has had and probably continues to have influence outside of Java (in general, and maybe AspectJ in particular). We just think of it more like a decorator pattern and less like a wholly new programming paradigm (as it was kinda hyped as originally).
I could easily be wrong though.
valbaca@reddit
still exists, just called middleware now
thashepherd@reddit
Spring/EJB, right? I remember that! Back in, oh, '11?
Nasty implementation, that. I always thought it was a neat concept. Reminds me of web server middleware, or maybe IoC/DI (none of which were fads or flashes in the pan at all).
Haven't so much as HEARD of AOP since 2011 tho.
ChortleChat@reddit
do you even lombok?
TangerineSorry8463@reddit
Lombok didn't fall out over Java adopting its features? Huh, maybe my infos were wrong
appogiatura@reddit
Java has added nice things like Records but nah Lombok still has a lot of good features. My fav is having the builder constructor annotation
vuwu@reddit
Lombok has a lot of better features than records, and it doesn't require syntax sugar or new versions of libraries. I never understood beans, and I don't understand records. Public final fields work just fine. It's just the tooling that doesn't support it, but the tooling doesn't support records, either.
jasie3k@reddit
I don't know about this one, the whole Spring framework so called magic is pretty much built on two things: dynamic Proxies (which can be interpreted as AOP) and reflection.
TangerineSorry8463@reddit
Look man, today is not the day I will be defending Spring and AOP.
metaconcept@reddit
You know how object-oriented programming splits your code up across a code base and makes it hard to follow?
Aspect-oriented programming is that, on steroids,
w08r@reddit
Use it a lot in emacs lisp but it's just called advice
Familiar-Flow7602@reddit
I would not call it a fad.
LloydAtkinson@reddit
It’s come back in again for .NET with source generators.
budding_gardener_1@reddit
Did it not just get reinvented as DDD?
fortunatefaileur@reddit
I had not thought of the phrase “cross cutting concerns” in years until just now.
grumpy_autist@reddit
I don't remember the name but some idiots invented a testing framework with steps being described in html table so product owners (???) with no coding skills could write tests for some reason.
They convinced business people to do that in a really known big IT vendor and fast forward few years all developers needed to code tests in html as product owners decided to not touch it.
Imagine writing tests without loop support or any error handling except pass/fail. And of course integration pipeline ran for 72h.
PedanticProgarmer@reddit
The entire idea of BDD is based on a delusion that there exist product owners who can write acceptance criteria. I haven’t seen it much, but in the projects where Gherkin was used it was the developers and developers only who maintained the tests.
ghmcintosh@reddit
Selenese?
koreth@reddit
This was doomed from the start even without the technical issues. IME it’s a rare product owner who has the ability or the mental framework to even think through a good set of concrete test cases, let alone describe them to a computer. Heck, a lot of developers have a hard time coming up with good tests.
WhyIsItGlowing@reddit
Nah Gherkin syntax is quite good because it kinda forces people to think about acceptance criteria in a structured kind of way that can be mapped to test coverage nicely in way that can make the tests fairly composable.
Where it falls down is all the frameworks to map it to test code don't really add much over just structuring the tests like that.
__deeetz__@reddit
Oh man, behavior driven design! It's such a horrible horrible thing. My old shop bought into the fad, and my new one tried to, but I made a hard case for this being absolute nonsense.
It was t HTML though, but "Gherkin", the language used by "cucumber", the test runner for this.
EveningFine4974@reddit
We'll agree to disagree on this one. I've found Gherkin to be one of the best ways to communicate test cases with the QA Tester, Engineering Manager, Technical Service Officers and Marketing Assistants. It gets everyone on the same page so there is no ambiguity on what the Hardware / Software Integration does. The Gherkin files become part of the source code and bind to the unit tests + build pipeline. I use it at home on my hobby code and also enjoy it.
WhyIsItGlowing@reddit
Gherkin syntax is good for working out ACs, and making tests that map to them, but the likes of Cucumber, Specflow etc. aren't that necessary a lot of the time because it's straightforward to just structure tests like that without using the framework, so for people who are in the code it's not really adding anything and for those who aren't, then the level of detail people care about is encoded in the test names usually, so exporting results to something that can be browsed usually is 'good enough'.
The only time I could see it as a big benefit is if you're working in a team with lots of vehemently manual-only QA people.
BH_Gobuchul@reddit
I actually kinda liked being in a team that used cucumber heavily. It’s just code underneath so it doesn’t get in the way much, but it does force you to break each test into reusable steps, which was an improvement from the 150 lines of low information boilerplate that every single non bdd test tended to turn into
__deeetz__@reddit
I get that and heard the argument before. But I would argue that is a discipline question more than an intrinsic benefit. The 150 lines could have been a fixture as easy as well, there’s not fundamental benefit there IMHO.
Venthe@reddit
Fascinating. So far BDD supported the most robust systems that I've seen yet, so definitely contextual.
__deeetz__@reddit
In all fairness: the old shop had the best engineering I’ve experienced so far. However this particular aspect was for sure not instrumental. It could’ve been more expansive unit or integration tests, with code-based DSL instead of the rather complex infrastructure of a client server plus undiscoverable driver code.
Venthe@reddit
Yeah, I'd guess as much. The place where I've seen it work (because, obviously, it haven't universally work) was in a product shop that was big on iterations on 'how' to do certain actions, while 'what' stayed relatively the same. A "perfect use case" for BDD if you will.
thashepherd@reddit
Ah! I've heard of Gherkin!
Flash in the pan - if the reqs were so good that they completely described the behavior of the system then they'd just be code, etc - but maybe some positive influence on test case naming?
__deeetz__@reddit
I actually think the descriptions in some sort of semi-formal style are good. It’s the hooking up of code and making it an actual test that was the insane part.
It was basically impossible to know which part of a “Given I have a logged in user X” was fluff and which regex extracted parametrization.
sudosussudio@reddit
My favorite BDD “story” I got assigned in Jira was something like “As a website visitor I want to be able to use a well designed website, so I can use the website.” And the expectation was that I’d implement the design on the front end via that single story. I quit not long after that.
jcl274@reddit
Imagine working for a company so far up the BDD butthole that they created their own fork/flavor of gherkin that they maintain. Ask me how I know 🫠
RetroRarity@reddit
Robot?
MagicianSuspicious@reddit
Fit & FitNesse? I see these as more early work in connecting customers and programmers, than as anything that reached fad level.
davimiku@reddit
Was this the application used as the code examples in "Clean Code" (Robert Martin)?
grumpy_autist@reddit
Yeah, something with Fit in the name. I got PTSD flashbacks just seeing this name.
rudiXOR@reddit
Behavior driven development
couchjitsu@reddit
Code generation never really took off to the degree people thought. There is definitely some codegen stuff, but I remember the days of "Just build your UML and the tool will create the code for you" that never quite worked out.
IAmTheBirdDog@reddit
But it wasn't for a lack of trying. The industry was pushing RAD in the 90s.
Sausagemcmuffinhead@reddit
Code gen from openapi specifications is useful though. We generate SDKs directly from our api spec. There are some different choices I might make if we rolled our SDKs by hand but being able to ship SDKs in multiple languages with very little work is a huge win.
gyroda@reddit
In general, code generated from other code is pretty successful. And I'm including an OpenApi spec as code.
Code generation to avoid needing to write any code is where things fall flat.
dethswatch@reddit
Uml: I worked with some tools that'd round-trip to code and back.
The problem is just that if you were good to enough to code every day in that language, banging out the code to define the objects is actually pretty quick and doing that in a diagram always took longer.
And even as an analysis tool- seeing a diagram of the objects wasn't very useful- in any codebase, you're not using hundreds of objects, you're probably dealing with less than 10 new-to-you custom objects at a time and the ide's are good at navigating them.
a_library_socialist@reddit
Most people don't understand that coding is the fun and easy part of the job. It's deciding what to code and how to prepare for future coding you don't yet know about that's hard.
bugkiller59@reddit
Testing
ScientificBeastMode@reddit
So true. It’s amazing how fast you can bang out a functional application from scratch if you just have all the requirements handed to you and you don’t have to worry about maintenance or the future evolution of the product.
That’s why prototyping is actually pretty effective, but you lose all that effectiveness down the road if you don’t completely scrap the prototype and do it the right way with all the things you learned while building the prototype.
a_library_socialist@reddit
Oh god, the amount of PoC's I've been ordered to put into production in my career is mindblowing . . .
And of course we won't take 2 weeks to clean it up - it's MUCH more efficient to spend 3 months after release trying to fix those problems in place!
ScientificBeastMode@reddit
Yeah, I have sometimes refused to use prototypes in production, but you have to build up a lot of social capital with your superiors to make that kind of call.
Andrew_the_giant@reddit
1000%
Abadabadon@reddit
I've found them useful when I communicate our design / steps to a stakeholder.
They know when api "put user in person db" is invoked, it puts the user in person db. So if a new stakeholder joins the team or an existing person needs a refresher, visually its easy for them to remember the steps.
dethswatch@reddit
Right, that's a good role.
I've taken to showing the restful API (for backends because that's what I mainly do) routes and that makes things pretty clear- the devs can go and use the ide to figure out which obj's are being used from there typically.
pmirallesr@reddit
Yeah I'm using IBM Rhapsody rn, which does roundtripping. And whenever I need to diagnose an issue or learn about something, I use the IDE. It's just easier. So in the end, it does feel like the diagrams end up not fulfilling much of a role.
To be fair we only use them for structural design, the dynamics of the software are hand-coded, so the diagrams contain only a small part of the useful info.
I'd say the most useful part of codegen so far for us has been to keep patterns consistent and avoid pitfalls in C.
In fact the only diagrams that we regularly use to actually communicate with each other are draw.io diagrams made expressly to describe specific situations
dethswatch@reddit
you know what's been the most useful to me overall have been ER diagrams or -anything- that tells me how to join A to E, because while I can make a lot of assumptions- it's hard to guarantee I'm right some times.
ERwin is (is it still alive?) was ok at best and cost so much that only a dba or two ever had it.
couchjitsu@reddit
Yep, I agree.
We had Rational Rose at one point and that was more helpful to generate the diagrams from the code and you could take those in to design meetings. But going the other way was never used.
But honestly, even then (circa 2001), it was best run overnight because on the single-core pentiums, it took quite a while to generate UML from our codebase.
Kiylyou@reddit
I still use this program. It is 2024.
handsoapdispenser@reddit
In the early 2000s we did this thing where you would define your domain model in an XSD and then a build step would generate all the Java beans. That also made it dead simple to serialize and deserialize objects over the wire with type checking built in.
yourfriendlyisp@reddit
Open api? Protoc gen?
SideburnsOfDoom@reddit
Mostly I generate OpenAPi off my code, not vice versa.
And even if it was the other way around, that would be generating an API client, not the services that do the work behind the http endpoints. UML promised to gebnerate those, and now UML is nowhere.
StTheo@reddit
The problem I found with that was that it was much easier to hide bugs in the generated spec.
A couple examples my team encountered was one spec that omitted pagination parameters, and another spec that didn’t use the correct types for id (number instead of number/string union, they were migrating ids).
So in both examples, the backend worked fine, but the frontend couldn’t use the generated code without maintaining their own “corrected” spec (fixing the backend code was too much hassle when the teams maintaining it were overworked). When both frontend and backend had to generate their code, that particular issue went away.
kingmotley@reddit
Same here. I generally would never use the generated client, but when working on a team with people of various skill levels, yeah... I just generate a client and hand it off to the team cause I don't want to have to explain how to deserialize a X into Y correctly including handling escaping, encoding, and codepages correctly and do it async AGAIN.
yourfriendlyisp@reddit
Good point, I do the same, then generate client SDKs with the generated OpenApi
Viedt@reddit
I use OpenApi to generate code in most everything I do. Write that first and then the server and client can start working from the same spec on day one. The issue with people that do it code first is they are missing that point. The code doesn't need to written on the server before the client can start their work. Then the code gen does the controller and all models and you can focus on business logic.
GrumpyDog114@reddit
If code generation is the most effective solution for a set of applications in a given language, the choice of language may be the real problem. That's a big hint that the language lacks the ability to abstract what the code generator is doing for you.
thetoad666@reddit
I did my masters dissertation on generating a web api with solid principles from a UML diagram and highlighted 2 problems. First, you must write the code that needs to be generated, 2nd some things are near impossible to represent in uml without using its version of pseudo code which is more difficult than actual code! This was in 2020 and i had 20 years commercial experience before that.
Best_Fish_2941@reddit
What is codegen?
RaccoonDoor@reddit
My company uses loads of codegen. We have in-house software that converts takes complex data structures defined in XML files and spits out entire Java packages to store the data. All we have to do is define the template in XML format.
ScientificBeastMode@reddit
I worked at a company that did something similar for TypeScript. They would write their backend code in Haskell, and they had a tool that could transform their Haskell API layer data types into TS types and generate all the functions for making requests, validating responses, and mapping them to our redux state. It was one of the coolest dev tools that I’ve ever seen!
ClittoryHinton@reddit
I think humans have come to cope with the fact that text is actually the easiest way to describe precise logic and procedures (aka programming languages).
GUIs are fine for well defined problem spaces where you just need a few plugins. Hence why they still see use in web design with limited interactivity or cookie cutter data pipelines. But introduce any requirements/logic complexity and those tools just become a huge barrier.
ScientificBeastMode@reddit
So true. Visual diagrams can definitely help with understanding the high level structure of a system, but if you actually dive into the fine details with a diagram, you’re in for a lot of pain.
Imagine a microservice system where events and DB writes/reads form a sort of informal state machine that handles complicated business logic… That is an extremely difficult thing to perfectly diagram. That’s not a thing you can easily scale. At some point you just need to express it in code.
mizzerem@reddit
At least two of the FAANG companies were very successful with this pattern… not sure how you can say it didn’t take off.
dashdanw@reddit
It works pretty well with OpenAPI 3.0, I’m implementing and leveraging generated clients for my current company and we even have third party developers that have their own code generation tools that they feed our schema into. It’s extremely satisfying.
ub3rh4x0rz@reddit
No. grpc, openapi, and sqlc are counterexamples that immediately come to mind, and the first two are wildly popular
Independent-Disk-390@reddit
Yeah that was kind of hopeful at best. Regarding that I think a lot of people misunderstood what NLG is.
Potpourrri@reddit
You sure about this? UML and T4 never took off but it's heavily used in things like grpc
ExpertIAmNot@reddit
There is a ton of codegen out there still it’s just not always for plain data modeling. Just in AWS land alone their Smithy project is codegen for all their SDKs. Projen is codegen for project configuration files. CDK is codegen for CloudFormation code.
dhemantech@reddit
IDE’a generating the basic variable declaration and getter/setters !!
Resident-Trouble-574@reddit
Now it's "Just build your prompt and the tool will create the code for you"
dangling-putter@reddit
Hasn't it sort of moved to planning and jitting (esp in ML applications these days)?
Personally, I think this is one of the coolest projects around:
https://www.fftw.org/
Justneedtacos@reddit
Inheritance centered OO
bluetista1988@reddit
Base classes are the root of all evil
UsualCardiologist403@reddit
But DRY! 😉
Pleasant-Database970@reddit
You can be dry without inheritance
ITAdvance@reddit
\~\~Inheritance centered\~\~ OO
vuwu@reddit
OO in general, honestly.
thashepherd@reddit
Nah, you can still make a few hundred LoC go "poof" when you refactor long procedural methods into object behaviors.
The fad was just taking it too far and turning object graphs into Biblical genealogies. And of course the whole Java AbstractFactoryManagerAdaptorService thing.
Venthe@reddit
I'd rather see this in reverse. There is a lot
AbstractFactoryManagerAdaptorService
's, because there is a lot of enterprise code. The creation ofAbstractFactoryManagerAdaptorService
is valid in principle, but in turn a lot of enterprise code is maintained by less experienced devs, so this in turn leads to the bloat and complete misunderstanding where to apply certain patterns and - crucially - where not to.thashepherd@reddit
Very fair. It's difficult for junior devs to understand and work within an environment like that without guidance and someone explicitly explaining how it's supposed to work to them up front.
Equivalent_Lie_2978@reddit
Can you give an example? I’ve never seen object behaviours simplify code in terms of the code itself. The only thing I’ve seen OOP simplify is the bridge to the domain problem in an abstract sense.
thashepherd@reddit
Sure. My specific example is damn-near-procedural code in the service layer of a backend API that factored out into tiny, testable methods on Pydantic models such that the service layer methods basically turned into one-liners. Think validation type stuff.
100% agree with your edit WRT assigning LoC to complexity btw. 3 maintainable lines are better than 1 complex line, etc. This was a clear case where the ancient ones were working too hard when there was an OO tool that provided a clearer, cleaner solution.
I also want to be clear that I'm not talking about massive object dependency graphs here. Just....cutting an object or 4 and giving them props and behaviors.
vuwu@reddit
That's true. I think the way we do object oriented programming is completely different than what the original designers of things like Smalltalk intended. If we were really following that, things might look a lot different.
UsualCardiologist403@reddit
I agree, but in the same breath game engine devs would disagree.
its all about the right tool for the job.
reboog711@reddit
I'd argue, from an academic perspective, if you aren't doing inheritance, you're not really creating an OO architecture.
The bulk of what we call "OO" today is some conglomerate of concepts from OO, Imperative, and Functional approaches.
Siduron@reddit
I do prefer composition but inheriting can be a good choice sometimes.
Venthe@reddit
The issue is, most developers never learned what does "sometimes" mean.
I've used inheritance maybe a dozen times over the past 5 years; everything else is far better off with composition.
EconomixTwist@reddit
Underrated answer
a-priori@reddit
Maybe it's because this is when I came of age as a developer, but the 1990s and early 2000s were a time of a lot of fads that went nowhere:
DCOM / CORBA: The idea was that you'd build distributed systems where code would be able to call functions across a network as easily as it does within the same process.
It was abandoned as an idea when people realized you can't hide the fact that you're working in a distributed system. It was replaced with protocols like gRPC, which explicitly models the network and its failure modes.
XML everywhere: The idea was that XML was a general purpose data format for representing structured documents. It became the de facto standard way of representing data, so if you were going to persist data, then it was probably going to done in XML. If you needed to exchange data between systems, it was going to be done by exchanging XML documents, perhaps via XML-RPC. If you needed to transform data, you'd do it with XSLT — an XML document describing how to transform an XML document into another XML document. These documents needed a specification to describe their structure, so you'd use an XSD (XML Schema Definition), an XML document describing the structure of another XML document. And so on. Everything was XML documents.
From this movement we got XHTML, which was an attempt to turn HTML into an XML format. It fizzled and died because it was finicky and offered no real advantages over HTML. Some things survived: Microsoft switched its Office document formats to an XML-based format in this era (the "X" in the ".docx" file extension stands for XML). We also got abominations like SAML which is still used for authenticating between systems, but has mostly been replaced with OpenID.
These days XML still exists, but is no longer the de facto standard data format and has mostly been replaced with JSON and YAML. There is similar tooling for validating documents against schema definitions for these formats, but it's not nearly as common because people realized this had limited utility.
Development methodologies: This was the hayday where various interrelated methodologies started such as "extreme programming" (1999), "agile" (2001), "scrum" (2002) and "rational unified process" (2003). These movements attempted to create a proper way of doing things, with planning processes and roles.
These movements still exist today, and many of the techniques they created have soaked into the processes used at many companies, such as regular standups and "sprints". These methodologies, especially extreme programming, also popularized the use of automated tests and code review, which were important developments for the field. But the evangelism around big, haughty "methodologies" mostly died out by the late 2000s.
andy_nony_mouse@reddit
I recently had to parse an xml document and was surprised to find that xpath parsers are dying. Those that exist are old and have security issues. Finding one was difficult.
JadeBorealis@reddit
I really wish agile and scrum would just die and disappear
a-priori@reddit
That’s fair. If it makes you feel any better, no one really, actually does Scrum or Agile, they just use the words and pick and choose a few techniques. So in a sense they have already mostly died and disappeared.
JadeBorealis@reddit
No one does "true" scrum or agile - that's why I want them dead. :)
being subjected to "scrum" psychological warfare from penny pushing heartless business douches made me have an undying hatred for the whole thing.
Live_To_Run@reddit
Yeah … all people seem to be worried about is if a story will “roll”.
davidblacksheep@reddit
Disagree with you on development methodologies. I think agile, etc, has been successful for the industry, in dividing and conquering developers and being able to micro-manage them.
varuntinkle@reddit
Isnt grpc and service architecture the dcom/corba model. Or you mean dcom/corba + failure handling is what was needed for groc..
a-priori@reddit
It’s the evolution of the idea. But the key difference is that gRPC doesn’t try to hide that it’s a remote operation. It makes it clear in its ergonomics that there’s network latency and the potential for failures, and it becomes the application‘s problem to handle these. With CORBA and DCOM it was supposed to be the framework’s problem to paper over those details.
That said this may not have been the real reason these failed. See the other thread with a postmortem on CORBA that gives an alternative explanation.
comrade-quinn@reddit
I remember these days well; especially DCOM, and I agree with your take.
While the idea was interesting, I felt even at the time it was doomed to failure. A network operation simply cannot support the same failure abstractions that represent a local operation failure, and no amount of trying to stuff the round peg in the square hole was going to change that.
The other issue with it was that you often define network operation flows differently to local ones.
For example, local operations can be chatty, calling back and forth between themselves to compose some data structure or other, they can also share memory between them in various ways.
Network operations, however, are better implemented as document based operations whereas much data as possible is collated up front and as few calls are then made over the network as possible to transmit it. In response a new ‘document’ is returned, perhaps, and then local data is updated with it.
Essentially, attempting to act as though the locality of constituent operations in a system made no difference to how they were implemented was untenable for many systems.
TraceyRobn@reddit
Yes. There was perhaps a feeling in the late 1990's that software development should become more formal, as this would make it better.
Software would become an engineering field with rules, not just designing and hacking code together.
This failed. It failed for many reasons, the top ones in my opinion were that these early attempts at formalism were mostly BS, and that it is more expensive and slower to develop software this way.
Quality costs more and most users don't care.
a-priori@reddit
I have a different take there.
The thing we realized is that engineers, especially civil engineers, don't do heavy processes because that's how you get quality. They do it because usually the artifacts of engineering work is usually hard to change and the cost of errors is high. On the other hand, in most kinds of software development, the artifacts that software developers make are relatively easy to change, and the cost of errors is low. So it doesn't make sense to treat them the same way.
That's not always the case though. You can build software the 'engineering way', and some fields do operate that way: I spent some time early in my career working as an embedded software developer on avionics systems, and that work is very process-heavy (look up "DO-178c" sometime if you want some light bedtime reading). But you make some severe trade-offs by operating that way. It's not strictly better, and the rest of the software industry is not lazy or undisciplined by not following these sorts of standards. These fields are the exception, and for almost every other case, trying to make software development 'more like engineering' is just cargo culting.
That's why you see efforts more recently in software development to lean into those differences: to make it even easier to change through processes like continuous delivery, to reduce the cost of errors through better release management, and to give developers greater insight into their systems through deep observability. That's where the software industry has gone more recently, and I think that's a more promising way of achieving the goal of professionalizing software development.
TraceyRobn@reddit
I agree with your take that most software failures are not as deadly as a bridge falling down. Oddly enough I also worked in avionics and ADA.
I generally don't think that the quality of software has improved much in the past decade. The fact that everyone is connected to the Internet and software can be patched, means that companies do little QA and patch bugs on the fly. Windows and games are prime examples.
Equivalent_Lie_2978@reddit
DCOM / COBRA, not really. BEAM does a fantastic job in this domain and is starting to grow a lot in the past few years while always allocating a big part of distributed computing
Aside from BEAM, we also have lambda functions.
The big issue with DCOM was always state management. BEAM handles this really well with the genserver/ agent etc architecture
funkdefied@reddit
DCOM/CORBA sounds a lot like some of the BEAM concurrency primitives (eg Nodes). The BEAM is making a comeback with Gleam. Hopefully it gathers more steam.
Equivalent_Lie_2978@reddit
Elixir is the front - runner with gleam still very much in early development.
BEAM works a lot more like an OS the a runtime.
montdidier@reddit
While I don’t dispute what you have said about the failure of CORBA is a widely held view I think the failure was more nuanced in reality. After reading this account on the rise and fall of CORBA it resonates more with me than the more apocryphal view.
a-priori@reddit
That’s really interesting, thanks for the correction!
Of the three I listed here, the distributed object one is the one I’m least familiar with. That’s why it’s the shortest one.
I knew it was complex from my experimentation with CORBA brokers, and then it kinda disappeared as a thing. The explanation about it being a problem of error propagation was something I heard later.
robhanz@reddit
This idea works fine, so long as you start with the semantics of distributed operations in the first place.
A remote operation done locally is just fast. A local operation done remotely is a bug farm.
Background-Rub-3017@reddit
No-code
andy_nony_mouse@reddit
Oh yeah. I have been hearing about how it’s going to eliminate programmers since 1991. And I’m sure attempts ho back further than that.
No_Technician7058@reddit
python is low code
Background-Rub-3017@reddit
Relevance?
No_Technician7058@reddit
python is very popular and successful and aged very well
BobbyThrowaway6969@reddit
Calling Python low level is like saying the ISS is a few feet off the ground.
calnamu@reddit
Do you know what low code means?
brianly@reddit
No-code is a perennial. More money has been made failing to make this successful than had it been successful.
Around the edges, things like Zapier or Power Automate are relatively successful but the blast radius is reduced. Tools that attempt code generation and the like fail most of the time.
bcameron1231@reddit
I don't think any no-code systems exist. Zapier and Power Automate, I tend to classify as "low-code"
No_Technician7058@reddit
anything written in python
ParamedicIcy2595@reddit
What a dumb thing to say.
No_Technician7058@reddit
python is a low code tool
bcameron1231@reddit
It's neither "low-code" or a "tool"
No_Technician7058@reddit
its low code compared to c
Polite_Jello_377@reddit
Found the junior dev 😄
No_Technician7058@reddit
found the python dev
bcameron1231@reddit
That's not what the term "low code" means.
Background-Rub-3017@reddit
I didn't say anything about low-code. Jeez
No_Technician7058@reddit
oh yeah
xku6@reddit
What a take. Python can be pretty succinct but it's literally nothing but code.
SignificanceNo3189@reddit
Yeah, Netflix, Uber and Reddit are surely low-code... (sarcasm)
No_Technician7058@reddit
if they are using python then those aspects of it are low code
tonnynerd@reddit
Technically, there's at least 1: https://github.com/kelseyhightower/nocode
SlowMotionPanic@reddit
God yes. I’ve unfortunately been brought in to fix problems caused by no/low code many times because they just can’t scale. And, after the most basic of CRUD apps, it tend to take someone with actual CS background to know how to set things up and optimize.
Low code is at least a decent place for people to cut their teeth and see immediate results before diving deeper into development I guess.
Salesforce is another one albeit it with certs (not unlike what MS offers, although less ephemeral since MS doesn’t know wtf it wants to do with them) but unlike things like power platform, it lets you dig both wide and deep if you want/have the access.
kauthonk@reddit
Power automate is buggy as all hell
garciawork@reddit
Pega LMAO. That is how I got my start originally. Only lasted a few months.
itsthekumar@reddit
I almost got into Pega, but didn't get through the interviews.
dpgraham4401@reddit
I was just at a company internal developers conference where no-code was being touted as the future lol
SpaceBreaker@reddit
Going through this now. It's still a thing for dumbass management to remove liability from them.
posisam@reddit
There are plenty of successful low/no-code and visual programming solutions for specific use cases. Particularly in areas where you have semi-technical people with a high level of domain knowledge.
who_ate_my_motorbike@reddit
Absolutely!
Orange data mining / weka / knime
Labview
Flowise / rivet / langflow
Powerflow
dsaint@reddit
This goes back to at least the 90s when it was called Computer-Aided Software Engineering (CASE) Tools.
dashingThroughSnow12@reddit
That one comes up big once or twice a decade.
One of the reasons I am not particularly afraid of “AI” taking my job is that to me it feels like three no-codes in a trenchcoat.
bladdersux@reddit
Vincent adultman !
SerFuxAIot@reddit
Unreal made it work though
dipstickchojin@reddit
The whole thing will be obsolete, I bet, once LLM-based UI agents mature to a point where certain classes of apps become obsolete in favour of simply synthesising and calibrating them for your use case with NL commands.
BringBackManaPots@reddit
We've actually used it to some decent effect for smart buildings
BasicAssWebDev@reddit
Flutter can eat my shorts.
HashMapsData2Value@reddit
How is Flutter no-code?
NootScamanderrr@reddit
He might mean Flutter Flow. Which can eat my shorts as well.
When migrating one of our old applications to Flutter, my dev manager bragged that he “rewrote the whole app” and just needed us to get it to the finish line. He had only done some drag and drop and color picking within Flutter flow. No api calls, local storage, authentication, etc 💀 the generated code was not very SOLID
HashMapsData2Value@reddit
Oh wow
pemungkah@reddit
SOAP.
jimsmisc@reddit
definitely on my list of "least favorite technologies I've ever worked with"
CallNResponse@reddit
I want to thank everyone who contributed to this subthread re SOAP and EJB and CORBA and so forth, because I had to work with that stuff for years and it always seemed “off” in some way, but I was buried deeply in a mushroom farm where everyone thought SOAP and EJB etc were just super-duper keen! It does my soul good to learn that yes, it wasn’t just me, and I’d been gaslighted.
Material-Resource-19@reddit
I was just thinking that when I saw XML above. SOAP, WSDL, XML - that first big iteration of web-enabled services/API was just so heavy.
pemungkah@reddit
Oh Lord, I'd forgotten WSDL. Thanks for reactivating that trauma.
Fuzzy_Garry@reddit
My company still uses it.
philosphercricketer@reddit
Visdul, I'm coming back
DogmaSychroniser@reddit
I wish I could. I recently had to go set one up for a WCF service and it was pain.
MidnightPale3220@reddit
Nothing wrong with accessing a SOAP service and parsing XML. Granted, it may feel bloated, but that depends on the use case. It's still used a lot in various industries. A complex document is a typical example.
Doesn't really matter to me, which kind to parse. At least most of XML these days comes with XSD. Sadly not that often a JSON schema is present in my experience.
wvenable@reddit
XML-RPC came before SOAP and was the inspiration for it and it was basically like JSON but XML.
Then it got "enterprised" and became SOAP. People are trying to do the same thing with JSON but luckily it doesn't seem to be catching on.
CpnStumpy@reddit
gRPC is the modern iteration, this is all perennial
jenkinsleroi@reddit
You forgot CORBA, which came even before XML existed.
thashepherd@reddit
EJB still gives me flashbacks
dashingThroughSnow12@reddit
People who prefer YAML over JSON because of JSON’s “bloat” never passed around gigantic XML files.
Wulfbak@reddit
It was kind of a stopgap technology in the early to mid 2000s that gave away once more robust technologies came along.
robhanz@reddit
XML is just such a misuse of technology. Using a document markup format as a data markup format is just.... full of impedance mismatch.
__deeetz__@reddit
"The S stands for simple"
One of the greatest pieces of internet tech writing ever.
CpnStumpy@reddit
Please find me the original link for this. I've tried a number of times and failed. It's so spectacular
__deeetz__@reddit
http://harmful.cat-v.org/software/xml/soap/simple
Empanatacion@reddit
They did mean "Protocol for Accessing Simple Objects" but I guessed liked the backronym "SOAP" better than "PASO"
appogiatura@reddit
electronic language protocol for accessing simple objects
EL PASO
Interesting_Long2029@reddit
~for~ para
SSA22_HCM1@reddit
Stick an "Object-Limited Dynamic" in front of it and it's a fiesta.
dacracot@reddit
Stupid Obnoxious inArticulate Product
Resident-Trouble-574@reddit
It was simple at the time. It's not their fault that today's programmers are snowflakes that cannot handle it.
__deeetz__@reddit
It was never simple. I suggest you read the essay I refer to learn about the absolute cluster fuck of a specification and broad tent of incompatibility this one was. The only thing that was simple was being in the microsoft eco system with its inane code generation tools that hid this atrocity from you as it had hidden COM and DCOM before. The latter nominally a cross platform standard, too, by the way.
FastidiousFaster@reddit
I can unfortunately attest to the fact that SOAP is still very much around. Yes there's less new stuff being made with it but it is nowhere near gone.
ColossusAI@reddit
Sales force has entered the chat…
jrib27@reddit
And Syteline. Ironically, their SOAP API is significantly more performant than their REST API.
ChortleChat@reddit
SOAP is still around in other incarnations
lost12487@reddit
It's still around in its original incarnation. Thanks Workday!
Text_Original@reddit
Thanks SAP!
CpnStumpy@reddit
Disagree about this being a fad, SOAP was huge for years, effectively creating services all over the place that have lived on
It was a pain in the ass, sure, but not a fad. ActiveX was a fad, it failed at it's intent and died without anyone ever succeeding in it's use broadly. SOAP is guaranteed running in tons of financial backends successfully still and countless enterprisey places
Fidodo@reddit
Woah, I totally forgot about that
Fluid_Cod_1781@reddit
every time you open a word document from onedrive/teams/sharepoint you are using soap
ChiefNonsenseOfficer@reddit
CORBA!
dipstickchojin@reddit
Omg corba in fact
khooke@reddit
I’d say CORBA was far more of a fad that didn’t make it prime time compared to SOAP. It overpromised but was far too complex to be easily adopted. I remember doing some proof of concepts with Iona’s Black and White server (if I remember right), but it was hard going to get anything working and the company I was working for at the time rightly decided to abandon that path, and later adopted EJBs instead, which was odd in the early days since there was a lot of similarities until EJB was simplified with 3.x.
SOAP APIs for their time were heavy but solid and reliable, definitely not a fad, they were widely used on plenty of large enterprise systems, especially ones which integrated with other systems - later replaced with simpler alternatives but a fad, I’d say no.
pemungkah@reddit
AAAAGH!
xaervagon@reddit
I'll admit I was one of the three people that actually liked SOAP, but I get why a lot of people disliked it. There was something about being able to get a few descriptor files and pushbutton a full, working library implementation that I really liked. The obvious downside was that dealing with API updates or changes pretty required blowing up the existing implementation (but given it was pushbutton, who really cares?). Today you need Swagger codegen and a bunch of other crap to get half the convenience, but you do get a lot more control over the implementation details.
dietervdw@reddit
grpc and protobuf FTW
xaervagon@reddit
Google did a great job on both projects and together they're peanut butter and jelly.
flanger001@reddit
I wrote SOAP stuff last week… 😭
Herrowgayboi@reddit
THERE IT IS...
Got triggered even just seeing this.
northrupthebandgeek@reddit
Unfortunately it's still alive and well in the enterprise world.
lIIllIIlllIIllIIl@reddit
The whole ecosystem of distributed system is full of fads: CORBA, SOAP, Java RMI...
How hard is it to use TCP sockets or HTTP?
abe_mussa@reddit
I wish this was something that I’d forgotten about
Third party vendor made us a new API as part of a contract we had with them. And then we get this outsourced SOAP mess
AnonymousUser1000@reddit
Working on legacy soap stuff here and there.
SOAP was originally called XML-RPC, that has helped me cope (atleast to some degree!)
LloydAtkinson@reddit
The last couple of weeks I’ve been working with WCF - but it’s web model. Imagine if someone decided to do the opposite of REST or any normal conventions.
I absolutely loathe this project.
prolemango@reddit
I spent the last two years building a SOAP server. It was very soapy
NGTTwo@reddit
Did you have to bend over?
another_newAccount_@reddit
Thank the fucking Lord
ConstantinopleFett@reddit
Not really a field, but HATEOAS. It never really worked in the first place as far as I can tell. Some people seemed to think it was brilliant but it was unusable and went nowhere.
OneHumanBill@reddit
Onshore development (US).
Araganor@reddit
Just my perspective, but when I was first getting into CS I saw a lot of tech blogs posting about how much SQL sucks, and that MongoDB and its equivalents were the future and everyone needs to learn it to stay relevant.
Now, that's not to say that NoSQL DBs never had any use cases (though I personally dislike the term because it has all the descriptiveness of saying "not red paint"). But, ultimately it turns out there are many things that SQL is just better at, especially at scale. So I don't see it pushed nearly as hard anymore.
rodw@reddit
What domain are you working in where the pendulum has swung away from NoSQL? I feel like we're still stuck in the opposite of OP's scenario with NoSQL solutions: as you noted NoSQL has it's place but still seems to me that as an industry we're still enamoured with NoSQL (and anti-relational hype) and end up recreating a half-assed RDMS solution using tools like MongoDB where PostgreSQL would have been easier, faster and better-suited at literally every stage of development. SQL databases are crazy mature and the set-theoretic data and query model they are based on is extraordinarily powerful and widely applicable. I'd wager 7 times out of 10 a modern RDMS would be a better option where NoSQL is used in new development today.
Unsounded@reddit
For a lot of teams you don’t actually need the benefits you get from something like Postgres, solutions like Mongo/Dynamo are quick, straight forward, and don’t have a lot of the pitfalls of relational databases like pesky maintenance cycles, highly invariable reads, and queries that need to be optimized. Most systems don’t actually need consistent reads or need to have one place with highly relational data, you can use micro services and stitch the information together with caching and web calls to pull the information together and have some of the work you’d have a query do done via code you maintain.
For larger companies that’s a major benefit because you have full control, and it’s easier to scale operations wise because it offers a more predictable workload on your systems and takes ambiguity and more arcane knowledge out of the path. I agree with your stance on most uses cases being solved with a modern RDMS but most places don’t need to use the most optimal tool. Sometimes it’s better to use the more mediocre hammer and nail but stick with it because it’s simple and does the job they need to do. Other tools do exist if required, but they can become second choice for teams trying to move fast.
Fudouri@reddit
I must be old when the nosql solution is considered the simple version.
Araganor@reddit
I work in Fintech/online banking, so admittedly not exactly bleeding edge on adoption of new tech. Not one of the big banks you've heard of, but I would say we are still ahead of the curve compared to a lot of competitors.
We store almost all of our data through Microsoft SQL Server, but we're trying to migrate to Postgres as the licensing fees are eating heavily into our profits.
That being said, we actually have recently implemented DynamoDB as a resiliency layer for durable caching. But we still rely primarily on SQL for pretty much all user/account data (at least that I'm aware of). So it's not like we don't use them at all, just that our default would be to reach for relational and only consider other options for specific use cases where it makes more sense.
rdem341@reddit
SQL db has also added better support such as JSON columns.
People also realized these NoSQL options are expensive.
grulepper@reddit
SQL DBs can also be expensive lol, really comes down to implement at and / or what your providers are offering you
Odd_Soil_8998@reddit
Yeah, as it turns out bolting on hierarchical semi-structured data support wasn't really all that hard
a_library_socialist@reddit
The main attraction of NoSQL to lots of people was it bypassed existing DBA power structures.
danishjuggler21@reddit
The main attraction of NoSQL was that it’s web scale, unlike MySQL: https://youtu.be/b2F-DItXtZs?si=EijFuiVZDs-rjb-U
a_library_socialist@reddit
A classic!
AModeratelyFunnyGuy@reddit
This is a super interesting perspective I hadn't heard before. Just curious, do you have any links which expand on this idea. (Totally fine if this is just an anecdote based on your experience haha)
trying-to-contribute@reddit
You might want to venture into an oracle shop where the development environment is still kinda waterfall.
An archaic DBA centric development environment is often toxic. A lot of shops (then), wrote/editted stored procedures/functions on the fly and deployed them in a change window with little revision control. The DBA world came to agile very, very late, much later than the rest of the operations side of things when most of system and network admin used devops methodologies, even when infrastructure living on prem was the only option. Often, DBA departments had their way of doing things and they were not interested in mingling with the rest of the tech department practices.
This in turn leads to the scenario that in order to do anything in the DB environments, especially if much of the computation was done in the DB (i.e. Oracle land), is that every ask is minimally a ticket, DB teams had their own product schedule and their releases are not always in sync with development efforts in other teams. A lot of DBA <=> Development interfacing happen between product management meetings. It was direly painful sometimes when database teams would fight the rest of development for features.
Things like: "This function is too slow. Can you look into rewriting it?" vs "Are you sure you are writing your queries properly?" or "this overall database is really slow, what's going on?" vs "we are running a large batch process and it occurs every night at this time." to "we have no insight or say into how to shape the etl process and we'd like some because we'd like to control how the data is shaped" vs "No, it is our unspoken policy to put everything in 1st normal form if we can help it. It's good database practice and we don't trust run of the mill developers to know anything about proper data management!"
So, bad DBAs, like bad sysadmins in the past, have fiefdoms as well. And fiefdoms are in impediment to agile and the business process. As more shops had the development team owned the database processes and didn't do computation in them (e.g. mysql centric places), especially in a time when this was also the rise of MVC coding methodologies so schema design and changes were dictated by the code, the less this fiefdom was tolerated.
HowTheStoryEnds@reddit
Tell me about it.
I basically work on 2 different pseudo-legacy 'ETL' processes that are nothing more than transforming an existing badly normalized db-structure to an even more denormalized one (with 70+ column tables) because we can't control the db method the underlying structure nor are we allowed to change it to something that's more in support of the actual OLAP functionality desired and it runs on underpowered hardware. (And this underlying db engine is not cheap)
It's supercalifrasqlistic-superdbdocious Even though its application is quite atrocious. If you ignore the devs long enough you'll always look precocious.
trying-to-contribute@reddit
That really sounds painful. Sorry to hear. Hope you don't have to revisit this platform often...
HowTheStoryEnds@reddit
Honestly the code ranges from insanely old to lots of SQL. It's just the red tape around it that usually kills the soul. The standard way of life of bigger non-tech-oriented corporations.
AModeratelyFunnyGuy@reddit
Amazing write up, ty so much!
trying-to-contribute@reddit
Appreciate you. I got a friend who is a "Senior Programmer" at a Federal Bank who is mostly dealing with Oracle at his day to day. He's on the other side of the fence and I get plenty of user complaints from him as well. The majority of his clients are not developers. They are people who read reports and much of his departments responsibility is to generate thousands of reports every night. I joke that he generates paper dashboards.
He doesn't really know where these reports go to, he can't really change the report generating code because some client down stream would complain that the numbers might not be consistent anymore.
IMHO, It's not a fun place to be a DBA right now.
a_library_socialist@reddit
Just anecdotes.
AModeratelyFunnyGuy@reddit
Fair enough, ty!
WeedWithWine@reddit
Based on all the examples I saw around that time, it was more that it allowed any developer to store data even if they didn’t understand basic relational databases. It’s still big with the coding boot camp crowd.
BrobdingnagLilliput@reddit
FTFY. I don't think anyone with any experience thought it was a good idea to eliminate the involvement of people who'd been thinking about data storage for 20 years.
a_library_socialist@reddit
I mean, there's LOTS of places that have really shitty schemas they've refused major overhauls on for years.
In those cases, NoSQL was an improvement - because while you lost the advantages of SQL, you did also lose the disadvantage of having to constantly do that weird join that Doug added on a coke binge in 1988 and has been limping along ever since . . . .
BrobdingnagLilliput@reddit
Sure.
But 10 years down the road, you'll have to constantly do that weird thing with the NoSQL instance while also doing that weird join.
Local solutions to global problems are the kinds of things junior developers suggest.
a_library_socialist@reddit
Oh sure - I'm not a big fon of NoSQL, just talking about how it was used as regards tech politics, not whether it's a good tech solution.
QueasyEntrance6269@reddit
I wonder if docker being a thing killed NoSQL indirectly. It’s a trivial task these days to create a database and tear it down in seconds
EarthquakeBass@reddit
Administrating it is a whole other beast though. Containers are hard to make work well with anything stateful. RDS I’d say had a bigger impact than Docker. But I think mostly people started to get bit by NoSQL (schemas are annoying until they’re not) and the pendulum swung back
ultraswank@reddit
From my experience it was always version 2.0 of a project that killed the enthusiasm for NoSQL. "OK, our app allows you to connect with friends and list all your favorite books. Now add a widget that shows the people in your friend's group who also enjoy the same authors as you do and the number of matches." Structuring your data just bakes in a lot of functionality that you miss when it's gone.
LetterBoxSnatch@reddit
I got paid pretty well for writing code that should have just been an SQL query but instead we needed to pull documents and page through them from multiple document types and apply the relationships before returning, so it's not all bad I guess
crappyoats@reddit
It was always so much fun having to start baking in relations to the NoSQL bc 98 percent of data works like that lol
donjulioanejo@reddit
Only for local development. I would not want to run Postgres/MySQL in docker in production, even in Kubernetes.
I mean you can, but your database is your pet. With Kubernetes, even clusters are pretty disposable in the cloud. Just use managed SQL in the cloud or let the DBA team deal with it on-premises.
belkh@reddit
Postgres operators on k8s have gotten so good it made it worth it, PITR backups, read replica clusters and connection pooling out of the box
carsncode@reddit
Containers don't solve anything that's challenging about database admin though
ltdanimal@reddit
I don't really follow here? I haven't really seen Docker have any real use case that would mean NoSQL would die down. I never saw the "I can stand something up and tear it down" as a factor in the tradeoffs.
Mysterious-Rent7233@reddit
I don't ever remember it being hard to set up a transient database for dev or test. Certainly not in the time since "NoSQL" became a thing. Docker doesn't manage the complexity in prod and installing a DB engine on your laptop is easy, so I'm not sure what Docker has to do with it. Connecting to a central cloud one is also pretty easy.
TheseHeron3820@reddit
I don't wanna toot my own horn, but I've always been able to tear a database down in seconds.
popopopopopopopopoop@reddit
Are you called little Bobby Tables?
TheseHeron3820@reddit
How do you know me so well? Did we go to school together at the XKCD Academy for Nervous Bois?
economicwhale@reddit
Especially if it’s prod 🫡
dbxp@reddit
A lot of people had issues when they found out how eventual consistency worked and that they had no transactions. Stack overflow saying they used just a few SQL instances had a lot of people questioning the nosql movement
No-Shock-4963@reddit
BINGO
eloc49@reddit
This is related (pun intended) to my problem with most relational DBs. The user/developer experience feels like they're meant to be used by DBAs first and by applications/code as an afterthought, when often the reverse is true.
Antares987@reddit
This holds true for cloud infrastructure as well and network administrators.
iris700@reddit
https://www.youtube.com/watch?v=b2F-DItXtZs
Araganor@reddit
Thank you for sharing this, I will be sending it to everyone I know 🤣
rubberband901@reddit
NoSQL actually stands for "not only SQL"
braino42@reddit
An issue with this sub I've noticed is people discussing their specific experience but then expressing that experience as generally applicable knowledge.
In this case, my experience runs counter to yours at amazon. A decade ago the entire company moved off of oracle and required "critical" services to use nosql datastores like dynamodb. There were obviously contractual issues with oracle, but the another influence was amazon inability to scale it any further.
I get the "oh amazon scale is a unique case", but my team specifically has a low scale ddb table with about 1k items in it. It costs $0.10/month. A comparable sql setup is $90/month. It even has relational data that we manage with sort keys.
Not saying the pro-sql comments are wrong, but no detail on the specific use case is provided typically, which makes it impossible to judge.
hoppyboy193216@reddit
There’s also a significant operational aspect to this - anecdotally I’ve very rarely seen significant outages caused by ddb, whereas I’ve lost count of the number of times I’ve seen some absolute shit caused by improper management of relational databases.
I’ve seen countless instances of critical tables being locked for 10s of minutes by non-concurrent index creation, apps failing to recover connection pools after a writer failover, hot path query execution plans suddenly switching to an insanely inefficient method, obscure resource bottlenecks, weird locking problems, extension incompatibilities, data adjustments wiping out entire tables, replication problems, etc on relational databases.
I know that all of these failures are technically avoidable, and I know ddb isn’t perfect either, but I’m increasingly finding it difficult to justify the use of RDBMS for any high scale or critical application that dev can feasibly used for.
northrupthebandgeek@reddit
You could almost certainly get pretty close to that degree of cheapness using SQLite (plus something like Litestream to automatically replicate changes to S3).
There's also no fundamental reason why a DynamoDB-style SQL service with DynamoDB-style pricing couldn't exist.
braino42@reddit
Sqlite is great and I've used it for a mobile app I worked on. With that said, why am I trying so hard to use sql to find a solution that meets par at best? Ddb is a simple cloudformation resource i can create/manage with the cdk wheres sqlite would require more upfront effort and an unknown operational effort.
northrupthebandgeek@reddit
What I'm getting at is that "cheap database" and "SQL" are orthogonal. You don't need a NoSQL solution like Dynamo to get a cheap database. If you don't want to use SQL then don't use SQL, but that's programmer preference driving that decision, not economics or "scalability".
(The fact that the software I write using SQL can run anywhere, not just AWS, is also pretty important to me - but obviously might not be as important to you if you're literally working for Amazon)
braino42@reddit
Point taken; I was trying to provide a counterpoint to a cost argument made elsewhere, and did not mean to imply ddb is always cheaper in every situation. If your company has the capability to operate a sql db at <$0.10/month i think that's great and worth sharing.
I think this difference in our experiences and others in this discussion is the existing environment and context these types of decisions are made in, like what you're saying at the end with sql being more portable but that potentially not being as important if you're all in on a cloud provider.
My points are that the answer to nosql or sql database when comparing scale vs cost or whatever criteria is "it depends", like is commonly said here. In order to help others, we as developers can be more specific in our recommendations and express our context that may influence that, which is difficult.
Remote-Car-5305@reddit
SQL-ity and serverful-ness are independent dimensions
deathhead_68@reddit
I find this too, a lot of people just seem to try and make dynamo work like sql and then conclude dynamo is shit. Dynamodb is THE shit when used well.
Horses for courses really.
Araganor@reddit
To be fair, I did preface my comment with "Just my perspective". I never claimed that my experience was representative of everyone else's.
I have no doubt there are plenty of shops where the pros outweigh the cons, and it's always up to the engineers to make the best determination they can based on their data and use case.
donjulioanejo@reddit
Yeah NoSQL was definitely a fad, but it also has some pretty legit use cases.
Put relational OOP data in a normal SQL database. Then put unstructured or arbitrary data in NoSQL like MongoDB/DynamoDB so you're not storing megabyte-size json blobs in Postgres.
frontenac_brontenac@reddit
My understanding is that DynamoDB isn't exactly great at large blobs. https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/ServiceQuotas.html#limits-items
abe_mussa@reddit
Oh god having flashbacks to early in my career, using NoSQL instead of SQL for a new project. Didn’t really make sense, given the experience of the team and the requirements of the project itself
We also just made life awful for ourselves by designing it exactly as we would an SQL db and scratched our heads wondering why it was so difficult to use.
Fidodo@reddit
My mongo db/coffee script/stylus phase aged worse than an emo haircut.
maleldil@reddit
Living this every day, with Apache Cassandra as our primary database. What a nightmare.
Fidodo@reddit
SQL doesn't suck but the syntax sure does. SQL is incredibly powerful but it's not fun to write anything complex.
MagicianSuspicious@reddit
Also: postgres is a better fit for many non-relational storage use cases, than are these "NoSQL" databases.
Material_Policy6327@reddit
Yeah honestly I remember the nosql hype where everyone somehow thought it was better than sql until they realized they still needed relational info lol
a_reply_to_a_post@reddit
i think the hype was largely around things like Mongo being "schemaless" like it was a great idea because a lot of people weren't using sql migrations as common practice when the M-E-N part of the MERN stack was blowing up
dustyson123@reddit
Mostly about horizontal scalability rather than being schema less. Though that was appealing too because at the time dynamically typed languages were also trendy.
a_reply_to_a_post@reddit
i just remember when all that stuff started coming into fashion in like 2009/2010ish...i was working in agency-land and was pretty productive during the time period after flash died with CakePHP (then Laravel) + vanillaJS..
myself and another senior dev there were pretty productive with PHP (yeah not sexy, but it wasn't wordpress either), but the younger devs we worked with who thought wearing fedoras was a good look also really loved coffescript and mongo, which alone made me kinda reject it for a while haha
at the time, i was still used to javascript being a pain to work with because i remember the "DHTML"/Web2.0 era, and the idea of serving a website with javascript instead of like Apache or something proven to work for years seemed nuts to me, but i've evolved my thinking slightly over the years :)
dustyson123@reddit
Ok similar experience and thought process, but coffeescript was actually very cool and solved a lot of the problems I had with JavaScript at the time. There's a reason a lot of coffeescript constructs like arrow functions and destructuring made it into ES 5+.
Material_Policy6327@reddit
True. The startup I was at tried to swap to it and they quickly realized schemaless didn’t mean easier
inkydye@reddit
I mean, it came out of shops like Google, which genuinely needed a scale that wasn't reachable by existing RDBMSs, and at the same time didn't need some of the standard features. For them, it solved a problem.
Then everybody wanted to act like they needed the same problems solved.
fortunatefaileur@reddit
My related theory is mongo just didn’t enable auth by default and that became the critical feature that won mindshare vs everyone having to remember how to edit pg_hba.conf on their workstation.
SideburnsOfDoom@reddit
AWS DynamoDb and Azure C0osmosDB get used a lot. They're noSQL. They're not Mongo, but some similar ideas.
But yes, there are cases where they work really well, and cases where you want SQL. They will not make SQL obsolete.
pydry@reddit
There are cases where they work acceptably and there are cases where they fail horribly.
Ive never heard of a use case where mongo was better than postgres.
Ghi102@reddit
In our case, we have a subset of data where there are essentially no relational relationships in-between the data. We never do anything like a "join", we don't even compare pieces of data in-between each other. They are all essentially independent pieces of information that get updated separately. They mostly have the same format, but sometimes have differences that would make it hard to manage in a regular SQL database.
This is pretty much the ideal case for going with NoSQL database and it works quite well. That's not to say that we are not using SQL. A big part of our domain is relational and we are using SQL for that. But a significant chunk is quite clearly non-relational and would not be served well by a SQL DB.
ChicagoJohn123@reddit
Lemme ask you, do you then reassemble all the data into a warehouse so you can do analysis?
Fun_Hat@reddit
Not who you asked, but nope. That's what Presto is for. No pre-assembly required.
ChicagoJohn123@reddit
But then you’re just doing full database queries.
melikefood123@reddit
Much like Redis? We used that to store a ton of key value pairs, and misc data outside of our relational db.
SideburnsOfDoom@reddit
I think it's deceptive to start the argument with "they work acceptably", "they" meaning NoSql database in general; and then focus on how bad Mongo is.
I mean it's is, but it's not all of "them". We knew to avoid it in favour of the AWS and Azure options mentioned above. And I have definitely come across cases that fit DynamoDb better than postgres.
pydry@reddit
Cosmos is basically Microsoft's overpriced hosted mongo. Same applies to that.
Ethesen@reddit
It also has Cassandra API (and more).
ChicagoJohn123@reddit
I’ve never met someone who was happy about their decision to use dynamodb.
deathhead_68@reddit
I've used DynamoDB for loads of stuff both in startups and for my current FAANG-adjacent job. I find a lot of devs misuse it tbh, but when its used properly its incredibly powerful
Fun_Hat@reddit
This. My manager is former Amazon and she said that even there is was widely mis-used. They had to have trainings on how to correctly use it.
Once you adjust to the paradigm it's great though.
toolatetopartyagain@reddit
Knowing before hand what your queries will be like before using the DynamoDB is something which I will never be comfortable with.
Araganor@reddit
Right that's exactly what I meant when I said I dislike the term. It's kind of useless as a term because "databases that are not SQL" is a huge umbrella.
Antares987@reddit
I love this post so much. It's been something I've been saying for years. I believe that NoSQL is the result of people following NIH (see: https://en.wikipedia.org/wiki/Not_invented_here ).
Graph theory and set theory are their own types of math, like how we have algebra and geometry. Since it's gotten more attention recently, it's like how Turing famously realized there were large portions in the number domain where solutions could not exist, simplifying what needed to be tried using brute force in code breaking.
"Oh what's this archaic COBOL-esque all uppercase syntax?" Why are we using this "outdated technology, I can do better because I'm smart and young." -- it's always a young buck, and I was guilty of talking a lot of shit to Joe Celko back when I was a teenager. By my mid-20s, I had my name in one of his books and in my 30s, I found myself getting flight instruction from none other than an 86 year old Geoff Lee, the granddaddy of SQL Server.
NoSQL systems have a very specific use-case when the data isn't all that important, there's a lot of it, and most of it doesn't matter for the purpose of whatever the end user is doing. Social media, Amazon, et cetera. They're also good for people who don't understand data, so if you want to hire an army of "cheap" offshore labor for a big project and end up with a warehouse full of servers to do what could be done on a single rack (*cough* UBER), you use NoSQL.
Fun_Hat@reddit
That's a whole lot of words to say "I don't understand why someone would want to denormalize their data".
Antares987@reddit
I do understand why. And there are specific cases where it makes a lot of sense. Things like user preferences, complex layout information for charts and graphical designers that are incorporated into an application, data models that are subject to change during development so utilizing an object store to speed up development time is something that I'm a strong proponent of. Anything that's subject to potential Combinatorial Explosion where the complexity can be avoided with relational stores should be relational.
wrex1816@reddit
Once you're around long enough you see these things go in endless circles.
You learn something, then you're told "Pfft, don't you know nobody does that anymore". Wait a year or two, there's a brand new paradigm that everyone says is the hot new things, except, you realize it's just a rehash of the thing you originally did and then was told is outdated.
There's a lot of bullshit in this field.
Ilookouttrainwindow@reddit
I think "nosql" databases are great, but they are an addition to SQL, not a replacement. Just like SQL can't solve every problem out there. I do remember mongo declaring how "shit" all of SQL was, how dinosaur it was. Look at the latest version - schemas, acid transactions, all of SQL stuff. But can it do join yet? Have they finally done away with map/reduce? Then there's redis - little beast that saves everything.
nt2701@reddit
We even have a certain decent sized product that currently is using NoSQL DB planning on moving back to SQL DB. 🧐
allKindsOfDevStuff@reddit
It was funny when people tried to backronym it years later and say it stood for “Not Only SQL”
TimonAndPumbaAreDead@reddit
In my experience, relational databases really do just kinda suck? They're good for a specific subset of mid-size, transactional data management, but most things are either small enough that you don't need the overhead that comes with a full MS|My|Postgres SQL installation or it's big enough that you're in index/join hell and should be moving your data into a reporting database anyway. Like, it's okay? It's fine. Is just not really great for anything. Too big for small data, too slow for big data.
kingmotley@reddit
It is great for a lot of things. Anything that needs to be point in time consistency and uses atomic transactions across possibly multiple entities. If you don't care about those types of things, then there are other solutions that might fit better, but you need to spend time standing up and implementing a different tech stack for those marginal gains. In my line of work, I've never found the complexity of running multiple datasources worth the gains. That means having cyber security understand the intricacies of it, the infrastructure teams being experts on it, the site reliability/deployment groups understanding it, the additional licensing issues, and of course the developers, architects, and to some degree the QA and BA teams.
It is great as a general solution that can solve most problems at reasonable performance and price.
metaphorm@reddit
the vast majority of data in business applications is inherently relational and a relational database is the best tool to handle the data.
there's not a lot of "overhead" here. popular RDBMS' like Postgres are extremely effective at handling both small and large data sets. "Index/Join hell" doesn't make sense to me. I'm not sure what you're referring to. Any database system is going to require indices to make your queries sufficiently performant, and joins to to correlate data. A relational database can handle this natively. Other datastores often require the application code to handle joins instead, and this is almost always much less efficient.
economicwhale@reddit
Do you know of any useful resources comparing performance of relational vs non-relational databases at different sizes of data stored and queried?
TimeToSellNVDA@reddit
Okay, I'm going to start the flamewar 🔥
I want them to die with in a fire, and I don't hear people talk about them much any more - other than ironically. It's how managers think you can get more or better quality done if you add one more engineer.
optimal_random@reddit
TDD is only awesome for testing the Sum functions or other trivialities always shown in examples.
Once your code has network calls, database data fetches, and all sort of abstractions, then the whole concept falls on its ass.
Don't get me wrong, Unit and Integration testing is really important to ensure the quality of software, and I use it extensively. But the whole TDD gospel is just unbearably ridiculous.
MargretTatchersParty@reddit
There is a limit to how far you can go with TDD and pure unit tests. The resource calls _should be_ abstracted away. Having good unit tests shows that you're in that situation and forces you to limit the mixing of unit testable and non unit test able code.
frontenac_brontenac@reddit
Extreme Programming is still the #1 way to ramp up new products and new teams when you have in-house expertise that isn't already in the software teams' heads
thashepherd@reddit
TDD is a good tool to have in the ol' mental toolbox, just don't expect that to be the default programming style.
Ditto pair programming.
XP was before my time, I always saw it as sort of a proto-Agile: is what you make of it.
HighBrowLoFi@reddit
Absolutely right. These approaches are great for training junior developers (and even then only part of the time), but otherwise are not only unproductive, but miserable for many developers. The success of XP and Pairing often implicitly rely on extremely opinionated and predictable development. You can get much better bang for your buck by just empowering a team collaborate effectively and then give them space to be productive
TheWhiteKnight@reddit
Came here to say this. TDD was interesting, tons of people that had no idea what it was would say they were doing it, but I have yet to see anyone actually do it anywhere.
I stopped telling people that just because they're writing tests doesn't mean that they're doing TDD. "Yeah, ok".
XP is the other one. I joined a company with extremely smart folks but while there were pair stations, they were rarely used. Our perforce commits had to have a "pair" assigned in the commit message. But they were just reviewing the code. Not pair programming / XP.
Still, we had a bulletproof testing strategy. We had tests that tested our test framework (but what tests tested the tests that test the test framework? LOL). Very thorough, learned a lot. Our tests would create their own Jira tickets, upload log files to them automatically, and assign them to the triage person for the team responsible for that area of the code. Tons of lab machine automation. Spin up various OSs on specific machines with various RAID configs, build the code on that machine, run the tests, capture log files, open Jira tickets, tear down the machine, and put it back into the pool for other tests to grab.
EarthquakeBass@reddit
TDD is an interesting idea but I always kinda got the impression it ends up over fetishizing tests and trying to cram everything into a test to the point of monotony when that’s not always the right tool.
XP I never tried but seemed to be a way to try and spice up a process that’s inherently fairly unpredictable yet boring.
Pair programming personally I kind of hate. I mean I’m all for hopping on with colleagues from time to time but the idea of two people sit there and one person drives is (1) exhausting because you are constantly being monitored, (2) inefficient because speaking to communicate is far inferior in bandwidth to writing, (3) inefficient because, well, those two people could be doing things in parallel. I have a feeling pair programming caught on largely to sell more seats for consultants — after all, why bill out one competent programmer when you can bill out one competent programmer and one incompetent programmer for twice as much.
Non-taken-Meursault@reddit
I fucking hate pair programming. I get it when you're new and getting used to a new codebase as a junior dev, but its benefits wear out quickly and it ends up being one dude programming and the other one chiming in occasionally while mortally bored.
OblongAndKneeless@reddit
It doesn't work when the bored one has ADHD. They're off doing something else in their head.
DiligentComputer@reddit
Pair programming without an intent to either a) investigate a tricky bug together and learn as a pair (which you could argue isn't actually pair *programming*), or b) have one party actually intending to teach some part of the system or some concept to the other, is actually a downgrade of team productivity. In those two cases, though, I've found it to be quite a good technique.
CubicleHermit@reddit
Pure TDD is awesome for certain specific kinds of greenfield business logic or infra code. Since working with someone who was NUTS for TDD about 10 years ago, it's been absolutely amazing for three very specific, very critical classes in that time.
Said nut wanted to use TDD for every single bit of code, no matter how legacy, and no matter how many side effects you were stuck with because of networks or UIs. smh
cfalone@reddit
I would say Agile has become the opposite of what was originally intended.
Kapri111@reddit
IoT
WearMental2618@reddit
This is not a fad, Amazon just dominated it.
Kapri111@reddit
Very few jobs around for the amount of talk it had.
thaddeus_rexulus@reddit
I think this is likely because a lot of the jobs around IoT are actually data jobs rather than IoT specific jobs. I did consulting for a massive concrete producer and they had hundreds of thousands of sensors collecting data. The necessary redundancy (including human redundancy) for IoT devices that actually participate in manufacturing pipelines made it so that the ratio of sensor to non-sensor devices is something like 500:1 or higher.
ripreferu@reddit
Can you provide a source for that claim about Amazon?
I must have missed something.
WearMental2618@reddit
Ring. Alexa in everything. They also host the alot of the iot cloud (azure too)
ripreferu@reddit
Ring and Alexa are services for customers, but what about business to business uses?
The manufacturing company, I work for, relies on IoT sensors and creates its own mesh. They are probably not using azure, nor Amazon for this.
hashtag-bang@reddit
I think this is probably bigger than most of us know about. Like more popular in traditional industries rather than say it being a smart home thing.
kondorb@reddit
Web 3.0 is almost forgotten. It was an amazing tech desperately looking for a problem that simply didn’t exist. And then it was abused for quite a few high profile scams on top of it.
David_AnkiDroid@reddit
Web3 != Web 3.0
I'm personally sad that Web 3.0 didn't catch on, unencumbered RDF triples are such a 'pure' way of storing data
thequirkynerdy1@reddit
The semantic web to me was much cooler than the obsession with everything on the Blockchain.
It always saddened me that it didn’t take off.
NostraDavid@reddit
A lot of work to setup the semantic web, but what kind of value did it bring? I can't recall any 😅
Familiar-Flow7602@reddit
That's because there is no one Person or Bank or Money but it depends on perspective you are looking at it
Spare-Leather1230@reddit
I love the idea of semantic web but hate the implementation. That being said I’m OBSESSED with DbPedia and want it desperately to work
cajunjoel@reddit
I hope it will, too. The Library of Congress has adopted the same linked data structure for BIBFRAME to describe books and book like things. It will ultimately replace MARC which has been around, I kid you not, since the 1960s. That said, academia is all over linked data, but it hasn't caught on in the commercial world as far as I know.
If you like dbpedia, check out wikidata.org. There's a fair bit of overlap there.
kondorb@reddit
“Web3 (also known as Web 3.0)”
That second thing is so obscure and so forgotten that the term was up for grabs.
funkdefied@reddit
“Web 3.0 (not to be confused with Web3)”
PM_ME_SOME_ANY_THING@reddit
You should check out the first line of that Web3 link. Kinda dumb that since one didn’t catch on they just reused the name for something else.
gajop@reddit
The Semantic Web, yeah. I did my master thesis on it. Lots of cool tech around it (especially logic reasoning), but the core premise is problematic. Companies don't want to openly expose all this data to the public, and allow you to navigate/parse it how you will. FB isn't going to expose their user network as a FOAF, because their userbase is extremely valuable and they want to be in charge of presentation.
So it ended up just being used to give meta attributes to web 1.0 to help with SEO
Morphray@reddit
I think it was dreamed up by people who were in the old mindset of the internet being a utopia of decentralization. (Tim Berners-Lee being a perfect example.) They didn't realize that for at least 25 years the internet has become very centralized.
cajunjoel@reddit
I think the semantic web at its core is a beautiful idea, just as I think Java is a beautiful language. It's the ecosystem that developed around it that fucked it all up. Everything I've ever encountered surrounding linked data seems half baked and never really complete. It's frustrating.
Familiar-Flow7602@reddit
It's not beautiful and it leads to bloated models. That's because there is no one Person or Bank or Money but it depends on perspective you are looking at it. It could not succeed and this is a good thing.
frontenac_brontenac@reddit
Java is an esoteric programming language. It belongs with Smalltalk.
itsthekumar@reddit
I'd love to see people who were once advocating for it eat their words now. But of course they never mention it anymore.
I remember an MBA on Tiktok was pushing it hard.
hmmnda@reddit
Ah yes the "Semantic Web" with all the subject predicate object promise of making the whole web easy to understand for machines.
northernmercury@reddit
Using a java cross-platform UI framework.
Java applets on the web.
OdeeSS@reddit
I recently had to upgrade the Java version on a large network of apps using Apache tiles. Business wanted a it upgraded to resolve security issues and gave me two weeks to do it. I had to make a fork off Apache Tiles, upgrade the Java version, and then upgrade the apps that way. Now I'm terrified of what I've done and struggling to explain to Product that we either need to time to rewrite the front end or they're going to need a dedicated team to maintain the forked repo.
GronkDaSlayer@reddit
You can add servlets to that list.
DanookOfTheNorth@reddit
I was going to post “applets” but you beat me to it.
fuzzynyanko@reddit
They were alright until Java applets started to get a mess of security vulnerabilities
brazzy42@reddit
The thing is, the vulnerabilities were always there. The whole architecture was insecure, first because it was based on the old Netscape Plugin API, and second because Applets ran in a regular JVM with full access to the OS, which was merely supposed to be restricted through the SecurityManager mechanism. Both of these provided far too big an attack surface, and as soon as hackers started looking into them seriously, and endless stream of vulnerabilities ensued.
rodw@reddit
Speaking of security vulnerabilities it feels like Macromedia/Adobe Flash is what originally ate Java Applets' lunch, a couple of years before raw HTML/CSS/JS became rich-/dynamic-/productive-enough to address almost any use case an applet might have been used for.
For that matter Java as front-end GUI framework at all seems pretty rare today (with a few notable exceptions).
fuzzynyanko@reddit
Agreed. YouTube especially began to take off. Flash crashed a lot, ate up RAM, gave us annoying ads, and pretty much followed in Java's footsteps. People made jokes about it. SPOILERS: Dragon Ball Z Kai Abridged 2 .
Flash for game making was actually pretty cool.
TangerineSorry8463@reddit
I'm so glad that Apple didn't invent TypeScript cause now we would be dealing with googling AppleTS and getting vintage ancient Java bullshit.
ScientificBeastMode@reddit
Lmao that would be hilarious and horrifying at the same time…
davy_crockett_slayer@reddit
Oh god. The horror. I remember first discovering Java Applets when I started playing SFCave back in 2001/2002 along with Runescape.
bizkitmaker13@reddit
Brooooooooo, my first CS job was an internship in college for a company with a proprietary CMS built with J applets. Nightmare.
No_Shine1476@reddit
Now we just have HTML and a buttload of JS everywhere lol
mrmcgibby@reddit
It integrates far better than applets.
DoNotLuke@reddit
Applets … I was there … 20 years ago when sun Microsystems was coming of age
ParamedicIcy2595@reddit
Sun Microsystems wasn't coming coming of age 20 years ago. They're an older company than that. 20 years ago was 2004.
DoNotLuke@reddit
And you made me feel even older
Wulfbak@reddit
I remember 20 years ago in the days of asp.net web forms many of us talk at server controls would wind up making Java script obsolete. Who knew?
fuzzynyanko@reddit
Oh man. The whole WE NEED THIS TO BE MULTIPLAT, EVEN THOUGH WE ARE ONLY RELEASING ON ONE PLAT thing in business bugged me to no end. It's fine when they did actually releases on more than one platform. Resume-driven development probably
jwezorek@reddit
A corollary to this was the thing where we had to be cross-platform and maintain the look-and-feel of the platform on each platform supported so it was not enough to get cross-platformness by implementing a cross-platform GUI framework once and letting platform independent code do all the rendering etc. because Windows needs to look like Windows and Mac needs to look like Mac (and Linux needs to look like ???). This concern was annoying because it was always so clear no one really cared about it except for maybe Apple and Microsoft who loved to have developer lock-in.
Then fast forward 20 years to when web apps take over and all of a sudden no one cares about platform look-and-feel any more. Which like, okay, no one did to begin with but it would have been nice back in the day to just use Qt + the Fusion theme or Java + Swing and not have people coming at you pretending to care about this issue.
mackfactor@reddit
Oh man! Applets! I had almost forgotten about that.
abe_mussa@reddit
cries in google web toolkit
Rain-And-Coffee@reddit
Hey I remember GWT ! I was there
itijara@reddit
I feel like Java applets were a good idea but not implemented well. I sort of wish that we had a better implementation of native-like apps that worked in the browser.
RougeDane@reddit
Flash
Sinusaur@reddit
The Java mascout Duke was everywhere at one point.
MrDiablerie@reddit
Applets were horrible, glad those didn’t last.
ceilingscorpion@reddit
2 years in the future - Generative AI
Live_To_Run@reddit
I really hope so. The push to incorporate AI into everything is so counterproductive
StoryRadiant1919@reddit
i know I’m gonna get it for this one….Agile.
Live_To_Run@reddit
I’m pretty sure most developers hate agile.
PragmaticBoredom@reddit
To actually answer the question: XML.
For a while XML was in all the books and it was talked about as the de facto way to structure things.
It went on for a few years before real developers started rewriting things without XML and showing how it could be simpler, faster, and less fragile. The speed difference from dropping XML was an easy win in so many optimization exercises.
wvenable@reddit
In way though, thank god for XML because it seems to have replaced a whole hell of a lot of proprietary binary formats. Even Word documents are XML now.
I have some crappy 3rd party applications needed for work that are very buggy but thankfully modern enough that they store their data as XML and if needed I can just open notepad and edit them.
Brainvillage@reddit
Ya, anyone dogging on XML now doesn't remember the pre-XML hellscape. They pushed it too far, but a little bit of XML is still totally fine to work with.
CallNResponse@reddit
As critical as I’ve been about XML in the past, I believe this is absolutely true. I can’t count the times where I’ve simply been trying to figure out how some utility etc worked, and the ability to suss out some meaning from XML was what saved me. I doubt I’m alone, and in fact I’m willing to believe that human-readable XML is a major contributor to the current (reasonably advanced) state of computer science. I think that if everything was all-binary format, all the time, a lot of people who have achieved success in the software biz today would instead have burned out early and given up. To put it another way: I think there’s significant goodness in how XML made many things easier, and thus allowed developers to make rapid progress.
Northbank75@reddit
We still deal with EDI regularly and JFC
dacracot@reddit
The other markup/machine parsible structures also lack something that XML has had almost from the beginning, XSLT, eXtensible Stylesheets Language Transformations. It is a language specifically for the transformation of XML into other constructs and it is extendible to other languages. Most browsers have had a built in XSLT engine since their inception.
CallNResponse@reddit
Oh God, XSLT. Did they ever finalize a definition of it? I worked with back when - mid-1990s? - and it was interesting but I recall that it kept changing significantly - like there was some kind of battle going on between the people tasked to define it, and there was so much churn that it was impossible to use it.
ProfBeaker@reddit
Eh, XML is still a perfectly good way to structure data, and honestly better than some of the replacements.
The problem with XML was that it got turned into this bizarre panacea way outside what it should've been used for. XML database, XML structured code, XML dependency injections (Spring XML configs.... **shivers**).
Lots of fads _around_ XML, for sure.
latkde@reddit
XML was for sure shoved into a lot of places where it shouldn't be, but I think its main problem is that the data model doesn't fit the data model of most programming languages:
You can define various mappings between these worlds, but it will never be as seamless as an object–relational or object–JSON mapping. So XML implies a lot of unnecessary complexity compared to alternatives.
Where XML is absolutely fantastic is as a syntax for custom text markup languages, but that is such a niche use case.
carloscrmrz@reddit
yes, data models with XML are weird. Currently working with international wires through SWIFT and a clear example I have is the Settlement Amont & Currency.
For some reason the model for that is having your Currency as an attribute and the Amount as the node’s value which is not horrible but just doesn’t feels quite right.
ProfBeaker@reddit
By "object-relational" you mean that thing which has entire mapping frameworks to try to hide the complexity?
And by "object-JSON" you mean that thing which has huge serialization libraries, and still struggles with advanced concepts like dates?
They're simpler in some ways, but "seamless" is a wild overstatement, IMO.
Though I'd take any of those over YAML.
yxhuvud@reddit
But it is not supposed to represent arbitrary data for arbitrary languages. It is supposed to be a way to represent documents, and it is perfectly ok doing that. But don't use it as a configuration language or RPC language, because it is really horrible for that.
burger-breath@reddit
At least XML had comments!
duva_@reddit
Hence the saying "XML is like violence. If it doesn't solve your problem you ain't using it enough"
dacracot@reddit
You realize that HTML is just a dialect of XML. Literally one of the most common constructs of CS for several decades now.
RougeDane@reddit
XML is alive and well in a lot of enterprise domains. Latest example I've come across: EUROCONTROL (the European collection of air traffic authorities) are finally deprecating the old way for airlines to file flightplans prior to flying. January 1st 2026 this new system called FF-ICE will be in production.
I will give you exactly one guess what the document format is...
robhanz@reddit
As I said in another comment, XML is just a horrible misapplication of tech (specifically as a subset of SGML). SGML is a document markup language, and using it for a data format is just full of impedance mismatches.
username_or_email@reddit
Thank god for that, I fucking hate xml
magical_midget@reddit
We still carry the same pains, json, yaml, etc are alive and growing.
For some reason, unless it is a DB, people want to be able to store/transfer data in human readable form.
ThenCard7498@reddit
one shot debugging, can poke the data ez
Araganor@reddit
I still have to manage JSON configuration settings inside of SQL tables at my org. Somehow managed to get the worst of both worlds with that setup!
mrcarruthers@reddit
As much as it's not efficient, I get it. It's easy to read and parse, all of our tooling is geared towards it, and we can send examples to non technical people without having to explain how to read it. If protobuf or something really catches on, and every tool we have can parse it to something human readable, I can see it catching on more.
muddy651@reddit
Fuzzy logic!
Brainvillage@reddit
Super Furry Animals fan I see.
TaXxER@reddit
Not nearly used as much as it should be!
Winter_Essay3971@reddit
Is Rust still big? Feels like I heard more about it 2-3 years ago
3ABO3@reddit
I think Rust is great for certain use cases, and especially so if you hate Go syntax
Purple_Experience984@reddit
Rust is great if you hate yourself
rodw@reddit
I'm not sure they were ever mainstream popular but DAE remember Java Aglets? "Movable software agents" that could serialize their run-time state and move to a new execution environment for reasons that were never very clear to me.
I guess the generic term is maybe mobile agenr
alfalfa-as-fuck@reddit
Jini!
CallNResponse@reddit
This thread is a lot bigger than I expected.
Did anyone mention Virtual Worlds yet?
Zenalyn@reddit
Blockhain
CS_Barbie@reddit
Yep
to-too-two@reddit
Is this one a fad?
I know it was a buzz word but isn’t there no crypto without blockchain? And cryptocurrencies are certainly here to stay.
mezolithico@reddit
It's still in its infancy, and there is some interesting use cases, especially in the finance world. Will it survive? Nobody knows.
backdoorsmasher@reddit
Genuinely question, but what are some good use cases for it?
mezolithico@reddit
It eliminates counterparty risk. You could use it at the backbone of equities exchanges, instant stock trade settlement, prevent naked short selling. It can be used for bond payments. Cheaply sending money internationally for inter bank transfers. It could really make insurance more efficient. For instance life insurance policies. A trusted org like the social security offer can send a list of deceased folks ssn, and a contract would auto payout death benefits.
ICanHazTehCookie@reddit
What's the point of a trustless system when you still need to trust the entity (insurance org) that's using it? As soon as blockchain has to bridge to the "real world" (which almost every good use case does), it falls apart.
mezolithico@reddit
Its can be used for trustless counter party risk. Like if you're gambling, the counter party can't refuse to pay. The contract rules dictate how the money flows with no take backsies.
It can ALSO be used for increasing efficiency of existing infrastructure which would be the benefit there instead of eliminating counterparty risk.
ICanHazTehCookie@reddit
If you're gambling entirely on chain then yeah that's legit, fair. My issue is when you're gambling on real world events and have to trust an "oracle" to submit the true outcome to the chain.
Blockchain is anything but efficient in my mind, but maybe I'm missing some new developments.
almost1it@reddit
Look into "optimistic oracles" if you're interested. You're right that there is currently no known way to post real world data to the chain in a completely trustless way.
AFAIK, the current meta is to rely on systems that ensure participants are incentivised to do the right thing. These prediction markets (betting on real world events) work because the entity who submits the data has to lock in a bond that can get slashed if theres consensus that it was wrong.
Of course there are a lot of edge cases around this system. But its at least an interesting application of game theory to align peoples incentives in certain ways which I think is only possible to do with an immutable network.
ICanHazTehCookie@reddit
That is neat, thanks for the info. I do wonder how possible it is for external incentives to outweigh the system's incentives? But regardless it seems like a step in the right direction
SimbaOnSteroids@reddit
Putting my money somewhere the fed can’t seize in the event I become a political undesirable.
Some_Dumbass_408@reddit
Forgive my ignorance, but how is a 15 year old technology still in its 'infancy'?
Spiritual-Theory@reddit
Mobile phones were invented in 1973. Were you using one in '88?
Mysterious_Comb9550@reddit
Check the Bitcoin price bro
MiAnClGr@reddit
Yes it only has one use case, speculative investment. All the other smart contract web3 stuff hasn’t stuck.
your_poop@reddit
Although there's 110 billion dollars locked in smart contracts at the moment
MiAnClGr@reddit
And how much is stolen on a regular basis?
your_poop@reddit
Probably a lot, but not from the well established protocols anymore. But I agree with you, the industry needs more good engineers who build safer smart contracts
honor-@reddit
yeah but did everyone start integrating blockchain into their products like hypesters said we would?
Mysterious_Comb9550@reddit
Maybe web3 hasn’t taken off yet but bitcoin is rapidly becoming new global currency.
Regeringschefen@reddit
Bitcoin is becoming the new way of gambling with your money, not a globally used currency.
PreparationAdvanced9@reddit
You are in the pump phase, get out while you’re up
Mysterious_Comb9550@reddit
You don’t understand. Please check out r/bitcoin for how it works. There are lots of literature and videos that can help you achieve financial literacy.
RobertKerans@reddit
I think "rapid" means fast??
TopBantsman@reddit
People were saying this a decade ago
Mysterious_Comb9550@reddit
Yup and I bet you wish you had bought then
TopBantsman@reddit
I did
dvali@reddit
Bitcoin will not be a global currency any time soon, likely never. It stopped being a useful currency a long time ago. It's far too volatile for that. Maybe some successor to bitcoin that is more resilient to abuse.
honor-@reddit
yeah, I guess if you're just doing all your shopping on dark web.
valbaca@reddit
"It's a really interesting technology but it just has limited use-cases"
How many times I've heard that and have seen exactly one use-case: grifts
Clearandblue@reddit
Yeah bloody hell when product people say "we need AI" without elaboration you can at least use your imagination for a couple use cases where you could at minimum just call an API and tick that box for them. Or "we need an app" you could sometimes handle with a PWA and improved mobile experience.
But with blockchain I had no clue what they were on about. Even squinting a bit I couldn't see a way to shoehorn it in.
BasicAssWebDev@reddit
I was coming in here to say web3.0
rodw@reddit
The worst part of the "web 3.0" moniker is that it sure would have been useful to use that label for the LLM/generative AI web. It feels like not enough normies recognize "web 3” to call this "web 4” and the generative AI influenced web is a LOT more closely aligned with the web 1/web 2 level of paradigm shift than blockchain stuff.
Calling it web 3.0 was stupid to begin with. It's largely infrastructural at best. Even in the wildest crypto-bro fantasy blockchain itself was never going to be as transformative as these other examples.
Eric848448@reddit
I’m surprised this wasn’t the top comment.
BrokerBrody@reddit
Because Blockchain was more buzz than actual implementation.
Many corporations actually tried to implement tech like NoSQL and goodness it was terrible for nearly all but niche use cases.
xacto337@reddit
TDD. May still be around, but popularity seems way down.
abe_mussa@reddit
I feel like TDD is so misunderstood and such a useful tool
Wouldn’t say I’m a purist though, there’s a time and a place for it (but it is basically the default for me)
ImSoFuckingTired2@reddit
Honest question: where do you use it?
I believe TDD cannot succeed in an environment where requirements tend to change, which is the most common scenario. When this happens, TDD invariably leads to constant test rewrites and thus wasted time.
double_en10dre@reddit
This makes no sense to me. I find TDD is even more valuable if the requirements are frequently in flux. It is a demonstrably true description of the current behavior.
And I HATE updating code that doesn’t have tests, it takes 10x longer because I have to infer all of the possible consequences.
Do you seriously have a different experience?
ImSoFuckingTired2@reddit
> I find TDD is even more valuable if the requirements are frequently in flux. It is a demonstrably true description of the current behavior.
A demonstrably description of the desired behaviour is, well, just a description of it. It does not need to be written in the form of tests, nor it would be useful to other stakeholders if so, since most are non technical.
> And I HATE updating code that doesn’t have tests, it takes 10x longer because I have to infer all of the possible consequences.
I don't follow. Are you suggesting that you use tests only as proof of requirement fulfillment? Then either your tests are incomplete, or your requirements are always complete, which I highly doubt.
abe_mussa@reddit
Are we talking about having unit tests in general here? Or TDD?
I like writing the test first since it’s a place to think ahead about interfaces of things, expected output etc. It’s not simply a chore to validate code I’ve written, it informs the implementation itself
Whether you think unit tests are necessary or believe they make life harder is a different issue, in my opinion. But I’ll share my own opinion on that either way
I disagree that constant changes make it pointless. If anything, having tests helps us ship faster with more confidence. If you’re rapidly making changes, having that quick feedback that you’ve not accidentally broken another requirement is useful.
If small changes are causing significant test rewrites every single time, I’d question the way the tests are written. Do they care about too much implementation detail?
I think you’re missing the point with the demonstrably true point. It’s a description of the current requirements - with the implementation validated against.
Tests are not just to validate the code we’ve just written works as expected - it’s to make sure we don’t unintentionally break a requirement we care about tomorrow, 1 week from now, 1 year from now etc
Own-Contribution1618@reddit
Works well for microservics. When I write lambda functions for AWS I start with a test and mock out all the aws stuff. No need to push anything to aws until tests pass locally
jasie3k@reddit
TDD is most useful when fixing bizarre bugs. Write the test to cover the weird scenario, see if it fails the same way it does for the reporting person, fix the bug, get the test to green.
MagicianSuspicious@reddit
As a TDD practitioner, I sadly agree with this re: its popularity.
I think that any practice that is difficult, and requires multiple years to learn, is going to face a difficult battle for wide adoption... since many people will try"TDD" and claim it didn't work for them, therefore it's not useful.
(quotes intentional as often the practice they are trying does not resemble definitions of TDD, nor what an experienced practitioner would identify as TDD)
Also given the long-tail distribution of experience in the industry, you'll always have more beginners than advanced folks for any such difficult-to-learn practice, so the online discourse will always feature more of the beginner (or: advanced beginner) voices, than it will the voices of people that have crossed that chasm.
ImSoFuckingTired2@reddit
> since many people will try"TDD" and claim it didn't work for them, therefore it's not useful
And they are right.
I think that TDD, like agile, is one of those practices that are so difficult to adopt correctly, that their hypothetical benefits are outweighed by their factual drawbacks.
MagicianSuspicious@reddit
The benefits of any practice are "hypothetical" until one spends the required effort to gain facility with the practice. Personally, I've found the benefits of practicing TDD to be very non-hypothetical.
"Agile" is not a practice.
ImSoFuckingTired2@reddit
> The benefits of any practice are "hypothetical" until one spends the required effort to gain facility with the practice.
I agree with that.
But I have found that, when a practice is both hard to adopt and contested by many, it does not matter how beneficial may it be for some, since it won't become widespread.
> "Agile" is not a practice.
The practice of agile is, well, a practice.
The framework, or methodology, or whatever word you want to use, is the idea of it, and on paper, agile sounds great. The practice, or implementation, is abysmal, though. Again, it does not matter how good the original intentions were, or if a minority has implemented it correctly and reaped the benefits, if the vast majority out there are essentially using waterfall with extra steps.
MagicianSuspicious@reddit
My primary goal is to be effective, and help organizations that I work with be effective. My experience is that TDD is a key part of fulfilling each of these goals. Does it take a lot of practice? Of course.
Frankly, I think most people find TDD daunting because it exposes gaps in software design skill. There are two ways out of this: gain design skill, or give up (and optionally blame TDD).
The software construction skill and discipline of the median team in this industry is, frankly, terrible, at least in part for the reasons I outlined above around experience:
The experience balance skews toward the beginner. An example is this subreddit, which defines "experienced" as "3 or more years". A developer with 3 years of experience is still in the beginner phase of their career. I wonder if telling someone at that stage of their career that they are a "senior" makes them less likely to acknowledge that they are still in the beginner stage, and thus stunts their growth.
It takes a lot of practice to gain facility at building software. For me personally, TDD accelerated that process greatly. I've also seen it accelerate that process for others with whom I've worked, and those that I've mentored and managed.
re: "Agile" as a practice:
There is no standard "implementation" of "agile". It's just a grab bag of ideas and ceremonies -- mostly from scrum -- that are commonly applied to project management.
I do agree that many organizations have sprinkled "agile" ceremonies atop wasteful/comfortable practices, and miss many opportunities for improvement as a result.
But I think, on balance, software project management now is generally more effective than pre-widespread-agile adoption practices. I was over a decade into my career when agile gained popularity, so I worked in actual waterfall shops, wrote detailed design documents with the best of them, worked with throw-it-over-the-wall QA teams, shipped software on floppies, studied Steve McConnell's books about software process, etc. before ever trying "agile". I really do think that what we have now is better than those days.
ImSoFuckingTired2@reddit
I'm so glad.
Every single person I know who has tried TDD at work, has failed miserably, and when they succeeded, the end result had taken so long to be deployed, it didn't make any financial sense.
a_library_socialist@reddit
TDD is still a good method, and people use it, it's just no longer considered revolutionary.
bart007345@reddit
That's not a field of computer science. It was never taught at university.
Ghi102@reddit
It was at our university. We had a whole class around testing and TDD was covered as part of it
RelevantJackWhite@reddit
Obviously not, it's a way of designing and developing software. That doesn't mean it's irrelevant to the question.
mrcarruthers@reddit
I had a class where we had to do it for at least one assignment. It was a pain in the ass, but it was taught.
Grumblefloor@reddit
I interviewed at a company recently where the CTO was a TDD nut. Didn't get the job because I didn't buy into doing TDD to the exclusion of everything else.
It wasn't even a small company; I can almost guarantee most people in the UK have heard of them.
OutsideMenu6973@reddit
Getting rid of C++. It’s still the first cs class you have to take
OneWingedAngel09@reddit
Perl. My first job as a junior was on a Perl project. The language is easy enough to learn but it lacks strong typing.
For weeks all of the senior devs were plagued by a math bug that would offset the cart total by a penny. Drove them nuts.
The solution was to add 0 to the $$ amounts to force Perl to treat them like numbers instead of strings.
Good times.
ablue@reddit
Cryptocurrency
victotronics@reddit
Parallel computing is everywhere in science. There is no program that doesn't use it, whether by some threading model, or more likely through distributed computing at massive scale. How else are you going to keep 100 thousand cores working on your astrophysics simulation?
No, take Data Flow. That was supposed to be the way to get "true" parallelism. (There is an interesting interview aroun 1990 where the speaker claims that hooking up workstations by ethernet is not "true" parallelism.) And despite some interesting experiments in hardware and software, it's basically gone.
ravigehlot@reddit
Adobe ColdFusion
Wulfbak@reddit
VRML. You, lawn, off.
ravigehlot@reddit
Wow…this one…for sure!
Least_Bee4074@reddit
My senior year of college, I took a graphics class and had to implement a VRML browser in C
dannyhodge95@reddit
Embedded Flash had to be forcibly removed from the internet, that was pretty bad
whoji@reddit
Used to be a useful skill to memorize FCKGW-RHQQ2-YXRKT-8TG6W-2B7Q8
Very obsolete tech now.;)
sharptoothy@reddit
Same, but for Windows XP: RY7B2-K9QQ4-H9H34-7DPX9-XQ7MG
Venthe@reddit
XTQJC-T8CCG-4BWVT-8TG6X-MX9QG
Last time I've used it/typed it was maybe 15 years ago or so. This shit is burned into my memory.
kaeptnphlop@reddit
I can still see the letters in sharpie on freshly burned CDs
JadeBorealis@reddit
? what is this.
joke went whoooosh
valbaca@reddit
it's a microsoft product code
except they were so easy to find
JadeBorealis@reddit
hot take, but this is purposeful.
I think there was a quote from microsoft "if they're gunna pirate, we want them to pirate our stuff"
basically get total and complete market control and acceptance so that people are only talking about and working with windows
Aggressive_Ad_5454@reddit
Hey, that Office 2000 DVD just installed on my laptop. Thanks for the license code 😇
OblongAndKneeless@reddit
Lucky. I'm still on Windows 95 using notepad.
HashMapsData2Value@reddit
How's functional programming doing these days?
PM_ME_SOME_ANY_THING@reddit
You mean the default paradigm for React? Currently the most popular and largest frontend library?
yeah666@reddit
React is more "decidedly not imperative" or functional-lite than it is truly functional. There's a world of difference between React and going all in on Haskell or Ramba.
qalc@reddit
modern javascript is essentially a functional language at this point
loxagos_snake@reddit
I see disproportionately more people complaining about how OOP sucks and how much better FP is, than I see actual programmers use functional programming.
Not to bash on a paradigm that I've never tried at all, but I wonder if that's yet another case of loud minority?
a_library_socialist@reddit
People that think OOP and functional are opposed generally don't understand either.
frontenac_brontenac@reddit
At their cores FP is algebraic, OOP is coalgebraic. They fulfill different use cases.
Inheritance is 100% cursed though.
sintrastes@reddit
Gonna soft disagree on that.
If they mean OOP and FP in the sense of language features and techniques, you're correct. You can easily mix and match, and many do.
If you mean OOP and FP in the sense of design methodologies, yes, although there is a bit of overlap I do think they are generally opposed.
There's a big difference between the "Functional" way of solving problems (using pure functions and immutable state) and the "Object-Oriented" way of solving problems (using encapsulated "objects" which primarily do work via updating their internal state in response to methods).
One of the issues with discussions with this is how many different ways you can define FP and OOP. I've covered two possible definitions here, but there's definitely more out there.
a_library_socialist@reddit
You can do OOP quite nicely with most objects being immutable - it's just that very few people do.
I blame Java being a teaching language primarily - people think if you're using an OOP language or any form of object it's OOP, and don't bother having any sort of domain model, etc, and no enforcement of single resonsiblity or SOLID.
Basically they do procedural with some objects, call it OOP, see it makes a procedural mess, and then claim OOP doesn't work and therefore FP is the solution to this.
FP is generally easier to enforce good behavior with language requirements, so often it can be better in this situation. But again, that's not inherit to the paradigm, it's how it's been used.
I've seen similar things done with Scala and F# codebases, where they're making all fields mutable, god data objecs, and the like, to get around the restrictions which make good software. Unfortunately no language is a match for lazy and arrogant programmers that think they're uniquely brilliant.
frontenac_brontenac@reddit
Functional programming is absolutely next-level for writing easy-to-read and correct software, it's just kneecapped by the learning hump. Mainstream languages have been slowly sneaking it in, for example TypeScript's type narrowing is pattern matching in imperative clothing. It's still the future, just unevenly distributed.
Regular_Zombie@reddit
The minority is very loud, but lots of functional programming concepts have moved into non-functional languages. I see much less 'pure' OOP now.
Chezzymann@reddit
I think like everything in life there is a balance. Sometimes you don't need classes for everything.
WheresTheSauce@reddit
It makes sense. Utilizing FP concepts in an OOP language is the best of both worlds IMO
playerNaN@reddit (OP)
Also, as a programmer that really likes FP, I'd argue that OOP vs FP is a false dichotomy. Immutable classes (like case class in scala) are an example of the union of FP and OOP.
Odd_Soil_8998@reddit
I mean, every time I get a chill FP job where things are running smoothly, some new CTO comes in and demands we rewrite everything in C# or Java. This has happened to me 3 times in the last 5 years. I love working with Haskell and F#, but it's just not worth chasing those jobs at this point because you're always one pointy-haired boss away from losing it.
vuwu@reddit
Yes, let's take what works and break it, because.... reasons. Those poor PHB, how will they justify the mass layoffs if they can't hire outsourced labor for pesos on the penny? /s
Ghi102@reddit
I think a main difference is that a lot of functional features are being brought in OOP languages. Often working about 80% as well, but that's often good enough.
Honestly, I think half of the features released for C# for the past 10 years, for example, were created originally in F#. Heck Generics are parametric polymorphism, which was first pioneered by functional languages
deathhead_68@reddit
I remember saying to someone once 'what do you think LINQ is'?
whateverisok@reddit
Best way to put it! Other examples are lambdas especially with streams
dustyson123@reddit
You're using FP likely. React on the FE, Rust on the BE just as two examples both utilize many FP concepts.
thashepherd@reddit
Great, actually. The pure FP languages never really went mainstream but the really pragmatic bits definitely infiltrated the 3GL languages we know and love, to good effect.
ceilingscorpion@reddit
My company uses it but with Python rather than a purely functional language like Haskell or Go. Though Go is pretty popular so I’d say it’s doing just fine
gomihako_@reddit
Well there’s elixir and gleam, ECMA has a proposal to add native pipes https://github.com/tc39/proposal-pipeline-operator
mrcarruthers@reddit
Some functional concepts have made their way into other languages, mostly for the better. Use the functional bits when it's beneficial, use procedural when it's not.
whateverisok@reddit
Yep! Easy examples are lambdas especially with streams
EarthquakeBass@reddit
I think the most approachable ideas from it have been sucked into other programming languages so less incentive to switch entirely
Zephos65@reddit
I use functional programming at least weekly, but probably several times per week during some weeks.
I use python and rust. Probably 70% of my functional programming is python list comprehensions.
Flashy_Current9455@reddit
Some parts have been imported through Kotlin and Swift
Flashy_Current9455@reddit
Some parts have been smuggled over in Kotlin and Swift :-)
alfredrowdy@reddit
thin clients
comrade-quinn@reddit
Really? I feel it’s the other way round. Mainframe is fat client, that’s the OG of mainstream remote client computing. Then things like desktop apps backed by servers came along but have increasing been replaced by websites.
Websites were originally always thin clients. The last decade or so saw a trend towards fatter clients in the form of SPAs but that is now being reversed with a push to simpler, leaner HTML where possible and a push to SSR where not.
Mobile apps of course are fat clients, but like desktop apps, I think they exist as they’re forced to. For example, Windows encouraged desktop apps as they created vendor lock in for MS.
Apple does the same as iPhone would be goosed as a cash cow were it not for the App Store. Were Apple to embrace PWAs, I think you’d see a bigger uptake in them rather than Apps.
This is because, fundamentally, thin clients reduce work in terms of duplication and make updates to the system easier. As such, I think when all things are equal, they’re the default to which we as a tech community will gravitate. It takes something to push us away from that (network latency, walled gardens, technology limitations etc), and that does happen frequently, but whenever such obstacles are removed over time; we centralise again. And that looks like thin client.
reboog711@reddit
Don't most schools use Chromebooks, which are thin clients?
dangling-putter@reddit
How tf are cloud, parallel and big data fads?
They were novel, we figured out the runbook and moved on, so much so that nobody speaks about them exactly because we have runbooks.
However, try working on a hyperscaler, all these problems are alive and well, and we spend a lot of time thinking and working on them.
sobe86@reddit
The fad of big data wasn't the concept of "a lot of data", it was the idea that there was precious gold buried in your mountain of clickthrough events. The people shelling it professed that basically everyone should a) store every possible event in a mapreduce database, and b) hire a team of data scientists to use 'algorithms' to optimise their business.
It most definitely was a fad that has mostly died now, even people who used to push it think so.
https://motherduck.com/blog/big-data-is-dead/
epelle9@reddit
On the other hand, plenty of big business survive purely because of the data they can gather and sell to advertisers.
sobe86@reddit
Sure, and some companies genuinely make money from any fad you can think of, they're fads because more people/companies bought into them than was warranted.
epelle9@reddit
Big data is as much of a fad as GPS. It’s not a fad.
It’s in basically any big software/hardware you use, people don’t pay much attention to it anymore, but it’s important for most apps you use daily.
Just because it blew up at one point doesn’t make it a fad, it never died down, people just stopped talking about it.
sobe86@reddit
Have a downvote right back mate :b
My guess is - you were not a developer in 2012-2014 era - if so you have no idea how stupid things got.
sobe86@reddit
You're looking through a pretty distorted lens - 'the apps I use daily' are not what we're talking about here. I would guestimate 80-90% of developers do not work at company that _truly_ needs big data - that was the fad, the rest of the companies buying into the hype.
itsthekumar@reddit
Just curious how/why you say companies don't have the scale of data to make any real use of their data.
I would like even a few "insights" would help tweak their business even if marginally.
GrapefruitMammoth626@reddit
They weren’t fads, but if I understood the sentiment correctly, they were hyper buzz words at their respective times when they were first gaining traction. They seemed like terms a non technical person at the company might latch onto to feel like they are at the front of the pack. I still recall a telecom provider ad where some average Joe points up at the sky and says “it’s in the cloud”.
benjackal@reddit
I guess they used the word fad incorrectly, moving to the cloud was definitely a kerfuffle. I feel as if a lot of companies rushed their migrations and put a lot of teams under the pump.
I get that there are reasons to have on prem, but just very unlikely for me to work with it again. Having to deal with an internal team to provision resources weeks in advance haha no thanks.
PM_ME_SOME_ANY_THING@reddit
Everything is a fad until it isn’t
ChuyStyle@reddit
They are fads in the sense that the return on investments have not yet actually been realized. Really the big players have use for these but a garbage company trying to modernize their system? Maybe dumping millions on these is not the best move.
There is a reason it is now all about AI. Cloud and Big data were promised solutions for making 💲💲💲
valence_engineer@reddit
Databricks and Snowflake are worth something like $100 billion combined right now. Google isn't paying them for their services. Tons of companies use them and get value and pay for it. Probably 100x versus a decade ago. Even more so for cloud.
ChuyStyle@reddit
I absolutely agree with you about the big players. But when we look at the non faangs who use those services, they are all in the hole trying to make a profit and those services help them streamline their profit margins. But after 15 years, Uber, Airbnb and other 2010 start ups have failed to make a profit with the use of cloud and big data.
Data bricks and snowflake are valuable because they are selling the shovels. The ones using it not so much
Empanatacion@reddit
Wouldn't that make shovels a fad? If it's a tool widely used by profitable companies, it's not a fad.
Also, both Uber and Airbnb are in the black. With 20-30% profit margins. There are quite a few other profitable companies that were cloud based from day one.
ChuyStyle@reddit
And it wasn't the tech that made them profitable. It was price hikes
ddarrko@reddit
Uber are profitable.
Plenty of snowflakes client base are profitable. My company use them and are profitable... What is your point here ?
valence_engineer@reddit
Those startups burn money because they chose to through a blitzscaling strategy. It's not a question of tech but business strategy fueled by zero interest rate VC money.
ChuyStyle@reddit
I see your point.
Qkumbazoo@reddit
Big data is a real thing, why not walk over to your data platform department and ask how they store and make available Pb's of data? It's an absolute pain to work with that and till date there's no cost-viable cloud solution. Still, it's incredibly powerful and profitable once tamed, where do all the predictive models come from you think?
playerNaN@reddit (OP)
That's why I put it in quotes, sorry if it wasn't clear what I meant.
dangling-putter@reddit
Sorry for kneejerking, it's been a long day 😭
playerNaN@reddit (OP)
Nah, I wasn't clear. Sometimes things make sense in my head but have a completely different connotation when I type it out.
valence_engineer@reddit
OP should look up the Gartner hype cycle diagram.
Logical_Iron_1028@reddit
Databases in particular seem to be stuck in a series of endless loops: https://db.cs.cmu.edu/papers/2024/whatgoesaround-sigmodrec2024.pdf
No-Shock-4963@reddit
XML in general, writing the View Layer of MVC app in XSLT in specific
serial_crusher@reddit
I'll say "Design Patterns". Like, it's still a real thing, and you'll still frequently reference a known pattern when planning a project etc... but in the mid 2000s, there was a fad. Interviewers would ask questions like "what's your favorite design pattern" with no context for what kind of problem you're solving.
I worked at a big company that had an architecture committee and any time you built a new service you had to present it for architecture review, and there was a blank where you had to list at least 4 design patterns you used. Again, no context, just "yes, this has some Singletons, an Observer, and a Chain of Responsibility"
On early-to-mid-2000s interviewing fads, those stupid brain teasers like "how many bags of cheetos can you fit on a 747" were all the rage at the same time because some guy from Microsoft wrote a blog post about how he liked to ask them.
gnosnivek@reddit
I still remember some of my classmates listing all the design patterns they knew on their resumes in the early 2010s. Those were some....interesting resumes.
SheriffRoscoe@reddit
Design Patterns were for cargo cult programmers. Along with Best Practices. Lots of newly-minted barely-even-junior programmers wasted far too much time on them.
reboog711@reddit
One of my interview questions is along the lines of "tell me about a time you used a design pattern in code and what did you use it for". Hopefully that is phrased better than asking about a favorite.
thashepherd@reddit
Eh. The book itself is...not obsolete, but long in the tooth. I wouldn't throw it at a junior in a book club like I would back in the day.
The concepts? They're out there, every day, in post-Smalltalk 3GLs. You just don't perceive them any more than a fish knows it's wet.
theothermattm@reddit
God I remember this… The young idiot version of me probably even asked this question. Glad that’s over.
CyberBrownie@reddit
AI
rwilcox@reddit
Domain Specific Languages ?
Felt like for a decade any nice API was called domain specific, even if it was NounlyClass.doingItMethod()
jek39@reddit
jetbrains recently released an IDE for creating DSLs https://www.jetbrains.com/mps/
frontenac_brontenac@reddit
This is extremely old lol
rwilcox@reddit
I forgot about that. Wonder how well it sells.
Non-taken-Meursault@reddit
Gradle written in pure DSL though and its a pretty popular tool
LloydAtkinson@reddit
No, DSLs are very much alive like YAML, Vue/Angular garbage attribute based tempting languages, I think there’s also a ton of handlebars based tempting in DevOps and kubernetes too.
Fortunato_NC@reddit
3D in the browser. In the late 1990s I briefly worked for one of many startups that was trying to take the 3D tech from video games and push it into the browser via a plugin. They all failed because computers and GPUs weren’t really fast enough yet for companies to do anything useful with 3D, and even if they had been, asking people to download yet another plugin was getting old, even in the halcyon days of Flash Player. Heck, browser plugins in general aged really poorly - absolute security nightmare.
Aggressive_Ad_5454@reddit
UML for everything. Enterprise message busses. Token ring networks. PL/ I .
SheriffRoscoe@reddit
Alas, PL/I. 50 years ago, it was extremely influential. Wirth's Pascal inspired a lot of Structured Programming thought, but IBM's PL/I was, for a while, the SP language that actually got used.
Aggressive_Ad_5454@reddit
And it had such great features, like ON CHANGE for reports, that haven't really been replicated anyplace else. Alas and alack indeed.
Orangy_Tang@reddit
My hot takes:
Expert Systems (never really worked, replaced by modern ai approaches).
Semantic web (I have never seen an actual concrete proposal of how this would actually work. Kinda morphed into web3).
Transputers (too weird, throws out too much established tech)
Software voxel rendering (beaten when actual GPUs arrived. Kinda ressurected with Gaussian splats).
Image based rendering (see above).
90s era neural nets (turns out we just needed to wait for compute power to ramp up x1000).
ActiveX (wtf were we thinking).
Roundtrip UML/code generation (never really worked, lots of extra effort for negative gain).
SheriffRoscoe@reddit
Especially the UML one!
bluetista1988@reddit
Does anyone remember Unobtrusive Javascript, and its cousin Progressive Enhancement?
At the time I thought it was viable, but it was during a whole different era of the web. Blackberry devices were the major web-enabled mobile devices we had, we were still dealing with IE6/IE8, and both ES6 and Google's V8 engine were a ways away.
This concept seems all but dead in modern development.
SheriffRoscoe@reddit
At the time, it felt like a necessary evil. Nobody had an inkling at the time that the mobile world would settle on two very different implementations of very similar, and high function, capabilities.
martinbean@reddit
This was all the rage when I got started in web development in 2005-ish. It was how I learned to build websites, and it’s how I still do it today. Semantic HTML, CSS, and JavaScript added as an enhancement rather than a requirement.
FelixStrauch@reddit
Object-oriented programming, back in the good old C++ days at the turn of the millennium.
By OOP I don't mean one class derives from another, I mean monstrous class hierarchies 20 objects deep.
Everyone bought into this. Questioning it was like questioning unit tests today. It just couldn't be done.
BeDangerousAndFree@reddit
Macromedia flash
reboog711@reddit
Flash still had a sucesful life after Adobe bought them.
engineered_academic@reddit
Lots of languages like FORTRAN, ADA, and even BASIC. There were adventure fiction books written in the late 80s / early 90s with BASIC programs you could put into your computer to simulate what the protagonists "did". There hasnt been anything like that lately.
Clojure and the LISP-likes were hot around the early 2000s-mid2010s and then it nosedived hard.
Anyone still use JQuery?
Back in my day everyone knew what a "cgi-bin" was.
Blockchain and NFTs were hyped. I'm gonna catch shit here but Bitcoin and most other altcoins are just pump and dump scams, and someone's gonna be left holding the bag. People are making out like bandits though in the meantime.
Nosql and lowcode/nocode systems haven't emerged. LLM hype will be gone within 5-10 years, if that.
3ABO3@reddit
jQuery is present in something like 70% of top 1000 websites
reboog711@reddit
IS that because Wordpress uses it?
engineered_academic@reddit
Up until a few years ago there was still IE support built into most websites.
bcameron1231@reddit
Which is a scary statistic.
fuzzynyanko@reddit
If someone said "I have no idea what NTFs are used for" and I would reply "I am not surprised". NFT profiles were a solution looking for a problem, a problem that Xbox Live has mostly solved at one time or another. Minecraft solved a bunch of the others.
Unlike NFTs, the old Xbox Avatars actually made it into 3rd-party games.
Flashy_Current9455@reddit
Unit-testing
ctorstens@reddit
You don't unit test? I think of this as foundational.
Flashy_Current9455@reddit
Mostly kidding about it being af fad. But I haven't actually worked in any codebases with much unit-testing beyond specific low-level parts of the code
Ok_Raspberry5383@reddit
You sound like a junior who works as a software engineer at their mother's cleaning company
Flashy_Current9455@reddit
Oh, slam! You got me.
But that's the spooky part...I'm way senior and talking from experience from many companies
Ok_Raspberry5383@reddit
And you don't write unit tests? Sure... Senior can mean a lot of things...
Flashy_Current9455@reddit
Yes, most of the time it just means that I'm old :-D
There plenty of interesting discussion to be had about unit-testing, but I don't think this thread is suited for it. I mainly started the comment thread as a light trolling :-P
I think the top comment about "TDD" is the more serious and interesting version of my silly comment
serial_crusher@reddit
I could see an argument that the zealousy of "unit tests" and "integration tests" being different things was kind of a fad. Like just write the appropriate test that makes sure your stuff works, bozo.
It was largely a case of ivory tower professors having strong opinions, but I've definitely seen some poorly written test suites that tried too hard to maintain that distinction.
Ok_Raspberry5383@reddit
There's a very clear distinction if you don't get it that's on you
3ABO3@reddit
This mentality is how you end up with slow test and slow CI
another_newAccount_@reddit
I see we work at the same company
Flashy_Current9455@reddit
:-D
intinig@reddit
Lol
jeremiahishere@reddit
wat?
Flashy_Current9455@reddit
/s
chrisza4@reddit
Dependency Injection via external configuration file (usually XML) instead of code. This one, I'm not sure what the hell were we thinking. Matching class name in XML with real class instance name without IDE help and potential runtime error is good idea? I would say people in that 200x era are way too tunnel vision in concept of "configurable apps".
SOA (Service-oriented architecture)
Beginning-Comedian-2@reddit
From what I've heard, Domain-Driven Design.
https://trends.google.com/trends/explore?date=all&geo=US&q=%2Fm%2F03czw2b&hl=en
Venthe@reddit
And it's a shame really, because on a strategical level, I can't imagine doing enterprise projects without DDD - it's just that beneficial.
Side note, it seems that it gains in popularity after a sudden drop, but now the interest is based in asia: https://trends.google.com/trends/explore?date=all&q=%2Fm%2F03czw2b&hl=en
tomugon@reddit
OOP
anthropaedic@reddit
Agile
Venthe@reddit
Hard to say it's a fad. It's still the best way to develop software product, bar none.
What was a fad, was to "introduce" agile without changing anything at all; then surprised pikachu it did not deliver. Shocking.
RangeSafety@reddit
Microservices, that makes everything more complex and expensive that it should be.
Cloud, in other word someone else's computer.
Agile, the source of endless bullshit. I am sorry, the facilitation of bullshit. Be sure to use 10 dollar words.
Ok_Raspberry5383@reddit
If you're lifting and shifting onto instances in the cloud then yes it's just someone else's computer. But that's a you problem and means you're using the cloud wrong unless you have some very specific use cases which warrant this...
havecoffeeatgarden@reddit
Agreed with microservices. We went way too extreme with the practice and now the pendulum are swinging back from companies having hundreds of services to maybe just a handful.
softwaredoug@reddit
I have this radical idea called "services" that are neither monolith nor micro services :)
havecoffeeatgarden@reddit
hi there, would you be interested in becoming this year's turing award nominee?
softwaredoug@reddit
Yes.
And I will be writing a book on this subject called Goldilocks and the three architectures.
LloydAtkinson@reddit
Or as it’s always been called: Service Orientated Architecture
bart007345@reddit
They are alive and kicking.
RelevantJackWhite@reddit
They are alive and auto restarting!
Torch99999@reddit
Hopefully not for much longer.
a_reply_to_a_post@reddit
Y2K consulting
AR driven e-commerce
NFTs
Ok_Raspberry5383@reddit
Y2K was a very real problem... It was thanks to enormous efforts years before that meant there was no melt down
ImSoFuckingTired2@reddit
> AR driven e-commerce
A fellow VRML hater here.
a_reply_to_a_post@reddit
i used to bang out a lot of freelance website jobs like 15 years ago and some of my friends who are artists/designers will hit me up once in a while asking if i still do websites, etc...
during covid, that was the big hype...everyone wanted a store in "the metaverse" where your avatar could go try on some sneakers and buy an NFT
that was the start of feeling like i'm getting too old for this shit, because i do remember experimenting with shit like AR in Flash during the tailend of the actionscript era
ImSoFuckingTired2@reddit
I had the exact same feeling.
This friend of mine was trying to convince me to join some company doing a metaverse play, with a similar pitch.
It sounded awfully similar to what companies tried to do in Second Life, back in the mid 2000s, which in turn was really close to the idea of 3D e-commerce websites right around the dot com collapse.
SnooStories251@reddit
Quantum computing
Ok_Raspberry5383@reddit
Still very much in development, this is highly experimental though. It's never been adopted though so I don't see how it could be a fad
Prog9999@reddit
We had a module on functional programming in my degree using standard ml.
Anyone ever used it in the real world? This would have been 1991/92
Ok_Raspberry5383@reddit
Scala? Basically anything big data until python took over as the main language for ETL
Best_Fish_2941@reddit
IoT
Ok_Raspberry5383@reddit
How's IoT a fad, you just don't hear about it because it's ever more embedded in everything now as standard, not sure it's quite a fad
viniciuspc@reddit
When Apple killed Adobe Flash. It had a lot of resistance from people, me included. But the world is a better place without relying on an Adobe propertary technology for us to have interactive web pages.
slashdave@reddit
Blockchain is not a CS field.
The_Muffin_Man15@reddit
You’re being pedantic, blockchain is a fundamental idea behind Web3… a field of CS
Bayul@reddit
In what universe is Web3 considered a field in CS?
The_Muffin_Man15@reddit
No matter your opinion on Web3, you are naive to think that this isn’t a field of CS where both a large of amount of research and work is being done.
devoutsalsa@reddit
Teddy Ruxpin Firmware Developer
elkakapitan@reddit
Uml. Fuck that shit
Siduron@reddit
Yup, I've had to draw UML diagrams for everything up until my graduation. Never made one ever again in the real world.
taelor@reddit
Do you not use data model diagrams?
papawish@reddit
It's perfectly ok to modelize data.
It's not okay at all to modelize logic or workflows. Which is essentially what the GoF smarties tried to impose on us
vsamma@reddit
How about sequence diagrams? Or they’re not UML?
papawish@reddit
In the 21th century, in most industries, workflows change faster than you can update those docs.
Data structures usually are way more stable. Especially when stored on disk.
vsamma@reddit
Well we do deploy a bit more often :) But documenting just ER diagrams or whatever you use for data structures is not enough.
So what’s the best way of creating documentation that is actually easy enough to update and not get outdated?
LeetcodeFastEatAss@reddit
Learned it in one course in undergrad, seemed rather useless. I have not used it since.
IDontEnjoyCoffee@reddit
I've used that at all my jobs
davy_crockett_slayer@reddit
Web 2.0 and video streaming. I remember in 2005-2008 Web 2.0 was the hot new thing, and everyone was starting a video streaming site or service.
kazabodoo@reddit
Might not be old enough but I hated working with AWS CloudFormation in YAML, still hate it to this day and can’t believe people write this
ceilingscorpion@reddit
What do you use now?
kazabodoo@reddit
AWS CDK and Terraform but need to try pulumi too as its being recommended a lot
dietervdw@reddit
Pulumi FTW
ceilingscorpion@reddit
I do like Pulumi
800808@reddit
😂 this hit too close to home
thetoad666@reddit
Nosql! Im glad that seems to have lost traction! For all those who thought it was something new, what did they think we had before sql?
ShatGPT4@reddit
Today's patterns are tomorrows anti-patterns. Just about everything that is attractive today will be hated tomorrow, but eventually if you stick around long enough it comes back in vogue.
martinbean@reddit
Webrings.
secretaliasname@reddit
Those were good days before the internet was so centralized and ad monetized.
theothermattm@reddit
This made me laugh.
WeedWithWine@reddit
GraphQL. There’s very little real world use for this and saving fractions of a millisecond not transporting extra data isn’t really much of an optimization.
Zenalyn@reddit
agree. Though I prefer it over REST for headless CMS
ObscurelyMe@reddit
I think this falls into the YMMV category. There are definitely some (larger) companies where I think the benefits of GQL will outweigh the costs. Smaller companies with either a monolith or few microservices are just overengineering if GQL is what they are gravitating to.
t0astter@reddit
GraphQL was a pain in the ass that provided, imo, marginal benefit relative to the headache it created.
I remember my company even had a massive push for EVERYTHING to use GraphQL. Thankfully that died off.
Then-Boat8912@reddit
Enterprise Service Bus
valbaca@reddit
No Code in all of its iterations, including the current one. Surely this time we'll replace coders...and not end up just needing more coders to maintain the mess the tool split out.
SOAP COBRA XML J2EE Applets Flash UML
Basically all the things that they tout to business people that if you just follow Their Methodology you'll write less (or no) code and it'll be adaptable and let you be Agile (TM)(R)(C)
Best_Fish_2941@reddit
PHP and Perl
Best_Fish_2941@reddit
angular and jQuery
Best_Fish_2941@reddit
Map reduce
Best_Fish_2941@reddit
bioinformatics
Best_Fish_2941@reddit
Crypto
alfredrowdy@reddit
Semantic Web.
cscottnet@reddit
Artificial intelligence.
moduspol@reddit
Custom AI chatbots being shoe-horned into everything.
Oh, sorry, you said “past.”
rapatachandalam@reddit
WAP - wireless application protocol
Non-taken-Meursault@reddit
GraphQL was suddenly everywhere and then, all out of the sudden, it disappeared as fast as it came.
Definetively not a fad, but the obsession with turning everything into microservices also vanished quickly. I mean microservices are great, but people soon realized that using them everywhere (as so many people suggested) was pretty much a shoot in the foot.
thashepherd@reddit
Ooh, great call (and I only just realized that GQL faded, I'd always thought it was REST+1).
You're right. One might even call "extreme micro services" the meta fad.
rapatachandalam@reddit
VB ? :)
mrequenes@reddit
I used to work on RAD (Rapid Application Development) tools back in the late 90’s
In the early 2000’s, a “NextGen” platform
Everything was Xtreme or eXtreme for a while then, too.
reddit_again_ugh_no@reddit
Enterprise Service Bus
SideburnsOfDoom@reddit
4GLs, and Drag=and-drop visual coding tools.
TimonAndPumbaAreDead@reddit
SSIS just. Won't. Die.
bothunter@reddit
My God. I hate SSIS with a passion. I was forced to build a data pipeline with that shit. At least once I got it working, I assigned the PR to my boss and he complained that he couldn't understand the giant xml blob that I was trying to check in. Told him it was not my problem because I objected to using SSIS in the first place.
TimonAndPumbaAreDead@reddit
I once said, "the best thing you can say about SSIS is that it isn't SSRS. And the best thing you can say about SSRS is that it isn't literally Hitler"
thashepherd@reddit
....what would a world where Hitler started out in SSRS instead of art even LOOK like?
TimonAndPumbaAreDead@reddit
He probably would've killed himself sooner
LloydAtkinson@reddit
What’s the difference? I forgot
SideburnsOfDoom@reddit
Hitler's the German one with the moustache.
LloydAtkinson@reddit
Lmao
TimonAndPumbaAreDead@reddit
SSRS is for generating reports, SSIS is for ELT. They're both bordering on low/no code solutions targeting people with little to no development experience outside of Excel macros but who have database access for some reason and make it real easy to end up with a giant unmaintainable mess
LloydAtkinson@reddit
Ah yes I remember now thanks. It’s like SharePoint, no one at a company can ever justify why it’s used or even explain how it’s used. All anyone knows is it simply exists and is somehow core to the whole business… somehow.
FearlessAdeptness902@reddit
Thank-you Azure Data Pipelines.... I am forbidden from writing them in anything but the GUI.
Anytime I see a Low-Code system I cringe.
Psengath@reddit
Why spend minutes writing a one-page sproc when I could spend days drag & dropping a Frankenstein approximation of my intent?
ok_throwaway161@reddit
I remember it from the times it was DTS on SQL server 7.0. I feel old now.
CubicleHermit@reddit
The latest versions are "low-code/no-code" and are still just as pointless.
chamric@reddit
In like ‘98-2000 range I saw an article in an acm magazine about neural net based ai and ai agents. It was a fad that went away. For a while. Boy is it back.
ctorstens@reddit
Scala. Used to be the next big thing.
thashepherd@reddit
...so you completely forgot that Groovy existed.
Unsurprising.
ceilingscorpion@reddit
Scala 2.0 killed Scala
Flashy_Current9455@reddit
Replaced by Kotlin and Swift in some facets
TangerineSorry8463@reddit
And in terms of data science, it's a distant Xth place to Python, Rust, R.
LloydAtkinson@reddit
I don’t know anything about it other than some constant shit stirring between factions.
tdatas@reddit
It's a shame. Scala 3 is an absolutely incredible language but with a limited audience because of poor governance decisions.
TimeToSellNVDA@reddit
:(
In the end, I think Kotlin is simply the better language.
itsthekumar@reddit
Low code tools.
A lot of people have made careers out of them, but a lot of that experience isn't translatable to anything else.
SizzlerWA@reddit
Only NoSQL can be “web scale”. XML, SOAP, CORBA, TextMate.
Qkumbazoo@reddit
Graph databases, it was hyped up precovid and pushed as a replacement for any use cases, including acid-compliant Db's. Where is GraphQL being used today?
No_Technician7058@reddit
idk i think neo4j and pg AGE are both pretty popular at this point
papawish@reddit
I've used Neo4j recently for a perfect use case.
First time in my life I encountered such a usecase lol
maleldil@reddit
GraphQL is being used all over the place, but it's not a Graph database.
CoconutDesigner8134@reddit
Flash websites
CompetitiveNight6305@reddit
That’s the one I was trying to remember. Thank you!
chunkypenguion1991@reddit
VR comes to mind as a big flop. Unless they can make them look less like nerd helmets I think that fad is over for the near future. That was even with meta blowing billions of the tech
Electronic-Walk-6464@reddit
big bads: OOP SOAP SAML
emerging bads: Agile Cloud Microservices
npiasecki@reddit
After doing this for 20 years, I would say all of it. All of it!
We can build a bridge that lasts 50 years but we as an industry are incapable of building anything that won’t be barely-compilable trash after 10.
VB6, CORBA, CVS, ActiveX, ClickOnce, Applets, FoxPro, Delphi, SOAP, XML, SVN, .NET Framework … when can I get off this bus?
I am so tired of rewriting for API deprecations that basically do the same thing but different, I guess because the new batch of devs didn’t understand how the old one worked.
It’s why I refuse to use the term “software engineer” because we don’t do anything of the sort.
theothermattm@reddit
I agree with you except for bash scripts. At end of the world the only things still alive will be rats, cockroaches and bash scripts.
rashnull@reddit
Flash
No_Technician7058@reddit
osgi
Codrobin@reddit
Silverlight
myevillaugh@reddit
It was a great tool, but ten years later from when it was needed.
bluetista1988@reddit
I spent the early part of my career rebuilding Silverlight apps as web apps.
It wasn't a horrible idea for its time given that Java applets and shudders Flash websites were a thing, but its time came and went.
At one point I think Netflix's web player was Silverlight.
3ABO3@reddit
Oh man I totally forgot about it
Repulsive-Philosophy@reddit
Oh god...
MaCooma_YaCatcha@reddit
While i agree with majority, blockchain tech is not retarded. Elections can be implemented on blockchain but we are not mature enough (ring signatures). Also NFT got abused but real world use case are contracts, especially loan contracts, that can be enforced. This brings us options and futures. All this got overshadowed by money-grabbing-scam-projects.
We could also save signatures of documents on blockchain so we can verify validity.
And im sure there are many more use cases.
ceilingscorpion@reddit
A few I haven’t seen yet. Since I haven’t seen it yet - GraphQL. No one is moving away from REST any time soon.
Due-Ad-2322@reddit
MS Silverlight
Mysterious-Rent7233@reddit
Aspect Oriented Programming
Blockchain
"Structured Programming" in the dogmatic style with no early returns, no breaks, no continue, etc.
XML and SOAP
bushidocodes@reddit
Semantic Web, XML, SPARQL, etc.
jwezorek@reddit
"Knowledge Engineer" when the AI meant "expert systems" in the late 1980s. "Expert systems" were rule-based systems that were supposed to capture the knowledge of a human expert. "Knowledge engineers" were going to be the people who created them, but it was never a real thing. Was always just this weird fad which finally died out when machine learning / statistical approaches of the "connectionist" school AI started having real successes in the 2000s.
Kind_Syllabub_6533@reddit
Java script
mad_pony@reddit
What?
Kind_Syllabub_6533@reddit
The full stack javascript dev jobs will never come back the way they were before
mad_pony@reddit
And what is there to fill the niche?
Kind_Syllabub_6533@reddit
Low code / no code solutions. Still will be a place for specialized front end code. Less need for express apps just saying construct this json from this database query result. Less bootcamp jobs overall. Of course i could just be dumb and wrong
mad_pony@reddit
Low code/no code solutions still generate code. The only alternative to JS for frontend is html5, but it's not even a programming language, so it's quite limited.
Regarding backend, NodeJS is very popular due its simplicity and portability, I don't think it will ever go away any time soon.
PM_ME_SOME_ANY_THING@reddit
👍
JadeBorealis@reddit
I see kubernetes going this route. it is following the pattern of a fad.
- everybody and their mom has to have it
- suddenly it's on every fucking cloud infra job app
- people trying to bandaid it in places it really shouldn't be used
probably gonna get some hate from cloud folks but that's ok
maybe it just me, but if your problem is killing ants *you don't need a bazooka for it.*
I see it as a red flag when I see a small series A / B startup trying to immediately use kubernetes. like seriously why are you over complicating things so damn early...?
Which_Grass_9043@reddit
So many: - reactive programming - full stack JavaScript - functional programming (for all the things) - serverless architectures taken to an extreme
Although it’s more interesting to think about the things I thought were fads, and was wrong about: - docker - golang - …probably more I’m forgetting about
3ABO3@reddit
Does React Native count?
3ABO3@reddit
BDD and cucumber tests
alinroc@reddit
Enterprise Java Beans (EJB)
bssgopi@reddit
Virtual Reality and its extensions
I know that it isn't dead yet. But, the hype with which Oculus happened, everyone was convinced that this is the next big thing. Google Glass did something in parallel and convinced that this could be a game changer in augmented reality, even before the term was invented. In a relatively short time, everyone jumped in. HTC Vive, PlayStation VR, Hololens, Google Cardboard, etc. The unintentionally funny consequence was Facebook rebranding itself as Meta, showing how deep everyone believed this could be THE future (which I do believe too but not in the way it played).
Unfortunately, the consumers didn't jump into the bandwagon with the same excitement as their creators. From my analysis, the reasons boils down to the cost to benefit ratio and the perceived value gained overall. This in turn was due to the technology still being immature and competition cashing up on the hype more than genuinely wanting the technology to succeed.
Today, VR / AR / MR are all reduced to just niche products aimed at very specific use cases - industry, luxury, entertainment.
General-Jaguar-8164@reddit
MapReduce
Ok-Introduction-244@reddit
Asp.net webforms
If you weren't into .net maybe you missed this one. Classic asp was fine, a simple language to generate a string that gets passed to the browser. Like php I think.
Asp.net webforms tried to recreate the experience of winforms development on the web and I hated it. I really did. Maybe I never learned it enough to appreciate it, I don't know. I just avoided it.
I used to be embarrassed to talk about it, because like, everyone knew that it was the cool thing and asp was lame.
Until one day, after years, everyone collectively seemed to wake up and acknowledge that it sucked. Then people would openly mock webforms and say how you should use this other new cool thing.
rudiXOR@reddit
nekokattt@reddit
Several of these are debatable
rudiXOR@reddit
Well true but every single one was overhyped.
BigRedThread@reddit
For all the hype blockchain and crypto were getting (remember the metaverse?) even a couple of years ago, it has almost completely fallen off the map since it just did not have a strong enough case for it
metadaddy@reddit
I’m surprised nobody mentioned WS-*, aka “WS-Deathstar”.
At one point in time, Microsoft tried to persuade developers that applications should exchange XML messages with schemas spread across about a dozen different specifications: WS-Notification, WS-Topics, WS-Addressing, WS-Federation, and all too many more.
It was almost impossible to understand how any of this shit worked, since almost every paragraph of one spec incorporated part of another spec by reference. DRY taken to pathological levels.
sarky-litso@reddit
The newest cross platform UI tool
Pelopida92@reddit
Crypto/web3/nft, NOSQL, GraphQL…
whoji@reddit
Not field but these tech all had the fad status. Matlab, perl, cgi, jsp, ruby, PHP, SVM (xgboost and all other non-NN based ML algorithms and models), scala, Flash, midi, disk, tape reader
TheseHeron3820@reddit
Active records.
mezolithico@reddit
Loosely types languages . Great for prototyping . Huge pita for big projects and debugging. Most companies use add in for python, ruby, javascript(typescript) that enforce typing.
Tablessvim@reddit
Template meta programming
Smart_Constant8706@reddit
Hadoop!
EarthquakeBass@reddit
It’s still alive and powering a lot but I get the impression everybody basically just switched to Spark or paid Snowflake.
CubicleHermit@reddit
Spark is very often implemented on top of HDFS, which is one of the core parts of Hadoop, and often used the Hadoop cluster manager (YARN.)
Definitely complementary and not a full replacement, although things moving to cloud services vs running your own on-prem/colo-ed cluster for it is the big reason for many companies that Hadoop is less common.
PM_ME_SOME_ANY_THING@reddit
Everyone wanted to do big data until they realized their data is shit.
mrcarruthers@reddit
Hadoop was a stepping stone to better/faster projects that are very much in use nowadays. People use hadoop's successors now but they all owe their origins to hadoop
ripreferu@reddit
Not very Alive as a project but still a basis to many big data software used today. Cloudera claims they are still in business with Apache Ozone. The Hadoop ecosystem is still alive. Apache Hive is still widely used. The successor will probably be between Apache Iceberg, DeltaTable and Apache Hudi. Nothing clear yet.
Brambletail@reddit
AI
Beneficial-Neat-6200@reddit
Active X
ChubbyVeganTravels@reddit
Silverlight
Brings back nightmares
ScaryYogaChick@reddit
XML
timhottens@reddit
Clean Code. Not in the general sense but Robert Martin’s book and his ideas specifically. So many completely incomprehensible codebases where it takes you 10X as long to understand what’s even happening in your program.
w08r@reddit
Bottom up parsing ceased to be quite so in; still used a lot, of course, but maybe some of the early motivating factors are quelled.
business logic in stored procedures; slightly holy war still, but I think many are gradually giving in to the in-app-logic side
checked exceptions
flash
amunra__@reddit
Oop wirh class inheritance
raikounov@reddit
At one point, there were various projects that tried to convert code in one programming language to another. I think this was during the period when python was popular but not performant enough to "use in production".
ChappaSL@reddit
XML
sawser@reddit
Block chain.
A couple years ago my bosses asked if we could implement it and I explained it was just an immutable database, and that we should just have good permission controls and audit controls, along with regular backups to compare with instead.
EarthquakeBass@reddit
I think there was a sense at one point that containers would take over all of the entire SDLC and everything would be dominated by Docker. Instead it ended up being a lot more fragmented and many of us still develop natively, outside of containers then deploy to them because the problems with local dev never got fixed.
fogcat5@reddit
not really a fad, but some technologies change and things are profitable or not like RISC CPU which looked liked the future back in the early 90s, or bubble memory which is just around the corner for the last 30 years, or holigraphic storage or tape storage like 9track but super high density with lasers.
yojimbo_beta@reddit
Object orientation
/Ducks a chair being thrown
AcademicGlass1995@reddit
Don't come @ me, but blockchain beyond cryptocurrency!?
sneaky-pizza@reddit
A lot of “fads” were people doing the work before, and people doing the work after the fad has passed. It’s almost like a cool label gets put on something, becomes a fad, then the next label comes along.
Gh0stSwerve@reddit
Big data runs the world noob
mad_pony@reddit
Cloud computing and big data technologies are corporate standards for many companies these days.
imFreakinThe_fuk_out@reddit
Blockchain 100% and calling stuff ai that is not actually ai.
Professional-Lab7907@reddit
NoSQL, Blockchain, Metaverse, BPMN, Serverless
mad_pony@reddit
NoSQL and serverless are hugely popular
PragmaticBoredom@reddit
I think your definition of a “fad” is excessively broad. Cloud computing is more popular than ever. Parallel processing is everywhere now that our CPUs have 10 cores or more.
playerNaN@reddit (OP)
> I think your definition of a “fad” is excessively broad. Cloud computing is more popular than ever. Parallel processing is everywhere now that our CPUs have 10 cores or more.
True, I just mean something that got a a lot of attention all of sudden and then was no longer the "cool" thing to go into
dmazzoni@reddit
I don't think most people got into those areas because they were "cool", they got into them because they solved real problems that people were having.
Prior to the cloud, we were spending a lot of time managing datacenters. Now we can let someone else do that part so we can focus on our core business. Totally worth learning kubernetes to make that possible!
playerNaN@reddit (OP)
Experts got into them because they solved real problems that people were having, but I'd argue most new grads got into them because they were also "cool"
grain_delay@reddit
Nah working on a cloud service is still pretty cool. Building services to handle 10s of millions of requests per second with 99.999% avail is not easy
playerNaN@reddit (OP)
There's a difference between cool and "cool" lol. Solving real world problems is cool, internet blogs saying something is the next big thing is "cool," but cool and "cool" aren't mutually exclusive, internet blogs can be correct.
grain_delay@reddit
Ah shit we still talking blogs
playerNaN@reddit (OP)
It's all medium posts now (or it was, maybe I'm out of touch lol)
aphelion404@reddit
They were new areas that opened up. Now, in a lot of ways "Cloud" (for example) is either 'just' Distributed Systems or a set of tools, depending on whether you're on the builder or consumer side, but knowing those tools can still have real value for a lot of enterprises. Early on though, there was an intermediate world of figuring out what the patterns and tools should be.
Nowadays, most of those pieces have figured out, and unless you're working at a hyperscaler or someone otherwise building their own data centers, you're probably primarily building atop those tools and have simply integrated them into some version of a product engineering flow or an infra/platform for product, depending on the size and scale of the company.
Antique-Echidna-1600@reddit
Modern tech stacks are a fad. lol
bravopapa99@reddit
"AI", Winter is coming, again.
BlinisAreDelicious@reddit
Everything spec out in UML.
Diagram sequence are nice. I keep those
drguid@reddit
TDD (probably).
I've never worked anywhere that used it a lot. I just think it's too costly to write and then maintain all that code.
fuzzynyanko@reddit
I only know a handful of developers that use it, and it's used at specific times. It's mostly used when it's a pain to get to the code to actually run. This is where it excels. In one case, it involved working with a .DLL that gets registered with Windows, and if you manage to screw up, you need to reboot.
With me, I often cuss and then switch over to TDD when the situation comes up where it's beneficial. On another case, I was doing an incredibly difficult job of bug whack-a-mole, and TDD was the fastest way to fix it, working closely with the QA, and especially thanking her for her patience
kingmotley@reddit
Codeless programming. That was so awesome.
Regular_Zombie@reddit
Some things are 'fads' because they're new and shiny and enjoy a surge of popularity before fading: block chains, NFTs, noSQL.
For lots more things the novelty fades but we still have them and they're large parts of the industry: mobile apps, big data, cloud computing.
There is a third category of bad ideas which seem to come and go in cycles and maybe one day will deliver: low-code, AI.
Material_Policy6327@reddit
Cloud big data just became “normal” enterprise stuff. Blockchain for sure just cause it seemed to just spring board topical use cases. ML is in a fad phase but it’s quickly becoming enterprised so it will just be another business function of CS
metaphorm@reddit
I don't accept your premise.
Cloud Computing is still extraordinarily popular and the large public cloud providers (AWS, Msft Azure, GCP, Oracle Cloud, etc.) are industry leaders used as infrastructure for a huge number of companies.
Parallel computing is arguably more important now then at any previous time. Nvidia's market cap alone should be a testament to that.
Big Data has become a standard part of corporate business intelligence. Access to cloud computing is what made this possible in the first place. The continuing success of companies like DataBricks is strong evidence of the established importance of this.
You seem to be conflating "amount of internet and media discourse about the subject" with whether or not something is a fad. Don't get distracted or misled by the ever-shifting chattering of the media. It's bullshit. Just look at the business fundamentals.
IdealBlueMan@reddit
Ada as the programming language for US DoD work.
Natural language as programming language.
roger_ducky@reddit
Everything is a “fad” by your rephrase.
It’s not that we use one thing or another for no reason.
Buying and maintaining your own Hardware too expensive and renting computing time much cheaper? You do that.
Buying your own hardware way cheaper than the computer/time rentals? You do that.
You wanted to have the capability of a mainframe but it’s too expensive, and you have a lot of smart CS peeps assure you they can build a better system using a cluster of PCs? You let them try.
Database engine everyone is using can’t keep up with the volume you had? You change it to something else.
Your app clients doing a bunch of over fetching because your API endpoint people can’t keep up with the backlog for specialized endpoints for every single screen? Hope you got an architect that can fix that, so clients gets exactly what they wanted while freeing up the backend team’s backlog.
I intentionally didn’t mention any names because the reason is more important than what it is. Any sales brochures touting the virtues of a system beyond those original reasons never mattered, which is why everything seems “faddish.”
Manager that just understood famous company did something decided they should too, because that’d make their company like that one somehow. Most of them failed to understand the original reasons why it made sense there, so it gets implemented, shown to suck for their specific use case, and it’s eventually scrapped.
Other people that had similar reasons uses those exact technologies happily without any issues. It’s not a “fad” for them.
srb4@reddit
CORBA
MagicianSuspicious@reddit
Class-oriented design, and in particular implementation inheritance.
WholeRazzmatazz7658@reddit
I think all of those things are super useful and work well when they're chosen as the right tool for the job, but this industry has a tendency to prescribe them as a cure-all for every problem space. I would add JS frameworks, functional programming, object oriented programming and I think we'll look back on generative AI in the same way. They're all great in the right application, but they aren't going to solve all of the world's problems.