Which present-day technologies are here to stay because they're good, which will stay because of switching costs, and which will go?
Posted by Antares987@reddit | ExperiencedDevs | View on Reddit | 52 comments
This is tangentially related to this topic: https://www.reddit.com/r/ExperiencedDevs/comments/1gqh5yk/what_are_some_past_fad_fields_of_computer_science/
We've all seen things come and go. Sometimes the best technologies don't last as it is natural to deviate toward the mean -- and things that are awesome often miss adoption, developers lack the ability to market or support, better ideas are met with opposition due to the culture of the organization. The old accountants I know talk about how superior Lotus was to Excel. Delphi was clearly superior to Visual Basic, and we are fortunate that Microsoft sniped Anders from Borland. I spent a few days with Geoff Lee who is the Lee in Britton Lee for transition training in one of my airplanes years ago. Sybase came from his team, which is now SQL Server.
The Mythical Man-Month was every bit as true in 1975 as it is today. The IBM studies from the 1970s that were cited in this 1985 advertisement for the SGi Iris workstation hold every bit as true today as they did back then: https://www.youtube.com/watch?v=9EEY87HAHzk . And Jim Clark could have been what Elon Musk is if he worked a little harder, though in those days, he might have become another John DeLorean as flying to close to the sun back then was a real problem. Fortunately, the people supporting the sun of those days are now retired or dead.
Richard Feynman was one of the first to work with computers at Los Alamos and what he talks about in his lecture "Los Alamos from Below" also holds every bit as true today as he observed in the 1940s. https://www.youtube.com/watch?v=uY-u1qyRM5w
I remember reading an anecdote from (I believe) Bill Atkinson where he talked about Apple engineers drawing networks as point-to-point meshes and felt like he missed the boat on what Clark picked up on when he left SGi to found Netscape with Marc Andreessen. The idea of backbones and trunks like Sun MicroSystems used were not how they think. He was on the right track with HyperCard and the HyperCard home page evolved into what the iPhone home screen is today.
[HyperCard in 1986](
The "Phone" "stack" (stacks were like iPhone apps are today) would open up a dialer and would use the modem where your desk phone would be plugged into it and pass through to your phone line. Some of us still had PULSE dialing back then for rotary dial phones. If unfamiliar with the pain of rotary phones and rolodexes, here's Louis CK explaining what it was like to actually use a rotary dial phone: https://www.youtube.com/watch?v=WQyF099TKM8
We all hear about how Xerox gave up the kitchen sink when he allowed the team from Apple to tour their facility, but perhaps it would have gone nowhere, as happens all too often, and those engineers from Xerox should not have felt like they gave up the keys to the castle, but rather, found a way to be happy to see what they seeded.
Jack Slocum changed the world with ExtJS, and that brought about the monolithic javascript frameworks that are now pervasive. I had lunch with him a few years ago and when I told him what I didn't like about the newer versions, he disclosed that was when he left the company. I could not have given a developer a better compliment than to have called out his work versus those that followed, like in Boondock Saints, "Good shooting, shitty shooting."
The topic that I mentioned at the start of this post called out noSQL.
As of right now, I'm thinking that silicon-based AI is going to be short-lived. Florida has banned lab grown meat, which means that it was, or was approaching enough commercial viability, and being so for long enough to have made it through the legislative process here in Florida. The question that I ask is "if we can grow meat, we are growing cells, and we can grow neurons." The thermodynamic efficiency of neurons is on the order of 10**9 of silicon. And the original story of The Matrix had the meaning of "Fate, it seems, is not without a sense of irony" having us used not for energy, but for computational power, though "a penny saved is a penny earned" and the net result of efficient biological computing would be the same as using us for energy.
But on the more common things, I believe that there's a lot of hoopla around cloud-based microservice "architecture" (I hate the terms architecture and engineering when it comes to tech). I feel that what happened with that is that it worked for Amazon, and they started allowing organizations to use it. Microsoft saw the cash cow that it is and got in on it. It appealed to developers and large organizations because it could be used to remove the political barriers created by network admin departments, and the "microservice" nature of things also seems to match the paradigm of using small simple applications in Unix.
The "tell" for this is that when I got my SQL Server certifications decades ago, those who made the exams put a lot of effort into ensuring that people know how to efficiently load data and tune indexes for performance. BULK INSERT and bcp still scream, and there are tricks to really make them perform. Most bright younger developers I interact with have never heard of these techniques and know little about how data is stored. MS pushes SSIS. Why? Well, when running shit on the cloud, if it runs like shit, customers simply scale up infrastructure, often at great expense.
So, now, with all that prefaced, here's my take: Agile development methodologies, Docker, Kubernetes, NoSQL (as mentioned earlier), Redis, RabbitMQ, large javascript frameworks, et cetera, are not long for this world.
leafygiri@reddit
Thank you so much for a well-written article. It's a nice read while relaxing after a Sunday lunch.
I think Docker is here to stay. The concept of containerization is going to last. Redis and RabbitMQ are going to last too.
Large JavaScript frameworks may not last another decade. Depends on whether another programming language becomes available for front-end development. Some of my colleagues are praising Rust (can be compiled to web assembly) as an alternative to JavaScript. I guess time will tell.
I don't have a strong opinion about the others.
Antares987@reddit (OP)
I appreciate the acknowledgement. I agree with you on light weight virtualization. Part of me wonders if microservice pipelines won’t be replaced by neural networks, which may reduce the demand for virtualization. I spoke in another comment about voting engines like exist on autopilots where multiple systems have a vote for an action so that if one is out to lunch it doesn’t upset the aircraft.
Here’s what I’m thinking. A lot of the data that’s out there in cloud systems is stored using noSQL models. As the cost of storage drops and volume of stored data goes up, massive tokenization of complex data can take place. AI systems can subconsciously handle this tokenization to the point where requested information is validated against a hash and only as a last resort is the so-called stored graph rebuilt.
CVisionIsMyJam@reddit
here are my opinions.
some version of this here to stay, but "scrum master" as a full-time position that people would be hired for and do nothing besides facilitating agile is gone forever.
linux namespaces & docker are here forever. way too much money tied up in this now for it ever to go away. containerization is a useful tool that likely hasn't been fully exploited yet, dev-containers are still a relatively new thing, for example.
here forever. its a IaaS agnostic deployment target. anyone building enterprise versions of their SaaS products are inevitably for docker container(s) & helm chart(s) as the point of hand-off.
NoSQL is a very broad term, so its hard to address. MongoDB-like NoSQL will remain popular among tutorial writers as well as niche use-cases. I haven't heard of people talk about using these kinds of db as primary data-stores since 2018 or so.
NoSQL which focus on things performant full-text, graph, etc, will continue to be used where rdmbs is not sufficient for one reason or another.
Here forever; its simple, easy to use and does the trick for a lot of simple caching use-cases. I think if they were to make a shim to allow clients to treat redis as if it was a kafka cluster it could suppliant kafka as well. "redis streams" isn't quite there.
could see this dying off. its not at the top of any benchmarks and IMO there are better options these days; redpanda & redis to name two.
there are a large number of programmers that do not know of any kind of development outside of the contexts of large javascript frameworks. here forever.
deadron@reddit
RabbitMQ provides durable queues and full transaction support. Redis is all in memory. It's not really fair to compare the two. With that said, cloud versions have so few outages it's less of an issue these days.
CVisionIsMyJam@reddit
redis supports persisting to disk as well actually
deadron@reddit
I am not overly familiar with this feature in redis, but durable has to do with guarantees that things won't be lost. I am unsure if this feature really offers that
CVisionIsMyJam@reddit
i meant redis isnt necessarily all in memory, it has persistence options as well. i do agree its somewhat unfair to compare them but i find rabbitmq sits pretty awkwardly between kafka / redpanda and redis utility-wise.
cortex-@reddit
I could see tech like Kubernetes and Redis eventually being replaced with more performant/economical/featureful alternatives, as infrastructure needs and tastes change, while maintaining the original APIs.
CVisionIsMyJam@reddit
possibly but I think its unlikely for k8s. k8s has quick release cycle so keeping compatibility with it would be tough for any rewrites. I do think something more lightweight / stripped down for edge devices has potential and there is development happening in this space, but nothing that's really wow'd me yet.
If we consider technologies being replaced by lighter weight alternatives, docker is already on its way out, with oci-compliant containerd serving as its replacement in many projects now.
redis has a ton of money backing it so it does seem difficult to replace it as well. but I could see it happening. tbh I feel like the redis team added tons of features specifically to make it difficult for small teams to replicate their api.
LightofAngels@reddit
How is redis streams not there yet? I am quite curious 👀
CVisionIsMyJam@reddit
its mostly just stuff like no custom retry logic for messages, no way to do strict ordering of processing out of the box, stream level partitioning is basically left as an exercise to the application developer in comparison to kafka where its out of the box.
love redis streams I just think they could extend it to support kafka clients with a bit of extra dev.
GiannisIsTheBeast@reddit
Interested as well since the product I work on makes use of them. They work for that we need.
Antares987@reddit (OP)
Until Blazor, with the exception of the now defunct Silverlight, I haven't been impressed with a web framework since the rendering model of WebForms, which Microsoft got really right. The event model and how they integrated it with Visual Studio ruined it in my opinion. I can defend my position if someone disagrees, and I believe that I can change others' opinions on this (it's also a blueprint on how to change from WebForms to a newer stack). And, with Blazor, we got WebAssembly and Blazor Server, which I think will pull us away from the large javascript frameworks -- and not even in the .Net sense of WASM, but WASM in general. Yes, I agree that those javascript frameworks are not going anywhere, but not because they're better -- no more than how those of us who did web development during the days of Netscape, IE5 and Chrome existing simultaneously dealt with rendering issues.
I'm not knocking Docker / Kubernetes, but I see those getting out of hand frequently. Probably my disdain for those goes hand-in-hand with premature abstraction and the use of multiple containers where probably one custom container would make more sense in many cases -- in the Elon Musk sense, "delete, delete, delete". And perhaps the issues are more related to there not being a strong push for antidotes to the toxic psychology that causes such environment creep. In Feynman's "Los Alamos from Below" that I posted in the OP, he talks about the "disease" that afflicts people in computing, which I find all too relevant today.
That also brings me to Reddit. Reddit is very much like USENET, which I dearly miss.
And, for a little more nostalgia, two products from the past that I reminisce about how good they were back in the early 90s were a product called "FirstClass" and "Telefinder" https://en.wikipedia.org/wiki/FirstClass and https://en.wikipedia.org/wiki/TeleFinder . In the days of dial-up and Legend of the Red Dragon, both allowed for simultaneous file transfers while doing browsing type activities.
john_rood@reddit
If you compare frameworks by performance, you’ll find that Blazor actually does far worse than most https://krausest.github.io/js-framework-benchmark/2024/table_chrome_130.0.6723.58.html Have you tried any of the smaller frameworks like Svelte or Solid? If you’ve been in .NET world for a while, you might recall Knockout.JS, which used fine-grained reactive updates. Solid and Svelte are similar in that regard.
Antares987@reddit (OP)
Thank you for this. I bookmarked this page. I've only worked with Blazor server for any sort of production application, with the exception of hosting WASM within flash storage on the ESP32. The thing that I like about Blazor server is the SignalR/WebSocket wrapper "just works", and the efficiency without the headers like we have with RESTful services.
I feel a bit sad to see some of my comments downvoted and I tend to get a lot of hate from what I perceive to be developers who did not pay their dues in the same way. At one point I fought like hell with Joe Celko on USENET until I realized he was correct.
Speaking of performance of browser-based technologies, there were no SPA web applications in 2000. The closest thing to the state of the art back then is what eBay is today. I would use hidden IFRAMEs to send requests and use javascript to talk to the container. Here's an old technology demonstrator that I wrote in December of 2000. The site is still live, and has been since 2002, but unfortunately, newer Edge browsers won't run it, even in compatibility mode, which is why I posted the YouTube video. If someone really wanted to see it run, they can do it using a VM running XP or maybe an early build of Windows 7, but it shows that it was possible and this performed on an early Pentium machine. This was an IE5.5 tech demonstrator I wrote at 20 years old. https://www.youtube.com/watch?v=CnH9TVoHc1o
HowTheStoryEnds@reddit
I sort of miss IRC and old fashioned forums. I really dislike the discordisation of technical communities. There's not that much difference to it though so I just ascribe it to me growing old and fat and old and...
hitchdev@reddit
Im pretty sure unit testing frameworks will die off at some point and be supplanted by something that can do separation of concerns right. Currently the state of automated testing is a bit like pre-framework PHP - a mess of duplication and poor abstractions.
deadron@reddit
Its really weird to hear this. Unit Testing frameworks barely do anything as it is, its still relatively new that some of the frameworks support basics like parameterization or use more than a method name to give context to a test. The biggest change I have seen is an emphasis on testing from the ground up in bigger frameworks. This primarily affects integration tests though as unit tests should be running primarily on smaller pieces of code. Its much easier these days to have a test start an entire stack and run a test against it.
Antares987@reddit (OP)
In my experience, testing frameworks are a lot like designing software through flow charts. While there are pieces of business logic that certainly can have their paths and behaviors diagramed, and flow charts might have had some practical uses when designing logic in small memory spaces with logic ICs or in machine language. Flow charts very quickly fall short when dealing with pipelines of set-oriented logic that could be thought of like an optical device has stacks of lenses and prisms. If the software needs a redundant algorithm to verify it's working, instead of writing a test, write a second implementation and have it throw an exception when they disagree, as the amount of concentrated effort for testing could be used for refinement elsewhere in the product.
That last sentence makes a lot of sense. I'll go one further and suggest that the testing framework itself not validate the output, but to what I mentioned in the previous paragraph, have it run the product through automated means and have redundant processing throw an exception when there is disagreement, and have top-level exception handling do the reporting.
Without fail, no matter how much testing is done through automated means or otherwise, there will always be issues that were not predicted. It goes hand-in-hand with something I frequently say: The cost of performing the cost-benefit analysis often exceeds the cost of implementing all proposed approaches.
deadron@reddit
This has not been my experience. In theory a high level integration test framework COULD be like this but testing frameworks like junit, mocha, MSTest, go test, and others just act as a way to execute small bits of code with little to not structure. There are higher level frameworks like Robot but the problem with the high level frameworks is they require a lot of fiddly code to be implemented underneath them and performance is awful.
Antares987@reddit (OP)
I occasionally will use unit tests in certain types of projects to validate a tight algorithm for parsing, bit-banging, et cetera when it can be faster to develop against the test rather than write a console application or script to test the functionality while I can focus clearly on it. If it's a case of "well, I can develop it against a test with the same or less effort than building a small application", then I'll develop against a test.
hitchdev@reddit
Why is it weird? Testing is a mess, people hate doing it, there's so much that could be automated and isnt. There's a mountain of potential improvements here that could be built upon better abstractions.
On top of that I personally feel that documentation and testing ought to undergo a merger. This isnt a widely rejected idea, but it's one I find people struggle to think about (as people in 1993 might have about, say, cloud computing).
yojimbo_beta@reddit
That's an interesting take.
One of the ideas bouncing around my head is a programming language where hexagonal architecture and TDD are front and center. Basically, if you can model your module system to make injecting dependencies a first class citizen, and your type system encourages separating effects from logic, pretty soon you nudge your users into very testable code.
What do you mean by separation of concerns, though, with regards to testing? Do you mean the very blurry lines between different levels of tests and where developer draw the boxes? Or something else?
Antares987@reddit (OP)
One thing I've noticed in younger developers, many of whom are absolutely brilliant, is this forefront of testability driving design. On the surface, it sounds great, but when I see in reality is that dependency injection and inversion of control is at the cost of encapsulation and it turns into code bases that reach a level of complexity to where the developers code themselves into a corner, unable to find anyone capable the project because it requires too much intimate knowledge across the entire system.
I believe strongly in automated testing for component level functionality where there are a large number of components that match a particular interface. An example would be something that does tax calculation for shipping from one address to another. Does the shipping address require taxes to be collected? What about the seller's address, the recipient's address or the address of the buyer? This is a narrow sample, but if one is writing an image processor that performs some sort of color correction, and needs to do it efficiently, or if someone is writing a Sudoku solver, or performing Fourier Transforms on market data, the complexity, development time and costs go up exponentially.
I frequently talk about John Carmack, who I imagine spending late nights hand-tuning his 3D rendering algorithms in assembly to get Wolfenstein 3D to run on the PC processors of the day. I could not imagine what life for him would have felt like if he was hired as a corporate developer in today's environment.
Downtown_Football680@reddit
Postgresql is here to stay.
waffleseggs@reddit
I expected this to be at the very top.
Fast_NotSo_Furious@reddit
AI as we know it today is unsustainable in its use of fresh water. With the growing climate crisis, unless these large corporations plan on building their own desalination plants, I wouldn't be surprised if in the next 2 years it's done due to government intervention and cost.
Impbyte@reddit
You do know that data centers don't consume /destroy water right? They use water to cool things, but it's a closed loop and water isn't contaminated or anything like that.
I have no idea where you heard that we are running out of water either, everything in your post is either ignorance or a result of misinformation.
I really want to drive home the point. Water is never created or destroyed, only moved.
Fresh water is not scarce and never will be.
Water scarcity is on a per geographical region and fluctuates due to weather patterns.
Antares987@reddit (OP)
The ideologues are here quickly to come and downvote your well thought-out post. Those pushing for renewables are also unaware of the thermodynamic inefficiency of on-demand peaker plants or running combined cycle at lower efficiency.
A brief summary of the argument goes like this: combined cycle power plants are roughly 60% efficient with about 50% reaching the fence. The problem is that to reach that level of efficiency it takes a long time and a lot of operational planning. Demand can be forecast to within 1-2%. When demand exceeds the output, low efficiency "peaker" plants must be brought online. Let's think of those as being 25% efficient with everything else being released as heat, just as waste energy from combined cycle that doesn't go to the grid.
While demand can be accurately forecast, the availability of renewable supply cannot, so when the wind stops blowing or the sun stops shining, peaker plants must take of up the slack and/or combined cycle plants must run at lower efficiency so the power is available to come online.
These numbers are less than perfect, but good enough to get the point across. If 50% of your power comes from combined cycle and the other 50% is supposed to come from renewables, and those renewables fall short to where they only produce half of what they're supposed to, then the 50% of the half, or 25% of total power must be produced by peaker plants that have half of of the combined cycle efficiency. In other words, 50% combined cycle and 25% peaker consumes the same amount of fuel as being on 100% combined cycle with no renewables on the grid.
Germany is learning this right now. There is a crossover point with efficiency and renewables where fuel consumption actually goes UP and not down with more renewables going on the grid.
Fast_NotSo_Furious@reddit
Sure man, you continue believing that and I have a bridge to sell you.
CVisionIsMyJam@reddit
This is super nitty, but it's not even close to viable. the growth medium alone costs almost $20kg/kilo. At the moment it's not clear it will ever be viable.
iamiamwhoami@reddit
Also we may be close to being able to grow neurons, but being able to program neurons is still in its infancy.
Antares987@reddit (OP)
I'm under the impression that they program themselves. "Life finds a way."
Abadabadon@reddit
Less than $10/lb for ethically sourced meat doesn't sound bad.
ChimataNoKami@reddit
It’s not ethically anything. The resources to get the growth medium and all the energy inputs far exceeds the strain of traditional farming practices
Abadabadon@reddit
I was speaking more so to the idea of not having to kill sentient animals for our happiness. Not ethically as in "good for the economy".
ChimataNoKami@reddit
I’m not going to argue against a vegan who doesn’t understand evolution
Abadabadon@reddit
if you don't want to argue then don't enter the conversation with an argument bye
No_Technician7058@reddit
its more like $30/lb optimistically
Abadabadon@reddit
How so? Produce has like one of the lowest profit margins
Antares987@reddit (OP)
I was unaware of this, but hearing that it was even a thing got my senses tingling. The idea that AI integrated with test parameters and large quantities of neurons to figure out how to communicate with them -- whether it kills the cells or not -- got my imagination really churning.
Xanchush@reddit
For larger companies there's a slow shift away from tools like Postman and Insomnia due to security concerns. Opting for internal tooling to conduct API calls or testing. I wouldn't be surprised if there's an open source tool similar to swagger/OpenAPI that will be adopted in the future.
Some of these tools masquerade as open source but hide a lot of the functionality behind subscriptions which is unfortunate but understandable.
HTTP404URLNotFound@reddit
C and C++. Considering how much code exists out there including every popular operating system, web browser, and many pieces of core libraries and infrastructure, I don't see C and C++ going out of style without lots of time and money invested in replacing those things. There has been a lot of discussion in various spaces about the need for memory safety with companies investing in Rust, Swift, and other languages, so I think the likely path is more greenfield development happening in those languages and coexisting alongside actively maintained C and C++ codebases.
Antares987@reddit (OP)
I've never been able to wrap my mind around C++, and Linus Torvalds opinion of it warmed my heart. C is just an absolutely amazing language. I remember one coworker at IBM saying "I love C, it'll warn you that you're doing something wrong, but it will let you do it." I conceptualized a stack-based language and compiler while driving a couple decades ago, but my attitude is generally, "nobody will use it, why bother?" when it comes to creating some boutique language. I don't care to learn someone else's language. If I could find a database problem big enough or could find someone willing to put forth the resources, I believe I could develop a database technology that would absolutely crush throughput on the TPCC.
I agree with your opinion. Tight algorithms written in C with proper input validations are the way to go. There will always be vulnerabilities, and oftentimes those vulnerabilities cannot be prevented just in checking buffers and in a language. I predicted the Thunderbolt and virtualization vulnerabilities years before they made the news because I understand DMA and streaming.
If you're old enough, you'll remember back in Windows 95 and earlier when you had a CD-ROM there was a checkbox "Enable DMA Access". At the hardware level, DMA fucks. To explain in simple terms, processors run with a clock and you've got pins that connect to memory. While modern RAM writes in large blocks, let's think of it as standard address and data buses. Easiest way to think of it would be to think of two implementations of memcpy. One iterates through an array and copies byte by byte, using CPU cycles in the loop, and the other gives the hardware an instruction to copy X bytes starting at Y address to Z address, utilizing the clock and hardware to do the copy in the background while your program runs. You then, when you need to access it, just block until the memory copy operation has completed.
However, if you have some form of virtualization running that can execute this instruction and the hardware doesn't fully support protecting the virtualization, you can get around checks in the virtualization engine and overwrite (or read) contents of memory you shouldn't have access to. Where this is super valuable is you could have, say, a camera sensor and RAM and have the clock that's reading the camera sensor memory pipeline the data directly to RAM without having to use costly CPU instructions to perform the operation.
MagicianSuspicious@reddit
SQL databases were here when I started my first job in the early 90s, and they'll be here long after I'm gone.
I think git is likely to be around for a long time -- it does pretty much everything that a source control system needs to do, and it's generally good for the commons to have a de facto standard dvcs. Younger folks would be horrified by the source control systems we used back in the day.
Re "agile development methodologies" and "NoSQL"... I don't even know what exactly either of these terms means anymore. Agile is almost 24 years old, and has had a profound impact, but it's now just a buzzword providing cover for all manner of project management decisions, including those that are obviously in conflict with the manifesto.
I think docker is here to stay for a while. Not sure about k8s.
cortex-@reddit
SQL will never go away. Never ever ever not in a million years. Even if RDBMS are completely rewritten SQL as a notation is going to exist 100 years from now.
OverEggplant3405@reddit
I'm still not convinced that tailwind is all that. I get that it's nice for teams where nobody can write good CSS (and what is good CSS, anyway?). I've seen some horrendous 20k+ line stylesheets.
It's just that people tried inline styles plenty of times. Bootstrap did the utility class thing, and it ends up looking line inline styles, anyway.
I'm backend now. So, not my circus or monkey. I just don't get the point of tailwind, even after trying it out a few times.
im-a-guy-like-me@reddit
I think js meta frameworks are probably a fad.
"Separation of concerns? No, we don't do that here."
kani_kani_katoa@reddit
Tech moves in cycles. People build things using one style (say, monolithic JS frameworks) then after doing that for long enough discover the drawbacks, then some smart young cookie develops a solution to that everyone excitedly switches to, starting the process again.
Unfortunately management techniques seem to only get worse, so I'm sure Agile will be replaced with something else well-intentioned but just as useless at delivering working software.
bluetista1988@reddit
"Agile" as in the Agile Mainfesto, has been more or less lost in modern development IMO.
"Agile" as in the project management frameworks branded as Agile will continue to exist until they decide they don't want to use the word Agile anymore. SAFe (Scaled Agile Framework) seems to be the one I hear about the most now, but there's also Nexus and LeSS.
kani_kani_katoa@reddit
Yeah, the manifesto isn't a thing you can sell, more a set of principles. Which is a shame because they're good principles
Antares987@reddit (OP)
Agile is an effective way to justify the cost to the emperor for his new clothes.