Which present-day technologies are here to stay because they're good, which will stay because of switching costs, and which will go?
Posted by Antares987@reddit | ExperiencedDevs | View on Reddit | 25 comments
This is tangentially related to this topic: https://www.reddit.com/r/ExperiencedDevs/comments/1gqh5yk/what_are_some_past_fad_fields_of_computer_science/
We've all seen things come and go. Sometimes the best technologies don't last as it is natural to deviate toward the mean -- and things that are awesome often miss adoption, developers lack the ability to market or support, better ideas are met with opposition due to the culture of the organization. The old accountants I know talk about how superior Lotus was to Excel. Delphi was clearly superior to Visual Basic, and we are fortunate that Microsoft sniped Anders from Borland. I spent a few days with Geoff Lee who is the Lee in Britton Lee for transition training in one of my airplanes years ago. Sybase came from his team, which is now SQL Server.
The Mythical Man-Month was every bit as true in 1975 as it is today. The IBM studies from the 1970s that were cited in this 1985 advertisement for the SGi Iris workstation hold every bit as true today as they did back then: https://www.youtube.com/watch?v=9EEY87HAHzk . And Jim Clark could have been what Elon Musk is if he worked a little harder, though in those days, he might have become another John DeLorean as flying to close to the sun back then was a real problem. Fortunately, the people supporting the sun of those days are now retired or dead.
Richard Feynman was one of the first to work with computers at Los Alamos and what he talks about in his lecture "Los Alamos from Below" also holds every bit as true today as he observed in the 1940s. https://www.youtube.com/watch?v=uY-u1qyRM5w
I remember reading an anecdote from (I believe) Bill Atkinson where he talked about Apple engineers drawing networks as point-to-point meshes and felt like he missed the boat on what Clark picked up on when he left SGi to found Netscape with Marc Andreessen. The idea of backbones and trunks like Sun MicroSystems used were not how they think. He was on the right track with HyperCard and the HyperCard home page evolved into what the iPhone home screen is today.
[HyperCard in 1986](
The "Phone" "stack" (stacks were like iPhone apps are today) would open up a dialer and would use the modem where your desk phone would be plugged into it and pass through to your phone line. Some of us still had PULSE dialing back then for rotary dial phones. If unfamiliar with the pain of rotary phones and rolodexes, here's Louis CK explaining what it was like to actually use a rotary dial phone: https://www.youtube.com/watch?v=WQyF099TKM8
We all hear about how Xerox gave up the kitchen sink when he allowed the team from Apple to tour their facility, but perhaps it would have gone nowhere, as happens all too often, and those engineers from Xerox should not have felt like they gave up the keys to the castle, but rather, found a way to be happy to see what they seeded.
Jack Slocum changed the world with ExtJS, and that brought about the monolithic javascript frameworks that are now pervasive. I had lunch with him a few years ago and when I told him what I didn't like about the newer versions, he disclosed that was when he left the company. I could not have given a developer a better compliment than to have called out his work versus those that followed, like in Boondock Saints, "Good shooting, shitty shooting."
The topic that I mentioned at the start of this post called out noSQL.
As of right now, I'm thinking that silicon-based AI is going to be short-lived. Florida has banned lab grown meat, which means that it was, or was approaching enough commercial viability, and being so for long enough to have made it through the legislative process here in Florida. The question that I ask is "if we can grow meat, we are growing cells, and we can grow neurons." The thermodynamic efficiency of neurons is on the order of 10**9 of silicon. And the original story of The Matrix had the meaning of "Fate, it seems, is not without a sense of irony" having us used not for energy, but for computational power, though "a penny saved is a penny earned" and the net result of efficient biological computing would be the same as using us for energy.
But on the more common things, I believe that there's a lot of hoopla around cloud-based microservice "architecture" (I hate the terms architecture and engineering when it comes to tech). I feel that what happened with that is that it worked for Amazon, and they started allowing organizations to use it. Microsoft saw the cash cow that it is and got in on it. It appealed to developers and large organizations because it could be used to remove the political barriers created by network admin departments, and the "microservice" nature of things also seems to match the paradigm of using small simple applications in Unix.
The "tell" for this is that when I got my SQL Server certifications decades ago, those who made the exams put a lot of effort into ensuring that people know how to efficiently load data and tune indexes for performance. BULK INSERT and bcp still scream, and there are tricks to really make them perform. Most bright younger developers I interact with have never heard of these techniques and know little about how data is stored. MS pushes SSIS. Why? Well, when running shit on the cloud, if it runs like shit, customers simply scale up infrastructure, often at great expense.
So, now, with all that prefaced, here's my take: Agile development methodologies, Docker, Kubernetes, NoSQL (as mentioned earlier), Redis, RabbitMQ, large javascript frameworks, et cetera, are not long for this world.
Fast_NotSo_Furious@reddit
AI as we know it today is unsustainable in its use of fresh water. With the growing climate crisis, unless these large corporations plan on building their own desalination plants, I wouldn't be surprised if in the next 2 years it's done due to government intervention and cost.
Impbyte@reddit
You do know that data centers don't consume /destroy water right? They use water to cool things, but it's a closed loop and water isn't contaminated or anything like that.
I have no idea where you heard that we are running out of water either, everything in your post is either ignorance or a result of misinformation.
I really want to drive home the point. Water is never created or destroyed, only moved.
Fresh water is not scarce and never will be.
Water scarcity is on a per geographical region and fluctuates due to weather patterns.
Fast_NotSo_Furious@reddit
Sure man, you continue believing that and I have a bridge to sell you.
CVisionIsMyJam@reddit
This is super nitty, but it's not even close to viable. the growth medium alone costs almost $20kg/kilo. At the moment it's not clear it will ever be viable.
Abadabadon@reddit
Less than $10/lb for ethically sourced meat doesn't sound bad.
ChimataNoKami@reddit
It’s not ethically anything. The resources to get the growth medium and all the energy inputs far exceeds the strain of traditional farming practices
Abadabadon@reddit
I was speaking more so to the idea of not having to kill sentient animals for our happiness. Not ethically as in "good for the economy".
ChimataNoKami@reddit
I’m not going to argue against a vegan who doesn’t understand evolution
Abadabadon@reddit
if you don't want to argue then don't enter the conversation with an argument bye
No_Technician7058@reddit
its more like $30/lb optimistically
Abadabadon@reddit
How so? Produce has like one of the lowest profit margins
iamiamwhoami@reddit
Also we may be close to being able to grow neurons, but being able to program neurons is still in its infancy.
Antares987@reddit (OP)
I was unaware of this, but hearing that it was even a thing got my senses tingling. The idea that AI integrated with test parameters and large quantities of neurons to figure out how to communicate with them -- whether it kills the cells or not -- got my imagination really churning.
CVisionIsMyJam@reddit
here are my opinions.
some version of this here to stay, but "scrum master" as a full-time position that people would be hired for and do nothing besides facilitating agile is gone forever.
linux namespaces & docker are here forever. way too much money tied up in this now for it ever to go away. containerization is a useful tool that likely hasn't been fully exploited yet, dev-containers are still a relatively new thing, for example.
here forever. its a IaaS agnostic deployment target. anyone building enterprise versions of their SaaS products are inevitably for docker container(s) & helm chart(s) as the point of hand-off.
NoSQL is a very broad term, so its hard to address. MongoDB-like NoSQL will remain popular among tutorial writers as well as niche use-cases. I haven't heard of people talk about using these kinds of db as primary data-stores since 2018 or so.
NoSQL which focus on things performant full-text, graph, etc, will continue to be used where rdmbs is not sufficient for one reason or another.
Here forever; its simple, easy to use and does the trick for a lot of simple caching use-cases. I think if they were to make a shim to allow clients to treat redis as if it was a kafka cluster it could suppliant kafka as well. "redis streams" isn't quite there.
could see this dying off. its not at the top of any benchmarks and IMO there are better options these days; redpanda & redis to name two.
there are a large number of programmers that do not know of any kind of development outside of the contexts of large javascript frameworks. here forever.
Antares987@reddit (OP)
Until Blazor, with the exception of the now defunct Silverlight, I haven't been impressed with a web framework since the rendering model of WebForms, which Microsoft got really right. The event model and how they integrated it with Visual Studio ruined it in my opinion. I can defend my position if someone disagrees, and I believe that I can change others' opinions on this (it's also a blueprint on how to change from WebForms to a newer stack). And, with Blazor, we got WebAssembly and Blazor Server, which I think will pull us away from the large javascript frameworks -- and not even in the .Net sense of WASM, but WASM in general. Yes, I agree that those javascript frameworks are not going anywhere, but not because they're better -- no more than how those of us who did web development during the days of Netscape, IE5 and Chrome existing simultaneously dealt with rendering issues.
I'm not knocking Docker / Kubernetes, but I see those getting out of hand frequently. Probably my disdain for those goes hand-in-hand with premature abstraction and the use of multiple containers where probably one custom container would make more sense in many cases -- in the Elon Musk sense, "delete, delete, delete". And perhaps the issues are more related to there not being a strong push for antidotes to the toxic psychology that causes such environment creep. In Feynman's "Los Alamos from Below" that I posted in the OP, he talks about the "disease" that afflicts people in computing, which I find all too relevant today.
That also brings me to Reddit. Reddit is very much like USENET, which I dearly miss.
And, for a little more nostalgia, two products from the past that I reminisce about how good they were back in the early 90s were a product called "FirstClass" and "Telefinder" https://en.wikipedia.org/wiki/FirstClass and https://en.wikipedia.org/wiki/TeleFinder . In the days of dial-up and Legend of the Red Dragon, both allowed for simultaneous file transfers while doing browsing type activities.
john_rood@reddit
If you compare frameworks by performance, you’ll find that Blazor actually does far worse than most https://krausest.github.io/js-framework-benchmark/2024/table_chrome_130.0.6723.58.html Have you tried any of the smaller frameworks like Svelte or Solid? If you’ve been in .NET world for a while, you might recall Knockout.JS, which used fine-grained reactive updates. Solid and Svelte are similar in that regard.
Antares987@reddit (OP)
Thank you for this. I bookmarked this page. I've only worked with Blazor server for any sort of production application, with the exception of hosting WASM within flash storage on the ESP32. The thing that I like about Blazor server is the SignalR/WebSocket wrapper "just works", and the efficiency without the headers like we have with RESTful services.
I feel a bit sad to see some of my comments downvoted and I tend to get a lot of hate from what I perceive to be developers who did not pay their dues in the same way. At one point I fought like hell with Joe Celko on USENET until I realized he was correct.
Speaking of performance of browser-based technologies, there were no SPA web applications in 2000. The closest thing to the state of the art back then is what eBay is today. I would use hidden IFRAMEs to send requests and use javascript to talk to the container. Here's an old technology demonstrator that I wrote in December of 2000. The site is still live, and has been since 2002, but unfortunately, newer Edge browsers won't run it, even in compatibility mode, which is why I posted the YouTube video. If someone really wanted to see it run, they can do it using a VM running XP or maybe an early build of Windows 7, but it shows that it was possible and this performed on an early Pentium machine. This was an IE5.5 tech demonstrator I wrote at 20 years old. https://www.youtube.com/watch?v=CnH9TVoHc1o
HowTheStoryEnds@reddit
I sort of miss IRC and old fashioned forums. I really dislike the discordisation of technical communities. There's not that much difference to it though so I just ascribe it to me growing old and fat and old and...
kani_kani_katoa@reddit
Tech moves in cycles. People build things using one style (say, monolithic JS frameworks) then after doing that for long enough discover the drawbacks, then some smart young cookie develops a solution to that everyone excitedly switches to, starting the process again.
Unfortunately management techniques seem to only get worse, so I'm sure Agile will be replaced with something else well-intentioned but just as useless at delivering working software.
bluetista1988@reddit
"Agile" as in the Agile Mainfesto, has been more or less lost in modern development IMO.
"Agile" as in the project management frameworks branded as Agile will continue to exist until they decide they don't want to use the word Agile anymore. SAFe (Scaled Agile Framework) seems to be the one I hear about the most now, but there's also Nexus and LeSS.
kani_kani_katoa@reddit
Yeah, the manifesto isn't a thing you can sell, more a set of principles. Which is a shame because they're good principles
Antares987@reddit (OP)
Agile is an effective way to justify the cost to the emperor for his new clothes.
hitchdev@reddit
Im pretty sure unit testing frameworks will die off at some point and be supplanted by something that can do separation of concerns right. Currently the state of automated testing is a bit like pre-framework PHP - a mess of duplication and poor abstractions.
yojimbo_beta@reddit
That's an interesting take.
One of the ideas bouncing around my head is a programming language where hexagonal architecture and TDD are front and center. Basically, if you can model your module system to make injecting dependencies a first class citizen, and your type system encourages separating effects from logic, pretty soon you nudge your users into very testable code.
What do you mean by separation of concerns, though, with regards to testing? Do you mean the very blurry lines between different levels of tests and where developer draw the boxes? Or something else?
Antares987@reddit (OP)
One thing I've noticed in younger developers, many of whom are absolutely brilliant, is this forefront of testability driving design. On the surface, it sounds great, but when I see in reality is that dependency injection and inversion of control is at the cost of encapsulation and it turns into code bases that reach a level of complexity to where the developers code themselves into a corner, unable to find anyone capable the project because it requires too much intimate knowledge across the entire system.
I believe strongly in automated testing for component level functionality where there are a large number of components that match a particular interface. An example would be something that does tax calculation for shipping from one address to another. Does the shipping address require taxes to be collected? What about the seller's address, the recipient's address or the address of the buyer? This is a narrow sample, but if one is writing an image processor that performs some sort of color correction, and needs to do it efficiently, or if someone is writing a Sudoku solver, or performing Fourier Transforms on market data, the complexity, development time and costs go up exponentially.
I frequently talk about John Carmack, who I imagine spending late nights hand-tuning his 3D rendering algorithms in assembly to get Wolfenstein 3D to run on the PC processors of the day. I could not imagine what life for him would have felt like if he was hired as a corporate developer in today's environment.