Which present-day technologies are here to stay because they're good, which will stay because of switching costs, and which will go?

Posted by Antares987@reddit | ExperiencedDevs | View on Reddit | 25 comments

This is tangentially related to this topic: https://www.reddit.com/r/ExperiencedDevs/comments/1gqh5yk/what_are_some_past_fad_fields_of_computer_science/

We've all seen things come and go. Sometimes the best technologies don't last as it is natural to deviate toward the mean -- and things that are awesome often miss adoption, developers lack the ability to market or support, better ideas are met with opposition due to the culture of the organization. The old accountants I know talk about how superior Lotus was to Excel. Delphi was clearly superior to Visual Basic, and we are fortunate that Microsoft sniped Anders from Borland. I spent a few days with Geoff Lee who is the Lee in Britton Lee for transition training in one of my airplanes years ago. Sybase came from his team, which is now SQL Server.

The Mythical Man-Month was every bit as true in 1975 as it is today. The IBM studies from the 1970s that were cited in this 1985 advertisement for the SGi Iris workstation hold every bit as true today as they did back then: https://www.youtube.com/watch?v=9EEY87HAHzk . And Jim Clark could have been what Elon Musk is if he worked a little harder, though in those days, he might have become another John DeLorean as flying to close to the sun back then was a real problem. Fortunately, the people supporting the sun of those days are now retired or dead.

Richard Feynman was one of the first to work with computers at Los Alamos and what he talks about in his lecture "Los Alamos from Below" also holds every bit as true today as he observed in the 1940s. https://www.youtube.com/watch?v=uY-u1qyRM5w

I remember reading an anecdote from (I believe) Bill Atkinson where he talked about Apple engineers drawing networks as point-to-point meshes and felt like he missed the boat on what Clark picked up on when he left SGi to found Netscape with Marc Andreessen. The idea of backbones and trunks like Sun MicroSystems used were not how they think. He was on the right track with HyperCard and the HyperCard home page evolved into what the iPhone home screen is today.

[HyperCard in 1986](

The "Phone" "stack" (stacks were like iPhone apps are today) would open up a dialer and would use the modem where your desk phone would be plugged into it and pass through to your phone line. Some of us still had PULSE dialing back then for rotary dial phones. If unfamiliar with the pain of rotary phones and rolodexes, here's Louis CK explaining what it was like to actually use a rotary dial phone: https://www.youtube.com/watch?v=WQyF099TKM8

We all hear about how Xerox gave up the kitchen sink when he allowed the team from Apple to tour their facility, but perhaps it would have gone nowhere, as happens all too often, and those engineers from Xerox should not have felt like they gave up the keys to the castle, but rather, found a way to be happy to see what they seeded.

Jack Slocum changed the world with ExtJS, and that brought about the monolithic javascript frameworks that are now pervasive. I had lunch with him a few years ago and when I told him what I didn't like about the newer versions, he disclosed that was when he left the company. I could not have given a developer a better compliment than to have called out his work versus those that followed, like in Boondock Saints, "Good shooting, shitty shooting."

The topic that I mentioned at the start of this post called out noSQL.

As of right now, I'm thinking that silicon-based AI is going to be short-lived. Florida has banned lab grown meat, which means that it was, or was approaching enough commercial viability, and being so for long enough to have made it through the legislative process here in Florida. The question that I ask is "if we can grow meat, we are growing cells, and we can grow neurons." The thermodynamic efficiency of neurons is on the order of 10**9 of silicon. And the original story of The Matrix had the meaning of "Fate, it seems, is not without a sense of irony" having us used not for energy, but for computational power, though "a penny saved is a penny earned" and the net result of efficient biological computing would be the same as using us for energy.

But on the more common things, I believe that there's a lot of hoopla around cloud-based microservice "architecture" (I hate the terms architecture and engineering when it comes to tech). I feel that what happened with that is that it worked for Amazon, and they started allowing organizations to use it. Microsoft saw the cash cow that it is and got in on it. It appealed to developers and large organizations because it could be used to remove the political barriers created by network admin departments, and the "microservice" nature of things also seems to match the paradigm of using small simple applications in Unix.

The "tell" for this is that when I got my SQL Server certifications decades ago, those who made the exams put a lot of effort into ensuring that people know how to efficiently load data and tune indexes for performance. BULK INSERT and bcp still scream, and there are tricks to really make them perform. Most bright younger developers I interact with have never heard of these techniques and know little about how data is stored. MS pushes SSIS. Why? Well, when running shit on the cloud, if it runs like shit, customers simply scale up infrastructure, often at great expense.

So, now, with all that prefaced, here's my take: Agile development methodologies, Docker, Kubernetes, NoSQL (as mentioned earlier), Redis, RabbitMQ, large javascript frameworks, et cetera, are not long for this world.