I love the idea that we shouldn't just learn languages, but learn 'Ur-languages.' It explains why switching from Java to Python is easy, but switching to Haskell feels like learning to walk again. Has anyone else felt this 'brain reset' when moving between these families?
Yeah. When I was in highschool I was very into science fiction for the same reason I got into learning lots of different--preferrably very different--programming languages.
Anything that doesn't change the way I think just feels not that interesting.
A good way to think if your goal is to absorb as many concepts as you can. Now that I'm 42 with two children (not yet old enough to really explain all this to) and not much time for, uh, psytrance festivals, I find myself desperately bored much of the time. D:
I think it's kind of an ADHD thing. Or an AuDHD thing.
The original UR programming language was "Proto-Indo Basic", which dates back to around 5000BC. It was a binary language where 1 meant "hit it with a rock" and 0 meant "don't hit it with a rock." It was the dominant language for a couple thousand years, targeting mainly stylus based Clay Tablet devices, and peaked with the development of the Clay Super-Computer.
I don't know if you'd consider it an ur-language. It's definitely different, but it doesn't have any descendants and only exists in one specific market segment. It's effectively a proprietary language as far as meaningful use.
It actually does an interesting job of abstracting away sharding and was doing NoSQL horizontal scaling before it was cool.
The problem is solved now though with the modern cloud, and the existing users, primarily healthcare (Epic), my expectation is it's too costly to be worth replacing.
The problem is solved now though with the modern cloud, and the existing users, primarily healthcare (Epic), my expectation is it's too costly to be worth replacing.
The problem was irrelevant for any medical system in the 1970's, patient databases just aren't that big and most of what size there is is scanned documents that this doesn't help with.
The reason that Epic uses it is because the people who created MUMPS also founded Meditech which is the great granddaddy of medical software and still an extant EHR. Meditech used MUMPS and so basically all of the subsequent systems do.
It had genuine benefits for the large customers. Epic only caters to large organizations, and less than a thousand customers.
In the late 90's and early 2000's large healthcare organizations such as Kaiser needed to either have eventually consistent systems on an RDBMS, which is a patient safety issue if the synchronization fails, so you had slow any buggy systems.
I see it in the same way I see MongoDB: It solved a real problem too early, but requires a paradigm shift in how you build the software, and you can get locked in despite better solutions existing now.
It's always funny for me to see MUMPS, because one of my professors used it in his information storage and retrieval class. I've actually had someone on reddit tell me they used his book as an independent reference for the language.
Conceptually the way it manages data storage was groundbreaking in the context of its development. I can totally understand why your professor used it in that class.
The problem is that the context in which it existed no longer exists and without that context it's one of the worst languages that have actual production code in current usage.
It exists because the guy who invented it was also the pioneer of Clinical Imformation Systems and so most of them use it.
Great post! I didn’t expect SQL to be in the Prolog family and am struggling to see how it fits. I was expecting to see Lean in that list, but I guess I haven’t written Prolog, and maybe Lean fits better in the ML family?
In Prolog, you state rules and facts, and then you can query for some information based on what you've previously stated. SQL is pretty much the same, your rules are your table definitions and foreign keys, and your facts are the rows that you insert into the tables.
Teaching myself prolog was something g I tried to do during the pandemic. I only wrote four small utilities but they all felt like writing a report generators on top of an RDBMS.
I don't think it's quite right. They're both centered around n-place predicates, but Prolog is about finding specific values satisfying a list of propositions, while SQL is about finding all values satisfying a propositional expression. Kind of.
Like he says, definitely learn both.
Lean is kind of like ML? It's based on the lambda calculus, but without recursion and pushing the type system to become much more powerful so programs can be useful proofs. Really, I don't think proof assistants are functional programming languages any more than Lisp is, but I don't think they've been mainstream enough to make it into a blog post like this. In 10 years they probably will.
They definitely have some similarities. For one, they are declarative in that instead of describing the steps to getting what you want like most languages, you describe what the result you want looks like and the figuring it out is done for you.
Implement a small one and it’ll make more sense. I’ve done it twice: once in a single .hpp file (<1000 LoC for a little over 50 words) and a minimal one in C (~400 LoC for a smaller set of primitives) that used gcc’s computed goto to do a threaded interpreter.
I'd argue that RPG (especially RPG and RPG II) represents a class as well. It has a very distinct "program cycle" that represents a pattern seen in a lot of modern systems: the event loop. The event loop wrapped around some fancy filtering and you've got a language that's not ALGOL or Prolog-like at all.
This is actually interesting—feels like a faster way to move from idea → prototype, especially for early validation before committing to full design/dev.
Expanding on the footnote mentioning Forth: I would not bother beginning by learning a particular Forth implementation like gForth, or even caring about ANS Forth. The beauty of stack-based languages are their absolute simplicity. The easiest and most entertaining first step for anyone interested in Forth is to create their own Forth-like language.
Can someone help me figure out the reason APL is its own category? Seems very close to Forth, besides the notation being ordered in the opposite direction. Also shares some similarities with Lisp.
Hearing that one operates on stacks and the other on arrays or lists might make them sound somewhat similar. Perhaps simple examples look similar. But the truth is, those paradigms lead to very very different languages, each with their own uniquely different ways to look at problems.
I can see the reasoning behind ML and Lisp being different families, but honestly I feel like they're so similar that you could just pair them together.
Documented LISP without many macros can end up looking like ML with just more parenthesis. ML feels at times like statically typed LISPs with less parens.
I guess they could be separated as the typing of the languages usually makes a programmer think differently, but the way I was taught LISP was writing it in ML pattern matching style but with comments to outline data definitions.
Self is a proto-style OO language. The article doesn't have a class-style OO language, and these are big differences. Additionally, proto-style languages are trivially and nicely implemented in Lisp, so I would describe Lisp as the ur-language for Self. [That's not unexpected, as it's definitely the ur-language for Smalltalk].
Busy-Breadfruit-1514@reddit
I love the idea that we shouldn't just learn languages, but learn 'Ur-languages.' It explains why switching from Java to Python is easy, but switching to Haskell feels like learning to walk again. Has anyone else felt this 'brain reset' when moving between these families?
programming-ModTeam@reddit
No content written mostly by an LLM. If you don't want to write it, we don't want to read it.
TOGoS@reddit
Yeah. When I was in highschool I was very into science fiction for the same reason I got into learning lots of different--preferrably very different--programming languages.
Anything that doesn't change the way I think just feels not that interesting.
A good way to think if your goal is to absorb as many concepts as you can. Now that I'm 42 with two children (not yet old enough to really explain all this to) and not much time for, uh, psytrance festivals, I find myself desperately bored much of the time. D:
I think it's kind of an ADHD thing. Or an AuDHD thing.
Dean_Roddey@reddit
The original UR programming language was "Proto-Indo Basic", which dates back to around 5000BC. It was a binary language where 1 meant "hit it with a rock" and 0 meant "don't hit it with a rock." It was the dominant language for a couple thousand years, targeting mainly stylus based Clay Tablet devices, and peaked with the development of the Clay Super-Computer.
thmprover@reddit
I thought "CaML" referred to ML targetting CAM [Categorical Abstract Machine] bytecode, not "Cambridge ML".
I am also surprised Fortran (created 1956) was inspired by ALGOL (created 1958).
igouy@reddit
Yes, that is strange --
"Fortran: The world’s first programming language standard opened the door to modern computing"
iIoveoof@reddit
What about MUMPS?
recycled_ideas@reddit
I don't know if you'd consider it an ur-language. It's definitely different, but it doesn't have any descendants and only exists in one specific market segment. It's effectively a proprietary language as far as meaningful use.
LaconicLacedaemonian@reddit
It actually does an interesting job of abstracting away sharding and was doing NoSQL horizontal scaling before it was cool.
The problem is solved now though with the modern cloud, and the existing users, primarily healthcare (Epic), my expectation is it's too costly to be worth replacing.
recycled_ideas@reddit
The problem was irrelevant for any medical system in the 1970's, patient databases just aren't that big and most of what size there is is scanned documents that this doesn't help with.
The reason that Epic uses it is because the people who created MUMPS also founded Meditech which is the great granddaddy of medical software and still an extant EHR. Meditech used MUMPS and so basically all of the subsequent systems do.
LaconicLacedaemonian@reddit
It had genuine benefits for the large customers. Epic only caters to large organizations, and less than a thousand customers.
In the late 90's and early 2000's large healthcare organizations such as Kaiser needed to either have eventually consistent systems on an RDBMS, which is a patient safety issue if the synchronization fails, so you had slow any buggy systems.
I see it in the same way I see MongoDB: It solved a real problem too early, but requires a paradigm shift in how you build the software, and you can get locked in despite better solutions existing now.
Xipher@reddit
It's always funny for me to see MUMPS, because one of my professors used it in his information storage and retrieval class. I've actually had someone on reddit tell me they used his book as an independent reference for the language.
https://www.cs.uni.edu/~okane/
recycled_ideas@reddit
MUMPS is a weird language.
Conceptually the way it manages data storage was groundbreaking in the context of its development. I can totally understand why your professor used it in that class.
The problem is that the context in which it existed no longer exists and without that context it's one of the worst languages that have actual production code in current usage.
It exists because the guy who invented it was also the pioneer of Clinical Imformation Systems and so most of them use it.
transfire@reddit
What about it?
timoffex@reddit
Great post! I didn’t expect SQL to be in the Prolog family and am struggling to see how it fits. I was expecting to see Lean in that list, but I guess I haven’t written Prolog, and maybe Lean fits better in the ML family?
EntroperZero@reddit
In Prolog, you state rules and facts, and then you can query for some information based on what you've previously stated. SQL is pretty much the same, your rules are your table definitions and foreign keys, and your facts are the rows that you insert into the tables.
fragbot2@reddit
Teaching myself prolog was something g I tried to do during the pandemic. I only wrote four small utilities but they all felt like writing a report generators on top of an RDBMS.
jonathancast@reddit
I don't think it's quite right. They're both centered around n-place predicates, but Prolog is about finding specific values satisfying a list of propositions, while SQL is about finding all values satisfying a propositional expression. Kind of.
Like he says, definitely learn both.
Lean is kind of like ML? It's based on the lambda calculus, but without recursion and pushing the type system to become much more powerful so programs can be useful proofs. Really, I don't think proof assistants are functional programming languages any more than Lisp is, but I don't think they've been mainstream enough to make it into a blog post like this. In 10 years they probably will.
LolThatsNotTrue@reddit
Lean doesn’t have recursion? How’s that possible?
remy_porter@reddit
Both Prolog and SQL are 4GLs where you specify the result you want and leave the execution to the underlying engine.
TinStingray@reddit
They definitely have some similarities. For one, they are declarative in that instead of describing the steps to getting what you want like most languages, you describe what the result you want looks like and the figuring it out is done for you.
transfire@reddit
What is fascinating is how many of them are directly forth coming from their chosen core data structure.
Lisp - List Forth - Stack APL - Array Prolog - Tuples + Conditions ML - Functions (Erlang: Functions+Message) Self - Message*+(Typed) Maps Algol - Pointers (general data structures)
*(Most descendants are Function+Typed Maps and still Algol like in many respects.)
seanluke@reddit
While Lisp is well known for the singly linked list, this is not at all the defining feature of Lisp, nor was it since the beginning.
beebeeep@reddit
Erlang is better than ur lang!
UnmaintainedDonkey@reddit
UrMoM!
Full-Spectral@reddit
No, u.r.
fgorina@reddit
I like the article. And the recommendation of an Algol like language and sql is sound.
rlbond86@reddit
Forth is the most confusing language I've ever encountered. My brain just doesn't understand
fragbot2@reddit
Implement a small one and it’ll make more sense. I’ve done it twice: once in a single .hpp file (<1000 LoC for a little over 50 words) and a minimal one in C (~400 LoC for a smaller set of primitives) that used gcc’s computed goto to do a threaded interpreter.
It was fun.
Complete_Instance_18@reddit
Interesting topic! Thinking about these foundational languages makes me wonder
clintp@reddit
I'd argue that RPG (especially RPG and RPG II) represents a class as well. It has a very distinct "program cycle" that represents a pattern seen in a lot of modern systems: the event loop. The event loop wrapped around some fancy filtering and you've got a language that's not ALGOL or Prolog-like at all.
Gullible-Recipe4484@reddit
This is actually interesting—feels like a faster way to move from idea → prototype, especially for early validation before committing to full design/dev.
LIGHTNINGBOLT23@reddit
Expanding on the footnote mentioning Forth: I would not bother beginning by learning a particular Forth implementation like gForth, or even caring about ANS Forth. The beauty of stack-based languages are their absolute simplicity. The easiest and most entertaining first step for anyone interested in Forth is to create their own Forth-like language.
Frolo_NA@reddit
i would have picked smalltalk over or in addition to self because of the class system.
T_D_K@reddit
Can someone help me figure out the reason APL is its own category? Seems very close to Forth, besides the notation being ordered in the opposite direction. Also shares some similarities with Lisp.
Valuable_Leopard_799@reddit
They're very different...
Hearing that one operates on stacks and the other on arrays or lists might make them sound somewhat similar. Perhaps simple examples look similar. But the truth is, those paradigms lead to very very different languages, each with their own uniquely different ways to look at problems.
AutomaticBuy2168@reddit
I can see the reasoning behind ML and Lisp being different families, but honestly I feel like they're so similar that you could just pair them together.
Documented LISP without many macros can end up looking like ML with just more parenthesis. ML feels at times like statically typed LISPs with less parens.
I guess they could be separated as the typing of the languages usually makes a programmer think differently, but the way I was taught LISP was writing it in ML pattern matching style but with comments to outline data definitions.
seanluke@reddit
Self is a proto-style OO language. The article doesn't have a class-style OO language, and these are big differences. Additionally, proto-style languages are trivially and nicely implemented in Lisp, so I would describe Lisp as the ur-language for Self. [That's not unexpected, as it's definitely the ur-language for Smalltalk].
RealitySwitch@reddit
The section on ML incorrectly states it was originally developed in Cambridge, England but it was actually Edinburgh in Scotland.