The Quiet Colossus — On Ada, Its Design, and the Language That Built the Languages
Posted by SpecialistLady@reddit | programming | View on Reddit | 60 comments
graydon2@reddit
Saying Rust ignored or somehow wasn't directly influenced by Ada is very silly. I literally have (and had back when bringing up rustboot) a 1979 original copy of Ichbiah's rationale on the bookshelf behind me (along with the reference manual). I studied it extensively! And have publicly acknowledged this over and over (along with lots of other good languages). The early Rust team even had Tucker Taft come by the office once to advise us. Ada was absolutely one of our role-model languages.
iOCTAGRAM@reddit
Strange that did not manifest anyhow. No clear separation into specification and body. No calligraphic Ada syntax derived from Pascal.
Range checks do not raise exceptions, and catching exceptions is not ordinary programming style in Rust. Ada had for decades criticized for Ariane 5 fault, and on Ariane 5 exceptions were replaced by panic. Rust commits a crime of reintroducing panic and gets away with that. Almost every single time we stumbled on the Internet with somebody not familiar with Ada, they were telling "oh, your Ariane 5 blown up". How can I ask all those gentlemen spend at least equal amount of effort into attacking Rust for blowing up like Ariane 5? Until Rust would introduce normally accessible exceptions and stop blowing up.
Pascal has set base minimum for systems programming languages. In Pascal it is possible to declare enumerated type and use it wherever ordinal type is accepted. Index an array and for loop. Ada has inherited this base minimum and enhanced by records with discriminants. In Rust I can see nothing but mess. Rust enumeration type cannot be used as neither array index nor for loop. Also, Rust for some unknown reason tangled records with discriminats and custom enumeration types in a way that cannot be untangled. Yeah, record with discriminant is a popular use of enumeration type, but everything becomes stupid if they come in inseparatable pack. Rust does not deliver base minimum. Wirth's minimum.
We are having hard times seeing Ada influence in Rust.
graydon2@reddit
Rust's error system basically shipped incomplete, it wasn't what I wanted. The enumeration / ordinal distinction is by choice -- there are very few implicit coercions in Rust.
I didn't mean to say Rust copied a lot from Ada! Just that I did study it in a fair amount of detail. Rust is its own language and it's definitely not _very_ Ada-like in its current form, despite trying to compete in some similar domains. I guess I mean: I knew Ada, I liked Ada, I always wanted Rust to be able to offer some of the things Ada offers (safety, resource control, good defaults) along with all the other elements blended into the project. I talked to people who worked _with_ Ada before starting Rust and I talked to people who worked _on_ Ada during it. Ada was never far from my thoughts!
The biggest influences might be in places you're not looking: the pragma system for example, or limited types, or the `in out` mode and parameter modes in general, or integrated tasking and rendezvous. Note that a lot of these were later removed; Rust went through a _lot_ of revision and redesign after my initial implementation. You have to look at the early versions to see the similarity (but it's literally in the notes, see eg. https://github.com/graydon/rust-prehistory/blob/master/doc/notes/types.txt#L9-L19).
Also, in general, I found the Ada rationale book extremely lucid and balanced, one of the best documents of its kind. Finally, I was inspired by the way the Ada spec and conformance testsuite were put together and kept trying to organize our team to work that way. Eventually -- years later! -- it seems some combination of efforts by Ferrous and AdaCore did actually put together a proper spec that, I like to think, has some of the Ada spec's fingerprints on it.
iOCTAGRAM@reddit
What do you mean by "coercion"? I was not talking about coercions.
Well, maybe. That needs checking. On conventional platforms difference is subtle. Difference can be seen on WebAssembly: in out parameter can be become combination of in parameter and part of multi-result, to only use non-addressable WebAssembly stack. When I see contatenation operator (&) replacing "in out", that looks like another family of languages, languages that enforce addressability.
Dean_Roddey@reddit
Iteration is Rust is based on an iterator trait. If your enum implements that trait, you can iterate over it. You can iterate over anything that implements the iterator trait. Rust is fundamentally trait based in terms of the interface between the compiler and user types wrt to core functionality.
iOCTAGRAM@reddit
Since I am more close to Delphi last years, and Delphi also delivers base minimum, I would speak in terms of Delphi Spring. It has logging, and TLogLevel enumeration type. Rust has similar log::Level.
How do we count each level separately? In Delphi I would declare
And then I do
for every event. In the end I will need to print how much of each event, and I will make for loop over TLogLevel index variable to print every log level counter.
How would I do the same in Rust. You say
I dig through log::Level documentation. I cannot see an Enumerable trait. Or Iterable trait. I have found something though. Function "iter" that is introduced not by Enumerable trait or Iterable trait as I would expect, but by Level itself. Ok, local strangeness. Tolerable.
So now we can iterate over log levels. How to make an array indexed by log::Level? You say
I cannot see Index trait in log::Level. Does it mean
So counting different log levels is not intended by log library designer? I don't recall such thing ever be a problem in Delphi and Ada. I can count everything.
It does not matter how smart or not smart it would be to count action list states, this can be done. This is base minimum.
ts826848@reddit
You just need to tell the compiler how to interpret your enum as an index by implementing the appropriate traits:
iOCTAGRAM@reddit
Is it protected against accidental passing raw integer as index?
ts826848@reddit
Yes for that particular
incfunction since integers aren't implicitly convertable to enums. No for the built-in array type in general, as far as I know. You'd need to wrap the array type if you want to allow just the enum type to be used as an index:Though if you're going through the trouble of wrapping an array making
inca method lets you skip implementingIndex[Mut], assuming that's something you want, of course. For example:Dean_Roddey@reddit
I said YOUR enums. The creator of a library decides whether these capabilities are available to client code. They may choose to or not.
iOCTAGRAM@reddit
If log::Level is my enum, how to count them? array[TLogLevel] of Integer and Inc(Counter[Event.Level]), how is same in Rust?
Dean_Roddey@reddit
Well, if it's your enum, you have lots of options. Enums are first class citizens in Rust, so you can implement methods for them just like any other type. You just implement an Inc method that returns Some(nextval) until it hits the max and then returns None.
I have my own code generator that generates a lot of magical enum support, including bit set and array type support. But, if you wanted to generically create an array of such things, it would apparently be something like this, to create an array of SomeType's with one slot per log level.
I didn't actually try it to make sure it works, since I don't need it. But apparently that's it.
Of course, since enums are first class citizens, you could just declare a public const value for the enum that maps to that 'count of values' value, which would then become:
or some such.
iOCTAGRAM@reddit
I do not quite get the part where Rust array starts accepting enumeration value as index.
Last time I checked Rust, its arrays were retarded, something from before Pascal era, something before 1971. Rust arrays were only accepting integers and only 0-based. Something that makes wonder where exactly did Ada influenced Rust. Cannot pinpoint a single good thing from Ada that came to Rust.
Dean_Roddey@reddit
Someone beat me to it above. As usual, traits...
FreeHare7411@reddit
Ada? Never used it professionally. But I messed around with it in college a bit. It's kinda verbose isn't it? Felt like I was writing a novel to get anything done. Hear it's rock solid for safety critical stuff tho. Always thought that was pretty cool.
Quick-Frog3019@reddit
Ada's kinda wild. Used it briefly for a safety-critical system. It's verbose, sure, but you kinda appreciate the strictness when you're dealing with stuff where mistakes are bad. Don't see it much outside that niche tho, which is a shame I guess.
daidoji70@reddit
What a good article. I have never met an Ada programmer in the wild though. Do they exist somewhere?
gwern@reddit
Maybe. Personally, I saw 'quiet' in the title, and immediately plugged it into Pangram without bothering to read; yes, 100% AI.
Scroph@reddit
Yes it has many of the typical tell tales of the AI writing style
dagbrown@reddit
lol look at this idiot using a clanker to do his thinking for him
programming-ModTeam@reddit
Your post or comment was removed for the following reason or reasons:
Your post or comment was overly uncivil.
drakythe@reddit
Even if there aren’t that many dedicated Ada developers, it seems the article is making an argument that it intended most modern languages and the evolution of others. It’s right there in the subhead: “On Ada, the language that the Department of Defense built, the industry ignored, and every modern language quietly became”
I find the design process that the DoD engaged in to be the most interested part of the article. It’s wild to think of spending five years slowly ironing out the requirements for a programming language and then having someone build the damn thing. I know that’s a symptom of the current “move fast and break things” world we live in but I really want to know how thy process played out in practice.
KagakuNinja@reddit
The only time in my life I have ever experienced true waterfall project management was in the military. Our small team had 2 people who mostly just handled the bureaucratic bullshit.
The first problem with the Ada specification was that it was extremely difficult to create a working compiler. Maybe modern technology makes this much easier. As a result, compilers were proprietary and expensive. If I remember correctly, the Ada license for our Microvax running VMS was $40,000 in 1988, or about $10,000 programmer.
The article casts shade on C, but that is the classic example of "worse is better" design, or if you prefer, "cathedral vs bazaar". C was built quickly by a small team that accomplished amazing things; they just needed a language so they could create UNIX. Ada was an unwieldy monolith designed by committee which took years just to create a working compiler.
To paraphrase Alan Kay, one of the problems of the "cathedral" design philosophy is that it takes so long to build a complete and correct system, by the time you finish, Moore's law has rendered the underlying computing model of the system obsolete. This may be part of the reason Ada never thrived outside of certain niche industries.
jodonoghue@reddit
I agree. The cost of compilers (and the machines to run them) made Ada prohibitively expensive for most projects until Gnat matured in the late 1990s.
By that time, for better or (most likely) worse, C and C++ were fully established as the languages to use for low-level work, and Linux was an entirely usable OS for many.
mpyne@reddit
No, it was for the better. We can disagree with the logic all we like but having an imperfect-but-present thing now is almost always better than being perennially just 5 years away from perfection.
Fusion power was going to solve everything in just a couple of decades for 6 decades now. Now it's going to be made completely redundant with solar/wind + batteries, which even if theoretically less awesome are at least available now.
jodonoghue@reddit
In 1995, the answer was unambiguously “better”.
30 years later we have a massive legacy of C and C++ code riddled with vulnerabilities that the next generation of offensive LLMs is going to rip apart like a shark feeding frenzy. It won’t be pretty.
Arguably we should have started fixing things in the early 2000s when better alternatives started be be freely (or at low cost) available. However…
…Technical debt, that lovely phrase humans use to justify doing what they already know is the wrong thing.
mpyne@reddit
And yet that's still better than the alternatives of a 90s still organized by file cabinets rather than software, which was the actual alternative of not having C, C++, Windows, UNIX.
If the LLM-driven security flaws get bad enough we can go back to paper. You sort of hint in the lamenting that this would be a very bad outcome, all I can say is that I agree, which is why I'm glad we had something to get us away from paper a few decades ago.
Again, it's not even always wrong to ship something acceptable and then refactor it to make it better. In fact it's often better (even for the utilitarian model of "better"). An engineer's perceived opinion of the software quality is not the only relevant factor for whether it is valuable to its users overall. It's not a negligible factor either, ensuring tech debt remains low enough is how a software product ensures its long-term viability in the face of ever-changing use contexts, but there are multiple stakeholders who have a say in that, not just development engineering.
Like, this debate is being carried over a great deal of computers and network routers that are riddled with those insecure C and C++ libraries. Personally I'm glad for it, rather than waiting until I'm nearly dead in 30 years to be having the same discussion in whatever the future perfect Internet might have looked like.
KagakuNinja@reddit
One has to remember that back in the late 80s, the cost of many of those Ada language features was decreased performance.
Now we ship 1GB+ electron apps on mobile phones, no one cares. It mattered back then.
Using proprietary compiler meant vendor lock-in, high cost, and less options and innovation, compared to open languages like C where there has always been a vibrant open source community.
buzmeg@reddit
Batteries and solar also got way, way, way, more research and development funding than fusion ever did.
Note the funding levels (aka "Fusion Never"): https://upload.wikimedia.org/wikipedia/commons/thumb/a/ab/U.S._historical_fusion_budget_vs._1976_ERDA_plan.png/1280px-U.S._historical_fusion_budget_vs._1976_ERDA_plan.png
mpyne@reddit
Hmm, now why might that be?
Is it potentially related to the fact that I could actually do something with both of those at microscopically-smaller tasks than grid-level energy? Again, this is just another example of why having something working now helps you have something better later.
You can trace the investment into solar technology all the way back to Bell Labs and find practical applications from hand calculators to satellites and now to grid-level applications that will literally change the climate of the entire planet.
Meanwhile fusion isn't going to get lots more money unless it can show some beginner or intermediate-level real application, especially now that its original use case has been well and truly addressed by competitors that managed to start tackling the problem decades ago.
_disengage_@reddit
"Move fast and break things" was always a terrible idea, pushed by intensely greedy, careless, and shortsighted people.
drakythe@reddit
Agreed. It has its place when prototyping, but that place is pretty specific, and far away from anything that involves human lives.
daidoji70@reddit
Oh man that's par for the course in the Pentagon . There's a reason our MIC is both really great from a historical perspective and also costs a shit ton. They do things like this all the time.
Like ada though I don't think they leak outside that MIC to the commercial sphere.
SpiderJerusalem42@reddit
I had a professor who sat on the board that set the language standards for ADA.
louis_etn@reddit
Actual Ada developper here, using for all my personal project and also using it professionally, in a French aerospace startup.
marcodave@reddit
That explains everything.
There's two types of programming languages, the popular ones and the one used by French people
tomkeus@reddit
France has a massive aerospace technology sector, probably the biggest in the world when adjusted for country size and Ada is still the go to language for most mission/safety critical systems in aerospace.
yaurn@reddit
I am ! Train industry
Lemina@reddit
They do! There are definitely some in the DoD and at DoD contractors, and I believe in the automotive industry as well. I also know that NVIDIA has been using SPARK (formal verification subset of Ada) for cyber security purposes: https://www.adacore.com/case-studies/nvidia-adoption-of-spark-new-era-in-security-critical-software-development
OllyTrolly@reddit
In the aerospace industry there is use of Ada SPARK too, although it's something I think is generally being left behind so it's interesting to hear Nvidia start adopting it!
BassKitty305017@reddit
In the late 20th Century, the University of Washington required 2 Ada classes before you could apply to their Computer Science department. That damn compiler would call everything a syntax error. But if you could actually get a clean build, then by God it would run. Once in the department, a lot of classes used C. That compiler would accept anything then you’d get, “segfault, lol. Add more print statements or fire up the debugger, bitch!”
panopticchaos@reddit
I’m not a dedicated Ada programmer but I’ve used it at work (did aerospace for awhile)
One thing I’ll call out was that it was very easy to pick up, especially since it was so similar to Pascal which was such a common teaching language back then. There were nuances that would trip people up but the ramp up for new SWEs for the team was shockingly fast. More akin to new Golang devs today rather than new Rust devs today. Again a big piece of this was that ‘everyone’ knew Pascal back then.
cmsj@reddit
Nominally I was, in that one of my years at university was based around Ada. I hated it. Having said that though, I now super appreciate all of the compiler guaranteed safety in Swift.
hoijarvi@reddit
I wrote some Ada code for Nokia's MPS 10 minicomputer as an intern in 1984. It was definitely the best language that I had tried.
Full-Spectral@reddit
I did some in the 80s. I worked at a place doing military projects. I was just a lowly tape back up dude, but I would teach myself stuff at night. The guys doing the project argued that they couldn't do it in Ada because they couldn't access the graphics stuff, so they were doing in Fortran.
I sat down one night and worked out how to do the graphics calls from Ada (just needed to set up some external function and type definitions) and did a little mockup of one of the UIs of their project. My boss took it to his next meeting and probably got a lot of satisfaction from throwing that at them.
daidoji70@reddit
Cool
I_hate_posting_here@reddit
I am an Ada programmer, and I'm wild.
boredcircuits@reddit
There's dozens of us!
oldfartMikey@reddit
Me too, Ada 83, between 30 - 40 years ago. 😁
daidoji70@reddit
https://media1.giphy.com/media/v1.Y2lkPTc5MGI3NjExZmZyOHhtcDk5MzFsOHZzMndhdzFra3d5amVhNHFyZXRxbGZ1enAxNyZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw/LNgtUZn56dtRu/giphy.gif
Every-Progress-1117@reddit
Had to learn Ada in university in the early 90s as the second language after Standard ML. Some of the staff were involved in the GNAT Ada compiler project at the time, so in a small way (mainly debugging) I've contributed to that.
Ada however, for all its faults, is a great language. Verbose, strict, but made for good software engineering discipline (only beaten by Eiffel IMHO)
A few years ago I started working with Go, and it reminds me very much of Ada (and Pascal).
BassKitty305017@reddit
University of Washington?
Ahri@reddit
It's admittedly a while since I wrote any Ada, but I'm very confused at why Go would remind you of it - why do you find them similar?
Snoo23482@reddit
But Go is another worse is better language. It is closer to C than to Ada.
jacobb11@reddit
An interesting but flawed article.
Some of the features claimed to be original to Ada already existed, if poorly adopted in mainstream languages. I don't care to do the research to provide examples cough cough Lisp.
The article provides few dates for when various Ada features were actually available. It's all very well and good to specify amazing features, but when did Ada implement them?
I am also skeptical that Ada was ignored by language designers. It was part of my university curriculum (if not a large part).
Ontological_Gap@reddit
Ada extremely cool, and the article is great, but this is simply false "Ada's type system was, in 1983, unlike anything else in production use... The distinction that organises it is between a type and a subtype — not in the object-oriented sense of a type that extends another, but in the mathematical sense of a constrained set."
It does predate common lisp v1 but various other lisps had been using constrained types for a long time at that point
davidalayachew@reddit
The ability to bound numerics to a certain value was one of the things that stuck with me when learning Ada. I never got much further because the language was too verbose for my liking. But I kind of feel like it would be perfect for some heavily specified project where you know ahead of time all the constraints, but want to have the compiler confirm that they don't contradict. The only other language I know that makes that doable is Haskell, but that's a different story.
Signal-Woodpecker691@reddit
Ah used to do Ada for a few years, lovely language. Our instructor told us once you had got your types defined the rest of the code practically just wrote itself.
TheRealDocHawk@reddit
Ada's Dream, a recent well-regarded board game, has player colors of red, green, blue, and yellow. Neat.
unitedbsd@reddit
Some of you might also like this https://ironclad-os.org/