Ada Lovelace and the First Computer Algorithm
Posted by Successful_Bowl2564@reddit | programming | View on Reddit | 18 comments
Posted by Successful_Bowl2564@reddit | programming | View on Reddit | 18 comments
davidalayachew@reddit
Related, but I always love reading about how Ada Lovelace was the world's first programmer ever.
Seeing how her code ran on Charles Babbage's hardware was what made me decide to turn programming from a hobby into a profession. The reason she is one of my biggest inspirations was because her code ran on a non-electric computer. It was just gears, levers, and wires, no electricity.
Understanding that electricity wasn't necessary, and was strictly a performance enhancement on an underlying concept was what allowed me to put my feet on the ground and really see the core of programming. It truly is just Set Theory and Boolean Arithmetic all the way down.
Once you have the ability to read and write it, it's just a matter of making it persistent, reading from that persistence layer, then feeding it to something that can read and consume that data.
Once I finally understood the concept of computing from end-to-end like that, THAT is when this field finally became viable to me as an actual profession.
Ameisen@reddit
Given that the Analytical Engine was never built, it never ran. We can emulate it (and have, as the link shows) but she never ran it.
davidalayachew@reddit
"In Babbage's 1864 memoir he discusses the creation of the various notes with Lovelace, including note "G". He provided the mathematical formulas for the calculation of the Bernoulli numbers, which Ada converted into a step-by-step table of instructions for the analytical engine"
So Charles made the spec, and Ada implemented it, making her the programmer in this scenario.
Correct, but Note G is considered to be the first program because of it's difficulty and significance.
I am sure that Charles wrote the calculator equivalent of "Hello World" and "2+3" before Ada, but calculating Bernoulli Numbers is what makes Note G worthy of getting her credit for being the first programmer.
My college professor showed us a video of the code running on a machine, tracing each instruction and showing how it got the final answer. That's what I was referring to in my original comment. Maybe the machine I saw was a recreation?
Ameisen@reddit
The actual quote from Babbage's memoir does not match Wikipedia's interpretation of it:
For this one, Babbage himself worked it out in algebraic form (not quite instructions, as Babbage never created an explicit set of instructions - programs were described in his work as sets of states), which Lovelace found an error in and corrected.
For other illustrations, Lovelace worked out the forms herself, but not for Note G.
I'm legitimately unsure how the Wikipedia editor interpreted this quote the way that they did.
As said in his own memoir, he worked out the algebraic form specifically in Note G - Lovelace amended it with a fix for an error.
Why are you under the impression that he would design (on paper) a machine capable of arithmetic and control flow without having considered or conceptualized what those would be useful for?
The Analytical Engine has never been built. Portions of it have, but not the entire machine. Perhaps it was a video of the Difference Engine?
Matthew94@reddit
You love reading about her but somehow missed all the facts about her not actually being the first programmer?
davidalayachew@reddit
Are you talking about the ongoing debate about whether or not she is the first programmer? I am well aware of it, and was introduced to it almost as quickly as I was introduced to her.
Or maybe you are talking about Charles Babbage having made several other "programs" prior to her writing Note G?
"Historians like Allan G. Bromley have noted several dozen sample programs prepared by Babbage between 1837 and 1840 (all substantially predating the illustrative notes), though they were never published and were substantially simpler, which has led to, in popular culture, Note G being generally considered to be the first algorithm specifically for a computer, and Lovelace is considered as the first computer programmer."
So sure, Charles wrote the first "1+1" on the machine, but because of the significance and difficulty of Note G, Ada gets the credit as first programmer.
djnz-1999@reddit
His tables had some mathematical complexity, but what's important is what sort of machine were they programs for? With the early AE design not supporting conditional branching, the programs were "do this fixed list of arithmetic operations".
When you look at the machines actually built in the 20th century, the 1945 ENIAC is generally considered the first computer to have run, not the 1938 Zuse Z1. Because the ENIAC had conditional branches and the Z1 did not. Likewise IBM/Harvard's 1944 Automatic Sequence Controlled Calcuator, explicitly drawing inspiration from Babbage, did not have conditional branches.
Babbage's first program, from 1837, was this: v7= v5v1 v1'= v5v3 v3'= v2v4 v2'= v2v6 v4'= v2'-v1' v1''=v7-v3' v3''=v4'/v1''
Which he described in a comment at the top as: "Elimination between two equations of the first degree which is the equivalent to calculating the value of x from the eqn x = (bc' - b'c) / (b'a - ba') & y = (a'c - ac') / (b'a - ba')"
It's fascinating, but it and his other early complete tables were for an early design, when the AE was still just an automatic calculator, only capable of running a fixed series of operations.
Calculating the Bernoulli numbers required a fully Turing-complete computer. And if you read Lovelace's Notes, she analyzes the nested loops in her program to show the different number of operations performed for different iterations.
Babbage, did design a mechanism for conditional branches. He just never wrote a full program that used it. I think because he felt that Lovelace's program that used it was enough to prove its usefulness.
classy_barbarian@reddit
Yeah I totally agree, when I first understood this I found it absolutely mind-blowing. When you realize the first true computers ever designed were completely mechanical, no electricity involved, in the mid 1800s, and yet its still a frickin Turing-complete architecture... it really puts into perspective how all this shit actually works.
In regards to Ada Lovelace, she was absolutely a genius and her work advanced computer science by decades, but I've come to realize the specific claim that she was the first programmer ever is a bit odd if you consider that Babbage actually invented the programming language that she was using to do the programming. Its kind of a funny nuanced situation because Lovelace was actually a better programmer than Babbage. Their contributions are inseparable, it's entirely possible that if she had not written her Notes on the Analytical Engine in 1843, Babbage would not be nearly as famous as he is now. And the specific revelation that computer programs could model any simulation came from Lovelace. But Babbage was actually the one who invented the first programming language, which Lovelace was using. I think most people here would probably agree that inventing a programming language is enough to make someone qualify as a programmer, and just technically speaking that would make Babbage the first programmer and Lovelace the second, even if Lovelace's contribution to programming as its own field (separate from computer engineering) was larger than Babbage.
So its a complex situation, I've just come to think its hard to really attribute being the first programmer to either Lovelace or Babbage. Stephen Wolfram (the Wolfram Alpha guy) wrote an article about Ada Lovelace that covers all of this extensively, that's where I personally learned much of this. You can read that here:
https://writings.stephenwolfram.com/2015/12/untangling-the-tale-of-ada-lovelace/
davidalayachew@reddit
I consider her the first programmer because she wrote Note G. It is considered to be the first algorithm, and I think that there is a meaningful distinction between calculating Bernoulli numbers vs doing the calculator equivalent of "Hello World".
Matthew94@reddit
Care to substantiate these claims?
Ameisen@reddit
According to Babbage's memoir, he had written Note G - he had shared it in correspondence with Ada Lovelace who submitted a fix for a bug as an addendum.
So, Babbage's original program had two bugs.
serviscope_minor@reddit
It's interesting because Babbage didn't realise the monster he'd created. He was envisioning a machine for computing tables of numbers better than doing it by hand, Lovelace realised it was something rather more general than that.
Rattle22@reddit
I have a framed print of Ada because I consider her revelation of generalized computins so powerful.
SirDale@reddit
I’ve programmed in both Ada and Babbage. I’d have to say Ada is a lot better known!
Ameisen@reddit
Note G was written by Babbage himself and then shared with Lovelace - he makes this clear in his memoir. Lovelace submitted an addendum fixing a bug in it.
Babbage had also written dozens of other algorithms prior to it.
This pop culture fascination and misunderstanding frustrates me - she was very intelligent and remarkable, but I really dislike the twisting of historical facts.
Matthew94@reddit
It’s amazing how people think Babbage would go to the effort of designing a machine and then not consider how it could be used at all, instead relying on some random socialite to tell him how his own invention could be used.
djnz-1999@reddit
Weird. Babbage, in July 1843, commented on Lovelace's Notes while she was revising them: "the more I read your Notes the more surprised I am at them and regret not having earlier explored so rich a vein of the noblest metal"
Certainly if you wanted to know the specifics of exactly how the individual operations were to be constructed, Babbage would have been the one to talk to. But Babbage, himself, considered Lovelace's commentary on the future of software to go beyond his thoughts.
djnz-1999@reddit
For an in-depth look at Babbage's earlier tables and comparison to Lovelace's Note G table, I encourage anyone wondering about the first program/programmer to read this: https://pairdebuggingwithlovelace.hashnode.dev/lovelaces-program-part-9-the-first-computer-program
A few of Babbage's earlier tables were complete enough that they could be run, but those were all for any early design of the AE that was not Turing Complete (no conditional branching). Generally non-Turing Complete machines are considered automatic calculators or some other term, not "computer". Lovelace's program was for an anticipated Turing Complete AE design that Babbage was working on in 1843, so it was a "computer program" by most reasonable definitions. Unless you require having been run on a real computer, then you'd have to look 100+ years ahead.
Babbage's tables are fascinating nonetheless, and worth reading--check out the link.