Dual-Processor ALPHA CPU Module from DEC / DIGITAL from the Altavista timeline - a piece of Art !
Posted by Laser_Krypton7000@reddit | vintagecomputing | View on Reddit | 50 comments
This is an dual 300 MHz / 4MB cache EV5 21164 ALPHA CPU Module from the Alphaserver 8400 generation also used in the Altavista search engine.
lolerwoman@reddit
Everything made by DEC was a piece of art. It was such a shame that Compaq had to buy them. And then was a shame that HP had to buy Compaq.
Adromedae@reddit
Well. DEC was pretty much about to go out of business by the time Compaq bought them.
NamelessVegetable@reddit
That's a common myth. By the time Compaq bought them, DEC was actually profitable. Their prospects might not have been good, but that was solely because Robert Palmer was a feckless CEO. For example, Google's founders offered to sell Google's technology to DEC, because AltaVista was the best search engine at the time. They didn't know that DEC had a ban on acquisitions because they were preparing to sell to Compaq. If DEC's management had realized that search engines would one day create entities like Alphabet, things could have been very different. One almost wishes that a Silicon Valley tech bro was the CEO (for the confidence and initiative, not the bullshit).
Adromedae@reddit
DEC was far from profitable in '98. They had to spin off a big chunk of the company, just to remain afloat. It was a sinking ship; by the time they got taken over, DEC was producing half the sales revenue of Compaq with 60% more employees.
The main thing that made DEC attractive for acquisition was their services and support division, which had sadly the least to do with the old classic DEC.
Alpha was, ironically, one of the things that basically ended up choking DEC to death.
NamelessVegetable@reddit
I don't believe DEC there were full-year results for FY 1998 because the Compaq acquisition closed before the end of the FY. The first Google result I clicked on says that for FY 1997, DEC made 140.9 million on a revenue of 13.0 billion; and FY 1996 brought a loss of 111.8 million on a revenue of 14.6 billion. These are not catastrophic results. DEC wasn't a Toshiba.
I would say it their marketing (or lack thereof).
Adromedae@reddit
Nah. Alpha and their fabs really did them in.
A lot of people in these sort of subs don't tend to understand the design costs involved. By the time DEC implemented their out-of-order uArch, the costs had skyrocketed to the point that there was simply no way they could keep going any further with it. The market size for Alpha was not growing anywhere near as fast as design and fabrication costs. So those sunk costs and obligations started to make the slim profits basically meaningless, in terms of allowing for an organization of that size to continue existing.
This ended being a similar trend for many RISC/UNIX vendor of that era. DEC was just the canary in the coal mine. In sort of keeping with their pioneering tradition ;-)
Companies tend to live by the sword and die by the sword. DEC lived by the mini and died by the mini. They simply lacked the company culture to go for economies of scale and truly embrace that computing had become commoditized by the mid 90s.
There are lots of interviews with Ken Olsen, where it was clear that as brilliant of an engineer as he may have been, he simply did not understand the forementioned point.
Marketing being also a part of said company culture.
We're seeing another iteration of this cycle with Intel, for example. They lived by the micro/PC and will likely be zombified by the micro/PC. They simply did not get the memo regarding mobile and AI. Because their company culture prevented them from seeing these trends.
And now you start to see the mobile SoC guys taking over. They have much larger economies of scale, and thus they can invest more in higher levels of integration. To the point that now we have phone CPUs taking desktop cores to the woodshed.
NamelessVegetable@reddit
Fabs? DEC only built one modern CMOS fab for Alpha in Hudson (an existing site) in the early 1990s. Its cost was several hundred million. The only other fab DEC had, in Ayr, Scotland, was an existing one that was sold off in the early 1990s, so it isn't really relevant to the question at hand. The cumulative losses from the Hudson fab over its short operating life (it was transferred to Intel in 1997) is surely dwarfed by the losses incurred by Ken Olsen's pet project, the VAX 9000, which cost a couple of billion (the figure I recall is 4; it's probably wrong) and made only a few hundred million in revenue.
The design costs for Alpha were not that substantial. The Alpha 21264 that you refer to used a difficult logic technology (footless domino logic, I think), but was designed by a relatively small team, only suffered only a minor schedule slip of a couple of months, and the first silicon more or less worked as expected. I wouldn't characterize its development as particularly strained. Intel has/is going through worse today with delays and cancellations. The 21264 was then ported to IBM and Samsung process technologies (contrary to some reports, Intel @ Hudson wasn't actually the primary supplier of Alphas after 1997). Simultaneous to those efforts were the Alpha 21364, and the EV8. This picture is of an effort, that while disrupted (DEC had planned future Alphas to use their own technology, but didn't acquire the equipment suitable for 0.18 micron and smaller and so ended their own technology development at 0.25 micron), was more or less healthy (and actually better than what was going on at Sun Microsystems, a highly profitable dot-com darling at the time).
DEC's silly matrix business structure and internal acquisition model, and Ken Olsen's fondness for the VAX/VMS platform, had been identified, within DEC, as being serious problems in the 1980s. The situation though, IMO, was not as ossified as you make it out to be. DEC did several RISC studies in the mid 1980s, the PRISM VAX successor almost got out the door, and a team on the West Coast had the foresight to put out a RISC workstation/server (DECstation/server) IBM PC-style (design first, "apologize" later) while Olsen was still fretting about whether to stick with the VAX or go with something else.
BK went on an acquisition spree and tried to expand into everything (FPGAs, GPUs, AI, self-driving cars, drones, etc.). He did fail miserably on most of these things, but I don't think you can say Intel didn't see those trends. They saw all the trends, but couldn't execute.
I'm not see mobile as having higher levels of integration from where I'm sitting. The cutting edge is in the server and accelerator spaces. There are many large designs, chiplets, interposers, 3D integration for logic/memory and DRAM (HBM), etc. Advanced networking ASICs with the interfaces to match, etc. This space is driving the very best in semiconductor technology, and seems to be motivating much development in computer architecture and organization. It seems to me that there has been a split
I also suspect that the claim of mobile processors having parity with desktop processors is something one only actually sees on a very limited subset of client applications. I don't often look at the organization of those processors, but what I've seen does not lead me to think that they have performance parity with the likes of Xeon Scalables or EPYCs.
A similar situation existed back in the day of the Alphas. An x86 processor might come close to an Alpha (or some other RISC processor) on SPEC, but would actually be several times worse on a demanding real-world application.
Adromedae@reddit
"The design costs for Alpha were not that substantial. The Alpha 21264 that you refer to used a difficult logic technology (footless domino logic, I think), but was designed by a relatively small team, only suffered only a minor schedule slip of a couple of months, and the first silicon more or less worked as expected. I wouldn't characterize its development as particularly strained. "
EV6 and EV7 were close to $600 million to design and validate alone.
NamelessVegetable@reddit
The EV6 started when? Before the EV5 was near or at completion? Took about 4 to 5 years to finish? Was one of the world's most advanced microprocessors when it was introduced in 1997?
The EV7 was c. 2000? It was a SoC with the L2 cache, memory controllers, and mesh network logic all on-die; all the parts needed for a high-end enterprise server. Logic, which if not integrated, would have required a sophisticated ASIC which would probably have required substantial funding to design anyway.
Your $600 million figure is spread over several years. If it's correct, that's roughly $100 million per year, which is what I'd expect for high-end, leading-edge mid to late-1990s microprocessor development. This is on an annual R&D budget of over $1 billion, for a product segment (Alpha-based computers), which in the 1996 to 1998 timeframe, was a third of DEC's HW revenues (which was itself roughly half of $13 to 14 billion during that timeframe) with a profit margin of roughly 30%. It seems completely reasonable to me.
Adromedae@reddit
No. That $600+ million just for design and validation alone. By the time the chip is manufactured and brought up it was over $1 billion.
For EV8 the design and validation for it alone was costing as much as EV6/EV7 combined. Which was the issue: design costs were growing up almost exponentially.
Tru64 and OpenVMS revenues were not growing anywhere fast enough to make the investment even remotely reasonable.
By the late 90s DEC wasn't really having much in terms of margins. They were in and out the red from '91 on.
NamelessVegetable@reddit
Is that annual or cumulative? There's absolutely no way DEC was spending hundreds of millions on design and validation alone per year when its annual R&D spend was around a billion. Even Compaq, which actually increased investment in the Alpha platform (e.g. the new R&D center in Barcelona for the EV8), was only spending several hundred million per year on R&D when they gave up in 2001.
May I ask where you get your figures from? The ones from my previous post were gleaned from DEC's own annual reports.
It wasn't so much Windows because Alpha never really targeted that segment of the market despite noises from DEC and Microsoft (Windows NT on Alpha). Alpha was always positioned at the enterprise market, and it was the Itanium hype that did it in. Compaq cited Itanium surpassing Alpha in performance as one of two reasons for discontinuing it in 2001 (the other was cost, as you mentioned). Even though in reality, Itanium proved to be a profound commercial, performance, and computer architecture theory disaster, especially for HP.
Adromedae@reddit
I get my figures from having been in industry and having been part of the managing team for several architectures.
A lot of people on these subs musing about CPUs really are not aware about the figures involving in developing a competitive high performance design.
FWIW IA64 didn't kill AXP. Period.
AXP was pretty much dead by the time Compaq bought DEC and saw the financials.
Same thing happened with SGI/MIPS, and eventually SUN/SPARC.
They were architectures with markets not growing fast enough to make the ever increasing development costs worth it. Especially when competitors started to pass them by in performance, for a fraction of the cost.
The entire industry was well aware of these increasing costs as far back as the late 80s. When it was obvious that superscalar and out-of-order was going to be really expensive in terms of development.
MIPS almost went under trying to get their out-of-order arch and they had to be rescued by SGI, and both orgs never really recovered.
SUN literally was never able to push an out-of-order design out of the door and basically they died trying to make Rock (Only Fujitsu was able to execute in that regard when it came to SPARC).
IBM had to leverage PPC and their mainframes to reuse as much as they can to still keep POWER a thing, but they are starting to think twice nowadays.
HP also saw that. Which is why they figured out they wouldn't go past PA-RISC 2.0 on their own, so they chose to partner with intel. For better or worse, they decided on PA 3.0 being a implicitly wide design (initially called PA-WIDE), that eventually became IA64.
The point is that everybody in the industry knew the economics involved and it was a given that most proprietary architectures were going to being unable to be developed further. So under that lens, Itanium made some sort of sense as a path to a 64bit architecture where somebody else (intel) could eat most of the development costs, while giving the platform a collective economy of scale.
What happened was that a lot of those vendors did not see AMD's K8 coming. Which ended up being a much better alternative.
Nowadays, we're starting to see similar dynamics. With AMD and Intel having to compete against ARM cores that are starting to be more performant while accessing larger economies of scale (and thus being able to tap more development investment)
NamelessVegetable@reddit
Are those figures proprietary and not in the public domain? I'm not necessarily doubting you here, I'm certainly aware that my knowledge of DEC's inner workings is that of an outsider, I'm just curious if I can add to my knowledge.
I don't really get that impression. People here mainly focus on the technology.
The problem wasn't so much the TAM as it was with the fact that the RISC vendors all competed with each other instead of with x86. The Microprocessor Report noted this phenomena back then. Sun even went around killing off SPARC clone makers during the first half of the 1990s at the same time they were promoting SPARC as an open platform suitable for cloning!
MIPS wasn't the sole factor. SGI went on a wild adventure in the mid-1990s with the acquisition of Cray Research. They spent hundreds of millions on that, and many millions more trying to combine aspects of both Cray and SGI systems. And this is just the money, to say nothing of the time and effort.
Sun's record of processor development didn't inspire confidence for reasons other than superscalar execution and dynamic instruction scheduling. Almost every single one of their processors suffered from severe problems; only the UltraSPARC I and MicroSPARCs (a simple pipelined scalar design) did not.
Only relative to the microprocessors of the 1980s, when the state-of-art technology was pipelining. I dare say a significant portion of the "high" cost was due to the 1990s and early 2000s penchant for full-custom dynamic logic. IBM went with standard cells and spent the money on process technology instead, achieving a respectful lead in some aspects over everyone else (including AMD & Intel); SOI and low-k dielectrics, for example, were featured in IBM technologies a year or so before Intel and AMD got it (AMD actually did joint development with IBM in the late 1990s).
From what I've seen, that only started to happen in the mid-2000s when the POWER6 and z196 (the model might be wrong) started sharing more low-level IP. In the 1990s POWER/PowerPC and mainframe microprocessors were very divergent in their design and designers. Little was shared beyond process technology, standard cells, and the design methodology/tools. Mainframe microprocessors didn't even get superscalar execution and dynamic instruction scheduling until 2003 with the z990 (the name might be wrong).
That was more of a technology & fab issue I think. HP didn't want the expense of developing their own proprietary CMOS technology and the continual upgrading their fabs to chase smaller feature sizes. It should be noted they continued to design the first-generation Itanium (Merced) during the second half of the 1990s, until that team was transferred to Intel.
Except Itanium made no sense whatsoever economically and technically. This was known in some technical circles back when the vast majority of analysts were insisting that it could deliver what was claimed at a lower cost. Itanium merely required more complex and expensive compiler development while making barely any reductions in design cost. Itanium was as "hard" to design for as any non-VLIW/EPIC architecture. They ran into the same logic and circuit challenges as everyone else. The first Itaniums used plenty of the same expensive full-custom dynamic logic design as x86 (these had different logic styles though) and AMD (I believe the K8 et al. used dynamic logic; I could be wrong here).
From what I've seen, it's hard to compare the cost of AMD and Intel processors with ARM cores because most ARM cores are delivered as RTL, not macros. ARM licensees will do their own thing with the lower-level implementation/realization. That's not what AMD and Intel does. There will be lots of redundancy in R&D expenditure for ARM: ARM designs the architecture, organization, RTL, and maybe the physical design. Licencees will realize an ARM core from the RTL, or they might base their physical design on ARM's and make extensive modifcations themselves.
Adromedae@reddit
By all means, feel free to go out of your way to ignore the basic point and not gain an education from source. Cheers.
lolerwoman@reddit
Perfection isn’t cheap.
Adromedae@reddit
DEC was far from perfect. Ergo why they went out of business.
lolerwoman@reddit
The enginering was perfect. The decisions made by the directives not.
Adromedae@reddit
Businesses are about making money, not technical perfection (which DEC was far from achieving anyway).
A lot of engineers seem to miss that point.
lolerwoman@reddit
I agree with you. But temember to stay on topic: the hardware was a piece of art; perfect.
Adromedae@reddit
LOL. No it wasn't.
ElevatorGuy85@reddit
It felt like DEC couldn’t market themselves out of a wet paper bag ….
Adromedae@reddit
It was more of the simple dynamics of design costs.
That is why so many RISC architectures of the 90s ended up going belly up.
Design costs increase way faster than revenue, if you don't have access to economy of scale. Eventually you're out of business, because your target market is nowhere near large enough to justify investment in design in terms of the margins you'll end up getting.
phasefournow@reddit
The final nail in the coffin for DEC was the RAINBOW, DEC's belated attempt at a PC operating on CP/M while Windows-95 was wowing the world. They made a bunch of them and later had to pay almost as much to dispose of all the unsold units
Adromedae@reddit
LOL. What? The rainbow was from the early 80s way before Windows95.
phasefournow@reddit
You are correct. I deleted my comment.
PAPPP@reddit
DEC's management didn't understand that Microcomputers were going to eat the Minicomputer market until they were already doomed. The resulting diaspora turned out to be pretty consequential.
Many of their engineers understood. Dave Cutler, who was the architect of the RSX-11 and VMS Operating systems for them, was off leading a team developing future high-end microcomputers in Seattle by the early 80s, and they produced a fleshed-out PRISM architecture and MICA operating system by the mid 1980s, in time with the first-generation high-performance RISC designs.
Then in 1988, DEC canceled the whole project largely because they didn't want high-end workstations competing with their VAX minicomputer cash cow, Cutler got pissed and decamped for Microsoft (who were currently embroiled in OS development hell with IBM on OS/2.), with 20 members of his former DEC team. Where they designed most of Windows NT. As the joke goes, it may not be a coincidence that WNT is VMS incremented.
Then DEC realized they needed their own high performance microcomputer design and came out with the Alpha in the early 90s, and it is ...similar... to PRISM such that folks half-jokingly suggest that the "AXP" part name for Alphas stood for "Almost eXactly PRISM" because they just reheated the design.
Later, in 2001, when Alpha development was being killed in favor of Itanium (We all know that everything went well with Itanic), many of the Alpha designers were hired by AMD, where they asserted a ton of design influence, so much that the original K7 Athlons use the bus from the EV6 Alpha under license, and the original K8 (AMD64) parts show an awful lot of design similarity with the published plans for canceled future Alpha designs.
Xenabeatch@reddit
You should write a book. I’d read one just about Cutler.
PAPPP@reddit
It's actually on my long-term TODO list; I'm computer engineering instructional faculty and constantly find myself quoting Alan Kay on "The lack of interest, the disdain for history is what makes computing not-quite-a-field." I'm hoping to offer a computer history course as an elective some semester, but all my time has been soaked up with other things lately.
I want to do a bunch of selected-readings-and-experiences things with the assignments. Make some iPad babies experience obstinate old technology. Read some excerpts from Wichary's "Shift Happens", Emerson's "Reading Writing Interfaces", and/or Kirschenbaums' "Track Changes" and compose some text by answering prompts on a series of manual typewriter, electric typewriter, early word-processor, early modern word processor. Take a guided look at an old microcomputer trainer and its documentation, hand-assemble some simple program, and enter it on a hex keypad. Read some passages from "Where Wizards Stay Up Late" and log into an (emulated) TENEX box on a hardcopy terminal. Try to do some tasks on early consumer micros. Etc.
Overview books for computer history are few and flawed; Williams' "A History of Computing Technology" is delightful for early stuff but hasn't been updated since 1997 and stopped early even then, and I'm not overly found of Ceruzzi's "A History of Modern Computing" so it seems like a thing to work on.
I at least have a couple articles and log-form pieces in progress. I have a piece about the major OS developments 1985-2005, partly researched and written that I've posted pieces of online and will end up "somewhere, eventually", and a mostly-finished article on the history of Terminal Multiplexors (think screen and tmux; they're probably younger than you think. And probably younger than GUIs running multiple terminal emulators. Unless some never-substantiated claims from emacs enthusiasts are true.) that I'll probably stick online soon.
Also, I'm probably not qualified to write it, but I'd read a whole book about Cutler, he's a fascinating dude.
He's probably the #1 UNIX Hater, and even as largely a *nix guy, he makes a great argument.
He gave a very long filmed interview on Dave's Garage about a year ago that is super interesting.
Xenabeatch@reddit
Cheers for the tidbits and link. Have read one account of NT development but I find Cutler’s career as a whole fascinating, including whatever his role in Azure was. Remarkable man, still working at 82! I wasn’t aware of the Unix comparisons - *nix is an entire history in and of itself so a bit weird. Paid and maintained services vs open source would be a more fertile ground to engage in .. you would think.
TelluricSine@reddit
Check out "Showstopper!" by G Zachary. It's all about Cutler and his team developing Windows NT.
Xenabeatch@reddit
Cheers.
BrakkeBama@reddit
For real. That would read as some actual Micro-prose. For the ages.
itsasnowconemachine@reddit
DEC also dumped over a $ Billion into the high-end VAX9000 with expensive/high-power ECL, when DEC engineers knew that it would be out-performed with CMOS chips very soon. A $1million+ VAX9000 ended up not performing much better than a NVAX meant for a Vaxstation.
dagelijksestijl@reddit
The Itanic didn't even need to be released to accomplish its goal of killing nearly all high-end RISC architectures but Power(PC).
PAPPP@reddit
The amount of money and effort thrown into Itanium by HP and Intel, and Intel's lack of investment in a competent future for their x86 lines during that era, makes me think they weren't just using it as a bulldozer to kill off all the high-end competition, they actually believed in it.
Intel has a surprisingly bad record on big architecture projects. They also believed in the iAPX 432 (which was frankly a very similar story), and to a lesser degree in the i860, and that NetBurst style many-small-stage-pipeline architectures were going to scale to like 5GHz in the early 2000s without dying on on power or mis-prediction and .... yeah, none of those worked out.
dagelijksestijl@reddit
NetBurst was their vision of a competent future for x86, they just had the marketing department running the show until not long after Prescott's tapeout.
The marketing department would have been told to zip it much earlier had 130nm/90nm been delayed.
phasefournow@reddit
DEC and the annual DEC-World exhibition had gotten so big that Boston didn't have enough hotel rooms. One year in the mid-90s, DEC chartered the QE-2 to berth in Boston Harbor for the week to provide added rooms. From that peak to out of business in 5 years.
mycall@reddit
Running Windows 2000 beta 3.
NamelessVegetable@reddit
It's beautiful!
GonWaki@reddit
Microvax, PDP series, and Alphas served their place. I always like DEC products mostly because of its quirks — naming conventions, DECNET, software. Fun while it lasted but newer and more flexible equipment at a fraction of the cost killed off the compusaur.
There is still a Digital Place in Forest Hills, PA (east of Pittsburgh) where the regional DEC office used to be.
PAPPP@reddit
Let's see, digging through the markings and DEC name equivalence tables, the board is marked E2056, visibly two CPUs, so that's a E2056-DA; a KN7CC-AB LaserBus Dual CPU module carrying a pair of 21164 (EV5) processors?
I think that would have come out of an AlphaServer 8200/8400? which would line up with the early 1996 date codes on most of the chips.
That thing was a capital investment.
Laser_Krypton7000@reddit (OP)
Yes, fully equipped those were in the beginning 7 digit range moneywise.
I do have fully functioning ones in my collection:-)
TerminalCancerMan@reddit
I sometimes miss my Alpha. Sometimes.
CornerProfessional34@reddit
Found you some RAM for it: https://www.reddit.com/r/vintagecomputing/comments/1d127nb/one_gigabyte_of_ram_dec_alpha_gs60e/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button
ceojp@reddit
Nice.
SAIYAN48@reddit
Looks similar to the Plexus that Adrian has.
Laser_Krypton7000@reddit (OP)
???
Do you have a pointer ?
SAIYAN48@reddit
https://www.youtube.com/watch?v=lBprWU9cHXs
earthforce_1@reddit
That would have been a god machine back in the day
Kellerkind_Fritz@reddit
laserBUS!