Why and how 16-bit to 32-bit transition was much more smoother and quicker than 32-bit to 64-bit?
Posted by Appropriate_Fig_3516@reddit | vintagecomputing | View on Reddit | 90 comments
Azipcoder@reddit
16 to 32 was not that smooth and quick. Cost of the hardware played a huge role. In back then money, let’s say a 286 was $800 USD to build. Well, a 386 might be $2000 or more.
The biggest obvious advantage of the 16 to 32 bit was in audio and video. So, your 286 game would look like garbage next to your 386 game, and your 286 sound board and video card would still look and sound like garbage in a 386. It was like skipping console generations. Backwards compatibility was challenging and forwards compatibility didn’t improve the experience much.
Development was slow and developers were still counting memory usage. The first generation of 32 bit games was basically demo software. It took a while for development to catch up because the majority of machines were still 16 bit.
IMO 32 to 64 was smoother, but I also had more money by then.
Deksor@reddit
Well not really, the 286 performs as good as the 386 in equivalent tasks. The 386 wasn't a performance boost generation at all. But it was the root for a software revolution for x86.
No more segmented memory, protected mode (the 286 had one too, but much more limited), virtual 8086, 4gb of address space. This simplified programming a lot and allowed for far more advanced systems (which allowed for windows 9x and NT to appear, do multitasking, etc).
I don't think I can make a console analogy, but I can make a simple one.
286 is like a flathead screwdriver. 386 is like a screwdriver with replaceable bits with a set of bits. You'll be just as fast to unscrew a flat head screw with the flat head screwdriver or the screwdriver with replaceable bits. But the latter is way more versatile and will make your job a lot easier when you encounter a Phillips head screw! But it's not like you got a motorised screwdriver.
2748seiceps@reddit
Doubling the bus width, doubling the instruction pipeline, and moving the MMU on-die did have performance boosts when compilers started optimizing for it.
Sure, a piece of software made for a 286 would run roughly equivalent on a 386, but you could say the same thing about a piece of 486 software running on a Pentium. Then comes along something like Quake and changes the game completely.
redditshreadit@reddit
32-bit operating systems and software were more readily available than 64-bit, during each transition.
thelimeisgreen@reddit
I was there. It wasn't smoother. The big difference is there were far fewer computers in the world and most non technical home PC users had very little idea of any of this. Their 16bit Intel '286 couldn't run any new 32bit software, but it was somewhat of a non-issue. A lot of common apps at the time persisted as 16bit and ran just fine through Windows or OS/2 compatibility modes. MS-DOS itself was 16bit, but had 32bit extensions and 32bit protected mode to extend 32bit functionality on the newer hardware. So the "compatibility mode" was actually to run the newer 32bit software on the 32bit hardware, which was being crippled by a legacy OS.
32bit transitions for other platforms went a bit differently. For Apple, they went 32bit much earlier when they adopted the Motorola 68000 series CPUs with the Lisa computer. In 1984 the first Macintosh was 32bit for the CPU, but had a 16bit system architecture, as did a lot of early 32bit computers. Like the Atari ST and Commodore Amiga.
32bit was common for a lot of more expensive workstation or mini computers of the day and many running some Unix variant operating system, allowed for applications to be mostly recompiled without too much effort. Assuming the code and support were available to do so.
ken_the_boxer@reddit
In my experience the single most important point for the 386 was memory adressing. 640K simply wasn’t enough for anyone anymore.
msthe_student@reddit
fwiw the 640k limit could be busted even in real-mode, theoretically you could do a full MB. As for the 286 I think the problem was that 286 protected mode didn't really have a way to run real-mode code, so it was an all or nothing proposition
ken_the_boxer@reddit
Exactly
msthe_student@reddit
32-bit transition for Apple kinda happened multiple times, because the original Macintosh (and presumably Lisa) software wasn't 32-bit clean and could really only access 16 MB of memory (24 bit)
auximines_minotaur@reddit
Who said 16bit to 32bit was smooth? That would have been roughly the 286/8086 age to 386 age right? That transition was marked by probably the most jarring OS change in recent memory (CLI + CGA to Windows). Not to mention a huge winnowing of platforms (death of Amiga, Atari, other minor players and also-rans). And that’s not even touching the commercial market, where minicomputers completely disappeared as a class of computers (yeah yeah yeah we all know COBOL mainframes are alive and well, that’s not what we’re talking about here)
And who said 32bit to 64bit has been rough? Worst continuity has been in the Mac world with the transition to Apple Silicon making things a bit painful for a few years. But that’s pretty much over now and probably was mostly a pain for developers. I’m not sure how much everyday users were even affected by that.
enemyradar@reddit
Yeah, the only really rough part of the transition to 64 bit was intel trying to do it with the Itanium, dumping x86, which failed. But failed in a way that barely affected anyone and then AMD created the x86-64 and everyone just got on with it.
Educational_Bee_6245@reddit
Then again, by the time desktop pcs with more than 4GB of ram where a thing the cpu would support x86-64 and you could install a 64bit os and be happy. The struggles before only happened in the enterprise world.
arbiterxero@reddit
4gb was a windows issue more than a 32 bit issue.
Was still related to 32 bit, but only because Microsoft sucked.
Albedo101@reddit
What is even more hilarious, the suckage started way way earlier.
Intel released 386, a full 32bit cpu in 1986. Which in theory could address 4GB ram. Meanwhile, MS was gaslighting the world that 640K is enough. Microsoft wasted a decade of OS improvements from 1986 to 1995 with their shenanigans.
CubicleHermit@reddit
Yeah, in the consumer space I feel like the 32-bit to 64-bit transition was a nothingburger. By the time Vista came out, it was just like "oh, a few old bits of hardware won't work, and your 16-bit Windows apps need a VM."
Having been on the front lines of it in software development, the ~4 years prior to that on the enterprise side were the the painful ones on the PC server/workstation platforms, and the even longer one in the Unix world ('85 for Cray, '93 for Alpha) was likely more painful still but that was already pretty much resolved by the time I got out of college in '99.
VivienM7@reddit
Vista was... perhaps a little early for consumers to make the 32 to 64 bit move. By definitely by 7 it was time.
Enthusiasts are a different story. I remember being scared of 64-bit Vista, then my Windows installed got screwed up, and I was like 'oh why not try 64?' And I was pleasantly surprised, except that my then 3GB of RAM didn't cut it. This would probably have been early 2008.
CubicleHermit@reddit
My work was on XP 64 bit the moment Dell started offering machines with it preinstalled, and it had a major issue with drivers being unavailable, but the ability to use a full 4GB of RAM and run a 64-bit JVM (the 32 bit one was for off-heap address space reasons was limited to a heap size of about 1300MB on Windows) was huge.
So when 64-bit Vista came out, I jumped on it immediately, discovered 4GB would not cut it either, and decided it was my excuse to buy more RAM :)
VivienM7@reddit
It's funny you say that; that aligns with my recollection and why I never tried XP 64-bit, but there have been people on reddit claiming the driver support for XP 64-bit was good...
ScoobyGDSTi@reddit
It was basically just Windows Server 2003 kernel based. It had good driver support for enterprise chipsets, NICs etc, but still wasn't great as a daily drive OS.
CubicleHermit@reddit
Just depended on what you were running.
On the Dell workstations we got at the office (from memory, the Precision 390) it was great, and "just worked" with the actual Dell hardware.
It did not work usefully with the motherboard I had at home, at least that early, and I'm not sure if NVidia kept up the drivers for newer consumer models like the 8800.
It's possible later in the lifecycle, it got better, but the window between its release and Vista's was pretty narrow (about 18 months), and my interest in it for anything that wasn't preinstalled on one of the machines I used supported at work was pretty low.
Flynn_Kevin@reddit
I tried 64 bit XP but it was problematic with drivers so I rolled back to 32 bit XP. Vista was such a mess when it launched I rolled back to 32-bit XP again until Vista SP1. Vista SP1 was an entirely different experience. It just worked. 7 was the first time Microsoft delivered a relatively polished product on launch day.
paroxybob@reddit
Yeah, 7 was definitely the GOAT of Windows OS’s.
VivienM7@reddit
Windows 95 was relatively polished on launch day, so I believe was Windows 2000.
XP pre-SP1 was a mess (frozen taskbar bug!) but most gamers still ran 98SE so somehow that seems to have been forgotten…
fbman01@reddit
I only made the jump from 32bit to 64bit with windows 7.
When vista came out, most applications were only 32bit, there was no real need. Also my windows 7 machine was the first computer I had with more than 4 gig of ram.
sidusnare@reddit
Between that and PAE, and most software dual publishing, and 32bit backwards compatibilty, it was really not a big deal.
lweinmunson@reddit
We had a lot of issues switching to x64 around the Windows7/Vista/8 time period. A lot of our enterprise apps would crash on a 64bit OS. And it took even longer to move Office versions to 64 since the Excel plugins weren't updated quickly. I think it's only been in the last 5 or 6 years that we've been able to build pure x64 PCs and deploy them without checking which apps/plugins the users were going to need.
BitRunner64@reddit
AMD64 (x86-64) was such a brilliant move by AMD compared to Intel with their Itanic. You could buy an Athlon64 and just keep using your 32-bit Windows and drivers until you were ready to switch. Then when you finally did switch, 99% of your 32-bit applications and games would just keep working fine on 64-bit Windows.
AppropriateCap8891@reddit
Not to mention we were still deep into what I call the "Operating System Wars".
Those who were not around then or only casual users likely have no idea what that era was like. We had both IBM and MS DOS, OS/2, CP/M, Xenix, PC/IX, DR-DOS (along with multiple variants like Dos Plus), 4DOS, Norton DOS, multiple iterations of BSD, and this is just the primarily text based ones.
We of course also had Windows 1-3, XWindows, GEM, GEOS, and more.
I often laugh when people think back then everything was Microsoft. That was far from the truth, as which OS was still a big consideration depending on what you wanted to do.
RvstiNiall@reddit
I would argue that you didn't actually have OS wars, you had architecture wars. The x86 became dominant due to price, availability, and being 32-bit, while most of the competition was around the same price (or significantly higher), wasn't as readily available, and was 16-bit. The better computers (Unix workstations) were ungodly expensive, and weren't even readily available unless you were a large business, but hey, they were 32bit!
Most of the "other" OSes didn't run on x86. They ran on their own bespoke, incompatible platform. You couldn't run AmigaOS on an Apple computer, or vise-versa despite the same processors. If all of these OSes could have run on comparable systems interchangeably it would have been more equal.
Microsoft strong arming the x86 industry is why Windows won out among the x86 OSes, but there wasn't really any other competition there other than OS/2, which was an IBM thing that pretty much nobody else focused on. And before anyone says that OS/2 WAS available on Dells, HPs, etc: they were special order only, not the stuff you could pick up at the store. Much like PC makers offering Linux these days, you can't just go to Walmart, Best Buy, Office Depot/Max, Staples, Microcenter, etc, and pick up a Linux PC (other than the raspberry pi, and the super expensive nvidia dgx spark)
VivienM7@reddit
Everything became Microsoft for most people by about 1996 on the client/consumer side. OS/2 Warp's failure to catch on was... probably the swan song for non-Microsoft x86 client OSes. Everything else had fizzled before then.
On the server/workstation side, you had the great glory of the commercial *NIXes before that was eaten by Linux and Windows NT, but that was a little earlier.
And there were still things like NetWare, that started fizzling out but didn't get fully replaced into the Win2000/Active Directory days...
Also, 4DOS is not an OS, it's a command interpreter (or as *NIX people would call it, a shell). One that has since been open sourced and is still maintained. Long live Rex Conn and JP Software!
That being said, perhaps I should return to the question and point one one thing - there were two paths left from that world to 32-bit that had a chance. OS/2 and Microsoft. OS/2 Warp fizzled despite a ton of excitement in 1994-1995. Windows 95 went insanely big.
porkchop_d_clown@reddit
It’s a shame about Warp, I really liked it. I also thought it was cool that there was a scripting language (REXX) that worked on mainframes, Warp PCs, and my C= Amiga.
VivienM7@reddit
I never ran Warp, I was... excited... about the idea for a while, but it suffered the same problem as NT to a lesser extent - not really viable on the kind of 486 a 12-year-old has and needs to share with parents (and before anybody says dual-boot... not if you don't have the drive space).
And the second problem it had - very few preload opportunities. The mass market users powering the PC boom had no interest in installing a different OS, chasing drivers, etc. Doesn't help that Microsoft ... creatively... discouraged some OEMs from going down that path.
Once Windows 95 established itself as the successor to the dominant DOS/Win31 ecosystem, and I would argue it probably had done that before August 24, 1995, then it was game over...
(Actually, that reminds me, if I ever manage to get a healthy socket 7 retro system going, OS/2 Warp is one of the things I'd like to try on it. Along with NT4. Two operating systems that I have long admired but never really used.)
auximines_minotaur@reddit
Funny how all this became essentially irrelevant once everything was available on web and mobile.
VivienM7@reddit
It didn't become irrelevant. Chrome just became effectively the new OS.
(And note how there are few HTML rendering engines left on the market. Edge is Chromium, so is Opera. Safari/WebKit share some DNA with Chrome. Firefox is the last main outlier developing their own browser engine).
And then, well, the actual computer OS became irrelevant for as long it supports Chrome. (This, of course, did wonders for the Intel Mac...)
m-in@reddit
100%. I was going to say just about the same. 32 to 64 was a nothingburger. 16 to 32 was a royal pain.
auximines_minotaur@reddit
With 16 to 32, there’s a good chance you would have changed platforms entirely!
zabolekar@reddit
I don't think it makes sense to say that the transition to Apple Silicon was a part of the 32 to 64 bit transition: macOS has dropped support for 32-bit hardware in 2011, years before first ARM Macs, and stopped being able to even run 32-bit apps in 2019, still a year before first ARM Macs.
Maeglin75@reddit
I agree.
The step from 16bit a huge thing and solved a lot of problems computers were already running into at the time. For example,16Bit systems, usually with 20bit address space, limited the continuously usable RAM to only 1MB (from which in case of IBM PCs only the infamous 640KB were directly usable for software). More RAM required cumbersome tricks like bank switching. That alone made 32Bit basically a requirement for any serious multitasking OS. But even later DOS games already used their own 32bit memory managers to break the 1MB barrier.
On the other hand, there were problems caused by the transition. For example, that there was still a lot of 16bit software, and even at the time current operating systems like Windows 95 had a lot of 16bit code remaining, caused the first big flop among Intels x86 CPUs, the Pentium Pro. The Pentium Pro had a lot of improvements that laid the foundations for modern x86 CPUs, but it was very much only optimized for 32bit software. The still ubiquitous 16bit code especially in consumer environments stopped it in its tracks and made it slower than the previous CPU generation.
But all in all the advantages of 32bit were so big, that it overshadowed the problems and everyone was just going with it.
When the transition from 32bit to 64bit happened, most consumers really didn't need more the 4GB of RAM. So, for the most past, everyone just remembers the initial problems that came with the new 64bBit systems, not the improvements.
Educational_Bee_6245@reddit
Apple was especially rough, the moved on PowerPC to 64 bit on the G5 and then introduced intel chips that could only do 32 bit only to start another transition again.
auximines_minotaur@reddit
Agreed. But again, I know and care about this because I’m a developer. How much was the average user even impacted by this?
Educational_Bee_6245@reddit
Well, Apple always pushed developers and users to upgrade their stuff. If that had been Microsoft stuff like the classic environment and Rosetta would have around much longer.
This mainly caused issues for users when a system extension couldn't run on the new os or such situations.
auximines_minotaur@reddit
All I know is it made my docker builds annoying until AWS rolled out ARM images. Since then it’s been smooth sailing.
Oh yeah and also it made my life hell when I briefly had to support a legacy RoR app from hell, but the less is said about that the better 😂
mosca_br@reddit
memory limits and protected mode in x86 made the switch to 32 bits very attractive. 64bits allowed more than 4gb/proccess, but nothing outside enterprise software benefited from it. Also, intel didn't really back amd's 64bit extensions so microsoft was not in a rush either. xp64 was a novelty and had drivers issues (chicken and egg scenario)
CubicleHermit@reddit
Depending on your OS, the usable process memory on 32 bit versions was often smaller than 4GB, with part of the address space reserved - for Windows XP with defaults, that's only 2GB, and for most 32-bit Linux, it was 3GB. Both had non-default settings that could extend that, often with a cost of performance.
Support for more than 4GB on the machine itself (via PAE) was also limited to higher-end hardware and a lot of Intel consumer chipsets prior to that had reserved physical address space, so in practice, 64-bit basically accompanies the transition to supporting 4GB+ system memory.
What was dumb was the Intel 945 chipset, which was sold with Intel's earliest 64-bit mobile chips (the first generation Core 2 Duo) but which only supported about 3.5GB of memory, so if you paid for a full 4GB, you couldn't use it. The 965 (a year later, for mobile) fixed that.
p47guitars@reddit
945 chipset was pretty legendary though.
I remember setting up my first hackintosh with an ECS motherboard with that same 945 chipset. was fucking smooth.
glassmanjones@reddit
There was some flag for link.exe to give you 3GB of address space on windows - that held us together for a while before switching to 64 bit
CubicleHermit@reddit
I thought it was one of the Windows bootstrap parameters, vs. something you could select on the process level, but it has been a terribly long time since I've done anything professionally with Windows that wasn't basically "host a Linux VM (WSL now)" or "run conferencing tools."
glassmanjones@reddit
I looked it up. It was both!
/3GB was for 32-bit Windows
/LARGEADDRESSAWARE told link.exe to set a PE flag to tell windows that this app could use more virtual memory.
p47guitars@reddit
i wouldn't say that. a lot of x64 software from a variety of vendors slowly rolled out, FL Studio, Acid Pro, Vegas, and I wouldn't consider those enterprise software packages. Us creative types really benefitted from x64
brimston3-@reddit
The driver issues were mostly that some 3rd party drivers were 16-bit and couldn't run with the OS in long mode. A lot of which got fixed by Vista forcing the new driver model.
cazzipropri@reddit
It wasn't.
Did you migrate code from Win16 to Win32?
bobj33@reddit
Brand new account and this is the first post.
I assume this is AI slop.
Educational_Bee_6245@reddit
Was ist really? On which operating system?
If you think about it, when did the 386 introduce 32 bit mode?
And how long did it take for real 32 operating systems to become mainstream? Only when Windows 2000 or XP came around. Windows 95 and 98 had a lot of 16 bit baggage.
CubicleHermit@reddit
I don't think most users needed to care about the 32-bit baggage, and Win32s was a thing on Windows 3.x for a year or so before 95 came out, you had to have it to run Mosaic, and the need for 32-bit for internet browsers drove that transition VERY quickly between '94 and '96.
So I'd call it more like '86 to '94 when it was more in limbo, not all the way out to 2001 when Windows XP came out. Windows 2000 was just like the older NT versions and primarily a non-comsumer OS.
Educational_Bee_6245@reddit
No, there were 16 Bit Apps in Windows 95 for a very long time. I remember fun times with winsock.dll and winsock32.dll and what needed what. Also you had to go back into dos mode for a lot of games at that time.
CubicleHermit@reddit
That's backwards compatibility though, not lack of 32-bits, and I think you're looking at it the wrong way: "32-bits" doesn't mean "32-bit only."
32-bit Windows could run 16-bit apps all the way out to when it was deprecated last October. I thought they started dropping Virtual DOS Mode earlier than that but Google suggests it was still in 32-bit Windows 10.
"386-enhanced" mode goes all the way back to Windows 3.0.
Educational_Bee_6245@reddit
Sure, you could also run 16 bit apps in Windows NT or OS/2, even though they had pure 32 bit kernels.
Windows 95 and 98 were a strange mix, still built on top of 16bit MS-DOS.
The game changer was the flat memory model with a 32bit cpu, on Windows 3.0 all apps had to be written in the segmented memory model even in enhanced 386 mode.
A lot of later DOS games even where 32bit by using DOS/4G or the like. But that worked not because the OS was 32bit but because it was possible to go around the OS.
CubicleHermit@reddit
2000/XP/Vista/7/8.x/10/11 are all just the newer Windows NT releases, as much as I'd like to pretend 8/8.1 aren't part of that list. :)
The Win32s extensions (specific to 3.1/3.11, I'm pretty sure they never backported it to 3.0) did allow individual flat memory model apps on the later Windows 3.x. While other things used it, Mosaic was the killer app for that.
NT 4 and 2000 had a much higher cost at retail, weaker driver support, much higher memory needs, and few manufacturers preinstalled either one - I think all of those had as much to do with the slow uptake as the lack of booting into DOS first.
Educational_Bee_6245@reddit
I don't think Win32s were such a big thing back then. I never ran Mosaic on Windows 3.1 but both Netscape and Internet Explorer hat 16bit versions. I remember the map editor for Warcraft II needing Win32s.
But exactly my argument, Windows NT and what followed were proper 32 bit operating systems. And the 32 bit transition was really only done when XP became the mainstream os which was a good 15 years after the 386 came out.
CubicleHermit@reddit
I guess it depends on whether you're looking at the guts of the operating system or the APIs (or ABIs, as we'd say today) available to applications on them.
Windows 95 was already a good 9 years after, or 5 after they started to be semi-affordable. The original Deskpro 386 was a frighteningly expensive machine by regular desktop standards (as were the early 386 PS/2s), even compared to IBM's inflated pricing on the AT and the 286-based PS/2 series.
Educational_Bee_6245@reddit
Well, I don't argue against Windows 95 having 32bit API/ABIs but in general it was part of a transition from 16bit to 32bit.
Of course you could have a real 32bit OS in 1993, be it Windows NT, Linux or OS/2.
MrKrueger666@reddit
They were both smooth and both took their sweet time.
It was smooth because we have backwards compatibility. A modern Intel Core or AMD Ryzen can still run code for the very first x86 CPU's. You can just boot DOS on the machine you bought an hour ago.
The 16 to 32bit switch started with the 386. That was in 1985. It took until 2001 for consumer Windows to become fully 32bit. That was Windows XP. All the previous versions were riddled with 16bit code.
And yes, we did get Windows NT which was 32bit, but that was aimed at companies and power users. Also, they didn't always run older software.
Intel tried to get to 64bit with their Itanium processors, but those were horrible and nobody outside very large corporations bought them. They weren't really backwards compatible either.
Eventually, AMD came out with AMD64 in 2003, which is x86 extended to 64bit. It retained full backwards compatibility.
It has taken until Windows 11 to completely switch to 64bit. Windows 10 still has a 32bit version. Sure, we had WindowsXP x64 Edition, but that never really took off. It started to move with Vista, which most people hated and it really took off with Windows 7 starting in 2009.
It's all been really smooth, really. It took ages in both cases, but since we had backwards compatibility, it hasn't been much of an issue for most people.
glwillia@reddit
i ran windows xp x64 edition. it wasn’t really windows xp at all, it was windows server 2003 running on the desktop with a windows xp skin, and it definitely wasn’t ready for most users (drivers? yeah, good luck). vista got a bad rep, but if you bought a new computer with sufficiently beefy specs it ran fine (although many new 64-bit capable PCs of the time still came with 32-bit Vista).
MrKrueger666@reddit
Yeah, I ran XPx64 too. Getting drivers was sometimes an issue indeed.
And, I had software complain that it wasn't meant to run on server OSes. I believe it was a consumer version of O&O Defrag that had such a check and would not run.
brimston3-@reddit
Most modern Intel motherboards (ie. those conformant with UEFI class 3) should not be DOS bootable. They do not have CSM support, which you "need" for classic BIOS boot. Supposedly, that is everything post gen 12, but I've seen boards with CSM for even core ultra processors (H810).
There are some EFI shim programs that can run a BIOS/MBR loader, but I wouldn't say that is "can just boot" as now you're talking about having a bootloader on GPT that starts a bootloader on MBR.
MrKrueger666@reddit
Fair enough. UEFI does prevent booting DOS if CSM is disabled or not available.
And yeah UEFI has been pushing to eliminate legacy booting. Didn't know there's systems that are finally unable to boot DOS. TIL.
ElevatorGuy85@reddit
The iAPX label was used on more than just the ‘432. It was also given to members of the x86 family from the 8088 and 8086 through to the 386.
https://en.wikipedia.org/wiki/IAPX
I have several Intel hardcopy databooks for those non-432 processors with the iAPX designation on them. Here’s one (not mine!) from Bitsavers that has been scanned into PDF.
https://www.bitsavers.org/components/intel/8086/1981_iAPX_86_88_Users_Manual.pdf
syrtran@reddit
Yes, the renowned 432. Like Itanium, more expensive than, completely incompatible with, and slower under load than the x86 processor it was intended to replace.
You'd have thought Intel had learned their lesson.
sneesnoosnake@reddit
The same reason there is no push to 128-bit. A bit of diminishing returns. 16 to 32 was massive, 32 to 64 was important to handle more memory, 64 to 128 doesn’t have a compelling general benefit
CubicleHermit@reddit
Storage systems are getting very close-ish to where 64-bit addressing will become a problem, The largest hyperscalers may already be there.
RAM has a ways to go, although we're probably less than a decade away from where that would be an issue if the largest clusters used a single memory space. Fortunately, they don't.
Laptops just barely hit 128GB in the past couple years (2^(37)) and the densest 2U servers can hit 8TB (2^(43)) so even if Moore's law were doubling annually applied we'd be looking at about 20 years before a 64-bit limit applied on single-system RAM.
Where 128-bit addressing may become interesting again is with security if there's ever a renewed interest in capability-based addressing which has been around since the 1970s but has never broken through to the mainstream except on one series of IBM machines (System/38, AS/400, Series i).
It's also already been around for ages for vector processing; SSE has been around since 1999 and uses 128-bit vectors, all the way up to avx512.
Deksor@reddit
64bits can allocate up to 16 exabytes, which means 16 000 petabytes, or 16 million terabytes. But this is for ram, I don't see how storage has any relevance here, even if we somehow had a giant cluster storing 32 exabytes, all this storage isn't mapped in addressable space at all time
CubicleHermit@reddit
For storage, I suppose I should have said "would have" because in practice, the systems that are likely to need it soon have already gone to composite identifiers.
64 bit LBA is required for anything over 2TB because of the older 512B sector standard.
If you treat an exabyte-scale storage system as a single device with LBA, you have (block size in bytes, usually 4k) * 16 exabytes before that's full.
That sounds like a lot, but Google is already past the 16 exabyte mark and odds are some of the other hyperscalers are as well.
Storage is basically never addressed that way in big clusters, but your effective identifiers are usually sparse enough that in a big clustered file system they're already past 64 bits, just split between the on-device address and the device address (itself usually a composite.)
Deksor@reddit
Yes but that's not something related to the CPU's architecture, it's software and disk controllers. The 2TB limit is related to how the BIOS handled boot sectors (MBR, etc). Otherwise LBA itself has been 48 bits since 2002. So for PCs with that issue, you can actually still boot your OS on something with a capacity below 2TB and then have 20TB disks just fine :)
It's like the unix time issue which is stored on a signed 32bit value and will overflow in 2038, your 32-bit pc can still manipulate a 64bit number fine, only that it will be a little bit slower. It's just that it's been coded that way originally, and has become problematic in the age where 32-bit machines have been already obsolete.
I have an even better proof of this for storage : 16-bit PCs used 22bits and later on 28 bits for drive capacity (TBF, the calculation was made differently, but the amount of bits stored remains the same). Nowadays you can put a card called "XT IDE" in an IBM 5150 with an 8088, which is a really simple card with no intelligence (only glue logic) and then put a 128GB flash card and format it with MS-DOS 7.1 and have access to the whole capacity.
It will be very slow (I've seen some doing it, and a DIR command takes ages to return, because it wasn't designed for drives this big lol), but it works ! And afaik xt ide just uses 28bits LBA.
Consistent_Cat7541@reddit
Other people said it with more depth... it was not. It was not smoother and it was not quicker. This is like asking why the Intel to M1 transition was slower than the 68k to PowerPC transition for the Macintosh.
ghostchihuahua@reddit
TL;DR: it took mathematicians ages to go from 16 to 32. The legend says that they had to invent 31 on the way and that it was a very difficult task.
/s
yahbluez@reddit
I think most answers miss the point.
The move from 16 to 32 bit was done so fast because 16 bit (65.536 numbers) is to less for many things in real live. While 32 bit (4.294.967.296 numbers) is a range wide enough to cover nearly everything and give a precision behind most needs.
The step to 64 bit 18.446.744.073.709.551.616 that's EXA is behind most needs even behind to make sense at all. With the next step 128 bit we have 2¹²⁸ which is a number bigger than the number of all Quarks in the universe.
musingofrandomness@reddit
Aside from the obvious large examples others have provided, the 32 to 64 conversion was likely just more visible since computers were in much more widespread use.
I am no expert, but I have heard that inline assembly code was a big hangup for the 32 to 64 shift since the x86_64 CPU actually reuses the first half of the 64bit registers as 32bit registers and you can overwrite your stored values unintentionally. The compiler usually avoids this scenario, but defers to the inline assembly when provided which may not be tracking that nuance. I would notnbe surprised if the 16 to 32 shift had similar "gotchas" to be worked through.
aitkhole@reddit
I think you can perhaps guess from the fact that the 32 bit registers are called things like EAX, EBX and the 16 bit registers were called things like AX and BX that they were indeed extended in the same way.
musingofrandomness@reddit
I only even know about the registers because a coworker was showing me some assembly programming on a (new at the time) amd64 machine and happened to notice the behavior while he was tinkering.
I still remember that even "hello world" was a lot of work to pull off in assembly since you had to set it up one character at a time. I learned a lot from that coworker, and it gave me a lot of insight, but I still don't consider myself a coder by any means. Barely a scripter on the best of days.
jason-reddit-public@reddit
Pointer size and max unit of work sizes have diverged again -- just when I thought 64bit little endian is the new king.
The x86 had 16bit registers but *also* had segment registers which while a bit cumbersome, worked together to provide a 20bit address space (a good chunk of which Gates famously said we didn't need - had Intel used one pin for IO we'd be , checks my notes, *much like we are now* but maybe a bit, no pun intended, better all along (might have been able to have a faster memory bus and simpler cache add on cards.)
In C and perhaps nascent C++, there were these things called *far* pointers which I think took up the space of 32bit pointers but a product of reality instead of a gift).
If my memory is correct, you might get away with one blast through your C code to get rid of far (and huge?) pointers and have a good starting point for a 32bit code base. Moving to 32bits might have involved just removing every mention of huge and far . Since stuff was moving so fast, despite a modern version control system, you would have put your best people in charge of the 32bit fork windows fork and less effort on your MS-DOS version.
C sat pretty comfortably on top of the x86-32 stack (gcc and mscc, maybe even icc). It was also the kernel language behind most 32bit operating systems.
While the transition to 32bits wasn't easy., the 386 apparently had design put into it to make it be able to virtualize 16-bit DOS.
Everyone wanted to get to 64bit first. DEC actually did it best but missed the cost metric and the heat density curve. (If you fit 4x semiconductors in the same space, but those semiconductors only got 50% more efficient), your heat goes up in heat by 2X). This was *very* clear by 2000.
Separate line but lookup Transmeta Longrun and such. Longrun was better than seemed possible because there is a V\^2 component. Undervolting is also not a linear curve. Had display tech been ready, I think Transmeta could have done better. BTW, most SPEC benchmarks were pretty good on the P95/Crusoe (at least under linux). Window's 95 threw a curve ball - in that its "segments" were not 32bit flat like Windows NT which required extra instructions. Huh. Almost like they did it on purpose. Strangely, I think that decision meant the PentiumPro also suffered but perhaps by less.
Had Intel simply partnered with DEC instead of HP, things would still be the same because no one at Intel could admit that they weren't the smartest folks.
DAN-attag@reddit
It wasn't smooth. Original i386 was released in 1985, while consumer Windows versions didn't got proper full 32-bit support until 1995 and even then plenty of software was still 16-bit. I
chickenbarf@reddit
I dont recall it being a breeze, in fact we had to hack the A20 address line with hardware to make it go... with the keyboard controller.
anothercorgi@reddit
Not sure, thought the 32 to 64 bit transition was much smoother I thought, but I might be Linux-biased. 64-bit OS supported 32 bit binaries just fine as both supported protected mode; 32-bit OS all had issues running 16-bit because 16-bit was real mode and did not work with protection.
I'm more surprised of the 68k -> PPC -> X86 -> ARM that Apple did. That would have pissed me off, luckily I never bought Apple hardware.
BCProgramming@reddit
Well, I'd say Windows 95 was probably the first mainstream consumer OS that really started to get the "masses" over to a 32-bit (or at least approximately 32-bit) architecture. And that's a good 8 years after the 386 brought is 32-bit Protected Mode. of course other systems like Windows NT and OS/2 existed that provided "32-bit computing" environments but they weren't as popular overall. Even up through 94 Most systems were running DOS and Windows 3.x; the biggest user of 32-bit protected mode was probably DOS Games using extenders.
By the time 64-bit rolled around though a lot of the hardware was abstracted from software being written, too. Rewriting a DOS program written for Real Mode to work in 32-bit protected mode was an undertaking; switching your 32-bit Windows program to 64-bit was often just a compiler flag.
Key-Employee3584@reddit
It only really happened in the personal PC world. In the commercial world, it was a much more managed transition. For the small IBM business systems, they went from a 16 bit architecture like the System 36 to the AS400 which was a 48 bit architecture. And then the AS400 went from 48 bit CISC architecture to 64 bit RISC in the mid '90s. All of that installed software base needed to have special IBM software tools to either inspect existing source for non-compliant code to be updated and/or recompile to the new standards. It was fairly straightforward although somewhat time-consuming depending on the situation. IBM did a pretty good job in getting customers to get their source up to speed. But then again, that's the commercial world. The personal world was far more chaotic.
cch123@reddit
It wasn't.
ModularWhiteGuy@reddit
Wow, I thunk a lot about this at the time.
It wasn't all that smooth.
R-ten-K@reddit
Narrator's voice: It wasn't.
It also depends on which tier and time period you're focusing on the transitions: 32bit transition in mainframes in the 60s, minis in the 70s, or micros in the 80s. or 64bit transition in high end RISC in the 90s, or commodity x86 in the 00s, etc...
wrosecrans@reddit
There is literally still weird old Win16 era vestigial cruft in the modern Windows headers today. So let me know when we actually finish the migration away from 16 bit, and I'll tell you how quick it was.
VivienM7@reddit
On what OS? Windows?
I would note one thing lurking in the background. Steven Sinofsky points it out in his book Hardcore Software. Most of the people/businesses buying Win95/98 computers were buying their first computers. Unlike in the early 1990s or the late 1980s where WordImperfect didn't want to offend the person had spent a fortunate on an AT and therefore continued to support old DOS machines in order to get his $495+ upgrade price, Microsoft did not care about accommodating older machines with the first wave of 32-bit desktop apps. Some third party developers did care - Corel supported WP into 7.0 on 3.1, I believe.
Transition to 64-bit came in the wake of the Vista debacle which revealed people's frustration at constantly buying new computer stuff. 32-bit apps worked fine on 64-bit. 32-bit operating systems work fine (my dad was issued a laptop with a 32-bit OS in 2010, a nice Sandy Bridge one at that, etc) and have some more backwards compatibility for weird legacy stuff.
I would actually say 32-to-64 was smoother, 16-32 was quicker... unless, of course, you measure from the day the 386 protected mode shipped in 1987. In that case the fact that it took 8 years to get a practical OS that really took advantage of it...tells you something.