"PlayStation 5 Pro Teardown: An inside look at the most advanced PlayStation console to date"
Posted by Dakhil@reddit | hardware | View on Reddit | 49 comments
shugthedug3@reddit
Remember when consoles got cheaper just years from their release?
Two years after the release of the original playstation it was 2/3 the launch price.
What happened?
TheAgentOfTheNine@reddit
different philosophy back then. Today we are "and the next model will be so much better", back then it was a more competitive market and the idea was "and the model will be so much smaller, lighter and cheaper".
I mean, look at the ps2 slim model compared to the original one and to the ps3 slim, too.
Sony was going for mass adoption for everyone that still didn't have one of the original models. Nowadays they seem happy with selling fewer models with bigger margins because there's nobody that will enter their market and take away a big chunk from them.
grumble11@reddit
It made a lot of money in the PS4 era as there was a really meaningful process shrink that let them make a more powerful console fairly affordably, so they did. For the PS5 they're trying to repeat it since the PS4 Pro was a successful effort.
Personally I agree, these half-steps are a mess, but clearly there are people out there who want an enhanced PS5 experience and who will pay for it (and buy a few high-margin games along the way).
Fast_Passenger_2890@reddit
I always find the internals interesting to look at. The engineering behind it all fascinates me.
reddit_equals_censor@reddit
it is even more relevant than ever, because pc gaming sadly is now getting pushed forward by sony's consoles.
the ps5 broke the vram requirements in games massively, which is a good thing btw.
so the ps5 is helping pc gamers.
and the ps6 is expected to do so again.
it wasn't so in the past. but it is now.
so the engineering behind the future of pc gaming in a console is interesting af.
Johnny_Oro@reddit
Good for rich PC gamers who could buy high end parts maybe. Pushing hardware requirements up is never a good thing. Devs will do anything they can to avoid optimization work while chasing high fidelity graphics at the same time, thus we end up with games that drop down to 60 fps on RTX 5090.
Strazdas1@reddit
Pushing technological prowess is always a good thing and if someone cannot afford the best GPU they should consider they are not entitled to the most powerful games either.
Johnny_Oro@reddit
Pushing hardware requirements doesn't necessarily equal pushing technological prowess. Back in the day there were games that could only comfortably be played on a powerful hardware such as Frontier Elite II. It did things no other game could do before, it introduced simulations never seen before and unapparelled graphics. But modern AAA games don't do simulations much differently than they did a decade ago. Except for graphics, with the popularization of real time raytracing, mesh LOD, and such.
Also you don't need the biggest amount of VRAM, the biggest bandwidth, or the most powerful GPU compute cores to implement these tech. They can be implemented on GPUs as affordable as Intel Arc A310 which has RT accel cores, as well as Arc and RDNA 2+ iGPUs. What's really happening is these companies want to implement the latest tech without compensating on visual quality and development time and budget.
And more importantly, without compatibility with widely available hardware, the economics just couldn't scale. That's how terrible "F2P" games that are actually cash sinks for the players end up in low end devices, and actually become so much more popular than actual well made games.
kwirky88@reddit
Smooth motion helps clarity far better than ever so slightly accurate shading. I’ve had a 3090 for a long while now and honestly, I turn Ray tracing off. It’s just too heavy of a hit to framerate and I actually enjoy my games more when the they hit high framerates, 160 and up.
Manufacturers don’t want to compete against past sold items and there’s a marketing term known as a buy hole: where an item is sold and thrown into a bottomless hole to never come out on the resale market. Nvidia tries to throw GPUs into the buy hole by introducing new features in newer lines, with the hopes people won’t buy used cards. Honestly, used 3070 through 3090 cards are perfect these days for most any game on a 1440p or 1080p monitor. Beyond that it’s diminishing returns because you’ll need an incredibly expensive monitor to take advantage of a higher end gpu.
Strazdas1@reddit
I agree. Thats why i said that pushing technological prowess is good. Rwquirements are a result of that though. If you are increasing requirements without increasing technology that is not a good thing.
Its complicated but the simple version is consoles had low memory so developers dropped simulation and went all for GPU utilization with minimal system memory usage. Just another case of consoles ruining it for everyone.
They scaled just fine in the past. I guess you havent been gaming for a long time. I remmeber when a new AAA release meant your last gen GPU would not launch the game at all, at any settings.
They are also more popular because 1) they are free 2) they use predatory psychological assault to keep you coming back
Johnny_Oro@reddit
While requiring hardware with the latest features will be necessary to utilize the latest tech, what I'm saying is pushing hardware requirements to the middle to high end territory isn't actually necessary, because even low end hardware from the same gen will often have those features. And by enabling graphics customization that lets you disable rendering pipelines, swap geometry and texture data, and such, you could even let those games run on older hardware. The original Oblivion is a good example of that.
Consoles have flexible memory allocation. Even back in the Xbox One/PS4 era they had 8GB, which is plenty for most kinds of simulations. Actually most of the memory space in modern games would be occupied by textures and other assets rather than game data.
Games also cost a lot less to develop than they do today. They didn't need to sell as many copies. 2 million copies was considered a huge success in the early 2000s. And GPUs were still struggling with standardizing specs.
Yeah but they're not as popular in the PC space where there are plenty of affordable quality games available for everyone.
Strazdas1@reddit
The issue with that is that quite often different render pipelines are not developed at all in the first place so there is no option to switch. If you develop RT properly traditional lighting is not needed at all and will save you months of developement time.
Im talking PS3/Xbox360 era with their famously abhorrent memory. It got better in the OS4 Xbone era but developers have stopped caring by then.
They are also the most popular games in PC space. See LoL as an example.
Johnny_Oro@reddit
That's the same problem with Oblivion. The lighting system is highly taxing, but there's no alternative lighting system made. The solution? Simply allow the engine to render flat shaded textures with no lighting whatsoever. Far away distance objects are highly taxing, but there's no alternative LOD system. The solution? Allow the engine to straight up not render faraway objects. Those can be enabled through the ini file.
Yeah that was certainly limiting compared to PCs at the time, but it was a time when RAM was pretty expensive, and GDDR3 was only 32MB per module.
But not as popular as they are in the smartphone market, where they practically take up 100% of the market.
Strazdas1@reddit
Idont think those solutions would fly with a modern audience. Have you see what modern games would look like with lightning pass disabled? Many would be straight up unplayable.
Oh yeah smarphone market is totally ruled by those predatory games.
reddit_equals_censor@reddit
NO.
that is not at all what is going on.
this is about all of gaming and particularly about low end gaming.
nvidia for generations now is trying to freeze/degrade the low end.
and amd trying to follow them mostly.
if things would be going to nvidia's liking there wouldn't be any 12 or 16 GB cards below 500 euros/us dollars.
and this has basically nothing to do with optimizations.
you don't expect games to run on a 10 year old graphics card perfectly fine right?
yet we got 8 GB vram being the mainstream amount for 9 years now.
complete stagnation.
and vram usage can't get optimized away.
9 years of vram stagnation got broken through massively with the ps5.
and IF nvidia releases super cards with vastly more vram, then that is because of the ps5.
we should by now have 24-32 GB and 16 GB being the 250 us dollar card at the bottom.
___
and in regards to devs and optimizations. devs have been begging graphics card makers and especially nvidia to put a proper amount of vram on cards for years and years now.
but they never did. they literally removed vram. a 3060 12 GB is still a useable card rightnow,
but nvidia is rightnow trying to sell a 5060 ti 8 GB and a 5060 8 GB is coming.
again you don't expect a 9 year old graphics card to run games well or in lots of cases at all anymore right?
yet somehow they are still trying to sell graphics card with the vram amount from 9 years ago...
___
so how you should think of things:
nvidia: "you will NEVER EVER get more vram"
ps5 releases...
nvidia: "well i guess we sadly have to start releasing cards with more vram eventually".
__
and it is also important to point out, that having enough vram on a very slow card is always beneficial/required.
so you got a not a very powerful 7600 xt for example with 16 GB vram.
the graphics card is not very fast, but it has a working amount of vram, so you can always MAX OUT texture quality. so you got vastly better visuals than some broken ass 8 GB cards, that may need to run vastly lower texture quality at 0 performance cost.
and yes you read that right. texture quality has 0 or near 0 performance impact, as long as you got enough vram.
and we want better texture quality and lots of other features, that take up inherent vram.
so you want more vram. you should be getting more vram at any price point.
and cheaper 16 GB vram options are becoming available based on the ps5 forcing it.
so that is a good thing.
this is not about optimizations.
this is how 8 GB vram gaming looks like in 2025:
https://www.youtube.com/watch?v=AdZoa6Gzl6s
and it has nothing to do with optimizations. it is purely 9 years of vram stagnation. anti consumer bs.
chefchef97@reddit
I really liked their official PS5 teardown, I loved seeing the thoughts behind each internal layout choice
In the end everyone hated it because it was hot and loud, but man I still like that video
GenZia@reddit
Aside from looking incredibly cheap, the heatsink looks rather small for the amount of heat it's supposed to dissipate with that solo centrifugal blower, especially after the inevitable dust build-up.
But I suppose it's 'acceptable' for consoles to operate near boiling point whereas I try to keep my hardware in the low-mid 60s, max.
Strazdas1@reddit
Its acceptable for all modern chips to operate near boiling point. They are designed for that.
anival024@reddit
Why? Your hardware is designed to operate 20-30 degrees hotter than you're running it. You're just wasting energy if you're cooling it that much (or you're drastically throttling performance).
GenZia@reddit
Well, I won’t get into the endless debate of 'why.'
Suffice it to say, there’s a reason water cooling exists.
wankthisway@reddit
You sound like those guys that still tell you to let your car warm up for a minute before driving off.
GenZia@reddit
So... Running your hardware at 100c is perfectly safe and healthy because it's unnecessary to warm up your car in the morning?
Ploddit@reddit
That reason is silence and efficiency, not keeping your CPU 30 deg below thermal design limits.
Dey_EatDaPooPoo@reddit
You're not wrong about the hardware being designed to operate at 80-90C but you do actually save a small amount of power, even accounting for higher fan speed, running in the 60s due to lower semiconductor leakage at lower temps. Once you get into the 90s it actually starts making a pretty significant difference in power use.
Eclipsetube@reddit
This thing outputs 230w of heat. Not nothing but your gpu alone probably uses more power than the whole ps5 pro. So I’m pretty sure it’s not near boiling point
GenZia@reddit
What makes you so sure?
Now, the size of heatsink isn't everything as there's another factor at play: Airflow.
But airflow comes at the cost of noise and the PS5 Pro, at 45-50dB, is well into reference GTX580 territory (a card with a notoriously loud blower).
For comparison, my 4070S (a 220W card) peaks at \~70c (stock) with the dual-axial fans running at just under 30dB. With a minor undervolt, it basically stays in high 50s, low 60s.
Eclipsetube@reddit
Have you ever seen how thin those fans on your GPU are? They’re barely moving any air. The fan of the ps5 pro is probably 3 maybe even 4 of those GPU fans stacked on top of each other when it comes to volume
GenZia@reddit
Which should explain the (much) larger heatsink on my GPU.
Eclipsetube@reddit
Fair but a heatsinks job is just basically soaking in the heat. So if the fan is powerful enough the heatsink wouldn’t have to be so large
Vb_33@reddit
I wonder why it's so powerful limited. The PS5 used 225W when it launched on 2020, would have expected the Pro to dial things up.
BlackenedGem@reddit
The PS5 was already a humongous console when it launched, larger than a PS3 and pretty much at the limit of what consumers would tolerate. Increasing power would require more silicon and cooling which dials the price up even more.
Remember that these devices are often going underneath a TV and there needs to be allowance for substandard airflow setup.
exomachina@reddit
Good thing consoles aren't made of water.
GenZia@reddit
They still run on silicon chips, unfortunately.
reddit_equals_censor@reddit
what???
do they have issues with application or what is going on?
grooves of course mean more metal before heatpipres and should mean worse cooling performance.
so why did they actually ad them?
application issues, that left parts without liquid metal sometimes?
the general wisdom is the lower the height of a thermal interface material the better.
so would be really interesting if we could hear the truth for the change instead of the sony propaganda bs.
Asgard033@reddit
The grooves provide a capillary/wicking action that keeps the liquid metal where it should be to be
reddit_equals_censor@reddit
see now that does sound reasonable, but marketing and legal didn't let them say, what is most likely the truth, because it would point to issues on the non grooved versions or some shit, that people could think of.
it kind of sucks, that possibly good technological improvements which grooves may be. if they drastically decrease the risk of liquid metal leaving the die or risk of parts of the die becoming free from liquid metal, while the increased height of having the grooves doesn't impact performance,
doesn't get honestly talked about, because "oh no we can't talk about sth having been an issue in the past" or whatever or implying that.
Asgard033@reddit
The direct quote from the Sony blog post says this:
They didn't say whether it made cooling better or worse, only more stable. They also didn't say anything about the regular PS5's cooler having issues, so whether it's a case of "regular PS5 has issues they're trying to hide" or "regular PS5's cooler is still fine, but PS5 Pro's implementation is better" is anyone's guess. Time will tell.
Captobvious75@reddit
Its to keep the liquid metal from moving where it shouldn’t.
Blackberry-thesecond@reddit
What the fuck that doesn’t look like it does in Astro Bot.
fire2day@reddit
Must be incredibly advanced, considering the price keeps going up.
reddit_equals_censor@reddit
it is is advancing over time ;)
continuously.
it could have advanced another 50 us dollars as we speak ;)
BugsMax1@reddit
What?
GenZia@reddit
Must be incredibly advanced, considering the price keeps going up.
BugsMax1@reddit
The PS5 pro price hasn't changed as far as I'm aware
JuanElMinero@reddit
Downvoted for a correct statement. Never change, /r/hardware.
fire2day@reddit
Oh right, this is the Pro we're talking about. That's my bad. It's the less-impressive one that keeps rising in price, despite being 5 years old.
imKaku@reddit
That was the regular version(s). Price jack up to make the pro seem worth it.
Fun fact, in my local currency the Pro doubled in price from the regular one. Part currency sucking and part stupid pricing of the Pro.
_OccamsChainsaw@reddit
Any insight to why the wifi card sucks in this thing? My base ps5 had no issues. In the same exact spot in the house the thing goes in and out all the time plus it freezes on screens at times
KirillNek0@reddit
Bruh... You late by 4 months.