[Hardware Canucks] The Best AMD IGPs vs RTX 5050 - Testing on IdeaPad Pro 5
Posted by kikimaru024@reddit | hardware | View on Reddit | 81 comments
Posted by kikimaru024@reddit | hardware | View on Reddit | 81 comments
shadowtheimpure@reddit
The 860M is frankly not a great example of the 800 series of iGPU in terms of gaming performance. The 880M and 890M, as shown in the chart, beat it quite handily and reach playable 1200p framerates.
Rekt3y@reddit
Or let's not forget the 8060s, which will beat the pants off this thing. Any laptop with the 8060s will be quite a bit more expensive though most likely
max1001@reddit
A bit? More like 2x. These are going in $800 basic gaming laptop .
ResponsibleJudge3172@reddit
That iGPU is larger than 5050 by quite a bit if you actually look at the specs. Of course it will be in more expensive laptops than 5050
shadowtheimpure@reddit
I was focusing on staying in the same product stack for the sake of comparison. It's not exactly fair to compare across APU generations.
Raikaru@reddit
if by quite a bit more expensive you mean like 2.5x the price yeah
VenditatioDelendaEst@reddit
Bizarre that the battery life is so much worse than the same platform without the dGPU. More than 2 hours less in web browsing, and 4 hours lost in local video playback!
Is this a firmware bug? Is Nvidia's driver team sniffing glue? Is Microsoft sniffing glue? No sensible system architect would supply power to to the dGPU at all in either of these situations.
Malygos_Spellweaver@reddit
My guess is that the GPU is still there in some kind of "standby" mode. But I agree makes zero sense to have the dGPU taking so much energy, this should be 1 to 5% battery life difference max.
VenditatioDelendaEst@reddit
1 to 5%? Pshaw! It should be zero.
Malygos_Spellweaver@reddit
I wish, but I would not complain if it was just a little more sip.
Hytht@reddit
This is has been a problem for the past 15 or so years since the introduction of muxless hybrid GPU, dGPU is connected to iGPU through PCIe link & has to stay on all the time. dGPU laptops always have worse battery life than iGPU only laptops. It's not like dGPUs in laptops are modular and upgradeable so I would rather have big iGPUs like Apple instead of dGPUs, much longer battery life.
empty_branch437@reddit
My cheap vivobook with 6200u and mx110 laptop from years ago doesn't have this problem
Hytht@reddit
Laptops from that era had inefficient CPUs which already drain the battery in 4 hours that you would not notice the impact of a (weak, low power MX series) dGPU consuming power in standby.
VenditatioDelendaEst@reddit
Eh, IIRC CPUs since Haswell and later have been pretty decent for idle power.
Strazdas1@reddit
what do you think is powering windows desktop manager for that video playback?
VenditatioDelendaEst@reddit
The IGP on the CPU SoC should be, but it appears that someone had other ideas.
Strazdas1@reddit
Should, yes. But does? probably not in this case. They are probably using gpu hardware acceleration for video decoding, so GPU is powered on regardless.
VenditatioDelendaEst@reddit
The IGP also has hardware acceleration for video decoding.
Even if it didn't (an imaginary scenario just to demonstrate the magnitude of silliness at work here), I'd bet that for all but the most demanding codecs/bitrates, you could use less energy decoding on the CPU with really good DVFS (largest decode buffer that fits in L3 + slow-changing CPU frequency for minimum excessive voltage at 90%+ utilization). This is definitely the case on big-boy desktop dGPUs that won't even get out of bed for less than 10W.
Remember how the the RX6500 was criticized for sparse video codecs and only 2 display outs? That die was originally intended as a laptop product. In a laptop, display-driving-related features in a dGPU are vestigial.
I'm a bit of a radical, and I say they're mostly vestigial on desktop too. Only HEDT part and those -F wastes-of-sand lack integrated graphics, and HEDT is PCIe-lane rich enough, and a small enough market, to subsist on the existing supply of used RX6600s, RTX4060s and whatnot, until Matroxx can ramp up
Strazdas1@reddit
I think you are assuming that the OEM tuned everything for maximum power efficiency. Usually thats not the case. OEM puts a dGPU inside, so it shifts all jobs to dGPU and just ignore the IGP or leave it to power display passthrough. They will absolutely not care about decoding on CPU to put GPU into sleep state.
VenditatioDelendaEst@reddit
I forgot to include "OEM is sniffing glue" in my original list of possibilities. What you are suggesting is pretty deep "Nvidia is sniffing glue," too though. If "jobs" like this are shifted to the dGPU, optimus is basically not working.
RazingsIsNotHomeNow@reddit
Does the dGPU laptop have a Mux? It's a 5050 so I doubt it since it's a budget class laptop. If it doesn't have one, then that means anytime the screen is on the 5050 is always in charge of rendering the screen and can't be turned off. If so, this isn't a surprise at all. The 5050 will always be less efficient in light work load tasks.
VenditatioDelendaEst@reddit
Driving the internal display from the dGPU would be a very poor design decision, IMO. Even Windows has single-copy hybrid graphics these days.
WJMazepas@reddit
That looks like a great laptop. I have an Asus Vivobook with i5-12450H and a 3050 4GB and the battery lasts less than 2 hours with me doing anything. I dont know why.
I dont play AAA on my laptop, so a 5050 8GB would already be a great improvement and the battery life would make my life so much easier.
The problem is that i live in Brazil, and Lenovo do get most of their products here, but usually only bring the Intel versions
YKS_Gaming@reddit
so long as you are using windows and x86, you will always have 2 hours battery life without doing anything
manek101@reddit
There are windows laptops on x86 that give 10+ hours of screen time.
shugthedug3@reddit
Yeah mine does 12 hours no problem and it's from 2020... with a dGPU and an H CPU.
YKS_Gaming@reddit
10+ hours only when playing a single video on loop, without actually interacting with the system.
opelit@reddit
You either only use gaming laptops that have no optimization from OEM (most I have seen runs GPU even without load, causing additional 15W+ power draw), or has some shitty 15y old laptop.
Last laptop I had has Ryzen 4500U had 7-8 battery life by browsing web, listening to Spotify and doing office work at the same time, with barely 55Wh battery.
YKS_Gaming@reddit
4500U draws ~5W just idling due to power draw outside the cores(I/O, IMC, and others), no matter how low the clock speed.
A single windows update will make your chip draw 15W, throughout the update.
Healthy-Doughnut4939@reddit
You clearly haven't used Lunar Lake before because it's idle power draw and low power performance is much better than all prior Intel and AMD chips
opelit@reddit
In fact, a new CPU can completely shut down unused cores, and 4500U (4xxxU series) was the first gen that was able to do that on a mobile AMD SoC. That's why it had a huge efficiency jump over previous 3xxxU series.
quildtide@reddit
That explains a lot, I observed really acceptable power usage (both battery life and heat) on both of the AMD laptops I used with Zen 3 and 4. Both are (unfortunately) also on Windows 11.
They're not pulling crazy efficiency benchmarks like Lunar Lake, but they felt pretty good with everyday use.
opelit@reddit
The whole point of mobile SoC is short burst. They have few values of max power draw. There are so-called sustained draw, burst and short burst. On Intel a super thin laptop can have TDP of max 80W, but it is for 1-2s, then 55W that is limited by thermals and around 60s time, and around 30W of sustained power that once again can be throttled by temperature.
The whole idea is that when your beloved Windows Update come, the update is done in few seconds and even 80W spike in 2s is barely (... let me do math .... 2s x 80/3600s ... 4mWh) freaking 40mWh.... XD a 40/1000 of 1Wh.
Also, if your windows updated takes longer than few seconds, or up to 30s (with restart) then you should not block them and wait for them to cumulate 200 of them at once.
manek101@reddit
"2 hours battery life without doing anything".
This was your idiotic quote, now you are pushing goalposts and want interactions?
Both my Meteor Lake(gaming) and Zen 2(work) laptops give me 5-7 hours of web browsing, neither have top end battery life.
Newer x86 SoCs like Lunar lake/strix point can give 10+ hours of light web browsing and 7-9 hours of medium-heavy browsing.
Qsand0@reddit
What sort of misinformation is this? Is lunar lake ARM? Or are you on shrooms
YKS_Gaming@reddit
Windows will always boost unnecessarily, and run update in the background while on battery and during "windows modern standby".
empty_branch437@reddit
Saying "always" immediately proves you're providing misinformation.
YKS_Gaming@reddit
windows 11 always boosts unnecessarily when you interact with the system.
Most of the reskinned UI, including the right click context menu, is written with XAML, and calls for graphical acceleration every single time it is used, which on windows, involves a lot of fuckery that leads to a lag of ~10 frames and your CPU boosting unnecessarily to initialize the graphical stack and call the GPU - just to make it look a little prettier.
and that's without going into advertising, usage data collection, and windows update.
I believe that qualifies as "always".
empty_branch437@reddit
Why did you delete your comment? Don't you stand by your opinion?
YKS_Gaming@reddit
I didn't?
WJMazepas@reddit
The video posted on this very much topic showed a much better battery life than my laptop.
And it was reaching 10+ hours of battery
shugthedug3@reddit
Does it have a very small battery or something?
The 3050 dGPU should be in a low power state if it isn't being used.
shugthedug3@reddit
5050 will obviously crush any iGPU. We're talking 1650-ish performance for the absolute best of any of them, as far as I know.
himemaouyuki@reddit
8060S iGPU: u sure about that?
shugthedug3@reddit
Is it better? I'm probably out of date, I thought I heard AMD's latest are very good and around 1650 level.
himemaouyuki@reddit
1650 level is 780M, which is 2023 iGPU
880M/890M is 2024, which is on 2060 ~ low 3050 level
8060S is on par with 4060~4070, with unified RAM making u being able to utilize up to 96GB VRAM (from 128GB RAM on the system)
Qsand0@reddit
780m is 1650 level??
Buahahahaha...
780m is like a 1050ti at best wtf
himemaouyuki@reddit
Please, we dont talk about desktop GPUs here. All im referencing to are all laptop GPUs, both iGPU and dGPU.
Qsand0@reddit
And I'm referring to laptop gpus. You clearly don't game on laptops else you'd know laptop dgpus still shit all over igpus
himemaouyuki@reddit
I did own both a Legion Y540 (9750h 1650) and a Thinkbook G6+ (8845HS) to play games, that's how I can say that the 780M is on par with 1650M, bruh.
Qsand0@reddit
Except they're not. Literally go compare them on notebookcheck rn.
Siats@reddit
According to Computerbase's testing a 1650 is still 40% ahead of the 780m and the 890M is only a ~10% improvement over the 780M.
Qsand0@reddit
The amount of misinformation on this thread is insane.
How can someone boldly claim an 890m is 2060 or 3050 level. Are you maddd?
780m - 1050/1050ti 890m - 1650 max q
conquer69@reddit
The 890m is only like 10% faster than the 780m. The main benefit is improved performance at lower wattages.
The 8060s is like 10% slower than the 4060. It's nowhere near the 4070.
shugthedug3@reddit
Ahh yeah, that's Strix Halo right? I forgot about that. Has that been used in a laptop yet?
himemaouyuki@reddit
Only Asus ROG Flow Z13 and HP Zbook Ultra G1a atm.
The rest? On Mini PCs, 1 Framework Desktop, some Workstation, and a prototype GPD Handheld.
ResponsibleJudge3172@reddit
8060S is literally 9060XT size IGPU with much less performance
Strazdas1@reddit
8060s is 3 times the price.
himemaouyuki@reddit
True, laptops with 395 are way too overpriced. Can also go down to mini pc ones, for eg. it's about $1500 for 64gb ver, on GMK Evo X2 ($2000 for 128gb ver).
Beautiful_Ninja@reddit
And now you know why Nvidia is bothering with the "waste of silicon" known at the 5050. It still completely stomps any integrated solution outside of the prohibitively expensive Strix Halo iGPU's. They are also cheap enough that we'll see them in the discount laptop sales for the upcoming school season where Nvidia scoops up a massive amount of their marketshare.
JustBeLikeAndre@reddit
To be fair, these comments were aimed at the desktop version, which doesn't make sense considering that you can just get a 4060 or even a used 3070 for cheaper and better performance. I haven't seen the reviews yet but I wouldn't be surprised to see better offerings from Intel or AMD. On the laptop side though, they make more sense.
BarKnight@reddit
On top of that the 16GB AMD card is $100 more. Which doesn't make it much of a budget card.
tukatu0@reddit
Well that and the b580 is the real reason. The rtx 4050 is also a thing but that never came to desktop. I do find it a bit odd steve has been so angry lately. If the 5050 shouldnt exist, well that logic applies to the whole lineup.
So thank amd and intel for even existing i guess
NeroClaudius199907@reddit
Doesn't arc have less than 0.15% share? b580 is selling worse than 3050, 3060, 6600, 6600xt, 4060 at similar prices
tukatu0@reddit
Yeah. Probably not even 100k units. Do remember though that the arc b60 exists (dual b580s). So its probably worth something to Intel.
The same statistic probably applies to the radeon 6500xt. So no 7500xt came our and as a result no rtx 4050 either.
Nvidia doesn't even hide doing the bare minimum. Easier to just let randoms on the internet make excuses for you i guess. Basically free.
opelit@reddit
LOL, now limit the 5050 to what the iGPU pulls (around 30W), It is 60W whole APU vs 150W (cpu+gpu). Why even bothering to do the tests.
kikimaru024@reddit (OP)
Because most people will use the laptop/PC as-configured.
Lifealert_@reddit
But are these two laptops really the right comparison to make? What's the price difference?
JustBeLikeAndre@reddit
I can't think of a better comparison to make since these are essentially the same laptop, with the GPU being the only difference. We don't know the prices yet but considering the current price of 5060 and 4050 laptops, the price difference should be around 20-30%, with sales happening quite often. In return you are getting over 100% extra performance in AAA games and a 300% performance boost in 3D rendering, with the same chassis and everything ,a very good battery life.
opelit@reddit
Exactly this. This is like a comparison of a truck and heavy SUV. Both might have the same power, but one pulls 30L/100km, the other 10L/100km of oil.
Healthy-Doughnut4939@reddit
Wonder how Panther Lake with a 12 Xe3 core igpu will compare with the 5050?
Panther Lake will be a beast for handheld gaming and ultra books especially with native Xess 2 support when it releases in Q4 2025
Unfortunately, AMD's upcoming Gorgon Point and Medusa Halo APU's still use RDNA 3.5 which means no fsr4 upscaling.
Gorgon Point's RDNA 3.5 igpu will not be able to compete with Xe3 since the same igpu in strix point trades blows with LL's 8 Xe2 core igpu.
RumbleversePlayer@reddit
Assuming the leap is like xe to xe2 (probably little higher also due to 4 more GPU cores), no it will not beat 5050
Best case is 1660ti mobile in raster (or higher TDP 3050)
Kursem_v2@reddit
for what it's worth, the only drawbacks of Intel GPUs that I could think of is lack of proper driver supports on older video games, but this is still a work in progress by Intel.
meaning, games in early to mid 2010s sometimes has bug when played on Intel GPUs are that they run poorly, even though in theory the performance capabilities are there. it's just that drivers didn't run properly to utilize the iGPU fully. but we're in luck since Intel didn't abandon those games. their software engineers worked so that older video games run as expected on their hardware.
which is something that I'd applause.
Ghostsonplanets@reddit
It's obvious that a 5050 is much faster than an 890M. We're talking about a 3050 class iGP against a GPU that is around 1.5 - 2x faster.
Still, that doesn't take away the fact iGPs have come a long way. And they're more than enough for a lot of people.
ThankGodImBipolar@reddit
What’s not obvious is that an 890M is a 3050 class iGPU. The idea of an iGPU performing in line with a dGPU that’s not really that old would have been laughable a fairly short while ago.
Ghostsonplanets@reddit
Really? SteamDeck iGP was already 1050 perf. 680M and 780M were already matching 1650. 890M with a wider design and access to higher memory speed reaching the 3050M, which is 20 - 40% faster on average than the 1650, isn't something to be shocked, in my opinion.
Kalmer1@reddit
Steamdeck released early 2022, the 1000 series in the middle of 2016, thats almost 6 years.
3000 series released in late 2020, the 890M in the middle of 2024. Slightly less than 4 years.
We're 2 years ahead of that
ThankGodImBipolar@reddit
Steam Deck was the first device with an RDNA iGPU, no? That’s kind of what I meant by “a short while ago.” Even then, computers with RDNA iGPUs were pretty difficult to get before Phoenix came out 2 years ago.
Because of the way that handheld PCs have been branded, I also think that the average person doesn’t immediately see the Z1 Extreme and think “this is a laptop iGPU.” Maybe the massive performance boost over the older Vega 7 or Tiger Lake-era iGPUs that people would have been used to at the time has something to do with that.
TheNiebuhr@reddit
It's not 3050m class. It's slower than the lowest 35w version. Around 1650 level.
rawwhhhhh@reddit
Go to notebookcheck and compare between the 890m and laptop 3050 4gb. The 3050 is 33.3% faster then the 890m.
Qsand0@reddit
An 890M is NOT as fast as a 3050 GPU. Maybe the lowest wattage 35W one, but the gap between thr 35 and 95w variants is so massive it can't be simply ignored as just '3050' and needs to be specified.