Techspot - Intel claims Core Ultra 200 patches improve gaming performance by up to 26%
Posted by Antonis_32@reddit | hardware | View on Reddit | 163 comments
ibeerianhamhock@reddit
Even if this is true, it's still slower than the 9800x3d.
ieatdownvotes4food@reddit
8 vs 24 cores.. no brainer win for the 285k.
996forever@reddit
Did you run the same logic during Zen 1-3 when ryzen had more cores?
ieatdownvotes4food@reddit
hah, nah just when i bought the core 285k. only upgrade every 10 years.. then i dip out
Singul4r@reddit
how is the 285 doing? I'm asking Ultra's owners since it could be very good for programming and gaming. Are u happy with the proc?
ieatdownvotes4food@reddit
loving it! .. great compile speeds, super fast with local llms, imagegen, and transformer related tasks, incredible with davinci resolve and 8k footage, and is going to pair perfectly with a 5090 soon for 4k gaming.
the whole single core 1080p frame-rate benchmark debate doesn't take into account a multicore transformer 4k future.. people don't yet understand whats coming and this tech is future based. gl on your purchase whatever it may be
only_r3ad_the_titl3@reddit
lol people here did.
Strazdas1@reddit
by that logic bulldozer was great?
polyzp@reddit
At 2x the power
ElementII5@reddit
FYI 9800X3D is about 24% faster than the 285k.
DktheDarkKnight@reddit
It's up to 26%. I doubt the average performance uplift will be greater than 5%
Strazdas1@reddit
26% is claimed specifically for cyberpunk. The rest is journalists inventing things.
Drifter_Mothership@reddit
"up to" is the key phrase here.
Strazdas1@reddit
Up to is invented by the journalist, Intel didnt say it.
Ryujin_707@reddit
And 0% Faster in 4k where anyone with a brain would use 4090 or 5090 with a CPU in that price range.
xylopyrography@reddit
Only about 10% of people game in 4K and only maybe 1% have a 4090.
What you've said really only applies to 2024 AAA style photorealistic games, it isn't true for CPU-bound games for which there are a tonne of players wanting more performance--these just aren't often touched by reviewers because they're very hard to benchmark.
It also won't apply to 2025 games as much, and by 2027, the 9800X3D will be absolutely crushing the 285K in 4K gaming.
And what you said still isn't true in its raw form today for AAA photorealistic games. The 9800X3D is still actually faster in 4K gaming with a 4090, some games by as much as 15%.
Ryujin_707@reddit
Only 0.01% with a gaming pc got a 9800 x3d. Wow I can do statistics too.
CPU pound games my ass. If you buy 9800 x3d and pair it with a midrange gpu you are brainless . And if you get 9800 x3d and 4090 and don't play on 4k you are out of your mind.
Lol in 2027 there would be new CPUs and 9800 x3d wouldn't matter.
Just name me 1 game where 9800 x3d is faster than 14900k or 285k in 4k for more than %5. I will wait.
O"h yeah it's 77% when playing 720p low" Give me a break.
Just let me spend 600$ on a CPU and play 1080p and bottleneck the shit out of 4090 and then say "CPU bound" nah it's brain bound.
Odd_Cauliflower_8004@reddit
I have a 7900xtx and. 7800x3d. When the time will come, I can buy the 7090( if amd keeps fsr4 to rdna4, it will be the last amd card I’ll buy) and I know that I won’t be cpu limited much . And if I think I’m limited I’ll just buy zen6 x3d( which , believe me, will be massive step up from zen5. )
Good luck doing the same buying a midrange intel cpu.
Ryujin_707@reddit
Oh yeah I'm gonna get a 13700K for $200 and enjoy the amazing productivity advantage and quick sync.
And lose zero performance in 4k with my GPU. Because I got common sense not to waste extra 400$ on non sense cash grab.
Odd_Cauliflower_8004@reddit
Good luck with the stuttering from the ecores for your productivity. Throughput is only one aspect but sadly it’s the only one that we can measure clearly.
Ryujin_707@reddit
E core stuttering in productivity. Huh. What fuck does that even mean?
Odd_Cauliflower_8004@reddit
have fun working with windows going slow as hell to open up explorer cause it decides to assign that workload to the e-cores. i have to use one of those cpus on the work laptop and the only thing that makes it bearable is that i installed linux on it, cause windows was a complete laggy mess to even just use the browser.
LOL would run fine, but doing any actual work was something insanely frustrating on windows, and it's still not ideal on linux when i hit compiling or developing workloads that require more than the p-cores to work.
both the 2 years older g14 with 4800hs( zen2 ) and the desktop are significantly faster in day to day operations.
Ryujin_707@reddit
If you knew what you were talking about you would try basic stuff like turning e cores and hyperthreading off.
But you spew bs and say issues that I've never encountered or seen. I would admit that e cores stuttering was a thing in gaming. But windows and productivity? Absolute bs.
Odd_Cauliflower_8004@reddit
Productivity, again, is not just the how fast can you encode a video or compile a program.
Ryujin_707@reddit
Again Intel is snappy in navigation and zero issues. I don't know what the heck you taking about.
In fact Intel is better in This because of much superior idel power draw and task management.
Strazdas1@reddit
If you think CPU does not matter you never played strategy/sim genre.
PTSD-gamer@reddit
Escape from Tarkov Video Here. It is not much more than 5%. 10-14%. Tarkov is a game where GPU doesn’t matter much. A midrange GPU and a high end CPU is best.
Ryujin_707@reddit
The heck you taking about. The video even shows 200fps vs 199fps Avg. The other sample is 144fps vs 142fps.
Lol it's the same.
PTSD-gamer@reddit
Streets map online 4K is 127FPS x3d vs 111FPS for i9…but yes, the offline maps and most other things they are equal.
Strazdas1@reddit
1.16% and growing to an already low-gpu biased steam survey.
Darrelc@reddit
Is the UPS gain that good? One of reasons I got a 9800x3d was knowing it would improve my huge factorio map performance so it's exciting reading that comment
Sopel97@reddit
it's very large above like 60 ups, then slowly diminishes. My last base ended up at around 12-13 ups on a 7800x3d, someone with a 12400k ran it at 9-10. Had it been running 120 ups the difference would probably be 2x.
xylopyrography@reddit
It does depends on the type of factory you have.
You can create CPU-stressed factories in which Intel is competitive.
But AMD's vCache chips are always competitive in CPU-stressed factories (as they have good general purpose performance), and dominates in cache-stressed factories.
FactorioBox has some benchmarks.
teutorix_aleria@reddit
Its also probably 0% faster in i/o bound workloads too but how is that relevant to the discussion?
Ryujin_707@reddit
You acting smart by ignoring how important 4k is actually is? I'm impressed.
teutorix_aleria@reddit
Ive been gaming in 4k for years try again. Some people prefer gaming at high refresh rates on lower resolutions which is where the differences do matter.
Ryujin_707@reddit
Yeah I'm gonna waste %30 of my GPU power to play in low res and lose visual fidelity.
Sopel97@reddit
do you really don't understand that it's not about you
Ryujin_707@reddit
I shared my opinion why. 9800 x3d vs 285k doesn't matter in 4k.
How is that not a valid discussion point?
I hate when I say crap statistics like 50% faster or some shit when it's actually talking about 1080p low.
Any one is pleased by what they do and I respect that choice. But I'm not gonna fall for a cash grab when a $200 CPU can do the same job when playing on 4k.
Maleficent-Salad3197@reddit
Bravo. Intels continuous delays lies and even the real need for economy cores is a joke. The economy cores help keep the thermals down at the cost of making you either disable them on some games.AMD is Epyc.
RogueIsCrap@reddit
CPU power at 4K still matters depending on what you play.
For examples, Hogwarts Legacy has many parts that are heavily CPU bottlenecked even when using 4K DLAA with all max settings.
Ryujin_707@reddit
DLAA is rendering in res higher than 4k.
So again. Not naitve 4k res. You guys don't understand don't you.
RogueIsCrap@reddit
DLAA makes the GPU load somehow even bigger but yet it's being CPU bottlenecked.
You sure that you understand?
Ryujin_707@reddit
DLAA puts a load on the CPU because of the downscaling processing.
Do you really understand or trolling?
RogueIsCrap@reddit
Are you confused about the difference between DLAA and DLSS?
But then again you just said that "DLAA is rendering in res higher than 4K" so you must know that DLAA puts more load on the GPU and makes CPU bottleneck less likely. So you're probably just trolling.
Maleficent-Salad3197@reddit
Nice for CAD.but for gaming 1440 will do on anything smaller then 65".
996forever@reddit
Flashback when AMD compares 1800x vs 7700K in 4K gaming
HatefulAbandon@reddit
TIL I have no brain.
conquer69@reddit
People use upscaling by default now which lowers the rendering resolution. That cpu bottleneck will be felt for sure if you have a 5090.
Even the 4090 was cpu bottlenecked at 4K native with a 7800x3d in some games.
Ryujin_707@reddit
Don't care about upscaling. I'm talking about naitve 4k.
Maleficent-Salad3197@reddit
More AI frame gen isn't faster. Adding extrapolated frames is not as good as not using it and using raw power to increase fps. That would be step up.
Cute-Pomegranate-966@reddit
It isnt 45% faster in cyberpunk anymore. Maybe half that. They did fix that lol.
Exist50@reddit
That's the point. They fixed a negative outlier, but that won't change the median much.
scrndude@reddit
They keeping making claims about huge improvements in patches, and then the actual improvement is 1-3%. I’ll believe it when I see it.
jaaval@reddit
They didn’t make the claim in the title. They said cyberpunk will have 26% improved performance. Which is easy to believe since it was completely broken.
only_r3ad_the_titl3@reddit
you expect critical thinking in r/hardware ??
TheComradeCommissar@reddit
But... Intel bad?
I find it rather tiresome how some people are so quick to disparage Intel without proper justification. Yes, they've had their missteps in the past, but that hardly meams everything they do is deserving of such criticism. Can we all remembe the Pentium 4 situation and rapid improvement afterward with the Core microarchitecture?
And to be perfectly clear, this perspective comes from someone who presently uses anAMD-powered laptop, desktop, and evem a home server. I have no allegiance to this "team red, blue, or green" tribalism—it's all rather ridiculous, if I'm honest.
nanonan@reddit
Hey look, it's a regression in that title. Trusting Intel when it comes to marketing promises of performance is like trusting any of these companies, indeed a bad idea.
PaulTheMerc@reddit
up to: read: in that one niche case it was 26%.
Renard4@reddit
Usually with weird settings in a game that nobody plays.
Velgus@reddit
And then "niche case" ends up being running RT at sub-30 FPS, as if it's an actual use case for anyone.
MiloIsTheBest@reddit
"We were originally getting 4 fps in this game, now it's at least 5! Up to 26% increase!"
qazedezaq@reddit
Is this the same Intel that claimed multiple times that there was no permanent degradation of their 13th and 14th gen CPUs, but then one year later released a patch to decrease the speed at which those CPUs permanently degrade?
I'm taking their claims with a grain, or a handful, of salt, and I'll be waiting for third-party reviews of the patches.
ConsistencyWelder@reddit
Even worse, people in the industry knew about Intels stability/degradation problem. Shops saw abnormally high RMA rates and game devs knew about it too, but no one dared say it out loud out of fear of Intel cutting them off. Until someone did.
But Intel kept denying it up until they just couldn't any more.
I have no faith in them any more.
qazedezaq@reddit
It was truly wild. I really hate how quickly the tech industry and news media moved passed that, like Intel didn't just knowingly screw over hundreds of thousands of people and companies.
FloundersEdition@reddit
probaly even 10 millions or more. the degradation might be extremely fast on 13700K and above, but I seriously doubt mobile, 13600K and others are safe. it's probably a mixture of voltage paired with heat that speeds up the degradation, and mobile and OEMs have shitty cooling solutions. voltage should be lower on these SKUs, so it's happenig slower. but reports said some Raptor Lakes showed it after not even a single quarter.
it might take 4-5 years instead of one or even a couple of quarters. but CPUs usually last 8-10 years.
experiencednowhack@reddit
Waste of sand
Vellanne_@reddit
Seriously. As someone who bought into that platform it was really grating to see the focus change to their strange new name scheme. While odd, knowingly selling faulty products is far more important. I'll never buy or recommend an intel chip again.
NewRedditIsVeryUgly@reddit
Reddit blows things out of proportion while making no sense.
Those CPUs were covered by a warranty. I have not seen any cases of organizations not receiving a claim of warranty for that issue. Only anonymous reddit users claimed Intel didn't honor a warranty.
That means it's an inconvenience to customers and really damaging to Intel's financials. Pretending like it's some big Intel scheme to screw everybody is just dumb.
ConsistencyWelder@reddit
Doesn't the warranty only cover the boxed version? I thought people that bought the tray version were out of luck.
VenditatioDelendaEst@reddit
They probably new it would be fixable with a microcode patch as soon as they got a failed one in the lab. And I expect that they already had a failed one in the lab when they realized the failure rate was statistically abnormal.
ConsistencyWelder@reddit
They said for the longest time that there isn't an issue though. And also said they didn't know the cause, so how could they know it was fixable? We still don't know if it is tbh, we won't know if the latest fixes work until about a year from now.
VenditatioDelendaEst@reddit
Because you can always whack a few hundred MHz off the top, or even lock the chip to static frequency, static voltage, at the base clock.
Once you know it's cumulative electrical damage and not some contaminant present in the manufacturing process, the only question is how big performance hit is going to be.
ConsistencyWelder@reddit
No one knew if the cause was that the chip was driven too hard. We still don't really know, we just assume it.
VenditatioDelendaEst@reddit
https://www.youtube.com/watch?v=hVSSOs9Z-uY
Intel would've had failed articles decapped and running under probes. They'd know exactly which functional blocks were mis-computing, and why. They'd know whether or not there were any chemical elements present that weren't supposed to be, or not present that were.
You don't have to identify the root cause to limit the possibilities to the things that can't possibly happen with all the turbo and power saving features turned off.
NewRedditIsVeryUgly@reddit
https://www.intel.com/content/www/us/en/support/articles/000024255/processors.html
Looking at local retailers, they provide 3 years warranty for tray Intel Tray version. The link I sent shows Intel gave a 2-year extended warranty for 13th/14th gen, so I don't think they were trying to scam anyone.
ConsistencyWelder@reddit
What is a direct customer in this context? Shops? And why does the warranty differ from the boxed versions?
VenditatioDelendaEst@reddit
A warranty does not pay you for the labor required to identify the problem and replace the part, nor does it compensate the lost use of your computer while it was less than 100% working.
Warranties are mainly useful as a signal that the manufacturer is confident they have built a reliable product, in situations where you can logically assume that otherwise there would be a huge number of warranty claims (so, technically sophisticated buyers).
If you actually have to use the warranty, you have lost.
bargu@reddit
The only surprising thing is you having faith in Intel.
Gwennifer@reddit
No, it's the same Intel that completely changed how Thermal Velocity Boost worked in a microcode update a full year before the degradation issue blew up, because Thermal Velocity Boost was 'accidentally' built as a 'kill-CPU' function and Intel is just hoping nobody sues them over it.
UGH-ThatsAJackdaw@reddit
Do you mean the same Intel that claimed "mission accomplished" when they released a microcode patch that didnt work?
FloundersEdition@reddit
the same Intel that claimed 10nm on track. and PVC. and Sapphire Rapids. and most of Arrow Lake on Intel.
the Intel that claimed a "roadmap to leadership", "5 nodes in 4 years". and than shipped no client product on Intel 3 (I think server has something on it tho). claimed 20A "just a little later". than cancelled 20A because "18A ahead of schedule". the one that will now only release 18A products late this year and ramp next year, if they even manage that. and who knows in what volume and at what yield and PPA.
Ladies und Gentlemen: 5 nodes in 4 years - fastest gaming client still on the frying Intel 7. lackluster Meteor Lake on Intel 4 and external. lackluster Arrow Lake on external only. Lunar Lake external. GPU external.
good job boys!
UGH-ThatsAJackdaw@reddit
"But what if we fumble HARDER?"
littleemp@reddit
Even if performance was patch across the board to competitive levels, you are not going to patch most reviews or the perception that people have on those CPUs.
Zednot123@reddit
That's why you launch new SKUs with marginal frequency increases. It creates a whole new review cycle.
295K incoming!
littleemp@reddit
Please go work at AMD and show them how to actually peddle their goods. They need someone like you instead of morons like Azor.
DYMAXIONman@reddit
They should have rebranded their 6800s that they had in stock at the 7800 and called it a day.
ProperCollar-@reddit
Gross. Miss me on the 390(X) BS.
DYMAXIONman@reddit
Hey the 390 was a better card than the GTX 970 though.
conquer69@reddit
Was it? The 970 was 150w and came out 9 months earlier. The 3090 was 275w and performed the same.
The_Soldiet@reddit
390x was a rebranded 290x with double the vram. Not a bad card. Bought a 290x at release, and when I killed that thing frankensteining it with too much OC, i got a free 390X Sapphire Trix. That card held on until I swapped it with a 5700xt in early 2020. Best card I ever had, and I never had any problems with memory bottlenecks or bad performance.
ProperCollar-@reddit
It had 8GB of VRAM compared to the 970s 3.5/4GB which quickly proved to be very problematic.
I have a GTX 970 iny basement. I wish it was a 390 given the community driver support.
Strazdas1@reddit
except for vast majority of 970 owners the VRAM was never an issue they noticed at all.
ProperCollar-@reddit
If they were on a short upgrade cycle maybe.
I remember mine starting to show its age back when the 20 series was out. I even got my $20 cheque or whatever it was.
BleaaelBa@reddit
now we have 575w gpus, and nobody cares.
996forever@reddit
Comparing 575w with unquestioned performance leadership vs 100w more than competitor’s third best chip isn’t the flex you think it is lol
BleaaelBa@reddit
oh so unquestioned performance leadership suddenly removes heat from your room, doesn't cost extra energy bill? and that 100w extra did ? got it.
its funny seeing people who called 390/x a nuclear reactor are now completely fine with 550-575w gpus.
996forever@reddit
It does not, but what it does is provide a greater incentive for the tradeoff.
A Toyota Prius getting mediocre gas mileage should be a much bigger issue than a Ferrari 458 getting even worse gas mileage.
It’s not that complex of a concept to grasp. Hope I was of help.
Qesa@reddit
It's not even the Ferrari gets worse mileage. It's like the Ferrari goes 3x as fast as the Prius and uses 3x as much fuel.
BleaaelBa@reddit
You don't get it, do you ?
It is stupid if a Prius guy cries about mileage and then goes ahead and buys a Ferrari cuz he gets more top speed. mileage was never an issue thn, it was top speed he cared about but pretended to care about mileage. hope you get it now.
996forever@reddit
The Ferrari buyer isn’t caring about gas mileage nor is he pretending to.
There was never any cross shopping between the Ferrari and the Prius.
Having poor gas mileage than the competition is a bigger deal at the prius level than at the Ferrari level.
BleaaelBa@reddit
oh there definitely was. a lot.(more so in og titan vs 290x than 970vs390/x)
996forever@reddit
The 290x didn’t have the 8GB VRAM (well one exceedingly rare model did but it was not the norm) that the 390x had, so the landscape was different. 290x (after it came out) was certainly better than any Kepler in retrospect, but the 980 and 980Ti came out too soon after.
conquer69@reddit
If AMD had a 5090 competitor for 330w, it would be a huge deal.
BleaaelBa@reddit
that's not the point, i'm specially talking about heat, if people cannot take 330-350w from a gpu, how the hell 575w is acceptable to them? having more performance doesn't magically remove extra heat dumped into the room.
unless it was never really a problem but people pretended it anyway.
ProperCollar-@reddit
That's what they've done with XT CPUs.
Until the new naming debacle AMD was really good at naming CPUs (mobile was meh).
GPUs well... Since I started paying attention we had:
HD XXXX
RX XXX
Vega
Radeon VII
RX XXXX XT
RX XNvidiaNameX XT
only_r3ad_the_titl3@reddit
amd is also selling ZEN 2 CPU under the 7000 series name. Insane how that they have not been called out on it
ProperCollar-@reddit
They have been. I was more referring to the AI HX nonsense but that's why I called out mobile as meh and referenced the debacle.
jnf005@reddit
The RX XXX was also splited into R5/R7/R9 200/300 series and RX 400/500 series, then there's also the R9 Nano, Fury and Fury X as the 300 series' top end. What a freakin mess.
ProperCollar-@reddit
I I remember the R numbers. I just didn't have a succinct way to include them in my naming scheme.
Polaris was actually a dope architecture that got overshadowed by the GCN naming bullshit.
HOW MANY TIMES DID THE HD 79XX GET RELEASED FFS
vegetable__lasagne@reddit
They don't need to relaunch stuff, they already have their non K CPUs coming which will essentially be a rereview of the architecture.
Zednot123@reddit
Doesn't get nearly the same coverage as new top end models.
bizude@reddit
You joke, but now I have to wonder if this will happen and we'll see a Core Ultra 9 290K or 295K released for this very reason.
Scrimps@reddit
It loses to the X3D CPU's in some games by 100-140 percent. Keep going.
only_r3ad_the_titl3@reddit
source?
tonyleungnl@reddit
Benchmark numbers: From 10FPS to 12.6FPS! The numbers go up, but maybe they haven't anything to do with the CPU itself, but it goes UP!
OverlyOptimisticNerd@reddit
“Up to,” usually means one obscure edge case with everything else being 1-2%.
Saw this a lot during Polaris/Vega on the AMD boards where people were making up phantom gains with driver updates and for HUB to test and refute them, rinse and repeat on an endless cycle.
Strazdas1@reddit
Intel said Cyberpunk. Journalists interpreted that as "gaming".
moschles@reddit
I believe Microsoft rolled a firmware update into a recent Windows 11 update that does this 'patch'.
Former_Barber1629@reddit
Intel’s days are numbered, they choose to die on this sword and I sadly believed they would reign supreme, but the reality is, they are struggling.
wusurspaghettipolicy@reddit
If a patch makes the product work as intended then you released a broken product.
Astigi@reddit
Intel can't be trusted
1AMA-CAT-AMA@reddit
If theres this much low hanging fruit that the performance is increased this much, then what the fuck were they doing before launch
SmashStrider@reddit
Wait for 3rd party reviews.
animeman59@reddit
No, Intel. No one wants your shitty processor.
KenTrotts@reddit
If it wasn't for the fact that these processors were on par or slower than the last gen, I would have said it's a good strategy by intel to keep the updates going and to keep it in the conversation. As it stands now, anyone who's buying a processors for gaming isn't going to buy Intel.
Automatic_Beyond2194@reddit
I’m legit about to buy one if this is true.
996forever@reddit
What’s your use case?
HunkerDownDawgs@reddit
"up to" likely doing a metric ton of heavy lifting here tbh
KenTrotts@reddit
I certainly hope these gains are true. I just upgraded my PC to an arrow lake processor, but I did so because productivity benchmarks ranked it above all others for the program I use. Having great gaming 1% lows or whatever the updates will help with would be a nice bonus.
SherbertExisting3509@reddit
I'll believe it when I see it.
Hopefully not another arrow to the knee patch
ReviveHiveCola@reddit
Semisonic - Closing Time
someshooter@reddit
Crazy how both Intel and AMD have tried to do a relaunch this time around.
errdayimshuffln@reddit
I bet on avg across a wide selection of gamed, the performance improvement will be a single digit percentage.
frostygrin@reddit
It's not uniformly bad in the first place.
Exist50@reddit
It is though. The median is worse than Raptor Lake.
frostygrin@reddit
It can be tolerable if it's a little worse in a majority of games. But when you have outliers that are ~20% slower, it leaves a different impression.
Exist50@reddit
Sure, but it's not just the outliers that sunk ARL in reviews.
Felixthefriendlycat@reddit
Man I hope this turns out to be true for Intel. It’s going to be bad if they can’t compete anymore
lxs0713@reddit
Assuming they're on an upward trajectory, this new launch could be like their Ryzen 1000 series moment. A much needed fresh start that could really start coming into its own in a couple generations.
Exist50@reddit
What's new launch? They have nothing for desktop till Nova Lake.
Lifealert_@reddit
They seriously need to lower their prices in the mean time if they want anyone to buy into their platform.
jenesuispasbavard@reddit
Lol I went into Microcenter a few weeks ago thinking about buying a 9800x3d but AMD's mATX motherboard selection there is abysmal, and the 265k was less than half the price for significantly better non-gaming performance.
I'm loving it so far. It's pretty power-efficient, and I'm not gonna notice a difference between 149 and 162 fps. And it's nice to have a Thunderbolt port on my motherboard.
semidegenerate@reddit
Intel was often the smart budget-buy even with 13th and 14th gen, degradation issues not withstanding. But as far as raw performance per dollar, CPUs in the 13400F to 14600K range were very attractive. The 13100 and 14100 were even quite performant despite the 4c/8t configuration, and very affordable.
doug1349@reddit
Have you looked at non 3d chips? Lmfao.
pianobench007@reddit
It'll be fine. AMD was in intel's shadow for years and are okay. Smartphone's ushered in an opportunity for mediatek, Samsung, Qualcomm, and Apple to succeed there.
Intel business laptops and workstations are still very good and cost-effective. They just need to execute the manufacturing fabs. Only Taiwan is pushing leading edge on EUV. China is trying but struggling too. So is Samsung and Intel. So it is not easy.
But Intel still have good engineers. The GPU team is doing great work. Fab team is getting there. Intel 3 is promising and an important node to have at high volume manufacturing.
They just need 18A to be at high volume and everything else will be an after thought.
You can make all phones on Intel 3 for years and be fine. 24hour battery life is nearly perfect. I don't imagine 48 hours or 72 hours as a target metric anytime soon.
They can make a 72 or 100 hour battery life device today but it'll be slow as hell and with a dim screen.
NewRedditIsVeryUgly@reddit
Two months late, but at least they kept their word and addressed the issues.
This is a summary of the fixes, with the latest one at the bottom: https://www.techspot.com/images2/news/bigimage/2025/01/2025-01-17-image-7.jpg
2Kappa@reddit
Even if it's true, it's a massive indictment on Intel that they left so much performance on the table and let it get raked over the coals in the initial reviews instead of delaying it by a few months. Hopefully TPU can release some numbers by Monday, so we can see.
PotentialAstronaut39@reddit
Hasn't there been third party testing showing the difference in performance was marginal and even sometimes detrimental?
Have I dreamed that?
Zednot123@reddit
The Intel ME update that is required for some of the hotfixes added to beta bioses to actually do something hasn't been rolled out yet.
All the tests done with said bioses released late last year are at best incomplete and missing data.
FloundersEdition@reddit
first Intel made consumers alpha-tester for chip reliability with Raptor, now we are even alpha tester of basic BIOS features.
bobbie434343@reddit
Where's HUB Steve when you need him ?
Temporala@reddit
up to 26%, 0.1% on average? Sorry, but I'm dead tired of the up to non-sense smoke blowing out of marketeer mouths.
Capable-Silver-7436@reddit
something is fundamentally broken with these chips, they saw big gains by turning off the p cores and running everything on the e cores. unless its just that the shared 4MB l2 cache on each cluster of 4 e cores just helps THAT much with latency
Responsible-Juice397@reddit
And fuck intel. I don’t want to buy a mobo every year.
teutorix_aleria@reddit
Why are you buying a CPU every year? I'm 32 and can count the number of CPUs ive had on my fingers.
ConsistencyWelder@reddit
Even if it ends up being true this time, that only fixes the few games that were obviously broken. It doesn't necessarily lift the general gaming performance to a competitive level.
PeakBrave8235@reddit
If you’re consistently releasing products that only gain performance in updates, then you’re intentionally releasing half baked crap. Intel, Oculus Quest, etc.
There is also a major difference between fully baking a product and trying to squeeze more performance, and products like this, which are intentionally released before they should be.
DYMAXIONman@reddit
The issue with arrow lake at the end of the day is price. If they were priced competitively no one would care that AMD has a better gaming cpu.
OGShakey@reddit
A little too late. I'm sure a lot of people like me that were on Intel all their lives and needed an upgrade went AMD recently because there really wasn't a choice anymore. I've loved every minute of my 9800x3d and not going back unless Intel really blows me away
PeakBrave8235@reddit
If you’re consistently releasing products that only gain performance in updates, then you’re intentionally releasing half baked crap.
Intel, Oculus Quest, etc.
There is also a major difference between fully baking a product and trying to squeeze more performance, and products like this, which are intentionally released before they should be.
legit_flyer@reddit
Unfortunately, "Intel claims" has lost a lot its weight as of late, so let's just wait for third-party results.
imaginary_num6er@reddit
Cyberpunk improvements were main from game driver updates, Far Cry 6 and other benchmarks need to be reproduced and not only being provided by Intel