AMD dominates chip sales on Amazon — top ten best selling CPUs all come from Team Red, Intel’s highest entry sits at 11th place
Posted by gurugabrielpradipaka@reddit | hardware | View on Reddit | 205 comments
Impossible_Okra@reddit
Meanwhile Intel:
Wait people don't want to spend money on a new platform that might only get one generation of CPUs and said CPUs are performing worse than the previous generation. Who would have thought?
COMPUTER1313@reddit
The 285K being priced higher than the Ryzen 9950X and on top requiring +8000 MHz DDR5 CUDIMM (to avoid further losing performance) and a new board, makes it questionable in its cost efficiency even in productivity workloads.
tukatu0@reddit
Intel is about to learn the hard way why for every 100 gamer monitors there is only 1 productivity monitor. 5k resolution says hi.
LesserPuggles@reddit
The fun thing is that the shitty productivity centered Dell one is the one that sells by the thousands to businesses… hence why Intel doesn’t really care about the gaming segment lol
tukatu0@reddit
Do businesses actually need 285k and 265ks though? I'm not so sure the medium and smaller builders like pugetsystems are going to be recommending those. Well doesn't really matter since contracts.
COMPUTER1313@reddit
Get two of them and it will be the perfect setup for those who say CPU performance doesn't matter. /s
tukatu0@reddit
You joke but i was unironically thinking of pairing two 5k displays vertical. It would end up with a 4:3 aspect ratio resolution 5760 × 4200p or actual 8k (4320p) clarity. Since the vertical resolution is what determines your actual clarity. Decided I can't afford such a novelty. Atleast it's not worth it with ips quality and a giant border in the middle.
Well i know you mean 10k by 2880p but still. Might actually get 30fps on a 4090 for most games. Anything ray tracing can just use dlss ultra performance for 3000× 960p. Basically same fps as 1440p dlss quality... Probably
On the other hand. I just find out 8k tvs are actually not even 8k. They have some angular subpixel structure so you might end up with an actual resolution of 5500×3000p or something odd. No wonder that guy from digital foundry thought 8k is not worth it. Turns out he doesn't even have it.
The only 8k display is some 32 inch monitors
Strazdas1@reddit
If you need a reminder of how insane TV marketing is, just remmeber TV marketing made people believe 720p is HD and 1440p is 4k.
JapariParkRanger@reddit
720p is HD.
Strazdas1@reddit
It is not. HD is 1080p. That you responded like this shows how perverse TB manufacturers propaganda is.
JapariParkRanger@reddit
Any resolution standard displaying higher than standard definition is a form of high definition, but 720p is literally named HD. 1080p is Full HD. 1440p is 4x 720p, and named Quad HD. Tuberculosis has nothing to do with it.
Strazdas1@reddit
No, 720p is not named HD. 1080p is named HD.
It was supposed to be TV not TB. Missed the letter.
animealt46@reddit
720p is HD and always has been. 1080p came later with no new marketing so just became HD but better.
Strazdas1@reddit
720p has never been HD. Its a lie TV manufacturers sold you. They wanted to sell you crap 720p TVs but advertise it as "HD Ready".
Joshposh70@reddit
720p is sold as "HD Ready" and 1080p was "Full HD" - at least that's how it went down in the TV market over in the UK.
COMPUTER1313@reddit
TV marketing is insane. A decade ago I remember seeing a TV marketing claim that it supported 240 Hz. It actually only supported 60 Hz.
Tontors@reddit
I was in a Best Buy several years ago and it had tones of TVs with huge 4K ^^compatible signs on them(not 4K capable). They were all 1080P TV. Dont think I have been back since.
MumrikDK@reddit
And before that you had massive HD branding on 720P and (even more fucked up) 1366x768 TVs.
There's just forever going to be that practice of marketing pushing the top end of the spec and the spec allowing some bottom-feeder shit into it that completely undermines it.
Lycanthoss@reddit
Of course they are 4K compatible. ^Just ^ignore ^the ^downscaling
tukatu0@reddit
Yeah. Something about only inputting 60hz but actually outputting 120hz for proper movie frame pacing. The 240hz number was probably some stuff about flickering. It's not exclusive to tvs either. Monitors and their lying asses about 1ms gtg. When in reality until 2021 most monitors had 10ms full response times. Meaning your 1080p 144hz\165hz monitor is more like a 1080p 100hz display.
I know the gigabyte m27u has like 7ms full response times. So it can actually show 150fps fully. Probably alot of the popular displays from 2021 and forward are fine too. Any Fast IPS 180hz monitor should also be capable of true 150hz easily.
But yeah it's the reason why some 360hz panels have the same coherence of 240hz oled. Their actual response time is like 6ms instead of the 3ms needed. Of course incoherent 360hz is better than 240hz but something to know if you should upgrade or not.
ExtensionThin635@reddit
That’s IF you can OC the ram on the mono in the first place to reach that, and it ain’t gonna be in 4 sticks.
demonstar55@reddit
idk, I've never really done an in socket upgrade. Always just used my computer well past the time the platform was dead /shrug.
Kougar@reddit
Probably because it wasn't an option before now. Intel's sockets changed too regularly, and AMD's sockets also had to change to chase the newer bandwidth standards. Or they died prematurely like AM3 due to the awful FX chips. AM4 become a true unicorn, six years of processors with the 5800X3D to cap it off. I am optimistic AM5 will repeat it. Six years is a pretty good gap on upgrades.
DDR6 is nowhere to be seen yet and AMD doesn't need to change the socket to support CUDIMMs on DDR5. Hell AMD is still using it's first-generation DDR5 controller and infamously first-gen IMCs have always been on the weak side for both processor companies. So by all rights Zen 6 has room to grow on AM5 even if AMD won't hard confirm it yet. Presuming it is, people that bought into AM5 in 2022 could snatch a high end X3D Zen 6 chip in 2026, then use to use it for another 4-6 years. So at that point not only would they be using the same system they had for a decade, but they wouldn't be sacrificing top tier performance to do it.
Strazdas1@reddit
It was an option with AM4 and still noone did it. People generally dont upgrade their CPUs for at least 5 years.
Kougar@reddit
AM4 got six years of new architectures, and very many people upgraded to the 5800X3D and are still sitting on that chip today.
Strazdas1@reddit
and the people who upgraded to 5800x3D didnt do so from an AM4 boards and will not be upgrading to AM4 next time.
yflhx@reddit
They did. It made no sense to buy 5800x3d basically the moment Zen 4 launched - 7600 was cheaper and just as fast. And yet despite this, it was still selling basically till they stopped making it this year, and it's slightly down locked version - 5700x3d - is also still selling.
Strazdas1@reddit
They didnt. You severely overestimage how many people are hardware enthusiasts and update CPUs freqiently enough for AM4 to matter.
Its no wonder the x3D is outselling the 7600. The DiY market is dominated by gamers.
yflhx@reddit
It just isn't. https://www.techspot.com/review/2602-amd-ryzen-7600-7700-7900/
And because it isn't, nobody sane bought 5800x3d, once 7600 launched (okay, maybe a few months for mobo/ram prices went down a bit).
So among people buying 5800x3d since spring last year, they were probably 95% already owning AM4. Because there was no reason to get 5800x3d otherwise.
Strazdas1@reddit
They dont even tell you what games they tested and the 3 they show arent games where CPU matters. This looks like the usual "we tested wrong thing and came to wrong conclusions".
yflhx@reddit
This review was made by Steve from Hardware Unboxed. I'm sorry mate but to claim it's fake you'll need more than that. Like a reputable review confirming that "5800x3D is significantly better at videogames than 7600"
You said BS so many times that I won't be looking for other reviews to convince you, because it doesn't seem possible. So either you find one or this conversation makes no further sense.
Strazdas1@reddit
Ah, then its guaranteed they didnt test CPU-intensive games.
Its not fake. Its just not valuable to the demographic of people that buy x3D CPUs.
yflhx@reddit
Just admit you're wrong.
Or find a review "valuable to the demographic of people that buy x3D CPUs". You claimed that, I won't find that for you. Oh wait you can't, it doesn't exist.
Strazdas1@reddit
Thats the issue. all those big reviewers arent testing the CPUs in ways that are relevant to people buying them. Therefore we reach the wrong conclusions.
poopyheadthrowaway@reddit
AM4 might've been the only time it was worth it, at least in recent times. Maybe aside from those who managed to hack their Z170 boards to accept Coffee Lake.
yjgfikl@reddit
Count me as one of those! Went from a 6600k to a 9700k on the same board, and still using it today. Huge upgrade and completely worth it.
Strazdas1@reddit
AM4 is the only time they even supported it for more than 2 generations. and even then its... supported doing a very heavy lifting here.
COMPUTER1313@reddit
Ryzen 1600 to a 5600 for me. $110 in total (after selling the 1600) for 2x CPU performance.
I know someone else who went from an APU (purchased during the COVID pandemic and crypto mining craze) to a 5800X3D + actual GPU.
timorous1234567890@reddit
I went from A 2200G to a 5800X3D. Will get a GPU upgrade at some point and then pass the rig down to my eldest as it still has plenty of life in it.
Impossible_Okra@reddit
It's nice when the platform is long done and upgrades are dirt cheap. I had a i3 dual core socket 1150 and I got a quad core Xeon for like $20-30 on eBay.
Strazdas1@reddit
99% of people will never benefit from same platform support from AMD because they dont update often enough for it to matter. Why does this terrible argument keeps coming up in this sub?
COMPUTER1313@reddit
Have you ever tried removing the USB3 internal header cable from the board for a board replacement?
50/50 chance of the internal port being ripped out of the board in the process. Replacing just the CPU for a 50-100% performance increase and selling the old CPU to recoup the upgrade costs is a lot less effort.
Strazdas1@reddit
You completely missed the point that 99% of people will never have an option to replace just CPU because they dont upgrade often enough for AM4 longevity to matter.
g1aiz@reddit
The biggest benefit for people is that they can't just increase prices for new motherboards if you can go out and buy the ones that have been on the market for 2 years.
Strazdas1@reddit
But you cant go out and buy the ones that have been on the market because you still need a new motherboard each time you change CPU.
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hey No-External-1122, your comment has been removed because it is not a trustworthy benchmark website. Consider using another website instead.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
Helpdesk_Guy@reddit
The worst part is, that Intel has to grin and bear it, since ARL is supposed to hold out for Team Blue for the next 2 years!
The real kicker is, that (as many already coined it) ARL seems to really be Intel's Bulldozer-moment, yet even worse than that.
Since while Bulldozer also came with a regression in absolute performance (compared to its forerunner Phenom II), ARL does so, while being at the same time backed by a HUGE technological node-jump and bridged the gap from being on Intel's no-so-stellar process (thus, being in a comparable situation as AMD back then with Bulldozer, manufacturing wise; AMD against Intel back then; Intel chasing TSMC now) to ARL being literally fabbed on TSMC's N3 and with that, the industry's single-best process … and yet Arrow Lake still doesn't out-perform Intel's years old own designs and direct forerunners, not to mention that it would be able to beat any competition.
Makes you really worry about, what Intel was actually thinking about releasing this mess for the next two years! Incredible.
Which also excessively hints over to Intel 20A, and why ARL was shifted over to TSMC – ARL being released in that crappy condition on 20A instead of TSMC's N3, and the sharehodlers and investors would've called for Gelsinger's head the day after.
COMPUTER1313@reddit
They certainly couldn't pull a Skylake refreshed moment with Raptor Lake after running headlong into the "oops too much voltage" issue.
Helpdesk_Guy@reddit
It's all so effed up, really. I mean, we already had core-regression with Rocket Lake already, now absolute performance-regression with ARL – At what point they're stopping this nonsense and wake up to it?
YNWA_1213@reddit
I’m now really curious as to what Raptor Lake on N3 looks like, maybe with a cache increase? E.g., if there was any chance of an efficiency gain by porting over the architecture.
Helpdesk_Guy@reddit
The more interesting question is, why is ARL so slow and weirdly limited, when they haven't really changed any greater fundamentals for Arrow Lake, except for dropping Hyper-Threading. I mean, we all can agree on the fact, that TSMC's N3 are the best right now…
crazy_goat@reddit
"Let's stay the course." - Intel
NeonBellyGlowngVomit@reddit
"Let's remove one pin from the socket just for shits and giggles." - Also Intel
COMPUTER1313@reddit
Aliexpress merchants: "LOL YOU WANT TO RUN COMET LAKE CPUS ON Z170 BOARDS? GOT THEM RIGHT HERE!"
https://www.aliexpress.us/item/3256804563061974.html?gatewayAdapt=glo2usa4itemAdapt
https://www.aliexpress.us/item/3256801670651633.html?gatewayAdapt=glo2usa4itemAdapt
PastaPandaSimon@reddit
The Skylake platform bs was why I dropped Intel.
Planted-Fish@reddit
Went from an 6600k to a Ryzen 7 1800x with an asus crosshair x370 motherboard. When the 3000 series Ryzens came out bought an 3700x. Last year bought a 5800x3d and still on the same motherboard from 2017.
COMPUTER1313@reddit
I've seen the excuses of "well the new sockets supporter higher power usage".
I often times hit back with "I'd rather trust a Z270 board over a H310 for running a 9900K, and CPUs can always throttle to avoid overloading weak VRMs".
gold_rush_doom@reddit
Well, yeah. Intel did it to please the motherboard vendors. Kind of like a masked profit sharing.
Top-Tie9959@reddit
Intel also makes the chipsets that go into the motherboards so it is in their interest to have motherboard churn.
aminorityofone@reddit
nobody was fired for buying IBM
frumply@reddit
Up to 2013 I was upgrading every few years anyway so the Mobo issue was kind of moot. I got myself a decent Haswell system at that point tho, we had kids, I didn't have the time/money to devote to systems let alone use them properly... and last year, Ryzen seemed like the only choice that made sense. At that point it was near price parity for worse performance from a 13th or 12th gen, which would be difficult to upgrade down the road, vs a 5700X3D which was a decent deal, vs a 7700x That frequently had combo offers from Newegg and supposedly had years of upgrades down the pipeline.
I'm still far from "needing" any sort of upgrade but a 9800X3D could already get me 50% gains, and whatever comes next is likely to squeeze a bit more performance out. If I can pick up one of those at ~$200 new or used several years down the road I should be in great shape.
RippiHunti@reddit
Basically a repeat of AMD's Bulldozer release, but reversed. This time Intel is the one releasing a new generation with more cores than the competition, but worse performance than last gen.
SEX-HAVER-420@reddit
I think Intel can fix this... the CEO just needs to post more prayers on twitter.
Zomunieo@reddit
Intel hired Jim Keller, lead designer of Zen for AMD, but even he couldn’t save them. What makes them think God could help?
hardware2win@reddit
Keller was Uncle of Zen, not Father
Strazdas1@reddit
hes been posting weekly prayers since before he was a CEO. its just something he does, unrelated to Intels performance.
ExtensionThin635@reddit
At this point he can continue to crash, a single well timed donation and he gets a bailout or whatever he wants
dmaare@reddit
I don't understand why Intel even released a failed generation like this?? It's only going to bury them
Jordan_Jackson@reddit
Not only that but they put out one promising generation in the 12th gen and then proceed to shoot their feet off. They had a good thing going there but then proceed to make chips that eventually fry themselves and don’t disclose anything about it until it becomes public news.
Then they make the current gen and it is clear that they are basically back to their course that they took from Devils Canyon until the 11th gen.
Earthborn92@reddit
12th gen really had me going that Intel can respond and innovate - e cores were new, single threaded performance was good as usual and Intel had an answer for multithreaded performance deficit with AMD's chiplet approach.
But yeah, they flew too close to the sun with 13th and 14th gen. X3D flipped the tables on them. And e-cores didn't have the flexibility that chiplets did - they couldn't answer to them in Server.
LordAlfredo@reddit
Interesting, 6/10 are AM4, 5900X is the only Ryzen 9 on the list, and the X3D chips are unsurprisingly 9800X3D, 7800X3D...and surprisingly, 5700X3D.
Zhiong_Xena@reddit
AM4 just refuses to die lmao. Best socket ever made.
Also like to add, 5800x3d os the greatest gaming cpu of all time.
R3xz@reddit
I've often heard it remarked as the 1080Ti of CPU haha, absolutely deserved!
Decent-Reach-9831@reddit
Definitely not, 1080ti had longevity, the AM4 platform is what was excellent, 5800x3D is good but extremely overrated by the community
R3xz@reddit
RemindMe! 8 years
R3xz@reddit
Guess we'll wait 6-8 more years and see how well it can still handle gaming.
Helpdesk_Guy@reddit
You have that backwards. The 3D V-Cache equipped 5800X3D of GPUs, were AMD's HD-series, especially the last HD 7xxx.
You got it for fairly decent initial price-tags, had it for several years in a row with satisfactory performance, then it got gifted a bunch of high-energy drivers from AMD to happily ReLive their past in a Crimson-tinted Adrenalin-rush and outdo other's ShadowPlays, than it got thrown a life-line when some geniuses on a distance Battlefield™ way up north with already FrostBite-Engines dug through the trenched to get to the core of it for disMantle additional power like a stunning erupting Vulkan, for getting slapped a fine “You outdid yourself this time!”-badge and Medal of Honor to get honorable discharged into retirement, only to finally gather your magically doubled or even tripled pension-fund payment, while happily seeing others tried breaking their back in some wild mining craze …
No doubt about it, but the GTX 1080 Ti was 'just' a stellar long-lasting GPU, comparable to the GTX 970 – AMD's HD 7xxx-series however were graphics-cards, which outright refused to retire for several years, despite being laid off on purpose, only for it to come back the very next day with a statement like …
»You want to dismiss me?! F—k you! — You know what? I am self-employed now, here!
And I work for free, and I'll bring my own money just because. Oh, and just so you know, from this day onwards, I'm working a second shift on overtime to show it to you!«
R3xz@reddit
Are you ok dude? Even the first part doesn't make any sense to me lol - the 5800x3d is a CPU, not a GPU.
Calm down and take your meds.
Helpdesk_Guy@reddit
Why you get personal here? It was said that the 5800X3D is literally pictured as if being the 1080 Ti of CPUs, thus what the GTX 1080 Ti (arguably representing things like longevity and performance for many) was for GPUs, the 5800X3D is for CPUs.
Of course the GTX 1080 Ti is a GPU, while the 5800X3D is a CPU – Are you really that slow on the uptake, to NOT actually get the fairly accurate actual analogy here? Did you took your meds?! YOU are in the wrong here, not me, since you don't even get it.
Just because you can't even grasp the analogy nor a silly-meant and (when actually understood) quite humorous writing-style, doesn't mean that it isn't funny writing. my oh my, that I even have to explain the obvious to you… -.-
R3xz@reddit
I don't need to take my meds, but I guess I need more coffee lol. Thanks for the laugh, and I do find your erratic writing style kinda silly/funny. I don't fully agree with it though, but I can understand from the perspective written by an AMD stan haha :P
Helpdesk_Guy@reddit
Pardon me for being upset at first and thank you for taking it light-hearted this time around! ♥
Yeah… Of course the writing style at first comes off a bit silly, chaotic or erratic (until it suddenly clicks), as it was mostly unavoidable to make the case – I tried to incorporate as many crucial bits/terms into it what mattered and what made AMD's HD 7000-series graphics-cards as long-lasting as they ended up being.
The HD 5–7xxx-series in fact got quite a sudden unexpected reviving in performance at mostly the already well thought-of end-of-life (AMD's Relive/Crimson/Adrenalin driver-releases). Then the cards got retroactively granted AMD's Mantle™ (which was jointly co-developed with EA Games' DICE¹ [which is EA's Northern outpost in Sweden] and their Frostbite-engine technical director Johan Andersson personally pushing it [Battlefield 4], until AMD gifted Mantle's source-code and handed it over to the Khronos Group free-of-charges [no license, open-source], at which it evolved into the Vulkan-API) and then all that heavily pushed the pursuit of DirectX 12 at Microsoft. That being said, in a roundabout way, we have to thank AMD for DirectX 12 …
Then Vulkan brought incredible performance-uplifts and FPS-increased for all GCN-Architecture v1/v2 AMD-cards using the Vulkan-API, which made these cards even outdo newer nVidia-cards (which couldn't run Vulkan). End finally, you could sell the cards after all this to some miners on eBay for obscene price-tags, and they bought them happily. When already used for years!
It wasn't written by a stan, but that's mostly the truth – No other generation of cards lasted that long and was even revived software-wise to such incredible levels (when Nvidia instead refused older last-gen cards their Vulkan-drivers), and could be sold after all of that for more than the initial price-tag due to their way higher hash-rates for mining than any other Nvidia-card – No other graphics-card, neither from AMD or Nvidia.
For instance, a friend of mine sold his already age-old HD 7850 he already had used for years and bought in ~2013 during the mining craze for I think 420 USD – He thought the buyer was totally whack, yet many could sell their age-old cards for such price-tags half a decade later. I know, crazy right?
¹ Bummer, forgot about dropping something like "rolling the DICE of fortune on the card's future" – Mistake were made, I guess! xD
Zhiong_Xena@reddit
Definitely lol. Pivotal moment in gaming hardware.
Also, 1080ti was almost 8 years ago. Let that sink in
Matthijsvdweerd@reddit
Thats half of my life
Zhiong_Xena@reddit
Wow. You youngsters have to give me a moment to let that sink in. You go ahead, I'll catch up.
Matthijsvdweerd@reddit
Still a beast of a card. My brother had one. Sadly it died so he upgraded to a 7800xt.
Zhiong_Xena@reddit
Excellent choice he nade there. Radeon for gamers is a boon.
Decent-Reach-9831@reddit
No, the greatest gaming CPU of all time is the 9800X3D
Godwinson_@reddit
I recently had to replace my old i5 9400. Got an R5 5500 for a real good price- $70, but was dubious about its performance…
It’s amazing for my use cases. Such a cheap little chip that can get ~11000 R23 score and max out at 80° when manually OC’d with the cheapest air cooler I could find.
Zhiong_Xena@reddit
COuld have got a 5600 for not much more but 5500 is also good.
Onceforlife@reddit
Reminds me of the Phenom days, it was so good
noiserr@reddit
Phenom wasn't really that popular. Core2Duo was all the rage.
LordAlfredo@reddit
Phenom to Buildozer was a very different story compared to AM4 to AM5
Bonafideago@reddit
I plan on running my 5800x3d well until after AM5 runs it's course.
Zhiong_Xena@reddit
Amd users one generation behind all the time.
Still on better hardware than intel XD
Bonafideago@reddit
I'm now two generations behind, and yet this CPU is among the top 5 in every benchmark for gaming.
Zhiong_Xena@reddit
Like that other dude said. 1080ti of CPUs
Jordan_Jackson@reddit
Man, I’m surprised that the 5900X is still selling so well.
sascharobi@reddit
It’s cheap.
weyermannx@reddit
To me it's like the only "productivity" cpu on the list.. It's probably the best am4 productivity for the money .. I guess 5900xt would be better, but I think it's much more expensive still
I have one. I bought it before am5 was released. Probably didn't actually need 24 threads as a software developer, but sometimes it comes in handy
It's a solid productivity choice even now if you have an am4 platform, because of the price
Creative_Ad_4513@reddit
't' cpus are just marginally faster than non 't' cpus. Youre paying for the letter only. Just up the PBO limits a touch on the non 't' one and they are basically identical.
FinancialRip2008@reddit
this is the best r/confidentlyincorrect i've ever seen in the wild. extra funny saying it to a dev who bought one
Creative_Ad_4513@reddit
well then, care to correct me
FinancialRip2008@reddit
5900xt is a downclocked 16 core part
Creative_Ad_4513@reddit
whoops, mixed up 5800x/5800xt with that one.
FinancialRip2008@reddit
iirc they did the same bullshit with zen2. i was just funnin' ya; amd fuckin' their own branding with this nonsense
Allan_Viltihimmelen@reddit
And ridiculously power efficient. A no brainer for home-pro level workload.
ConsistencyWelder@reddit
Yeah it's cheap for what for some could be considered a beginner workstation CPU.
The 9800X3D multicore performance beats the 5900X's though, so 8 Zen 5 cores beat 12 Zen 3 cores. Not too shabby for a gaming CPU to also have what used to be considered workstation like productivity performance.
Jordan_Jackson@reddit
I have a 5900X and it has been a very good CPU. It pretty much still does everything I want it to and fast. I would just think that someone buying a chip now would go for 7000-series. Though I understand if you can find a 5900X for like $250-300. That is cheap for all those cores.
VitoD24@reddit
A few months ago I saw a video of a Spanish YouTuber, that makes videos for GTA 5 - LSPDFR mod and some other games, in which he showed his newly build system: Ryzen 9 5900 + GIGABYTE Aourus AIO(I yhink its 240 or 280mm) + RTX 3090 + 4 slots of RAM and all this on a Gigabyte Aorus Elite B550 MoBo. He didn't mentioned the capacity RAM and Storage he used. Prior to that according to his video's description he was on a PC with Intel i7 - 4790, GTX 1070Ti and 16 GB of RAM. I don't know whether he bought all this brand new or went for second hand / used market. He wrote in the video that get the Zotac RTX 3090 for like 699 Euros. I was really wondering why he went for such an AM4 system since for that money, he could get something around Ryzen 7700, but now I understand his point better. AM4 still offers a great experience, I was considering to get something Ryzen 5600 or 5700 or its x3d variant for my first setup, but the lack of integrated graphics is the only thing that I don't like in this platform. Having integrated graphics as a backup is very important for me.
gomurifle@reddit
So basically its almost only gamers and PC emthusiasts buying processors online.
996forever@reddit
Yeah, everybody else is either on a prebuild desktop or more likely a laptop
SilentHuntah@reddit
...you just described the bulk majority of PC DIY'ers.
LordAlfredo@reddit
Which shouldn't come as much of a surprise. System integrator CPU figures are probably very different, let alone business/enterprise customers.
ExtensionThin635@reddit
Which makes sense because businesses go through providers like CD to buy volume and bulk.
alex-tech1@reddit
i cant be on intels side anymore, buying amd from now on my brothers 14900k is basically a rock
FinancialRip2008@reddit
all cpus are basically fancy rocks when you get down to it
shalol@reddit
magic runes inscribed onto fancy rocks
FinancialRip2008@reddit
you are much more fun than i am. this is my new headcanon
acc_agg@reddit
I can't be on AMDs side any more, I always root for the under dog.
Aggrokid@reddit
The DIY segment that buys these CPU's ala-carte from Amazon is relatively small compared to prebuilts. Intel still dominates the latter.
TritiumNZlol@reddit
For sure. Although the longer this performance deficiency occurs the more sound advice of going AMD will filter through to the general prebuilt buying public.
nekogami87@reddit
The main issue is the production capacity, pretty sure AMD has no way to provide enough for SI builders (at least for gaming).
They already struggle with laptop, I don't see DIY market being better
Xaendeau@reddit
What? They provided a quarter billion chips for Xbox and PlayStation. If they have demand, they can figure out supply. Their server business is the best profit for them right now, CPU-wise. EPYC chips are more valuable than Ryzen, for a business perspective.
SI OEM client PCs are low hanging fruit. Intel has most of it and it's such a high volume low margin segment. It doesn't do much for them. E.g. Intel has serious rumors of somebody acquiring them due to their poor financial health.
I'd like to see more AM5 PCs at Costco. While office PCs probably don't matter about parts, Ryzen can be a serious influence on gaming PC builds, which is around 15-20% of the market share...split between desktops and laptops.
animealt46@reddit
Nobody knows why AMD can't or more accurately won't provide a consistent supply of parts to PC OEMs. TSMC says they are ready, consoles prove it's possible, but AMD won't. The OEMs that do go AMD for marketing advantages constantly suffer SKUs that are sold out and on backorder for months at a time.
noiserr@reddit
Because the OEMs are fickle. In 2022 the OEMs were double and triple ordering, and when the COVID lockdowns ended, the OEMs realized they had way too much inventory. So they cancelled orders, leaving AMD stuck with warehouses full of chips. It took them a year to sell down the indigestion.
Meanwhile they know exactly how many chips Sony needs. And Sony doesn't mess around with ordering more than they need.
nekogami87@reddit
If we don' t know why, we can' t say that they won't.
nekogami87@reddit
Yes, this is a quarter billion on a production line that's been ordered to run for nearly 5 years. There are cost in running stuff.
nekogami87@reddit
I mean, if we are talking about low margin, AMD should focus on datacenter sku then and not SI / OEM. That' s not even a question at that point.
hackenclaw@reddit
pretty much meaningless.
Laptop is like 8 of them are Intel, only 2 is AMD.
And for the same performance Intel is actually cheaper sometimes. lol
KermitFrayer@reddit
This will by first build using AMD. I saw no reason to stick with Intel this go round between the frying themselves chips and the horse power per dollar it just isn’t worth it.
Alive_Wedding@reddit
Good for AMD. Intel has basically given up on desktop for this generation. Arrow Lake is clearly more suitable for mobile platforms.
animealt46@reddit
Arrow Lake is suitable for nothing, it's Lunar Lake on mobile. Arrow Lake is just a mistake that Intel made for their first mass market Foveros CPU that they'll learn from for the future.
gumol@reddit
I wonder if AMD will break through 50% marketshare in desktop computers
teutorix_aleria@reddit
In self build and SI made gaming systems sure. In global desktop marketshare, not until enterprise and OEMs get off the intel train.
Its like way back in the mainframe days "nobody ever gets fired for buying IBM" Thats how intel is perceived in the business world today.
animealt46@reddit
Intel will dominante SI too as long as AMD continues to be utter unreliable junk for SIs to deal with. It's not even about product superiority or reputation, AMD simply cannot promise a constant supply of parts.
imaginary_num6er@reddit
Yeah enterprise needs "vPro" since Pat claimed all the vPro computers were able to get back up and running faster than those without it after the CrowdStrike catastrophe
Xlxlredditor@reddit
When it depended on the IT team's responsiveness lmao
imaginary_num6er@reddit
Xlxlredditor@reddit
Unless vProis some sort of KVM software solution on the CPU itself, I believe Customers, as they say, need to boot the physical machine in windows/bitlocker recovery. Unless I'm missing something, AMD/Intel has no difference, unless the vPro is basically drugs for the IT team
Mammoth-Main-3750@reddit
>Unless vProis some sort of KVM software solution on the CPU itself
Yes vPro includes such a functionality through AMT. It always runs (unless you physically remove the electricity source), regardless of whether the system is powered on or off, requiring no software installation, regardless of OS, and is essentially as good as sitting in front of the physical computer yourself. You can even fiddle with bios settings if you'd like through it. I've used MeshCommander in the past to manage some PCs with AMT enabled and it's a handy little tool to have around. I believe AMD has an equivalent but I've never used it and a quick google search leads me to believe that it's not quite as fully-baked as Intel's solution.
If you'd like to learn more Intel has some articles about their technology here and here.
Xlxlredditor@reddit
Interesting! But is it open-source? As a tech enthusiast I would find that a bit of a security hole if it is not
Mammoth-Main-3750@reddit
Some of the management software is open source but as far as I know the part that's embedded in the CPU is closed. If you're concerned about security there's the me_cleaner project on GitHub that allows you to sort of disable it. It does have a big list of caveats though due to how deeply embedded the management engine is into the CPU.
Strazdas1@reddit
It doesnt help that AMD simply does not have the chips. They frequently cannot provide OEMs as much as OEMs want.
COMPUTER1313@reddit
The only notable exception is Broadcom who successfully pissed off almost all of their VMware customers, such as AT&T filing a lawsuit because they didn't like the idea of having a 1000% price increase forced down their throat.
sturmeh@reddit
Does that count the computers that were built more than 10 years ago running ancient software out of fear of introducing security vulnerabilities or bugs?
YellowMathematician@reddit
Does it mean "if you buy Intel products for your companies and they fail, it is Intel's fault. If you buy AMD products and they fail, it is your fault"?
Earthborn92@reddit
It's more the fact that most mundane enterprise server and productivity applications don't exactly need more performance than Intel is able to provide.
So why change and risk your neck as a CIO?
Vooshka@reddit
This is the case in a lot of industries besides IT/tech.
No one wants to stick their neck out to introduce a new supplier unless it relates to a critical KPI (sometime not even then).
If anything goes wrong with the incumbent, it's the incumbent company's fault. If someone changes suppliers and it goes South, that person is on the hook for introducing the problematic item.
Even if there's a cost cutting KPI, most will get the incumbent to lower their price as much as possible, and justify the less-than-expected savings by staying with the incumbent.
teutorix_aleria@reddit
Something like that yeah.
ExGavalonnj@reddit
AMD doesn't have the capacity for that not having their own fabs. It is why OEM go Intel because they can always get products
No-External-1122@reddit
And yet AMD consistently secures contracts for consoles.
It's not a silicon-production problem.
Earthborn92@reddit
ARL is TSMC. LNL is TSMC.
Nvidia is all TSMC.
This fab excuse is not the reason.
RealisticMost@reddit
The older chips are Intel and most of the sales are older chips.
Azzcrakbandit@reddit
At this pace they might. Depends on oem contracts/sales.
NoHopeNoLifeJustPain@reddit
And yet all new laptops the company I work for is buying are still Intel.
Earthborn92@reddit
If your IT supplier is Dell, that's not going to change anytime soon.
996forever@reddit
Funny enough, the XPS line which for the longest time everybody thought was THE poster child for Intel, immediately offered Qualcomm options.
Earthborn92@reddit
My guess is a combination of FOMO on potentially missing out on Windows on Arm and Microsoft pressure.
For x86, their vendor is Intel. Hell, if Intel made a ln arm chip, I'm not sure Dell would have gone with Qualcomm.
bphase@reddit
Ours have been mostly AMD for the past couple of years. My current and previous laptop were AMD. It's slowly happening.
braiam@reddit
The title and your observations don't contradict each other.
Raxor@reddit
i still work with co workers that say 'intel is reliable'
braiam@reddit
Which is the thing, humans put too much stock on their individual experiences. That's why anecdotes aren't proof of anything.
FDrybob@reddit
Exactly. Take any Biopsychology class and you will quickly see just how overconfident everyone is in their own brain.
reveil@reddit
Honestly if you look at Lunar Lake laptops especially the thin and light kind Intel is not in a bad spot in that segment. Battery life to rival Qualcomms Elite chips with going toe to toe in performance. Top it off with 50% stronger iGPU and no hastle with arm compatibility problems. On the desktop though buying either Raptor Lake or Arrow Lake is almost always the worse option.
Dope2TheDrop@reddit
Genuinely horrible, the way things are going we‘ll have exactly red/green as the only viable options for CPU/GPU respectively.
We all know how that will reflect back on the consumers…
p3n3tr4t0r@reddit
Lol, maybe we all been underestimating the general consumer. Intel debacle being this mainstream seems odd to me
GrumpySummoner@reddit
Meanwhile, on european Amazon stores, the 9800x3d page is still not showing up in the search, and is only accessible by a direct link
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hey grilledcheez_samich, your comment has been removed because it is not a trustworthy benchmark website. Consider using another website instead.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
grilledcheez_samich@reddit
Bad bot! I was making fun of Userb3nchm@rk.
3G6A5W338E@reddit
We prefer to pretend it does not exist, as to not unnecessarily give it any attention.
grilledcheez_samich@reddit
That's great, but wouldn't it be better to educate people about it's flaws? It's like one of the first links that shows up when searching benchmarks on Google.
3G6A5W338E@reddit
Traditionally, Google has given more weight to pages that are linked more often.
And that's one of many reasons we try and not link that site.
grilledcheez_samich@reddit
Alright, fair enough. I didn't actually link it, I just said it's unholy name.
hannes0000@reddit
Went to AMD also this week from i7 10700k. That's my first AMD CPU since I started messing with pc 2008.
AlexIsPlaying@reddit
it's Steve fault :P (joke!)
ThotSlayerK@reddit
On a kinda unrelated note, how does AMD afford to sell 3D chips? Like is the demand high enough to justify the cost of an entirely new lineup? The DIY market is the most influential, but OEMs are mostly buying Intel and non-3D AMD chips.
I'm glad they are doing it but this is a wild guess on my part: AMD probably operates on very thin margins for the 3D lineup and is mainly building reputation, which is good imo if they want more market share. When a non-tech-savvy person hears that AMD offers the fastest gaming CPUs and it's not even close, they might start considering AMD for their next PC—even if gaming isn’t their focus, or if they’ve historically stuck with Intel because “it just worked.” I might be all wrong tho and I want some other opinions.
12318532110@reddit
3D v-cache was originally made for epycs like Milan-X and Genoa-X. The consumer version is just an offshoot of that.
ThotSlayerK@reddit
Oh that makes sense then. Thank you for the info.
KoldPurchase@reddit
It's ok. The gaming consumer market is insignificant according to Intel.
We'll see how Intel rebounds.
996forever@reddit
Their bread and butter (client wise) OEM office desktop and laptop they ain’t losing that anytime soon. The only thing they need is to have good supply./
KoldPurchase@reddit
True, but the US govt is worried about Intel's financial stability because of their technical problems.
Intel needs to develop new high end processors that can be good for gaming and productivity. Completely igmoring the consumer market because they aren't selling units now is irrelevant. They could be selling if they were making good products.
NeroClaudius199907@reddit
No client is doing very well vs other divisions. Its DC and foundry thats dragging them down. Intel just coasts of the fact they can supply more processors that amd. Like 50m to 8m
cocobello@reddit
That means the product is too cheap.
Zapafaz@reddit
-guy that's only read the first paragraph of an 8th grade economics textbook
cocobello@reddit
..don't be so judgy. I was just writing in the position of AMD and it got me downvoted to oblivion, haha. Well, cheers!
randomkidlol@reddit
nvidia's shit is way overpriced for what it offers over last gen yet theyre outselling amd by an even bigger margin.
imaginary_num6er@reddit
Nvidia is not overpriced depending on workload and only becomes overpriced when a new Nvidia generation is released for the same workload.
randomkidlol@reddit
im fairly certain 99% of consumer workloads involves powering up to 3 monitors and playing video games. yes its beyond overpriced.
Strazdas1@reddit
Then you have poor grasp of consumer workloads. Great many of them are mixed use where same GPU is used for work and play.
nuenoxnyx@reddit
Dude a $1699 msrp card (which already costs more than most PCs) is selling for $1960 to $2600
ResponsibleJudge3172@reddit
RTX 4090 is the only talking point most of the time despite everyone saying that the market for such is insignificant
psydroid@reddit
If there is a better ROI on that hardware people will still buy it to run their tasks. Nvidia is largely on the server in the enterprise these days, whereas Intel makes most of its money on the client side.
So Nvidia doesn't need to price its GPUs low to generate more revenue, as every GPU could also be sold in the enterprise at much higher prices. The real issue here is that neither AMD nor Intel is able to compete in the graphics space.
And with the release of the Nvidia/Mediatek APU with Nvidia graphics next year I expect AMD and Intel to gradually lose marketshare due to having a less compelling product compared to the Nvidia/Mediatek APU.
imaginary_num6er@reddit
So those who bought at $1699 is appreciating in value, not being overpriced
Impossible_Okra@reddit
Nvidia is just money printer go brrrrrrrr
dern_the_hermit@reddit
The more you something something the more you something something else.
Strazdas1@reddit
The more you post on reddit, the more insane you get?
Prince_Uncharming@reddit
It can’t both be overpriced and also outselling by a larger margin.
People are choosing it over anything AMD has to offer. By definition it can’t be overpriced, clearly people see the value in that choice.
obiwanshinobi87@reddit
Say it louder for the people in the back please.
fatong1@reddit
Or, intel is too expensive with lackluster gaming performance, which is what anyone seems to care about.
DaDibbel@reddit
A lot of people have lost faith in Intel especially over the 13th and 14th gen fiasco.
COMPUTER1313@reddit
"Trust me, we're going to put out an update to fix Arrow Lake's issues."
After how Intel handled that "oops too much voltage" drama? I'm deeply skeptical.
Azzcrakbandit@reddit
When you offer a new gen that performs worse(either due to a rushed release or poor hardware), what can be expected. Kicking a dead horse has never felt more realistic in my lifespan. I was born in 1999 so my memory isn't great before the intel i7-3770.
Trey_An7722@reddit
Which shows that gamers niche doesn't mean that much to AMD.
Grouwn up crybabies that were screaming "ZEN5 is CR*P" out of top of their lungs are irrelevant.