CNBC: Nvidia passes Apple as world’s most valuable company
Posted by SmashStrider@reddit | hardware | View on Reddit | 102 comments
https://www.cnbc.com/2024/11/05/nvidia-passes-apple-as-worlds-most-valuable-company-.html
auradragon1@reddit
I think I know why r/hardware wants AI to die. Like all things, it usually goes back to gaming.
r/hardware attracts gamers because of its focus on chips
Gamers, above all else, desire more performance for less money.
AI soaks up Nvidia's attention, increases wafer and other component costs due to AI demand
Gamers think they're getting less performance for the money because of AI
Am I right?
yflhx@reddit
Some people (including myself) don't want it to die, but still believe it's vastly overvalued. As a business model right now, it just makes very little sense. OpenAI is loosing money, and their biggest source of revenue is end customers, which are price sensitive. They couldn't convince many companies to use that (perhaps yet). And the big question is: will it start to make sense before they run out of VC money?
It obviously doesn't help that most companies try to ride the hype and release "AI" products that shouldn't be released or rebrand products that have nothing to do with AI. Many people look at this and think that all "AI" is fake. And I'm putting "AI" in quotes here because ironically it's not even actual 'artificial intelligence'.
DeliciousPangolin@reddit
I think generative AI is about 20% real, 80% hype. The biggest problem is that it doesn't really correspond to the tech industry's standard business model, where developing the product is expensive but the incremental cost per customer is near zero. Like, it costs almost nothing for MS to sell a license of Windows, or Facebook to serve content to a user. AI is expensive. Even inference is way too expensive to give it away for free, much less training models.
In addition, a lot of the cases where AI is genuinely useful fall into the category of "that's a feature, not a product". Like, context-aware fill in Photoshop is an excellent feature, but you can't sell it separately from Photoshop. It doesn't bring Adobe any more revenue.
auradragon1@reddit
Back in the 90s, it used to cost a lot of money and servers just to serve HTML pages. Then hardware got way better. Server chips went from 1 core to 256 cores. Hyper scalers brought efficiency. Now it costs next to nothing to serve 1 million HTML pages to someone.
Why do you think this pattern won't repeat for inferencing?
It does. If there is more value add, Adobe can increase the price of their products.
NeroClaudius199907@reddit
A lot of people are still skeptical about ai, they'll downplay everything ai related.
Top_Independence5434@reddit
I think it's more of a marketing problem than "AI" problem. "AI" is a catch all term that PR uses so that the uninitiated layman know it's something special than usual, if you ask the actual designer/researcher of the product they will say it's actual multiwords jargon name, that you would be required to devote hours reading papers/articles and many other media to grasp the basics.
2FastHaste@reddit
More often then not, the "multiwords jargon name" is actually technically AI.
It's just that the "uninitiated layman" thinks AI means HAL 9000
basseng@reddit
The same reason people here hated crypto, the difference is machine learning provides actual value to humanity, crypto is now just a massive greater-fool con.
auradragon1@reddit
Agreed. Crypto, I totally get. I'm also anti-crypto. It's 99.999% scams, rugpulls, and unregistered securities.
But LLMs? I'm all on board.
ProfessionalPrincipa@reddit
Interesting then that some of the crypto conmen got into the business of AI.
prestigious-raven@reddit
Conmen are in every business, especially sectors that feature emerging technologies. It happened with the dot com bubble, and it will happen with the current AI cycle.
auradragon1@reddit
True, the only difference is that all crypto people are conmen.
MumrikDK@reddit
It couldn't be about the environmental and job market impacts?
SagittaryX@reddit
Partially, but also AI is a boom that will bust at some point soon. Not the useful features of course, but lots of companies and products are going all in on AI with their focus for products and features that will not end up being used.
jaaval@reddit
For me it's it's just mildly infuriating to watch the hypetrain go choochoo. It's like when Musk talked about hyperloop and people cheered when engineers were in the corner trying to say that the idea is stupid from beginning to the end.
There will be useful applications for the AI stuff that is currently being developed but it will mainly be in making some tools better, not transforming anything. But the companies sell it with stupid scifi sounding ideas. AI has also been somewhat useful in uncovering some inefficiencies in how we work seeing what tasks people try to replace with AI.
The problem for the AI hype companies is that the stuff is expensive and the applications are not yet so useful people would be willing to pay big prices for it.
auradragon1@reddit
I think there is a clear distinction between hype for companies like OpenAI, Anthropic, Nvidia and opportunistic companies saying their thermal paste has AI.
The hype is for companies trying to scale Transformers.
jaaval@reddit
I was only talking about the big players, not at all about all the non AI products they market as AI.
auradragon1@reddit
I work in software and I believe people are severely underestimating how transformative (no pun intended) LLMs will get in the next few years.
jaaval@reddit
I work in software and I seriously doubt they will do much.
auradragon1@reddit
What software do you do?
jaaval@reddit
I used to do mostly data analysis code, before that i did sensor algorithms. Now I'm going back to doing general c++ software dev and slam algorithms.
auradragon1@reddit
I took your question here about Debian 4 years ago, and ran it through OpenAI's o1 model:
https://chatgpt.com/share/672b6ab0-0fac-8000-9272-8b8b15ecf85b
Do you think that answer would have helped you?
bexamous@reddit
That answer doesn't seem that great. I'm not sure what would be a great answer to that question.
All the stuff it suggests is like checking for obvious stuff.. which in theory could be useful but assuming its not then its really been no help.
auradragon1@reddit
I'm now getting downvoted to hell but you can just ask the AI to do this for you.
bexamous@reddit
You can ask but it won't actually work. Not far from it but definely not there right now.
auradragon1@reddit
How do you feel about Google announcing that 25% of their new code is written by an LLM?
bexamous@reddit
This is likely super misleading.. if you autocomplete finishes a line is it now AI written? I'm not sure that's what I'd consider AI-written. My understanding this is the extent of it (based on what people at google have said).
jaaval@reddit
That seems to be fairly generic debug steps. I am not sure but I think I ended up reinstalling the guest system because just reinstalling DE wasn’t enough.
mycall@reddit
Mamba, XLSTM, BERT and others exists, but Transformers are quite effective and offer a lot more than the GPT / LLM hype
auradragon1@reddit
Ehh... The world is massively scaling up because of the transformer model. BERT is a type of transformer model as is all modern GPTs and LLMs.
Zerasad@reddit
My hatred for AI is multi-faceted.
In its current state it's looking like a massive buble. Huge investment with no real useful implementation. At the same time companies are needlessly inserting AI where it has no use, enshittifying their working products. Nobody can see how AI can be monetized at the moment.
The way AI is currently used is to replace people's jobs and create a shittier user experience. Stakeholders and pencil pushers are seeing AI as a tool to save on creative workers like artists, copywriters, voice actors, the jobs that shouldn't be replaced by AI. While menial jobs that AI could actually improve are largely ignored.
AI is used to create a massive amount of slop. At worst its fooling boomers on Facebook, at best it's used to create misinformation and affect politics and public discourse.
I genuinely cannot see how the current implementation of generative AI has been positive for us as a whole
auradragon1@reddit
Do you really think LLMs have no use?
ProfessionalPrincipa@reddit
It's great if you want hallucinations.
auradragon1@reddit
I rarely run into hallucination problems using it daily.
RZ_Domain@reddit
Yea, you're right (not sarcasm)
chx_@reddit
If we are talking of dying , I do not want people and democracy to die that's why I am against AI.
auradragon1@reddit
How do you think AI will cause people/democracy to die?
chx_@reddit
What didn't kill people yet but very well damned could and likely will: foraging books written by AI, transcription system used in hospital inventing shit out of thin air.
What did kill people -- well one, so far: chatbot leaning into teen insecurities.
What affected democracy: singing Modi.
I haven't foreseen any of these and I can't tell what else will come but bad it will be.
Oh and that's the answer I would've written yesterday.
Today? Do you think they will not use AI somehow to pick whom to deport from the US?
kingwhocares@reddit
A lot also is due to exponential growth can't sustain itself and generally a crash comes before things stabilize. Happened with crypto before as well.
auradragon1@reddit
Cisco was 7x bigger after the dotcom bust (2001) than they were before it (1995).
notafakeaccounnt@reddit
Yes
From-UoM@reddit
Man, Evga left Nvidia at the worst possible time.
Back then Nvidia was valued at about ~300 billion around sept 2022. Stock falling over 50%, Missed earnings, demand for gpus falling. high RTX 40 prices. Things weren't looking good.
Fast forward just 2 years Nvidia is the most valuable company in the world. More that 10x ed their market value. Making obscene amounts of net profit and growing. The RTX 40 series has been extremely successful. Boosted greatly by the huge AI demand and brand recognition. Rtx 50 series will have massive demand because now its using Blackwell and the AI hype.
Evga messed up big time. They would have made a lot of money and sales even with low margins just because how in demand Nvidia is now
SpeedDaemon3@reddit
Other that rtx 5090 there is nothing interesting with rtx 50. And for most 4090 owners there is Little reason to go for 5090 at a 30% improvment. Nvidia makes the money nowadays with professional grade cards.
bctoy@reddit
I'm not speculating since last time I was way off-base so will just say that the earliest rumors about 50xx series were updating of the shader architecture.
So just because the supposed 5080 looks barely an increase over the 4080 does not mean that it'll perform the same on specs. Then RT/DLSS and frame generation improvements will most likely be there to differentiate it further.
SpeedDaemon3@reddit
My point is 5080 will likely match 4090D so it can be sold in China. So no new tech improvment that wasn't available until now, only maybe considerably cheaper than current 4090.
Strazdas1@reddit
China limitations are on specific metrics. There are many things a 5080 could improve over a 4090 and still be allowed to be sold in China.
Strazdas1@reddit
We dont know whether those rumours are true or not, but if they are then the cards are practically incomparable on raw specs and we will have to wait for benckmarks.
ResponsibleJudge3172@reddit
Remember 4080 had 76SM vs 68 on 3080. Absolutely miniscule jump in SMs, but performance jump was good
From-UoM@reddit
Yet its clear the entire 40 series is selling well.
SpeedDaemon3@reddit
The point is there is no reason to upgrade from the 40 series to 50 series. 40 series sold so well because 4090 wasnthe first genuine 4k120hz capable gpu.
Strazdas1@reddit
The vast majority of people buying the cards arent upgrading from 40 series. They are upgrading from 09/10/20 series.
194277006@reddit
It's not capable of 4k 120hz without dlss.
SpeedDaemon3@reddit
Dude, I have the card, don't tell me what it can do and what can't. Also dlss is really good when You have great starting fps and 4k upscaling from 1440p. It's a lot better than 1440p upscaled from 1080p.
dedoha@reddit
RTX 5000 wasn't even officially announced but I guess people love manufacturing outrage
timorous1234567890@reddit
Most 4090 owners want the best of the best and will pay for it. That is why they purchased a $1,600 in the 1st place.
A 30% performance increase for the 5090 will be enough to get them to buy one and sell or pass on their 4090.
SpeedDaemon3@reddit
Idk Man, I have a 4090 and see little to no reason to upgrade, the gpu is still cruising. Also outside USA 4090 was considerably more expensive. 1600$ yeah would be peanuts money...but in Europe was 2000-2500 euros. The only noteworthy improvment is the 32 GB for AI generation.
Ilktye@reddit
It's like saying Ford only makes cars like GT40 that are interesting, and people are only interested in those cars.
Hot_Cricket_5193@reddit
Nvidia consumer gpus really arent responsible for this growth
From-UoM@reddit
The AI boom and brand recognition boosted rtx 40 series sales.
Just look at the numbers. Gaming GPUs made 2.9 billion last quarter and will most likely make 3 billion+ this quarter.
They are reaching peak crypto boom and shortage sales during covid.
Hot_Cricket_5193@reddit
Their ai and b2b divisions far outgrow this - everyone knows they dont even care about gamers lol
From-UoM@reddit
Numbers don't lie. The revenue is growing for Nvidia RTX GPUs in a climate where consoles and other GPU sales haven fallen greatly
Hot_Cricket_5193@reddit
Most of their revenue comes from ai and data centres..? Just because they are beating amd it doesnt really mean much for their income statement
From-UoM@reddit
You still are not getting it.
Yes, most of their revenue comes from data centre. This made nvidia highly valuable and proved how good their cards are for AI.
This has a knockon effect which made their gaming cards far more desirable and the revenue figures show that.
Hot_Cricket_5193@reddit
Nvidia isnt sharing money with the partners like you think - take the L
From-UoM@reddit
Who said anything about sharing more?
More sales at same margin will get you more money.
Its just simple math.
1% of $100 is $1
1% of $200 is $2
Brostradamus_@reddit
You are assuming the margin is the same. It isn't.
Strazdas1@reddit
The margin on consumer GPUs are either the same or larger. It certainly hasnt decreased for Nvidia.
From-UoM@reddit
Even if its lower they make more
0.6% of $200 is $1.2
Nvidia Gaming gpus just before EVGA left in Q2 Y23 - 1.6 billion
Q2 Y25 - 2.9 billion. So it really did double sales in 2 years.
Strazdas1@reddit
While true, even in consumer GPUs Nvidia is showing better results than ever.
jedimindtriks@reddit
Evga barely made any money on Nvidia cards. That would not have changed no matter how well Nvidia is doing
NoStructure5034@reddit
They weren't just having low margins, they were apparently losing money on every 3080 or higher card sold.
Strazdas1@reddit
Thats because they outsourced 100% of the manufacture/packpaging so everyone else took their margin. The parners that actually make money with Nvidia chips are doing stuff in-house.
From-UoM@reddit
Evga 100% would have made more money with the AI Boom.
NeroClaudius199907@reddit
They wouldve kept a lot of people still employed and possibly making 90m
auradragon1@reddit
Problem for Evga isn't Nvidia. Problem for them is that they were outcompeted by Taiwanese companies. So on their way out, their idiotic CEO decided to make a scene and blame Nvidia.
danuser8@reddit
An Nvidia a day, keeps the cash away?
Strazdas1@reddit
The more you buy the more you save?
m0rogfar@reddit
An interesting valuation to be sure.
It makes sense if you assume that Nvidia will be able to continue selling AI chips for almost $50000 at >95% profit margin in the bulk that we're currently seeing, but seems risky if you consider that AI software companies are very heavily incentivized to get a viable competitor to the CUDA ecosystem going so that they don't have to pay for Nvidia's billions in profit margins.
Tman1677@reddit
This was 100% my stance a year ago, but honestly we’re a year later now and the competition is… what? Intel is bailing on dGPUs entirely, AMD’s offerings are an utter joke, there’s absolutely no industry consensus around a non-CUDA GPGPU API. Sure PyTorch and Tensorflow support more backends than they used to, but those are pretty clearly designed for consumers on laptops, not hyper scalars.
Overall I’d say if anything Nvidia is in a better monopolistic position than they were a year ago, not worse
EmergencyCucumber905@reddit
Then how are hyperscalers using PyTorch and Tensorflow on AMD GPUs?
Strazdas1@reddit
With difficulty. That or the big companies maybe get AMD support to help them do that, i know smaller clients certainly dont.
auradragon1@reddit
Nvidia GPU clusters are now up to 100k per datacenter. In 2025, it will be up to 300k - 500k GPUs hooked up together spanning multiple data centers. You're digging deep into CUDA code optimization at those scales.
ResponsibleJudge3172@reddit
Ignoring other factoras like Nvidias's networking division which aold as much as AMD data enter, Nvidias's data center CPUs and next year attempt to do semicuatom for PC beyond the switch and self driving cars stuff
m0rogfar@reddit
A year isn't that long in terms of hardware development, nor is it that long in terms of justifying Nvidia's current market cap. The current market cap suggests that Nvidia's investors expect that Nvidia can keep posting their current results scaled for inflation for another 65 years, which is a completely different ballgame.
Tman1677@reddit
A year is excruciatingly long in terms of software development, and it’s a software moat that Nvidia has over Google/Microsoft/Amazon. If there was a meaningful software API competitor like an open-sourced DirectML or something we would have heard about it by now. The silence speaks volumes.
I’m not saying that the gravy train will run forever for Nvidia, but it’s certainly gonna run for ~5 years. In that time they’ll have the money and opportunity to reinvest into literally anything they want. They could invest in AI models, datacenters, many things we can’t even imagine that could justify the evaluation once sales eventually die down.
StrictlyTechnical@reddit
More than that, all of Nvidia's biggest clients are building their own AI hardware and are actively working on cutting them out of the supply chain.
Strazdas1@reddit
But they arent having a lot of luck with it. Especially on the training side. If your competition is designin a product for 5 years and still end up buying yours because its better, then maybe you can keep high sale expectations for a few more years.
Also theres a lot more in the demand than a few big players.
auradragon1@reddit
So are Intel and AMD's biggest CPU clients.
The difference is that ARM is an instruction set and they license their core designs. So ARM customers like Amazon, Meta, Microsoft, Google, Ampere can easily scale. All you have to do is customize the ARM design to whatever you want, fab it at TSMC, and boom, the chip works with any ARM compatible software.
For GPUs, it's entirely different. No one is handing you an ISA and design for you to take to TSMC to manufacture. You also have to build the software ecosystem that can compete against CUDA.
Google has managed to do it with their TPUs but it's not clear if using their internal hardware has hindered their AI development. Google's Gemini has been consistently behind top models made by OpenAI, Anthropic, and Meta which are all trained on Nvidia hardware.
DigitalAkita@reddit
Designing your own processor even based off Arm's licenses is an extremely niche task, quite difficult, expensive and time consuming. Also Arm doesn't license to anyone willing to buy.
anival024@reddit
Yes, but you can go from nothing to ARM license + proof of concept / minimum viable product very quickly. TSMC will help you as well if you're waving fat stacks of cash around.
You don't have to have a product that competes with Nvidia's. You just have to have one that competes with Nvidia's value proposition.
auradragon1@reddit
You still need to be a big company to do it.
But so far, Amazon, Meta, Google, Microsoft, Nvidia, Tencent, Alibaba, Ampere. These are off the top of my head.
StrictlyTechnical@reddit
I am intricately familiar with Microsoft's Maia so that's what I'll focus on:
You don't, microsoft uses custom IP blocks for large parts of the chip like the control processor is just Tensilica cores, with the accelerator cores being the thing they actually designed themselves.
Maia compiles down to their assembly through LLVM and clang.
Third party IP.
I'm sure google is in a similar situation as microsoft
I can guarantee their models are trained on Nvidia's hardware, these custom chips aren't even used for training, it's all about inference, training makes up a small part and neither Microsoft nor Google don't mind buying Nvidia's hardware for that. Even AMD isn't trying to compete on that front, they literally put no effort at all into training performance, all the money is in inference.
auradragon1@reddit
Nope.
https://cloud.google.com/blog/products/ai-machine-learning/introducing-cloud-tpu-v5p-and-ai-hypercomputer
StrictlyTechnical@reddit
Lmao my bad I assumed they'd have a similar position as Microsoft.
It's what I was told by an AMD engineer who is specifically working on DC GPUs, that is AMDs position right now.
NVidia themselves claim 40% of their DC revenue is from inference.
What did I contradict? Your claim was that they need to design an entire chip, my claim was that most of it is 3rd party IP and they only designed the accelerator cores.
It has to be competitive in the sense that manufacturing it and running it has to be cheaper than buying and running GPUs from Nvidia. Perf/W was disappointing, but overall afaik the solution is still cheaper than Nvidia.
Triton uses LLVM, and Maia has an LLVM backend...
What do you think "assembly" is?
auradragon1@reddit
That's an idiotic take. AMD puts a ton of effort into training performance but no one is building 100k, 300k, 500k GPU clusters relying on AMD GPUs.
Right now all the money is in training. That's where Nvidia is completely dominant and why they're the most valuable company in the world. When the market matures, inference will be larger.
Furthermore, after using H100s to train, what do you think those GPUs are doing? Sitting there idle and never to be used again? No. They're re-used for fine tuning and inference.
Strazdas1@reddit
I mean, they will for at least a few years, after that we will see.
madewithgarageband@reddit
I’m long-term short on Nvidia. I think the applicability of AI is more limited than we currently realize given their energy usage and hardware requirements, and I don’t think Nvidia’s moats are that strong given they don’t even manufacture their own chips.
pandaSmore@reddit
I thought that already happened a month or so ago.
SmashStrider@reddit (OP)
Yeah, it happened again.
signed7@reddit
Apple, Microsoft, and Nvidia have kept switching places on top for the last couple of months now.
hurtfulproduct@reddit
Lol, yeah; we’ll see if it lasts. . . Apple has traded back and forth with Exxon-Mobil, Microsoft, and no Nvidia for the title of Most Valuable company. . .
iwentouttogetfags@reddit
Oh, have they got only 14 5090's to sell to consumers then /s