Building a new pc. Is the 3090 still relevant for a new build?
Posted by AndrehChamber@reddit | LocalLLaMA | View on Reddit | 33 comments
In terms of budget I could afford a second-hand 3090 or a brand new 4070 Ti Super. I am afraid of buying a 3090 and it becoming a huge bottleneck in 2-3 years. Would it be better to perhaps save more money and invest on something like a second-hand 4090 in a few months? I am a casual gamer, by the way.
Such_Advantage_6949@reddit
I have one 4090 and 3 4090. The 4090 is good for game, but in AI stuff, it is not worth the premium
Pedalnomica@reddit
I'm guessing you meant 3x3090?
Such_Advantage_6949@reddit
Yea 1x4090 and 3x4090. Key limiting factor for llm at the moment is vram bandwidth and you can check out the spec, 4090 is about the same same as 3090
crantob@reddit
How is this possible
Such_Advantage_6949@reddit
4090 is faster at computing than 3090. However currently, they both compute faster than the speed you can write out to the VRAM memory (vram bandwidth). 5090 specs look like will have quite faster vram bandwidtch than 4090
KY_electrophoresis@reddit
You are overthinking this massively. A 3090 will be fine and still worth good money resale in 3-5 years time. Don't wait, buy the best deal now and then sell and hop to the next one down the line. Buying cutting edge is like driving a new car off the forecourt. It feels great for a few weeks, but after that it's the same shit as if you bought 'nearly new', but you just overspent.
spiky_sugar@reddit
This is true while the 'old car' has the same properties, when 5090 will come out and will have 32GB memory instead of 3090 and 4090 24GB all the public repos that are published and trained from research groups will in few months focus on 32GB mark as they do for 24GB today...
PyroRampage@reddit
Yeah but I can get 2 3090s and have 48GB, sure I have less compute but it may be worth it.
mellowanon@reddit
people think 5090 price is going to be $1999 and up though. you'll be able to buy 3 3090s for that price. And I'd rather have 72gb vs 32gb any day.
PyroRampage@reddit
Yep, we’ll I said 2 to be safe. Only 2 could be NVlinked too.
kmouratidis@reddit
There was a joke that a new car loses half it's value the moment it's out the store's driveway. Having been through the GPU scalping years, the complete opposite was true. If you're not in the US, there is a high chance 3090s still go for 800-900... and even some weird dudes on local sites asking original price 😅
AndrehChamber@reddit (OP)
Loved the analogy. Thanks for that.
VulpineFPV@reddit
3090 is also the generational powerhouse that can run FSR and DLSS together in some cases. It’s also great for AI use cases. Great pick.
3090 is a favorite for lots of AI enthusiasts.
PyroRampage@reddit
It also has NVLink which the 4090 and beyond do not.
SwordsAndElectrons@reddit
Since you're asking here, presumably also a casual LLM user? Not so casual?
An argument can be made for the 4070S for gaming, as long as the titles you play and the resolutions you play them at don't bump up against it's 12GB of VRAM.
For LLMs, 24GB of VRAM makes the 3090 the clear winner.
AndrehChamber@reddit (OP)
Not so casual LLM user. I am a software engineer with 16+ years of experience. I have worked with ML but not with my own hardware. The games I play are not very demanding.
_Erilaz@reddit
If you're working from home on your pet projects and this could make you money, then I guess 4090 might be justified. Same can be true for image generation workflows since they're heavier on the compute. Also applies to low latency scenarious. But if none of that is an issue, spamming 3090s won't be a bad idea for a LOOONG time especially for LLMs, it is widely supported. To put things in perspective, people are still building monstrous P40 LLM farms.
ArsNeph@reddit
In AI, VRAM is king. The more VRAM the better. As a gaming card, the 3090 is roughly on par with a 4070 Super, so it's perfectly capable of handling 4k gaming. Unless you're dying for frame generation, you're overall better off with a 3090
FullstackSensei@reddit
Bottleneck for what? If it's gaming, maybe, but you say you're a casual gamer. If you're thinking about LLMs, then the 4090 won't be any better. Both have the same VRAM running practically at the same speed. So, for inference on the 4090 won't be much better, if at all.
LLM inference is bottlenecked by memory bandwidth, not compute power.
AndrehChamber@reddit (OP)
I was referring to bottleneck for LLM in some shape or form. You made a good point. So, in theory, if a 5090 comes with plenty more VRAM (32GB+), it would likely be better, but that would be way more expensive and it's probably at least 6 months away. I think a 3090 is probably the way to go.
FullstackSensei@reddit
Even if the 5090 has 32GB, me thinks the 3090 will still make more sense. You can get probably three 3090s for the price of one 5090. You'll end up with 72GB VRAM and 3TB/s memory bandwidth for \~1.5 the power consumption of the 5090.
I doubt we'll see any breakthrough in the coming 2-3 years that will increase compute substantially while keeping memory (and by extension memory bandwidth) the same.
The best part is: with how nvidia/AMD/intel are focusing on the datacenter, 3090s will stay at the same price level (if not slightly go up) at least for the next year. I know I can sell the three 3090s I bought last year and early this for a profit now.
synw_@reddit
it's not enough. We all need more cheap vram
shroddy@reddit
The 3090 has more Vram (24 instead of 16) and more memory bandwidth, which are the two most important factors for LLMs right now.
g33khub@reddit
The main advantage of the 4070Ti S is efficiency and low power / heat management requirements. But I would not trade that for 8GB less VRAM. Make sure you have a case and PSU which is appropriate for the 3090. I had to change several things for accommodating the 3090 and managing the VRAM temps on the back.
AndrehChamber@reddit (OP)
Yup, it's a new build from scratch. All these will definitely be taken into consideration.
AndrehChamber@reddit (OP)
Yup, it's a new build from scratch. All these will definitely be taken into consideration.
Over_Description5978@reddit
I think you have no choice. At max you can add multiple cards as per your budget
dookymagnet@reddit
Is there a sub to see everyone’s local LMM machines? 👀
AndrehChamber@reddit (OP)
Keen
carnyzzle@reddit
3090 is the new 1080 Ti
Strong-Inflation5090@reddit
I have a 4070ti super, it's decent but most of the times I wish for more memory so 3090 would be a great option.
shokuninstudio@reddit
3090 is the best value 24GB card you can get and I’m glad I waited. I picked up an open unused 3090 FE for £650.
MixtureOfAmateurs@reddit
A 3090 is best, a 4090 will give you faster inference but anything you can fit on a 3090 will be plenty fast. A 4070 ti super has less memory so is less useful - VRAM is king