How Much VRAM for AI For Beginner is Good?
Posted by Gravaid@reddit | buildapc | View on Reddit | 15 comments
So yeah, I am going to start doing AI stuff once I get my gpu, this is my build currently:
CPU: Ryzen 7 5700G
MOTHERBOARD: b550
POWER SUPPLY: 500Watt (I know i will upgrade the PSU)
RAM: 16Gb 3600Mhz
STORAGE: 256gb NVME.2, 128gb SSD, 512gb HDD
I want to start learning about AI and other things and was wondering how much vram is required for it, honestly my budget for gpu is around the price of rtx 3060,
AMD is probably a no go cuz dosent support CUDO etc.
Is 8gb good enough for beginners so I could go for rtx 3060ti or should i just buy rtx 3060, the problem is also the country i am living in has outrageous price for gpus like for example rtx 3060 goes for avg 350 usd (used), new one is out of my budget.
Should i just go for the 3060 or will 3060ti work for a few years?
tcisme@reddit
The answers are "it depends on what you're doing" and "you'll want as much vram as possible."
I'd recommend a 3060 over a 3060 ti in your position since it will let you explore more workflows. You could also experiment with "renting" gpu boxes, which would provide you firsthand experience with the capability of various gpus for your work (and would also be good value if gpus are expensive where you are).
Gravaid@reddit (OP)
As you said, this is a starting card so i can explore and choose whats best for me, and if it gets overwhelming for 3060, ill probably switch or do some kind of in between local and cloud.
tcisme@reddit
Good luck.. consider keeping your monitors plugged into the integrated graphics outputs to free up more vram.
Born_Bad_1294@reddit
You wanna locally host AI?
If yes, then 8GB is shit and won't run many models. Minimum would be 16GB to run decent models.
You can also run models via cloud etc, maybe try looking into that
Gravaid@reddit (OP)
sadly thats out of my budget, so i guess ill have to do with 3060 12gb
tcisme@reddit
Good luck.. consider keeping your monitors plugged into the integrated graphics outputs to free up more vram.
ReasonableMortgage11@reddit
What kind of ai do you want to get into ? T2I, tts or LLMs ? I just upgraded my 3060 12gb to a 3090ti ... A 12gb card is a perfectly valid starting point. Avoid AMD at all cost it's absolute dogshit and a nightmare to get shit running on a non cuda card...
Gravaid@reddit (OP)
yeah as you said, its my starting card so ill probably explore from gen Ai, then stretch out from there. for future i am uncertain but highly likely ill delve deeper iinto llms.
Born_Bad_1294@reddit
No use mate.
if you wanna play games, yeah sure.
But 12GB for locally running decent models is not nowhere enough. I myself own a RTX 5070 Ti 16GB and even I can run some select models.
Better to try cloud
ReasonableMortgage11@reddit
You clearly have no idea what you are talking about
Born_Bad_1294@reddit
Ohh yeah?
How so?
ReasonableMortgage11@reddit
Also " even I " can run a select few ... Like wth is special about your 5070ti ? It's a 16gb card what do you expect it is a marginal upgrade over 12gb for AI also it has a mid range baby bus which is absolutely suboptimal for fragment offloading
ReasonableMortgage11@reddit
You didnt even ask the man what type of models he wants to run and want to sell him 16gb which in most cases is absolutely 0 upgrade over 12gb in terms of models it can run because 12 or 24 are the big steps open weights aim at, and if it can run on 16, then Q6 of it can run on 12 just fine with minimal loss at less than half the price of a 16gb gpu ....
CtrlAltDesolate@reddit
You'll want 16gb vram 100%.
I tried on a 3060ti a while back and very underwhelming unless you're wanting a really basic local bot that answers stuff copilot could do faster and more accurately.
Gravaid@reddit (OP)
yes thats why i was also leaning towards 3060, I am at entry level so I probably wont go that far into it, just need something to get started with.