What GPU should I get for my PC
Posted by T-90_Soviet@reddit | buildapc | View on Reddit | 11 comments
I have been using my system for around 1.5 years now and I can surely tell that I need a GPU as my current setup which is:-
CPU: Ryzen 7 5700G RAM: 16 Gig (need to upgrade this too) GPU: None T-T And other things but this is what I have
My setup barely keeps up now without a GPU, I want a GPU so that I can earn for myself and my family and start my game development but I don't know what GPU should I select, I barely have money for the 4060 and that was what I was thinking of, as I want to create games and learn AI. I don't know what should I select as I am broke and don't have any money but also want to start these things.
Low-Blackberry-9065@reddit
You probably want an Nvidia GPU for AI (better support, AMD support is improving but idk if it's close to parity yet).
Consider a 3060 if it's cheaper than the 4060.
T-90_Soviet@reddit (OP)
But I heard that it is 10% faster than 3060, that was why I was considering it but why 8GB VRAM on this thing.
SeniorCaution@reddit
If you go for the RTX 3060, definitely go for the 12GB version. I have an RTX 3060 12GB with 32gb of ram that I use for AI workloads, and it's exceeded my expectations with LLMs & image generation. I've been able to even run some 70b LLMs on it with quantization. Would recommend for a budget AI build.
T-90_Soviet@reddit (OP)
Both the RTX 4060 and 3060 are priced the same
SeniorCaution@reddit
Yeah, 4060 8 GB and 3060 12GB are about the same, both around $290 USD. It matters the use case. If you mainly plan to do AI, then the VRAM is typically the biggest constraint, and you should go with the 12GB 3060. If you plan to do just game development then the 4060 8GB is good.
My brother uses the 4060 8GB. Gets more FPS in games, but he simply wouldn't be able to load the same AI models that I run on my rig.
If you can fit a 4060 16GB in your budget, that would be best case (\~$450 USD)
Low-Blackberry-9065@reddit
It is faster but also more expensive, since your main concern is budget getting the cheaper and slower in compute but with more VRAM (for AI models) could make sense.
If you have more budget you can look up the 4060TI 16GB, poor value for gaming but the cheapest Nvidia GPU with that much VRAM to fit even larger models.
Wajid-H-Wajid@reddit
RTX 4060 is a solid pick, but if you're tight on cash, the RX 7600 or a used RTX 3060 Ti are great alternatives.
Forward_External_536@reddit
You should get a 3050
opensrcdev@reddit
Agreed with the folks saying to get an RTX 3060 12GB. That is a good GPU and will work perfectly for development purposes and AI learning.
SilverKnightOfMagic@reddit
For this situation just buy what you can. Look into used market so your money goes a bit further.
Sounds like you just need something to get you started. So get something that's you started and then upgrade when you can.
UnderstandingSea2127@reddit
Get what you can afford. Check the system requirements for the tools you want to use.
Any discrete GPU will be better than integrated graphics.
For creative workloads you'll want an nVidia RTX GPU with CUDA support.