4070 Super (12gb) vs 5070ti (16gb)
Posted by rabbany05@reddit | LocalLLaMA | View on Reddit | 11 comments
My friend is selling his ~1 year old 4070S for $600 cad. I was initially planning on buying the 5070ti which will cost me around ~$1200 cad.
Is the 4070S a good deal compared to the 5070ti, considering future proofing and being able to run decent model on the lesser 12gb VRAM?
I already have 9950x and 64gb RAM.
jacek2023@reddit
What's the reason to buy these cards instead 3090?
fasti-au@reddit
VRAM has a key. 12 gb means your running smaller models but smaller is good now compared to smaller a year ago so it’s more a is 12 enough. No people aim at 24gb 32gb 48 gb for big small models like r quad 24gb.
You can always rent GPUs for stuff if it’s small use
an80sPWNstar@reddit
I'm running the new qwen3 VL 8b Q8 fully on my old 1080ti 11gb and set the token count to like 16k. It's fast and does a damn good job. It all just depends on what you want to do.
Long_comment_san@reddit
I have 4070 with 12gb VRAM and it is a completely useless card for local AI use. 16 gb is same type of useless but you get more context/slightly more speed. Honestly if your aim is AI use, get a second hand 3090ti provided it fits into your computer. It has everything and plays games about the same. You only lose 4 bit support and have to tolerate post-mining era cards. But realistically, 12 and 16gb are completely useless. If we're talking LLMs, assume 1-2 gb for system, 6gb for context. You only get 8gb of truly free vram for your model. What you can run with it? Something like Q5 13b to fit entirely. Same with image generation. Yesterday flux 2 came out. What do you need for it as a baseline? Yup, it's 3090ti with 24gb.
datfalloutboi@reddit
The 4070 ti super is the best deal. Same VRAM as 5070 ti and great value. While it lacks the new Blackwell architecture the 50 series cards run on, it’s a very small tradeoff.
rabbany05@reddit (OP)
Sorry to say it’s the non ti 12gb version.
Dontdoitagain69@reddit
If you are plying games on 1600 , 4070 is enough unless you want to go low set on 4k. I got a laptop with 4070 and it give me 165fps If you are not playing games , get a pro card. They run at lower power and have higher ram. 16 vs 12 gb won’t give you any noticeable performance. For LLM I start at 24gb, because even though it still a poverty spec you’ll start to see a difference , even if it’s last gen. If I were you, I’d save money and get 4070 and upgrade ram. Because you will spill models anyways , at least you will have some headroom
DedsPhil@reddit
2 used 3090
etherd0t@reddit
buy/save money for the new one, preferably the 5090 (top of the line)
no matter what you use it for.
calivision@reddit
Imo the new GPU is not twice as good but costs twice as much
andy_potato@reddit
Generally nothing beats VRAM. Can never have enough of it.
16 GB is enough to run most current image generation models at decent quality but will severely limit you using local video generation models like Wan. Seeing how even image generation models like Flux2 are exploding in size I feel like 16 GB won't last you for very long. For LLM you should be fine though as you have a decent CPU and a good amount of Ram.
If you're on a budget and depending what you want to do maybe start out with a cheaper 5060ti / 16 GB first and add the (hopefully soon to be announced) 5070ti Super with 24 GB next year.