Best upgrade for ai from a 4070
Posted by Narrow_Awareness2830@reddit | LocalLLaMA | View on Reddit | 4 comments
I’ve ben expirementing lateley with the new Gemma models ( sorry for my spelling ) and when I try to run the 31b model it works , but it’s very slow. what is the cheapest upgrade I can get ?
Easy_Kitchen7819@reddit
Rtx 5090
Narrow_Awareness2830@reddit (OP)
Sorry but I can not afford that
Weird_Linux_Nerd_07@reddit
I have "upgraded" 4060 16Gb by adding 5060 16Gb. I use fine tuned llama.cpp configs to split model across gpus.
MushroomCharacter411@reddit
Define "very slow".
The way things are going in terms of prices, it might be cheaper to add another 12 GB VRAM card (like another 4070, or even a 3060) and split the model across them. That wouldn't be the cheapest if you were starting from nothing, but since you already *have* one 4070, it might make sense to consider.