RAM prices exploding should I grab old stock now for rag?
Posted by Working_Opposite4167@reddit | LocalLLaMA | View on Reddit | 17 comments
I need some advice.
I have 32GB RAM in my PC right now, but since it’s my work machine I usually have around 10GB free. I’m also running an RTX 3090.
I want to build RAG setups for two AI projects. I found a 96GB DDR5 6000MHz kit that’s still being sold for the old price (\~1690), and the store told me RAM prices are about to spike because the market is going crazy.
The idea is that if I buy the 96GB, I’ll probably sell my current 32GB kit.
My dilemma:
- I can rely on the OpenAI API and avoid running big models locally.
- But I’m scared the API costs will pile up over time and end up costing more than just buying the RAM once.
- On the other hand, maybe I don’t even need so much RAM if I mostly stick to OpenAI.
So I’m torn:
Should I buy the 96GB now before prices jump?
Or skip it and just rely on the API, even though long-term costs worry me?
Anyone with experience running local models or using OpenAI heavily your advice would help a lot. Thanks!
UnreasonableEconomy@reddit
NAND supply is artificially constrained. On top of that, we're in the equivalent of the bitcoin farming boom for GPUs, applied to ram.
Personally, I'd buy another 3090 for 800 rn than 96gb for 520, and then wait for this whole thing to crash. (might be 2-5 years out, might be before the end of the year, who knows lol)
Current_Finding_4066@reddit
If AI keeps sucking enough supply, it will take time to resolve. I suspect demand is goign to crush too.
UnreasonableEconomy@reddit
there is no AI demand.
Nepherpitu@reddit
THE HOW MUCH?! ~$1690? Do you mean USD? Not nigerian dollars or german marks? In Russian these kits sells for ~$800. I'm pretty sure you can find it much cheaper than ~$1700
Sufficient-Past-9722@reddit
Sounds like time to ask the bank for a loan, then but a bunch of RAM and take a quick trip to Zurich.
Working_Opposite4167@reddit (OP)
No sorry its 520$ for 96GB
a_beautiful_rhind@reddit
Few months ago I bought 32gb 2400MTS for $22 a stick and now it's $130 a stick. I think it's not possible to buy at the moment if I tried.
Cergorach@reddit
I'm seeing 96GB DDR5 6000MHz (2 module kits) starting at €377 around here (in stock). And that's with prices already doubled compared to early this year.
Before buying anything, look at your motherboard, is that memory supported? Do you have all the memory slots full, how is that supported?
You'll probably get better results with OpenAI then with the local models that fit in a 3090... You can also look at cheaper services like Deepseek (if that does what you want). How much are you kicking to the API, how often? Do the math first!
Current_Finding_4066@reddit
If you do not need it, do not buy it. At the moment prices are out of control and no one know what next year will bring. But such shortages will take time to resolve.
Aggressive-Bother470@reddit
You should be more scared the prop api starts generating useless bollocks while the rest of the world keeps claiming how amazing it is.
pmttyji@reddit
Price gone up already after September
Special_Cup_6533@reddit
Depends completely on your workload and how much RAM you plan to actually use. I have 192GB DDR5 and I'm often sitting at 120+ GB used when I have containers, DBs, local models, browser trash and everything else going. So big RAM absolutely can get used if you really try.
Even jumping to 64GB would feel a lot smoother for dev work, RAG services, Docker, and random background junk. However, if you plan to offload larger models, 64GB can become annoying.
DeltaSqueezer@reddit
If you can get RAM for the old price, I'd grab it now. Where I am, prices have more than doubled in the last few months. SSD prices are shooting up too.
ttkciar@reddit
I would totally buy more RAM now (and did! Hopefully enough to tide me over through 2027).
hieuphamduy@reddit
My suggestion is that you should buy now, especially given the RAM shortage is projected to last until 2028. 96gb is good for you to run gpt-oss120b at around 20t/s, which is good enough for my use cases
hieuphamduy@reddit
Also, try ebay. I am certain you can get the same kit for a cheaper price at auctions. I recently just upgraded mine to 128gb for around $500
334578theo@reddit
IME RAG doesn’t need a powerful model if your retrieval and prompting is good - you could likely use a smaller OSS through Openrouter for <$1/m tokens and have way better latency.