Seems like used 3090 price is up near $850/$900?
Posted by Synaps3@reddit | LocalLLaMA | View on Reddit | 118 comments
I'm looking for a bit of a sanity check here; it seems like used 3090's on eBay are up from around $650-$700 two weeks ago to $850-$1000 depending on the model after the disappointing 5090 announcement. Is this still a decent value proposition for an inference box? I'm about to pull the trigger on an H12SSL-i, but am on the fence about whether to wait for a potentially non-existent price drop on 3090 after 5090's are actually available and people try to flip their current cards. Short term goal is 70b Q4 inference server and NVLink for training non-language models. Any thoughts from secondhand GPU purchasing veterans?
TheImmigrantEngineer@reddit
I'd say wait for 4090 to reach that price range. 4090 is a significant improvement over 3090. 3090 is not really worth it in 2025.
koalfied-coder@reddit
For LLMs and inference the 4090 is only like 10% faster and uses more power tho.
TheImmigrantEngineer@reddit
4090 is a lot faster for training or fine tuning models.
koalfied-coder@reddit
Any idea on a percentage? I'm interested to see some figures. To be cost effective it would have to be significant when A series cards are also available.
FullOf_Bad_Ideas@reddit
about twice as fast. I was doing a lot of finetuning on 8x 3090 / 8x 4090 clusters recently. This is the speedup you get when your model fits comfortably within 24GB and you do DDP to just speed up the next 8x. Batched inference is also about 2x faster on smaller models on 4090.
koalfied-coder@reddit
Hmm did some research and some quick testing between my 3090 and 4090 albeit single cards. I got around a 40% increase but at twice the cost. While significantly I don't see getting 4090s for training when proper workstation cards exist.
FullOf_Bad_Ideas@reddit
Doing some training now and I moved from local 3090 Ti to cloud 4090 to finish the run quicker. ETA went from around 9 hours to around 4 hours 10 mins. I can't replicate your results of those small speed boots. Right now it's not an LLM but ViT though but I've seen similar boosts with LLMs.
Synaps3@reddit (OP)
The trouble is NVLink support, 3090's are the last affordable option (interested in this for non-LLM training).
polawiaczperel@reddit
3090 is still a beast for a lot of things, including AI purposes.
Orolol@reddit
For inference it's enough, but for training 4090 is vastly superior to the 3090, not only in term of raw compute, but also because of all the optimization specific to transformers it have compared to the 3090
mayo551@reddit
I suppose if you're going to train, sure.
99.5% of people don't touch training.
Euphoric_Apricot_420@reddit
As soon as AMD ai max pro 395+ ultra super duper APU releases and you can have 96 Vram i suspect the prices of the 24gb cards will go down fast
Aggravating_Gap_7358@reddit
I bit the bullet and purchased 4 of the TI Blower versions for my Gigabyte 292 server.
From Ebay, All 4 were sealed in the original package and hadn't been opened.. I was shocked.. These were from CHINA.. Someone didn't get the chance to use them, they were perfectly clean and unused..
I'm still trying to learn and figure out what to do with the server.
Synaps3@reddit (OP)
Whoa, great find!! I'm super jealous!! Would you mind sharing the seller? But I suppose they won't have any more left at this point...
You'll have a lot of fun and lots to learn! Enjoy!
Aggravating_Gap_7358@reddit
They had 55 of them when I got mine. The seller was https://www.ebay.com/usr/yzhan-695. It looks like there are still 40 of them.
Synaps3@reddit (OP)
Thank you!!
cchung261@reddit
I feel like the prices will go up if the tariffs are enacted. They will impact 50x0 pricing as well.
ForsookComparison@reddit
This is a lot of Reddit fear mongering.
In reality I think the eBay supply has just dried up. The people grabbing them for $550-$650 were buying them off of "upgrade every gen" gamers. No prosumers were offloading these yet. Supply of cheap used 3090's simply ran out. Nobody is mass buying 3090's because of tariffs.
BangkokPadang@reddit
I think the increase in 50x pricing prevented the prices for the older cards from dropping because it just made a new higher segment, and people that are buying 90 cards for LLMs aren’t buying them to be graphics cards. They’re buying them to be LLM machines, and there’s only more and more interest in that.
So instead of the 5090 being $1500, knocking the 4090 down to like $1000, and the 3090 down to $500…
We’ve got a $2000 card, a 4090 that will continue to sell for $1400ish, and it’s gonna drive up the price of a 3090 closer to $900/$1k because there’s no value proposition for an Nvidia LLM machine with that much ram below that.
dandanua@reddit
NVIDIA greed will cause people to buy AMD cards, it's not normal that old generations prices went up after new series announcements.
BloodSoil1066@reddit
Biden crippling worldwide trade isn't helping. What an idiot move that was
Ragecommie@reddit
LOL, why are you getting downvoted?!
Different_Fix_2217@reddit
Tribalistic redditors who cant admit their precious biden is anything but perfect
ippa99@reddit
the irony of this comment
random_guy00214@reddit
Join the trump train brother
Ragecommie@reddit
People have abandoned common sense entirely it seems.
Different_Fix_2217@reddit
https://www.youtube.com/watch?v=5Peima-Uw7w
mulletarian@reddit
What did he do, exactly?
Enough-Meringue4745@reddit
Immediately picked a fight with Russia on day 1 because he’s a dumb old man stuck in 1920
ReadyAndSalted@reddit
Is it the West that's picking a fight with Russia? Seems to me like they're the aggressors
Enough-Meringue4745@reddit
The two nations are naturally at odds, as Russia has a self sustaining economy which goes against the US's core function. Russia is a country that doesn't need the US's involvement in any issue whatsoever.
Think about it; The US's power comes from its position of economic control. Russia is completely immune to the US's economic influence. The US is also immune to Russia's economic influence. These are two alpha predators, and are naturally at odds.
Did and does Russia influence elections in the rest of the world? Yes.
Did and does the US influence elections in the rest of the world? Yes.
Did and does Russia invade countries for economic superiority? Yes.
Did and does the US invade countries for economic superiority? Yes.
They're the same damn thing.
Uhhhhh55@reddit
Was this written by an LLM? straight up hallucinations
AIPornCollector@reddit
Ah, the ol' CCP and Kremlin funded disinformation bots. Gotta love it.
Enough-Meringue4745@reddit
Yeah?
CapcomGo@reddit
lol you sure about that?
greentea05@reddit
Well it doesn’t come from voting in comically bad senile old men to run your country
hedonihilistic@reddit
If you stupid fucks don't like it here. Why don't you move to Russia??
iamthewhatt@reddit
They are, conservatives like to pin everything on liberals because they have no policymaking abilities of their own.
sedition666@reddit
Russia literally invaded its neighbour. Reagan wouldn't have let the RINO Republicans become a bunch of bitches in the face of Russian agression. You should all be ashamed.
Enough-Meringue4745@reddit
The USA bombed countless innocent children with drones in Afghanistan and Pakistan, give me an equivalent innocent death to from Russian invasions.
mankomankomanko69@reddit
You clearly haven't been paying much attention to what's been going on in Ukraine
sedition666@reddit
Ukraine
RevolutionaryLime758@reddit
Things didn’t only start happening when you finally got around to paying attention. Classic ignoramus projecting their idiocy on the rest of the world.
BloodSoil1066@reddit
The Biden administration randomly limited AI chip exports to most countries except for a few US allies.
Not sure whether he's dementedly signing nonsense or he's burning it all down before he goes
mulletarian@reddit
Was it really random with no reason given?
BloodSoil1066@reddit
It's a vague attempt to split everyone into US or China camps, which is why India was dropped in there too because while otherwise favourable to the West, they keep sitting on the fence and undermining the Russia sanctions. It's not good geopolitics to punish other countries before they do anything
greysourcecode@reddit
There many reasons, the first is national security. I’m sure some here might realize how AI and LLMs can be weaponized. The second is domestic advantage in AI infrastructure. The US is hoping to gain an advantage in this area, both in trained models as commodities and training as a service. A lot of the intellectual property and software is produced in the US but almost all of it as manufactured overseas.
BloodSoil1066@reddit
Does it look like QWEN is falling behind here?
No, so it's pointless.
As China open sources its models, other countries are going to look at the US policies and start wondering if they too will get limited if they make too much progress
greysourcecode@reddit
Of course! We should all just give up and let China control the future of AI! Why didn’t I think of this before! You’re really quite the genius aren’t you! We should all just give up and show our belly because the competition had a successful product.
BloodSoil1066@reddit
That's not the point, it's counter productive if it forces other countries to reassess their global relationships
I could decide to not share a cheese crumb with the cat because it's in the best interests of his diet and my liking for cheese, but then he might also decide to take a dump on the outside of the litter tray
False_Grit@reddit
I think it's a complicated question without clear answers. I don't know that I like the idea of our AI overlords being Chinese.
I don't know that the trade limitations are a good idea though, or if they are even effective.
I also don't know if I like the idea of our AI overlords being American.
If it were up to me, I think I would focus on increasing domestic chip fab production, so that TSMC isn't the focus of the world the way it is now.
Would I also limit exports of high end GPUs....? Actually...I guess so? Or at least place contingencies. If China were willing to do a little less censorship and a little more democracy, i see no reason not to share. I dont think the Chinese people are any better or worse than anyone else, but I don't like dictators in general. Fuck those guys.
Effective or not, I personally believe this is one of those few times in history that could effectively change everything. It's definitely a dick move to limit GPU shipments - but it also has important implications in a world where essentially our entire financial and warfare systems rely on electronics.
moldyjellybean@reddit
Don’t worry I’ve been through the GPU wars of 2012/13 2016/2017 2021 etc.
The prices all crater and at least with those in 2013/14 I could merge mine doge/ltc, 2017, 2021 Eth and get back some of the costs.
I’m not getting any of these costs back except to learn and have fun. The prices on these will fall also fast.
Plus I used to get GPUs dumped as ewaste for super cheap also. That also pushes the prices for everything down slowly
enkafan@reddit
I just bought a new car. Told the guy we were holding onto the old car Sales guy kinda sheepishly said "I mean I voted for the guy, but, ummm with what they are saying .... Well Id buy new now and sell used later too"
Secure_Reflection409@reddit
Just wait until speculative decoding gains are fully realised this year and they'll all become worthless overnight, I suspect.
Xandrmoro@reddit
How these things are connected? If anything, speculative decoding makes you want even more vram to trade it for inference speed.
Secure_Reflection409@reddit
I would assume nobody wants to pay for even a single extra watt of power more than they have to?
Why run two cards when one will do?
This is assuming UMbreLLa is the first taste of what's to come.
Who knows, though.
Xandrmoro@reddit
But one will not do? Speculative decoding or not, you still have to load weights somewhere. And its not like you can magically use lower quants or something.
Secure_Reflection409@reddit
You load them in RAM and 1.5 tokens/sec becomes 12 when the draft is sitting on Ada, allegedly.
Xandrmoro@reddit
That means that the draft is as capable as the non-draft with virtually no rejections tho - all 8 draft tokens end being used. Its either extreme edge case with 0 temp (which on itself rules out a lot of usecases), or draft model as smart as non-draft, which makes the big model unnecessary to begin with.
Lower-Possibility-93@reddit
LLMs are definitely propping up the graphics card market for the foreseeable future. High end older generations are more valuable right now even though they are older.
trackpap@reddit
Ebay isn't a good place to gauge real prices for hardware as they are sellers(actual vstores)
places like fb marketplaces will be your better bet in acquiring these cards
Now I expect for the typical gamer(who dosent care about mining or ai) he's gonna upgrade and currently all you need is 16 vram tops, and 3090s came out 5 years ago, I'm baffled how some sellers on ebay and amazon are still trying to sell same card at 1400-2k prices brand new, hope no one buys from them.
If you have patience and check often, I've found that in the states you can find plenty of good cards and variety cards at very good prices 400-650 in very good state, but if i may, Southern states like Florida, Atlanta and the likes have better avg deal( i bought 2 from these areas and had ppl bring them over).
I've had plenty of luck getting cards but, I attribute it to luck, if you have friends who can properly explain how to measure the life of a card, please don't get scammed, 500 or 800 in your pocket is better than 0 and a sour feeling, so something feels off, pull the plug... be shameless. Sellers get many offers, don't worry.
Synaps3@reddit (OP)
Thanks for the helpful advice. I have had some luck on ebay with smaller purchases but the price fluctuations on GPUs makes all of this much more intimidating. I'm thinking about getting a few 3060 12gb's since these are more reasonably priced still and the stakes of each purchase are lower.
FB marketplace is a good idea. I was leaning towards ebay because of the return policy if the card is busted and I can stress test it on my own time, but I should look into this. I've read about people bringing eGPU enclosures and testing the GPU at the meetup, but I don't have all this equipment...
Do you have any FB marketplace GPU hunting tips?
hicamist@reddit
I didn't do research on anything below 24 vram, except the a2000-a4500 cards are good on ebay.
I've had chats with others who have sellers who don't like testing the cards in front of you, but my general procedure is be polite(if he's a prick, leave it there) ask for more detailed pictures of the card, then ask for a video of your name plus furmark(this is why I think I've been lucky, ppl say it's not a great stress tester), then if it checks out I go out, smell the card and check for corrosion or things looking blotched , I swing it 1-2 times for any rattle, and that's good enough for me. Hope I don't get crazy criticism (again I say I've been lucky, I'm on my way to buy xc3 3090 right now as well).
Commercial-Chest-992@reddit
Yes, this semi-mythical $600 3090 I see referenced all the time feels more and more like Bigfoot: often reported, seldom verified.
Xandrmoro@reddit
I bought two for $1150 total a couple months ago (out of the mining rig, but in a good condition). Now beating myself for not buying all four they were selling, its up to $800+ now -_-
L3Niflheim@reddit
Just personal experience but seems like the secondhand 3090 supply dried up a little over the last few months which has probably pushed the prices up. Guessing people are holding out for 5090s or cheaper 4090s.
MachineZer0@reddit
Reminds me of when P40 data center decommissioned stock dried up. I personally sold my P40 since the spread was narrowing in price. The proceeds were used to buy 3090s. I don’t think I was alone.
AppearanceHeavy6724@reddit
3090 will go back 650, once market is flooded with 50xx, as 40xx will be new 30xx.
iamthewhatt@reddit
I honestly doubt it for the 24G+ cards. 80 series and lower perhaps, but that extra VRAM is incredibly important for AI, and demand will stay strong.
AmericanNewt8@reddit
That's assuming that 24GB Battlemage doesn't materialize.
iamthewhatt@reddit
Battlemage won't change anything because Intel won't be able to use that 24G with AI like nVidia can.
SexyAlienHotTubWater@reddit
They won't be able to beat the performance of a 3090? I don't think that's true.
iamthewhatt@reddit
No way Intel can compete with a 3090 Tensor core, especially not with the CUDA ecosystem.
SexyAlienHotTubWater@reddit
Why not? They have XMX AI units, which do the same job as a tensor core, and the B580 already has half the bandwidth of a 3090 and nearly comparable f16 FLOPs.
iamthewhatt@reddit
It isn't about hardware, it's about software.
SexyAlienHotTubWater@reddit
Ok but I want to point out that your position seems to have changed from "they can't match the hardware" to, "they may be able to match the hardware, but they can't match the software."
Why can't they match the software? They already did in for their CPUs with MKL.
iamthewhatt@reddit
I am not changing that position, their hardware is inferior as well, its just much worse because of the software. I would love to see a chart where the XMX units outperform a 3k series Tensor core.
SexyAlienHotTubWater@reddit
Come on man, I feel like you're deliberately ignoring what I said.
iamthewhatt@reddit
What part in particular? Intel's XMX core cannot compete with a 3090 Tensor core in any CUDA program because they can't use CUDA, and its worse in Blender OPTIX as well. I am not sure in what way the XMX unit can compete anywhere with a 3090 currently. The hardware is intriguing sure, but so is AMD's, and they can't compete either.
SexyAlienHotTubWater@reddit
> currently
We aren't arguing about whether their hardware is inferior or superior currently. It's obviously inferior, the B580 has less FLOPs than the 3090, even FP16, and likely on Tensor operations. We're arguing about whether they're capable of building equivalent hardware to a 4-year-old Nvidia card, which I think they probably are.
> The hardware is intriguing sure, but so is AMD's, and they can't compete either.
The 7900 XTX has better performance than a 3090 for roughly the same price. Tinygrad is getting better performance from the 7900 XTX than a 4090 on 4 GPU LLaMA:
https://x.com/__tinygrad__/status/1879034330284192050
AppearanceHeavy6724@reddit
probably. probably time will show.
koalfied-coder@reddit
Us tariffs have entered the chat
Zeddi2892@reddit
But be very careful buying GPUs via private sellers. The market is absolutely filled up with scammers.
Synaps3@reddit (OP)
Not sure why the downvotes, this seems like pretty sane and helpful advice. Thank you!
Is it too much to ask for a photo verifying
nvidia-smi
output with the card plugged in next to the screen, with the username written down? Feels like I'm overly paranoid hahaa_beautiful_rhind@reddit
Heh, yea. I saw them at $400-500 back in like january of 2023. I bought a stupid pascal instead because of driver support.
I bought a few since then. Each time for just a little more, coming up to 700 post-tax. Surely, it will get cheaper I thought.
Dos-Commas@reddit
You sure those are real and not scammers? You can find plenty of $500 of 4090 from scammers.
PDXSonic@reddit
I picked up a EVGA 3090 in December ‘22 for $800 and there were Dell/Alienware OEM ones for a bit cheaper (700ish) but if they were $400-500 I would have gladly picked up two.
a_beautiful_rhind@reddit
Yea, that was before and they were completed listings.
NEEDMOREVRAM@reddit
Great time to sell some 3090s!
a_beautiful_rhind@reddit
Unfortunately I'm in the buy more 3090s camp. At this point though I want something ada for what's coming down the pipe.
AmericanNewt8@reddit
Fp8 seems to finally be becoming big.
BloodSoil1066@reddit
I was telling people to buy 3090s before the launch, because yes the 50** are going to be great as gaming CPUs, but they were going to be terrible value for AI training because the RAM is so low
The prices will come down because people will find a way to rationalise buying a 5090 anyway, but it's going to take time to get their prosthetic leg and arm fitted properly
In comparison, my local prices are much the same, but then I doubt if anyone local to me is buying them for AI
Wonderful_Gap1374@reddit
I mean I was going to buy a 4080super and call it a day. But the 5080 is 999 and the 4080s are going for 1200+US right now.
I feel like it's cheaper to get the 5080 than the current gens card. Hell, even the 5080 is cheaper than the 3090 when its going for just a few bucks less or at the 5080.
segmond@reddit
5090 is not a terrible value, but a different kind of value. I'm going to try and get a 5090. Why would I do so instead of 3x3090s? Performance. I figure it's going to run 4x as fast as a 3090. If you are doing just chat, basic prompting, a few image gen here and there. You can live with 3090's quite fine. If you are running agents, then it's endless inference. That means things that take me 1hr can drop to 15minutes or things that take 20 seconds down to 5 seconds. I value my time, so I plan on having a combo of 5090's and 3090's.
BuildAQuad@reddit
But if you can get a 5090 for 2k vs 2x3090 for the same price, Id argue It could be a better choice with the 5090.
sedition666@reddit
You can likely get 3x 3090s for close to 5090 pricing
BuildAQuad@reddit
Yea, at that point its a simple choice lol.
BloodSoil1066@reddit
3090's are £450 near me, so that would be 48Mb vs 32Mb for \~half the cost?
(plus a £200 Z690 motherboard). Also people might like to upgrade in stages
I'd agree that most developers/companies could make a business case for a 5090 investment, but there are going to be a whole bunch of students debating whether to rent compute, get a 3090 or possibly have their Uni organise something
Synaps3@reddit (OP)
I missed the boat
KY_electrophoresis@reddit
Interesting, because in the UK 3090 prices have come down from their pre-xmas peak. Got one delivered today!
Kenavru@reddit
just bought one for equivalent of 450$. Alot avalible for \~500 in my country.
prudant@reddit
here 600-659
RebornZA@reddit
Its an EBay issue:
Olx...
Synaps3@reddit (OP)
is olx in the us?
RebornZA@reddit
No, however there has to be more than just Ebay state side. Here there are like mutiple "OLX-like" sites, Facebook marketplace, & Ebay itself. Ebay is often the most trafficked and as such prices are often higher.
samorollo@reddit
TIL that olx isn't present only in Poland :o i don't know why I assumed that, but if someone asked me, I would bet a lot on it
AppearanceHeavy6724@reddit
olx is everywhere in developing world.
samorollo@reddit
yeah, now I checked it and one polish site was bought and incorporated by olx and that's probably what got me confused
floydfan@reddit
I’ve been watching GPU prices for a month or so and they are all trending upward. I thought it was because of holiday markdowns, but it seems more like a slow and steady increase.
When I started watching, I was seeing used 3090s at the $5-600 range, for example. It’s rare now to find one for below $800.
Synaps3@reddit (OP)
I think that's not too bad of an idea, but it seem like used prices are between $700-$800 for the 4070 Ti Super, might as well get a 3090 at that point, eh? If you know where to get them cheaper (or new for less $900), please share!
floydfan@reddit
No, they’re more expensive. I had one in my Amazon cart for $839 last week but by the time I went to check out it got sold out from under me. I got mine for just over $1,000.
NEEDMOREVRAM@reddit
Why H12SSL-i? And why not ROMED8-2T, which comes with 7 PCIe x16 slots?
Synaps3@reddit (OP)
I don't think that I will get 7 cards :) Five already seems like a lot of room to grow.
NEEDMOREVRAM@reddit
Never say never. Imagine how frustrated you'll be when you do fill up that 5th slot and realize you'd like to have two more!
Synaps3@reddit (OP)
You might be right ;)
jacek2023@reddit
There is nothing disappointing in 5090. People who want to spend money will buy it, people who want GPU for AI never planned to buy it.
Glittering_Mouse_883@reddit
Just picked one up for on eBay and counted myself lucky as the prices do seen to be trending up. Came in under 800 after shipping and tax, but I get what you're saying. All the other listings seem to be in the range you gave. Facebook marketplace seems to be the place where you can get one for <$600, but I don't have an account.
segmond@reddit
more and more people are getting interested in AI. never ending articles about how AI is going to change the world, take the jobs. the demand exceeds the supply. i'm not sure the prices are going to drop.