I've tested Olares One (5090 with 24GB VRAM)
Posted by caiodelgado@reddit | LocalLLaMA | View on Reddit | 8 comments
I made a video review here on PTBR/EN reviewing this machine and running ollama there.
If there are any questions, tests I should do, I'm happy to hear
No-Refrigerator-1672@reddit
Dead on arrival. 14 tok/s for Genna 4 31b for $4k? Granted, it's ollama, so real performance will be better, but still you can get this speed for a quarter of the price.
caiodelgado@reddit (OP)
I want to do the tests on running it for vllm and lamma cpp, but overal for a pre-built system with all the OS benefits I found it great.
No-Refrigerator-1672@reddit
It's terrible value. I just looked up Newegg - I can get a laptop with exactly the same CPU and GPU $500 less, and it'll have a screen, a keyboard, and an ability to work from a battery. And OC benefits are questionable at best: it's obscure linux fork, which means unknown update availability, and unknown compatibility with anything more advanced than lauching an openclaw. You'd be better off installing Ubuntu in the laptop I've mentioned.
caiodelgado@reddit (OP)
You're free to use whatever linux you want to , but the device is 3500 on the link in the video. btw, and it has 96GB of DDR5 against the 32GB of the laptop you mentioned.
Yes I agree that the laptop has the screen and battery, but those are different usecases :)
If you're using a laptop (on that pricerange) as a server, something seems off.
No-Refrigerator-1672@reddit
I'm basing my $4k number on official store. Your link is likely promotional; I'm comparing apples to apples - official price to ifficial price, no promo discounts.
96 GB is a benefit, but what for? CPU offloading kills performance immensely, you can't run a 100B model on this thing. You won't use it as server either - mobile CPU comes with significant compute restrictions vs normal one, you can't serve a small org off this, yet for a single person 96GB is overkill. I struggle to see why I would need that much ram paired with that bad of a compute.
caiodelgado@reddit (OP)
Fair point,
But if you want to run it as a server you have more 54gb of free ram to be used.
It's not bad compute tbh, it's a ultra 9 24 core... It's pretty capable for a home server, installing proxmox and bypassing the GPU is cheff kiss.
No-Refrigerator-1672@reddit
It's bad compute for the price; and it's bad compute for the RAM. My personal server hosts like 20 different services, including 2 instances of vllm and one of comfyui, and I have 28GBs allocated on average. For server, you only ever need 96GBs id you want to serve like an org of 50 people, or launch desktop VMs for thin clients; but neither of those is possible with a mobile CPU that will throttle if you look at it wrong.
caiodelgado@reddit (OP)
I don't think it will throttle for my homelab usage, but I also don't think I would be able to convince you since you've already made up your mind.
But imho, I think it's a good device and for sure better than the laptop you shared for the same price :)