3090s are well over $800 now, is the Arc Pro B50 a good alternative?
Posted by ea_nasir_official_@reddit | LocalLLaMA | View on Reddit | 12 comments
Is the arc B60/65 a suitable alternative? It does not seem half bad for the prices I'm seeing on them. I really want to build an ai machine to save my laptop battery life. I mostly run Qwen3.5 35B and Gemma 4 26B
Skyline34rGt@reddit
Intel gpu's are problematic.
Qwen3.5 35b and Gemma4 26b will run fine with a lot cheaper setups.
You can run it on used (200$ worth) Rtx3060 12Gb (MoE layers offloaded to ram) and got like 40tok/s for Qwen and 35tok/s for Gemma (both Q4-k-m).
Positive-Stock6444@reddit
3060 is the new 3090 :(
hurdurdur7@reddit
I would look at dual Radeon RX 9060XT. And i'm fairly sure their software support beats Intel.
go-llm-proxy@reddit
Dang I have a box full of them, maybe time to sell them?
I tried an arc and nothing worked but its been a little while, I think a better option is probably a Mac Studio right now, but hoping that Arc support comes along.
equatorbit@reddit
A box, you say?
go-llm-proxy@reddit
Not a huge box, but yeah. I had 6 3090's and 4 3060's racked but came into enough A6000's and RTX-6000 Pro's that I needed the power and rack space and pulled the ampere cards last year. They've been run hard, mining back in the day and then for ML and LLM for a couple more years, so wasn't really planning on selling them but if they're really going for $800....
Impossible_Style_136@reddit
For running Qwen3.5 35B and Gemma 4 26B, you are going to be severely limited by the VRAM limits and memory bandwidth on the Arc B-series.
Intel's SYCL backend for llama.cpp has improved significantly, but you will still be dealing with frequent driver regressions and unoptimized kernels compared to the CUDA ecosystem. If you are doing this to save power, your next best action is buying a used RTX 3090 and aggressively power-limiting it via `nvidia-smi -pl 250`. You retain the 24GB of VRAM and the software compatibility, but slash the power consumption.
Dead_Internet_Theory@reddit
I think if power consumption is the biggest concern, Apple does start to make the most sense.
desexmachina@reddit
Good luck w/ Arc drivers let me know when you can inference
ea_nasir_official_@reddit (OP)
If i get one Ill update you
!RemindMe 1 month
RemindMeBot@reddit
I will be messaging you in 1 month on 2026-05-04 23:39:30 UTC to remind you of this link
CLICK THIS LINK to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) ^(delete this message to hide from others.)
journalofassociation@reddit
Where are you finding 3090s for $800?