How much will you pay for a PCIe Nvidia B100, B150?
Posted by Ok_Warning2146@reddit | LocalLLaMA | View on Reddit | 9 comments
In the beginning, Nvidia had a plan to release a 96GB B100 PCIe card that was later scrapped (probably because B200 was selling too well?). Now with more competitions from Amazon and Google. As well as the fact that Meta is developing their own chip. There might be a chance that Nvidia will revive it in the future.
B100 vs RTX 6000 Blackwell:
- HBM3e 4TB/s vs GDDR7 1792GB/s
- 227kb shared memory vs 99kb => lower latency in higher wrap https://arxiv.org/html/2507.10789v1
- NVLink support highly likely as H100 PCIe also support NVLink. https://www.nvidia.com/content/dam/en-zz/Solutions/gtcs22/data-center/h100/PB-11133-001_v01.pdf
- Hardware support for tcgen05 instruction => 18-23% faster for all matrix multiplications https://arxiv.org/html/2507.10789v1
- Decompression Engine - Can save you disk space and GPU wait time for checkpointing during training https://developer.nvidia.com/blog/cut-checkpoint-costs-with-about-30-lines-of-python-and-nvidia-nvcomp/
B150 is a hypothetical card that is the PCIe single die version of B300 that has 50% higher tensor core FP4 at the expense of FP64 performance.
RTX 6000 Blackwell has an launch MSRP of $8565. How much will you pay for PCIe B100 and B150 if they are launched?
tech_cruncher@reddit
where are you getting $8565 for RTX 6000 Blackwell? that looks like a great price. My suppliers are quoting over $10K citing price increases. B150 will be cool if it's one third of B300 price or lower.
Ok_Warning2146@reddit (OP)
$8565 launch MSRP was suggested by Gemini AI. I think it is reasonable as I saw people getting this price in the beginning and also people getting it $7k for edu discount,. Of course, the actual price can vary over time as well as your geographical location.
tech_cruncher@reddit
got it. yes, in early days I recall this being in the $8K range from a variety of suppliers. I don't use amazon, e-bay etc for gpus and some official NVIDIA PNY partners had them in $10-11K range but will check current prices again. thanks
abnormal_human@reddit
H200 NVL is under $30k now. B100 would be less VRAM but more performance. Value tends to follow VRAM. Maybe $15-20k. But realistically no-one wants it. If 96GB is good enough, you're running RTX 6000. The niche of "faster RTX6000" is small for LLMs. Really it would be most applicable to media production but that's a small fraction of overall GPU usage.
Ok_Warning2146@reddit (OP)
Thanks for the tips about H200 NVL PCIe card. It seems to be more reasonable than buying two RTX 6000 if u don't need the FP4 support (but then RTX 6000 is not full optimized for FP4 in software for now).
verdooft@reddit
I don't need this, low power consumption is more important for me. I use llama.cpp, acestep.cpp, whisper.cpp, ComfyUI on a laptop without a real GPU.
Ok_Mammoth589@reddit
I think it's clear they're not going to sell consumers 1, 2, or 4. At least for two more generations.
SPoKK1@reddit
I’m not wealthy (about $40 k per year), but I’m passionate about Nvidia’s innovations. I would gladly pay roughly $100 per GB, including taxes, for an Nvidia card with CUDA. Although HBM3‑a is far more expensive than GDDR7, my modest income makes it hard to justify a higher‑priced card without tax incentives. Because the card will sit close to me, it needs to be as quiet as possible and avoid thermal problems. Low‑VRAM AI performance is quite miserable, so I would love a card with 128 GB or more of memory.
fragment_me@reddit
Something something 3090