Intel Arc B70 32GB GDDR6 announced at a price of 949
Posted by New_Mix_2215@reddit | hardware | View on Reddit | 125 comments
(yah i don't believe this price will be obtainable for anyone either)
XxBrando6xX@reddit
Finally a perfect card for my plex server!
twice_paramount832@reddit
Even a iGPU is perfect for Plex.
XxBrando6xX@reddit
lol yeah I was kidding. It actually is a really awesome product if you’re a person messing with Local AI though. That much VRAM is a good deal for that price
InadequateUsername@reddit
Can I transcribe 8k on this?
nuclearbananana@reddit
For AI they seem to only have fp8/fp16. No int8, int4 or nvfp4?
AdamConwayIE@reddit
Xe2 seems to support INT4, and INT8 is listed on Intel's product spec page. NVFP4 is Nvidia Blackwell.
Xurbax@reddit
Err, I am pretty sure the performance claims they give are all for int8...
max123246@reddit
Idk what's the market for this card then
nuclearbananana@reddit
I'm sure they have it, there's just no benchmarks
AnyImpression6@reddit
It's an enterprise card.
cdoublejj@reddit
wonder if it supports vGPU for GPU sharing among virtual machines?
xole@reddit
I read that intel is/was adding that on consumer cards. Is there a way for it to be done on consumer amd or nvidia cards, even if on linux? If not, that might sway me to pick up an intel card at some point.
nosurprisespls@reddit
It was possible on consumer nvidia cards. People were doing it during the pandemic with GPU shortage (you can find some youtube tutorials). Nvidia found out and disabled it with a driver update.
cdoublejj@reddit
nvidia they come to your house with a brute squat and beat you till piss your self for trying to use extra features on a consumer card
SoilMassive6850@reddit
I believe Turing cards could have their vGPU capabilities unlocked with a firmware hack but in general for AMD and NVIDIA they are limited to very select higher end models and I know at least NVIDIA has some license costs as well.
For Intel you get vGPU support starting at around 400€ with the Pro B50 (2 vfs), I doubt they will add it for non pro cards though. I personally picked one up to run a few desktop VMs with just a 70W power budget.
SoilMassive6850@reddit
B50 does 2 vfs, B60 does 7, not sure how many B70 does yet.
cdoublejj@reddit
why such a drop from 12 to 2? more horse power per vm vs 12 lower horse power?
pdp10@reddit
Intel nerfed it, then?
SoilMassive6850@reddit
No official confirmation on whether it was purely product segmentation or some firmware issues which led to the change, but product segmentation and the initial 12 limit being an accident seems like the more realistic answer. Wendell from Level1Techs has tried to inquire about it but has received no answer according to his forum posts.
Not_Your_cousin113@reddit
It's been radio silence on Intel's community forums as well, I'm gonna assume this is a "management wants to milk more money" situation, some arms gotta be twisted to keep Intel honest. https://community.intel.com/t5/Graphics/Why-did-the-latest-Intel-Arc-ProB50-firmware-nerf-SR-IOV-VFs/td-p/1732703/page/2
jenny_905@reddit
Absolutely it does, SR-IOV is standard on these cards.
red286@reddit
https://github.com/ccpk1/Homelab-Public-Documentation/blob/main/Proxmox%20Virtual%20GPU%20Setup%20with%20Intel%20Arc%20Pro%20B50.md
Appears someone managed to do so with the B50, which should use the same drivers as the B70, so in theory, it should be possible.
CassadagaValley@reddit
Kinda seems like the next batch of gaming cards will be Celestial and this is the last use of Battlemage.
Terrh@reddit
I think Celestial is dead.
This was supposed to be the B770 but it's only launching as the Pro B70.
It should have solid gaming performance anyways but the gaming variant is either totally dead or at least massively delayed, and Celestial is MIA for a long time now.
Vb_33@reddit
Everything points to celestial being very much alive.
TheDonnARK@reddit
It's just been the thing to say.
Before the B580 released, they said battlemage is dead. It launched and they said big battlemage in any form was dead and Intel was moving on to celestial. Now that there is a big battlemage launching, they will say they meant a gaming variant the whole time, that's what they meant, and also celestial is now dead.
nanonan@reddit
In igpu form, sure. As a consumer or even a pro card not so sure.
996forever@reddit
I think they specifically refer to the gaming variants
hamatehllama@reddit
Battlemage is only one year old. Celestial will arrive next year together with the competitors from AMD and Nvidia.
Plank_With_A_Nail_In@reddit
Its one year old by release date but only competes with 3 year old cards from AMD and Intel. This card is 4 years too late.
IIlIIlIIlIlIIlIIlIIl@reddit
That's fine and expected; Intel is still new to the game.
m31317015@reddit
Yeah nowadays people pick Intel because of budget. They kinda learnt their lesson by being the underdog and almost go out of business. When the value of the product is okay and the drivers are okay, they can have a bit more chance surviving, to them right now, a snowstorm.
AverageDonkey247@reddit
Everybody seems to love to hate on Intel, including their efforts towards making competitive GPUs. But honestly their budget cards when you can get them at the actual MSRP in the 200-300$ range are pretty good value even now. Also as the other guy mentioned in a few generations, if they don't abandone it because of the Nvidia deal, they should be right there with the other two all the way up and down the product line if they choose to make a top end SKU. On another note, everyone should be cheering for Intel to at least continue on with their graphics division if for no other reason other than the fact that more competition in the market is ALWAYS a good thing for the consumer. More options, cheaper prices, it's just basic economics and that simple really...
Erikthered00@reddit
It was never supposed to be the B770, all B type cards are Battlemage. A type cards were Alchemist, and when they launch, Celestial cards will be C type cards
comelickmyarmpits@reddit
How celestial is dead?
Battlemage came in dec 2024 and if we follow 2 year cycle that all vendors follow then celestial make sense for dec 2026.
Isnt it obvious that all we would get leaks of this pro b70 for now?
FinancialRip2008@reddit
i assumed that was because the increased ARC cpu overhead would be more painful with a more powerful card
soggybiscuit93@reddit
Or just the die size costs.
Slap clamshell the VRAM, double the price, and suddenly the die is a lot more worth releasing.
imaginary_num6er@reddit
MLID called this years ago
Schlaefer@reddit
Twice a day.
PastaPandaSimon@reddit
Though it's ridiculous to try to position it against the RTX Pro 4000 with 24GB of VRAM nobody buys, rather than the 5090 that outclasses this card for most of the same enterprise workloads. Intel knew this when establishing the $949 price point for their reference model. They also knew that gamers wouldn't pay as much for it, as the gaming performance would be sub-par at that price. I wonder if you can still game on it though.
soggybiscuit93@reddit
Why is the comparison bad? Cost wise, it's closest Nvidia competition is closer to the Pro 2000 BW
A Pro 4000 is nearly 2x the price, and a 5090 is more than 3x the price (and not necessarily available in the same enterprise channels).
For the hobbyists chasing VRAM, you can fit 2x B70's in the same size and power budget as a 5090, while still coming in over $1000 lower in cost and having 2X the VRAM.
And for workstations, this seems like the cheapest way to get 128GB
You lose CUDA, but there are still plenty of people tinkering and looking for alternatives if the price is right.
S_A_N_D_@reddit
Seems to me that this would still be a very good value proposition for inference since you can use larger models.
Winter_2017@reddit
5090 is incomparable to these enterprise cards. These cards get certified drivers, which are essential to running certain applications. No one serious is purchasing a non-certified card in projects where a crash can lead to hundreds of dollars wasted in lost work and time.
PastaPandaSimon@reddit
Because if you're the kind of business that has the need to run such certain applications, you're likely getting an Nvidia product adjusted to your needs due to performance. A 5090 gets stable creator drivers if you're a small-medium business as a more versatile product. If budget is less of a concern, you're getting a proper Nvidia product, unless you've got a niche use case that doesn't benefit from Cuda.
mrblaze1357@reddit
That's exactly what I was going to say. They also have ECC video memory, and other features. Literally had this exact conversation with an engineer at work who was arguing with me to get him a 5090 over an RTX PRO 6000 for his simulations.
DehydratedButTired@reddit
Gotta get that enterprise cash.
MadOrange64@reddit
But can it run Crimson Desert?
nonaveris@reddit
Yes, as the publisher has allowed it to run.
tofuchrispy@reddit
Well only if you’re certain your application works perfectly with intel this makes sense. Anyone else interested in large vram for ai and other things has to get nvidia bc it mostly just works on nvidia
Brosaver2@reddit
I wonder how these would perform in games. Not that I think they would perform good, but it just made me curious
HellsPerfectSpawn@reddit
Around a 5070, so not worth it
Free_Moose9611@reddit
I Just order two of them. If they're good I'll load up more. My NVIDIA cards made me think twice about value. Hopefully they work in Proxmox LXCs/Vms. The Vram/Price is just too good to pass up!
AlexGSquadron@reddit
1270 euros in germany
Fearless-Area-532@reddit
I'm getting a couple for work
StuffProfessional587@reddit
Hopefully comfyui will work with it. Fingers crossed.
New_Mix_2215@reddit (OP)
Looks like its already supported. Intel Arc is listed. How well though, i don't know - i haven't tried it. I only own a arc a380 that i use for decoding video.
https://docs.comfy.org/installation/system_requirements
reddit_equals_censor@reddit
vram for them, but not for us.
___
as a reminder of the vram amounts, that cards should have today.
the ABSOLUTELY MINIMUM today for ANY card, that will be marketed to game AT ALL, is 24 GB vram.
and 24 GB vram today is already assuming a 30 GB ps6 and not a 40 GB ps6.
and a ps6 is coming in 2 years.
16 GB vram is already breaking in at least one game at very very high settings.
and today to match the ps5 you need 3/4 of the ps5's unified memory pool as vram.
so the ps5 has 16 GB unified memory. so to match it vram wise on desktop/laptop you need 12 GB minimum.
so to match a 30 GB ps6 you need 24 GB vram. to match a 40 GB ps6... you need 32 GB vram RIGHTNOW as minimum to not get shit on in 3 years, when the first p6 only targeted games to come out.
and again i am not saying a 9070 xt should be 32 GB vram.
i am saying, that the lowest gaming card (5050/9060) NEEDS 24 GB vram/32 GB vram to not turn into e-waste and be a terrible/broken experience when ps6 targeted games come.
but instead only pro cards get enough vram apparently and the industry laughs at 8 GB vram cards, that were already broken 3 years ago now... and only have gotten worse since then.
ClickClick_Boom@reddit
Stop buying shit from these companies then.
reddit_equals_censor@reddit
so your suggestion to companies ALL OF THEM TOGETHER screwing you over is just not getting a computer then...
you need a graphics card to exist. the 2.2 companies, that make gaming graphics cards has 2 companies making them and one being on the way out with ties to nvidia for future apu tiles.
so the 2 real companies doing it have been found guilty of price fixing in the past and both are scamming you.
we need graphics cards.
so again your suggestion is to NOT buy products you need...
and instead... ?
why don't we listen to ceos telling us how water should not be a human right again as well? (nestle ceo if you're not aware)
so again your idea DOES NOT WORK in a market, where the 2 evil corporations are scamming customers and don't care that much about the average customers as well on top of that rightnow and hey maybe for at least 2 more years.
so don't buy products you need for companies, that don't really care to see them not change at all ever.
that will sure as shit work, instead of i don't know...
starting by fining companies for scamming customers with missing vram.
straight up nuking them for 10% of all their income each month if they dare to sell any graphics card below 24 GB vram.
an actual fix, that would actually work, instead of suggesting people to not buy products, that they need and only get made by 2 companies and both are scamming you.
___
and just for funsies and in case you think of it "just buy used" DOES NOT WORK anymore.
because part of the missing vram strategy by nvidia and amd is to destroy the used market and they have been extremely successful.
who in their right mind would find an 8 GB 3070 ti insult desirable? no one of course. it is broken by missing vram, because nvidia wanted to massively reduce its lifetime for the 1. user and make it worthless for the used market.
___
if you want some more fundamental ideas, that could help. end ALL "intellectual property" legislation.
ALL OF IT.
part of what makes a monopoly possible is patent/ip, that creates an artifical moat to prevent competition, or even governments going "yeah screw that we fork the hardware and software and improve it.
nvidia and amd couldn't do their evil, if they were REQUIRED to only ship open hardware and open software, that can be forked.
ClickClick_Boom@reddit
I'm not interested in reading your unhinged ramblings.
NeroClaudius199907@reddit
The best part is he's using rx 580
JPLangley@reddit
I'm glad Intel didn't give up on Arc. Now if only it natively worked with Ollama.
badcookies@reddit
Thanks Ollama.
potatoears@reddit
lol
Eire_Banshee@reddit
There is a flag you can switch to enable arc but you have to build from source :(
TheBlueMatt@reddit
Use the vulkan backend with this PR: https://github.com/ggml-org/llama.cpp/pull/20897
hak8or@reddit
It's a perfect excuse to stop using Ollama which makes major questionable trade offs (their horrific model naming scheme for one back when Deepseek v3 was released relative to the distilled version) in the name of "easier for beginners", and constantly kept trying to steal credit from llama.cpp devs via making it appear like they wrote their inference engine.
Just call llama.cpp natively, yes it's a few CLI options, but it already comes with a decent web server built in, and has an OpenAI endpoint that you can use OpenWebUI with and anything else. And it's at the source.
Schlaefer@reddit
Use ollama's vulkan backend.
ThankGodImBipolar@reddit
MLID doing a victory lap right now lol
cdoublejj@reddit
who is MLID?
igenicoOCE@reddit
"leak" Youtuber that keeps falling for obvious hoaxes and generally has some of the worst track records on being right
DaMan619@reddit
Moore's Law is Dead
iDontSeedMyTorrents@reddit
Damn, fuck you for answering the guy's question, right?
Loose_Skill6641@reddit
why
ThankGodImBipolar@reddit
Because this is Battlemage having a difficult and delayed launch, precisely like what he said in his "Arc is effectively cancelled" video. I think we can be pretty confident that a B770 will never materialize now too, if they're saying this card has a 300W TDP - that's damn near double what the 9060XT draws, for something that should be comparable performance wise.
goldbloodedinthe404@reddit
Gaming performance is literally not figuratively but literally irrelevant this is for ai inference
ThankGodImBipolar@reddit
"This" is BMG-G31, which is the die that was supposed to go into the B770. This product isn't for gaming, but the die inside of it was meant for that. It's not like the R9700 magically draws a different amount of power than the 9070XT because one is meant for professional workloads and the other isn't.
AkazaAkari@reddit
"Arc is effectively cancelled" and the Arc B70 releasing today are contradictory. There's no reason for anyone to release a new gaming card right now when AI demand means you can multiply your margins by selling to professionals instead of gamers.
steve09089@reddit
Curious how it actually plays games, but yeah price is doubtful.
jenny_905@reddit
Probably quite a lot faster than a B580, has 32 Xe cores compared to 20 on the B580 and a wider memory bus.
Not that it's a gaming card but it'll just presumably be pretty good at it.
ray_fucking_purchase@reddit
At least you gave an answer instead of blindly downvoting the poor person. Also people acting like no one ever gamed or tried to game on Quadro cards by modding drivers back in the day.
jenny_905@reddit
Yeah very high chance an attainable workstation card will play a game at times. People will be curious since this is effectively what the B770 was going to be (with extra VEAM that won't make a difference in gaming) so I've no doubt it will be benchmarked in games.
ray_fucking_purchase@reddit
Indeed, give it a few weeks and someone will have posted some benchmarks.
pythonic_dude@reddit
GN (I know they are not liked here but oh well) always do gaming benchmarks among others on productivity cards, so looking forward to that. Seems like no amount of "thanks Steve" can deter Intel from sending them stuff for reviews, too.
UpsetKoalaBear@reddit
I appreciated your question FWIW.
It’s useful to know, machines containing these things get sold off by the dozen when companies upgrade so budget gamers could get a decent deal.
comelickmyarmpits@reddit
In my country people buy used nividia p6** cards to game on cheap
InconspicuousRadish@reddit
It's not a gaming GPU so that's largely irrelevant.
soggybiscuit93@reddit
Plenty of hobbyists interested in this card are interested because of its (relatively) low cost.
So I dont think it's unreasonable to assume many of them would be interested in it going into a dual use PC that does local AI, but can also play games. Rather than having 2 PC's
red286@reddit
It's not a gaming card. You can't even connect a display to it directly. This is an AI inference card.
ResponsibleBeard@reddit
Um, I see 4 DP2.1 ports there, bud.
Alternative-Luck-825@reddit
The sales volume for the B580 and B570 is under 1 million units, with a gross margin of only 10%. In contrast, the B50, B60, and B70 series have sales under 100,000 units, but their gross margins range from 50% to 100%.
Whether Intel abandons gaming GPUs depends entirely on profit margins, as the gaming market remains significantly larger than the professional card market. Unless memory costs are brought under control, Intel may indeed skip releasing dedicated gaming GPUs for the next generation.
However, users could still utilize professional cards like the B70, C50, C60, or C70 for gaming. Intel would provide gaming drivers for these professional lines; they can be treated as gaming cards, though the retail price would be triple that of a standard consumer GPU.
Uptons_BJs@reddit
Ehh, I donno. This card is priced realistically. The Radeon AI PRO R9700 is notably faster, and you can easily get them for around $1300-1400.
If pricing for the B70 floats up a bit more, why wouldn't you just buy the AMD?
pinmux@reddit
The R9700 isn't that much faster. It has better floating point performance numbers but the integer performance is on-par with the B70. R9700 has only a tiny advantage in memory throughput, too.
R9700 at 300W power rating (vs 230W for B70) and $300-400 more than a B70? I'd say competition in the marketplace is working! :)
MDSExpro@reddit
You can powercap R9700 to 230W with one command. You can undervolt it too.
pinmux@reddit
But then presumably the performance won’t be quite as good as the published specs. So then Int8 falls behind B70 at stock power levels.
Eire_Banshee@reddit
Who cares you are really buying it for the vram headspace anyways
airfryerfuntime@reddit
What's the performance hit?
feckdespez@reddit
Idk man. After playing with my B50 for the last 6-ish months (bought at release), I think I'd pay for AMD over Intel even. The software ecosystem for Intel is in such sad state and isn't improving quick enough.
PastaPandaSimon@reddit
Fast VRAM per dollar. If that's not your priority, you're likely getting an Nvidia card, not AMD.
New_Mix_2215@reddit (OP)
I had honestly forgotten that card. Fair point, i haven´t followed ROCm´s support for a while, but i assume it also have way better support then oneAPI(I guess it just have to rely on pure vulcan support). And given that you can buy like 2 or 3 of 9700s now of a price of a 5090, they are somewhat a decent deal.
Main reason though i believe the b70 wont be available for that price is honestly just i dont think there will be that many produced. I would also assume the R9700s that are available right now is on shelf from before the ram crisis?
ea_man@reddit
For the LocalLLM falks probbly the cheaper B65 will be of use, buy 2 of those tol run an 80B model, let's see a benchmark.
LuluButterFive@reddit
If B70 has sr iov then its a much better value
FinBenton@reddit
This and R9700 have pretty much the same memory bandwith.
Framed-Photo@reddit
I guess if you literally only need vram it can make sense?
Gippy_@reddit
The extra VRAM won't matter if the raw performance sucks. We've already seen this with 4080 Super vs 7900 XT. The 4080 Super won so overwhelmingly that AMD waved the white flag and didn't even put out another 24GB card.
wickedplayer494@reddit
Vega FE is back on the menu, boys. (Except it's big Battlemage out of Intel, this time.)
deathcom65@reddit
Do UK when it's available In Canada?
TheBlueMatt@reddit
You can literally buy it today on Newegg. Ships today for $999 (ASRock) or ships 4/2 for $949 (Intel self-branded).
thatnitai@reddit
Even intel GPUs abandoned gaming already lol
AverageBrexitEnjoyer@reddit
I don’t think the pro lineup was ever supposed to be gaming oriented
Flimsy_Swordfish_415@reddit
that's tech illiterate /r/pcmasterrace is leaking :D
R-ten-K@reddit
I mean, this entire sub is mostly gamers having a hard time dealing with the harsh reality they are not the center of the tech world...
Ok_Assignment_2127@reddit
That sub somehow manages to be filled with both 10 year olds and angry boomers at the same time
R-ten-K@reddit
Most tech spaces online are like that, “tech enthusiasts” always feel compelled to lead with a kind of reflexive negativity that adds little value.
It is just entrenched cynicism which crowds out any kind of nuanced, constructive conversations that would make these communities worthwhile.
NeroClaudius199907@reddit
Why dont people just buy amd? openbox 9070 is like $559 right now
Unlucky_Age4121@reddit
32gb of vram with sriov support is the answer.
NeroClaudius199907@reddit
No I mean Intel was never going to bring 32gb for B770 mainstream
While waiting all that time why not just get 9070
kikimaru024@reddit
You don't need 32GB to game.
JackSkelinngtEpstein@reddit
And just like that, I'm glad I didn't wait.
ie-redditor@reddit
LOL way too expensive. Basically the strategy is to have an expensive card so that it looks like "it must be good".
Spright91@reddit
Its not a gaming card. Its an entry level AI card. Just ignore it.
Beefmytaco@reddit
This is a good thing, cause it has a lot of memory. This will steer smaller business/research entities to possibly buy this instead of the far more expensive nvidia options, and could potentially make prices drop a bit.
Doubtful, but one hopes it happens.
forgottenendeavours@reddit
Yes, solid advice to yourself.
Sopel97@reddit
this isn't r/gaming
AutoModerator@reddit
Hello New_Mix_2215! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.