Why won’t Steam Machine support HDMI 2.1? Digging in on the display standard drama.
Posted by Balance-@reddit | hardware | View on Reddit | 254 comments
Although the upcoming Steam Machine hardware technically supports HDMI 2.1, Valve is currently limited to HDMI 2.0 output due to bureaucratic restrictions preventing open-source Linux drivers from implementing the newer standard. The HDMI Forum has blocked open-source access to HDMI 2.1 specifications, forcing Valve to rely on workarounds like chroma sub-sampling to achieve 4K at 120Hz within the lower bandwidth limits of HDMI 2.0. While Valve is "trying to unblock" the situation, the current software constraints mean users miss out on features like generalized HDMI-VRR (though AMD FreeSync is supported) and uncompressed color data.
Corentinrobin29@reddit
TL;DR: the HDMI Forum sucks. Use Displayport instead if connecting to a monitor. Use the one and only DP -> HDMI adapter by Cable Matters which (sometimes) works if you want 4K 120Hz HDR VRR with full 10bit on your TV like me.
And once again, the HDMI Forum sucks. Pricks.
p1r473@reddit
Wait will this adapter allow 4k 120hz on the steam machine? Any link?
SpecialSauceSal@reddit
This being the adapter in question: https://a.co/d/j9BTuKl
hm9408@reddit
Long link without embedded tracking attributes https://www.amazon.com/dp/B08XFSLWQF
24bitNoColor@reddit
That isn't at all true for all TVs. LG OLED TVs from 2020 and before that at the very least (but likely later sets as well, I just happen to have a 2020 set) do not support VRR over DP-to-HDMI adapters and with a lack of DSC also don't support full 120 hz at 4K with 10 bit 444.
ChoMar05@reddit
Here is what valve should do: Design a nice sticker that says "Steam Machine native 4K" and "license" it to TV manufacturers that have a DP on their TV so they can slap it on their boxes.
Used-Rabbit-8517@reddit
TVs don’t have DP
Strazdas1@reddit
Some do, its just quite rare for TVs.
meodd8@reddit
TVs with DP support are incredibly rare, right?
Strazdas1@reddit
being able to advertise compatibility with a sticker might improve chances of them existing.
Cynical_Cyanide@reddit
What do you mean there's only one DP > HDMI adapter?
Corentinrobin29@reddit
There's only one that works reliably, the Cable Matters one. All other DP -> HDMI adapaters fail to pass through a 4K 120Hz HDR VRR 10 bit signal semi-reliably.
Other adapaters will be able to do the same, but not all at the same time. For instance, you'll have 4K 120Hz HDR, but VRR will not work. Or you'll have VRR but the HDR won't work. Or you'll have both, but the image will be 8 bit (HDMI 2.0 levels), causing colour issues. Or the image will straight up bug out/break/disconnect.
The Cable Matters is the only adapter the community has found which can do all the above somewhat reliably. The adapater is at its fucking limit, so sometime it bugs out, needing to be unplugged/plugged in, or a restart; but in my experience I get 4K 120Hz HDR with VRR at 10bit most of the time. I use Bazzite on an LG C1 TV.
Now we wouldn't need that adapter if the HDMI Forum allowed HDMI 2.1 on Linux without proprietary drivers (which AMD do not have). And unfortunately HDMI has a monopoly on TVs, so we're stuck with either HDMI 2.0 (which looks like shit with HDR and VRR enabled due to 8bit 4:2:0), or using a janky ass adapter.
msalad@reddit
Can you provide a link to the adapter?
Alternative-Wave-185@reddit
Its the Cable Matters 102101 - can confirm that it can really do VRR, depending on your GPU / Display. On my 4080 Laptop / 6800 XT it did not work, on my 5070 Ti it works with LG 42C2 and 55C9. Unfortunately the on the 55C9 it causes major image glitches every 5 seconds. On the 42C2 its fine.
(4K 120 HDR VRR FULL RGB)
Cynical_Cyanide@reddit
Hmm. What's the 'limit' related to exactly? Heat? EMF? It's an active adapter, yeah?
hellomistershifty@reddit
Probably signal integrity because of the sheer amount of data, 4k 120 is like 48 gigabits per second. Anything slightly off in the timing, and the signal drops
Cynical_Cyanide@reddit
Right, but signal integrity is affected by things like conductivity (heat) and interference (EMF). If it's a signal processing chip bw limitation, I'm surprised some premium cable company hasn't just put a more powerful chip in. In fact I'm surprised a premium cable company hasn't made a short-distance super thick monstrously overkill adapter/cable for this purpose.
hellomistershifty@reddit
It'd be nice, but it would require a more powerful chip to exist - it's a pretty specialized thing, and you'd have a make back all of the money on designing and fabricating the chips (plus chip manufacturers are pretty slammed these days) and the only real use case I know of for these is for connecting TV to older GPUs or multiple TVs to GPUs with a single HDMI port.
I'd also have to see if you could draw enough power off of the power pin on the DP port to power anything significantly better
I don't really know, just throwing out some ideas of why they might not be gunning to do that right now
Nihilistic_Mystics@reddit
Bandwidth.
c33v33@reddit
Can you link? I thought Cable Matters explicitly lists it as not VRR compatible
Alternative-Wave-185@reddit
I have this adapter and with my AMD 6900 XT it did not work. However with my new 5070 Ti is does (VRR really activly working not just “on” in the driver) - while it works fine on my LG 42C2 it causes image errors every 5 seconds on my older 55C9, but in general working here too.
wankthisway@reddit
From reading forum posts, it's flaky. So it's a crapshoot regardless.
spazturtle@reddit
This is why the Intel Arc GPUs only support DP and the graphics card has a built in DP to HDMI adapter on the board for the HDMI port. So the driver only needs to support DP.
shroddy@reddit
And Valve should have done the same with their hardware, I assume they have enough control over the final product to do so and also to verify that it works correctly, including vrr.
Puzzled_Ad604@reddit
I mean...the use-case is a bit different.
I don't think the average person is connecting an Intel Arc GPU to a TV and an adapter is probably fine. But the average person, is at least expected to have the option to use a Steam Machine as a Home Theater PC, with little resistance.
shroddy@reddit
Fully agree, and that's why Valve needs to get it right, by making sure their converter chip works reliable with their Gpu in their Steam machine.
Puzzled_Ad604@reddit
Well, if it were easy to make a converter chip that works reliably, then we wouldn't be having this conversation.
Its not like Intel reached this crossroad and was like 'lol lets make it unreliable hehehe'
TheBraveGallade@reddit
when it comes to VRR to HDMI though DP, literally every company has issues with it.
cluberti@reddit
They're still running Linux (unless the user flashes Windows onto the device), and thus are still using the open-source driver that lacks support.
hhkk47@reddit
AMD had to do the same thing. They had open source drivers ready for full HDMI 2.1 support, but they could not release them because the HDMI forum sucks.
DragonSlayerC@reddit
That was only for the A series and they did it because writing good drivers takes time and supporting HDMI on top of DisplayPort would make things more difficult for the driver team, which needed as much help as they could get for the launch of the first cards. The Intel B series cards have true HDMI ports and suffer the same problem as AMD on Linux with HDMI 2.1.
AK-Brian@reddit
It varies from one individual card to another, but Realtek Protocol Converters were indeed used on Alchemist series models to provide (partial) HDMI 2.1 output. Depending on the specific type of color space, bit depth, refresh rate and display mode needed, it got a bit complicated. It's also part of why A- series cards are often a pain in the ass to get working with some TVs or older displays (the other part is poor EDID handshaking). No VESA VRR or Auto Low Latency Mode support, either.
More recent Battlemage cards, however, no longer use a PCON and support native HDMI 2.1a output, avoiding all of the above mess.
HDMI Forum does indeed still suck, regardless.
hishnash@reddit
apple do the same, the display controllers are all DP display controllers and then if there is a HDMI port that is powered by active DP to HDMI chips on the motherboard. Does lead to some compatibility issues that the vendor cant easily fix as the DP to HDMI converter tends to not be something they can flash new firmware onto.
the_dude_that_faps@reddit
The adapter has a high success rate as long as you don't care for VRR. Once you care for VRR, the chances of it working come down a lot and depends a lot on the display you're plugging it into and your GPU model.
WarEagleGo@reddit
:)
frsguy@reddit
But can the DP adapter do hdmi arc?
RetroEvolute@reddit
ARC is between your TV and your receiver. Assuming you're using the adapter between your PC to your TV, you should be fine. The video card will still send audio.
frsguy@reddit
Yup from gpu to TV and the sound bar is hooked up to the arc hdmi on the TV. Also keep forgetting my soundbar is arc, not eArc.
RetroEvolute@reddit
Yeah, adapter should work fine. 👍
cluberti@reddit
No, because the underlying spec is DisplayPort, and it's converting to HDMI signaling. It doesn't add features that DisplayPort lacks, unfortunately - DP to HDMI is generally a one-way connection, so anything coming back over HDMI will be lost.
frsguy@reddit
Aw dam that sucks but thanks! Had to use the only hdmi port on my gpu for better hdr on my monitor, guess I'll have to play hot potato when I want to use my pc.
cluberti@reddit
Unfortunately, yes, if you can't use DisplayPort.
Corentinrobin29@reddit
I'll let someone else answer that, because I've only used Arc cards on Linux servers for compute/Quicksync. I've never used the video output on them.
From what I understand about Arc (and what someone seems to have commented under me too), they do not have actual HDMI hardware, just a DP -> HDMI converter built into the card itself. This should bypass the issue entirely.
frsguy@reddit
Sorry my fault I meant eArc not Intel arc cards :p.
My sound system uses eArc so when I connect my tv to my gpu via hdmi it passes the sound to my soundbar/sub
Corentinrobin29@reddit
Ah I see! Honestly no clue since my sound system still uses Toslink (optical)! So for me the audio output is baked into the video feed with HDMI from the PC, and the TV just outputs that over Toslink.
Although I'm interested in the answer too, since I wanted to upgrade to an eARC setup one day.
your_mind_aches@reddit
One of the main reasons to get the Steam Machine is the HDMI-CEC support.
arandomguy111@reddit
I don't think it's that simple.
From what I've read HDMI 2.1 is supported in Linux on Nvidia hardware from both the closed source and "open source" drivers.
The problem seems like it stems from the open source issue as even the Nvidia "open source" driver has closed binary blobs, which they used to support HDMI 2.1
Even from the article it suggests it seems like the hangup is also an ideological one in part -
This suggests they could implement HDMI 2.1 support in the same way.
Salander27@reddit
No it's NOT that straightforward. Nvidia and AMD have very different GPU architectures. Nvidia has the GSP which offloads a lot of hardware management from the OS, including apparently HDMI-related functionality. The GSP is essentially an entire sub-OS that runs on the GPU that is launched via firmware provided by the OS. This has several benefits such as being separated from the OS so that they can have the OS-specific driver side be simpler with a shared GSP firmware. It also can help reduce load on the system CPU since many tasks are being handled directly on the GPU instead of needing to run on the CPU. And also what is probably the most important reason to Nvidia that it allows them to "lock" features in a way where someone can't hack the driver to unlock them (their consumer GPUs basically have all the same hardware as their enterprise ones but the consumer ones have severe limits because they want companies to buy their more expensive ones). The con of it is that it makes the GPUs slightly more expensive (pennies compared to what they get by forcing companies to buy enterprise GPUs instead) and that the firmware blobs are very large because they're essentially an entire minimal OS image.
By contrast the AMD GPUs have much simpler firmware components and delegate much of what the GSP does to the driver. This is cheaper for them but it means that they have to write the functionality multiple times for the OSs they support and they have no real mechanism to prevent people from buying their cheaper consumer GPUs instead of their more expensive ones (besides the latter having more memory and being under a support contract).
InflammableAccount@reddit
I'm not terribly knowledgeable on the subject, but it does appear as though this statement flies in the face of all the disparity between AMD and NV GPUs running (in Windows) on low perf CPUs. Hardware Unboxed did several videos showing that overall driver CPU overhead was notable higher on NV.
Salander27@reddit
Just because the Nvidia driver is offloading more tasks to the GPU it does not mean that it uses less CPU than the AMD GPU. It just means that the Nvidia driver would be using more CPU compared to not doing so, but at the end of the day the hardware is different and the drivers are completely different implementations. Perhaps the AMD driver is in fact more efficiently written than the Nvidia one or perhaps the Nvidia driver is doing something else that the AMD driver isn't.
Without actually profiling what the driver is doing while it's scheduled on the CPU you'd have no real way to know the reason and even if you had profiles of both the AMD and Nvidia drivers they'd be so different in terms of codepaths that it would be almost impossible to do a direct comparison.
Corentinrobin29@reddit
The open source Nvidia driver "works" because it is modular. The part that the HDMI Forum refuses to open source is separate and proprietary.
Intel skips the problem entirely by having the DP -> HDMI converter on board the card itself. Intel Arc cards don't actually have HDMI, just a converted Displayport.
So you could use Nvidia on Linux. But then you'd have to deal with the plethora of Nvidia issues on Linux: lower FPS than Windows, bad frame times, low 1% lows, bad Wayland support, no Gamescope support, etc... It's usable and mostly works, but when a 9070XT catches up to a 5090 in averages, and beats it in 1% lows, you know something's wrong.
About your second point, sure it's ideological in the sense that AMD decided decades ago to make their Linux drivers open source, and part of the Linux Kernel itself. (AMD users on Linux do not update drivers, they just update the Kernel. Nvidia users have to install separate drivers like Windows). And they're paying for that (otherwise good decision) today.
But the problem in reality is practical, because it means that AMD would have to rewrite their entire driver architecture for Linux because one pesky organisation refuses to give an inch.
It's absurd to blame AMD driver design which has always been the best Linux GPU experience, when it's one organisation refusing to budge an inch and asking AMD to reinvent the wheel instead. The worst part is the HDMI 2.1 firmware isn't black magic. There's no trademark secret here that risks being exposed; let alone capabilities than Displayport doesn't already have. They're litterally just being asses about it.
DragonSlayerC@reddit
It's not really about the driver itself being modular with Nvidia, but the fact that the kernel driver doesn't actually do any low level management of the card at all. Instead, the card has a processor on it (called the GSP) which manages all the functionality of the card. That's why the firmware is like 100MB; it's basically an entire driver. The OS kernel driver just tells the GSP what it wants to do and the GSP takes care of the implementation details. AMD and Intel can't do the same thing with their cards and it would require a complete rework of how the cards/GPUs are designed. It's also why Nvidia's Linux kernel driver is now open-source; any trade secrets and implementation details about their cards are part of the firmware, which is closed source.
steve09089@reddit
Not quite sure of the full details, but essentially it boils down to NVIDIA doing things differently.
They have a closed source firmware blob that handles most low level card functionality unlike AMD and Intel, and thus they can offer HDMI 2.1 functionality by not having that functionality as part of the driver itself, but rather handled by the firmware.
arandomguy111@reddit
Which is why I'm not sure framing this entirely as an issue with the HDMI forum is accurate.
Yes one might not like the stance from a ideological perspective. But it does seem technically speaking it's possible to implement HDMI 2.1 on Linux. There is no hard restriction.
So if AMD or Valve chose not to implement it just based on that, should the consumer who might not share such rigid ideological stances also not pressure AMD or Valve to take a more pragmatic approach?
steve09089@reddit
It’s not a choice between ideology or pragmatism, rather simple technological constraints.
Fundamentally, AMD and Intel GPUs can’t use this approach, since NVIDIA has a CPU on their GPU since Turing allowing for them to create this more powerful firmware. AMD and Intel don’t, so they can’t do anything remotely to this level.
TopCheddar27@reddit
Here's the kicker most of the time VRR does not work on that adapter.
jorgesgk@reddit
Can't the driver interface with some proprietary blob that acts as a middleman between the open source driver and the HDMI 2.1?
SeantheWilson@reddit
It absolutely can, just no one has yet to make one.
AndreaCicca@reddit
Yes
Gippy_@reddit
Big meh. The HDMI license and the closed-source requirement is the cost of doing business. Valve is a multibillion dollar company so this is really just a matter of them being stubborn.
Everyone who uses a credit card pays a 2-3% credit card fee for every single transaction. Hypocritical if you use credit cards but then complain about HDMI.
csakzozo@reddit
It's not Steam paying mate... It's us, the customers....
_barat_@reddit
AFAIK HDMI foundation doesn't want money but a "closed driver" and AMD doesn't want to make a "closed driver" like nv did.
Gippy_@reddit
Nvidia drivers are closed and Nvidia has 92% of the market.
Open-source isn't the panacea for everything.
puffz0r@reddit
It doesn't even need hdmi 2.1
csakzozo@reddit
Not what the specks say.
frissonaut@reddit
Will anything even happen with steam machines with the current price of RAM
csakzozo@reddit
Yes
csakzozo@reddit
If there already is an open source driver to add the HDMI 2.1 functionality, can't they just leak the Linux driver and let us add these features "unofficially"?
Cheerful_Champion@reddit
Honestly HDMI Forum is terrible, I wish manufacturers would start phasing out HDMI
youreblockingmyshot@reddit
The amount of HDMI out in the world pretty much means that won’t happen.
advester@reddit
Then just reject hdmi 2.1 and use DP for modern features instead.
reticulate@reddit
DP has no replacement for eARC
Strazdas1@reddit
how commonly do you need a return channel for audio though?
bondinspace@reddit
It's basically how every mainstream soundbar setup functions these days
Strazdas1@reddit
Ah, i never used a soundbar or know anyone who does.
bondinspace@reddit
They've gotten really cheap these days so have grown a lot in popularity
FranciumGoesBoom@reddit
sound bars the the most popular ways of getting better audio out of TVs. And relatively cheep/simple compared to a full home audio set up.
reticulate@reddit
A lot of the "just use DP, not sure why you'd need HDMI" comments on this subject feel like they're coming from people who just watch everything on their computer. The concept of a living room with a TV and attached soundbar is foreign to them. Not judging but it's definitely something I've noticed, you see it all the time on the various linux subs too.
your_mind_aches@reddit
HDMI-CEC
Fabulous_Comb1830@reddit
Not going to be replaced in the TV segment without their say.
Bannedwith1milKarma@reddit
It's a shame TVs didn't continue with Display Port like they used to with VGA.
ClickClick_Boom@reddit
Is dumb that they don't because it's an open standard, at least on more premium TVs. But of course it all comes down to what most people are familiar with, which is HDMI, and money, it's cheaper to not include it.
Ok-Contest-4565@reddit
A compliant open source driver was written, submitted and rejected because it is contrary to some (Microsoft) of the funding parties interests.
cheesecaker000@reddit
One of the reasons is because display port doesn’t handle audio well like HDMI does.
eARC is one of the main ways people connect their TVs to their surround sound systems or soundbars.
Pyryara@reddit
Well nobody is saying "ONLY support DP". You only need a single HDMI port with eARC, rest could be DP.
keesbeemsterkaas@reddit
What's the background of that? Is sound more like a usb device on displayport? Somehow I've never had a problem the last 15 years playing audio over displayport?
nothingtoseehr@reddit
Audio via HDMI supports return channels, DP doesn't. HDMI also simply has a lot more investment going into it
kasakka1@reddit
Afaik DP does support a functional equivalent to audio return channel. Not sure if anything supports it tho.
mrturret@reddit
It's also because TV manufacturers make money from HDMI licensing
FinalBase7@reddit
Brother, TV manufacturers lose money from HDMI licensing lol, they have to pay for that shit, cable manufacturers too.
With that said, despite display port being free it's not actually cheaper than a roughly equivalent HDMI cable, often it's more expensive.
alphaformayo@reddit
Who are they paying though? Like seriously, I've never actually thought about this and just assumed they just paid. And while they do as HDMI does have a licensing cost, where does that money actually go? All the major manufacturers are members of HDMI Forum.
FinalBase7@reddit
It goes to the HDMI forum, being a member of the forum means you get a say in the spec and features of HDMI, you still have to pay licensing fees to use the port and to use certain branding words.
Area51_Spurs@reddit
Very few TVs had VGA. Only really high end models.
ComplexEntertainer13@reddit
No idea what you are talking about. It was super common in the early days of LCD TVs.
waitmarks@reddit
As a Linux user I have been following this drama since HDMI 2.1's release. Hopefully valve with their larger influence can convince the HDMI forum to change their minds on allowing an opensource driver implementation.
I am worried though that the HDMI forum will grant some sort of special license to valve and the steam machine will become the only linux device to support 2.1
tajetaje@reddit
I really doubt it as it would require a custom AMDGPU driver patch
RealModeX86@reddit
Yeah, this is the crux of the issue
amdgpuis fully open-source. The HDMI forum refuses to allow AMD to put support there because of their approach to their "intellectual property" of how HDMI 2.1 works.Theoretically, a binary-only module could include support, but that's not a good approach either
If one were to make hardware-specific (GabeCube/SteamDeck only) support in software, it would still expose the implementation details, and would be trivial to bypass.
As I understand it, Intel ARC has HDMI2.1 in Linux by implementing it in hardware, so if anything, Valve could maybe take that approach with a built-in DP->HDMI converter for instance.
Strazdas1@reddit
AMD has no trouble using binary blobs elsewhere, so why not here?
RealModeX86@reddit
Directly in the kernel vs in firmware. You cannot put binary blobs in kernel space, but you can use binary blobs on the hardware itself. That only helps if they can put all of the implementation details of HDMI2.1 into the firmware they load on the card. Maybe possible in future generations, but the current designs might not allow it to be fully handled in firmware.
jocnews@reddit
OTOH, the refusal to compromise is on the open source projects. It's their choice, and it is HDMI author's choice to allow putting their IP in open-source code, or to choose to demand it being provided in binary form only.
Since the IP is likely interconnected with DRM schemes, if they made open saucers happy (for a moment at least, until the next outrage), they would piss off content providers.
RealModeX86@reddit
Generally speaking, no you can't put binary modules in kernel modules even if you want to, because they effectively become GPLv2 themselves when linked into the kernel.
Nvidia does some questionable things with their binary module to try to get around this by calling out to the binary from the kernel (which I've heard referred to as a "legal prophylactic" lol), but even this is questionable on whether it follows the licence text. Mainly, that hasn't been tested in court, and may or may not stand up to lawyering.
Firmware is a different story, since the kernel would generally be loading the firmware into the device to make it work, rather than that code being linked against the kernel. So, a future design from AMD could place the HDMI handling in the firmware module and work fine on Linux.
Copy protection, and security through obscurity are likely factors in why the HDMI forum is so shitty about this, but I think the bigger thing is the licensing revenue. If an open implementation exists, then it could be copied into an uncertified device, and the HDMI forum would have to buy one fewer exotic car or something.
As a counterpoint, DisplayPort does HDCP (copy protection) just fine without this draconian licensing issue, and is equivalent (or better than) hdmi 2.1 in features. The only drawback to it is lack of support from TV manufacturers.
jocnews@reddit
Well but that is on the licence the open source project has picked, precisely to not allow such things.
They ruled themselves out by their choice. It's kinda their responsibility to deal with it whatever they want - to convince others to comply, or to not have the feature if they don't want to. HDMI creators don't have to comply, open source projects aren't magically entitled to have everybody go out of their way to serve their needs.
RealModeX86@reddit
Sure, the HDMI forum has that right and I have a right to call them assholes for it. They'll still be getting their licensing fees on the hardware either way, and it's doubtful HDCP would be any less secure because of an open implementation. Previous implementations have never been an issue in that regard.
There's also practical reasons why a closed source module is a bad idea. It becomes a debugging black box if there's ever an issue caused by that code.
I think the main thing with stuff rearranging on sleep is from the monitor appearing to be disconnected. I've seen the same thing happen with HDMI, but as with all things, your mileage may vary
jocnews@reddit
Ironically, Steam Machine doesn't actually offer DisplayPort 2.1 either, I just noticed.
The GPU used in the box could do it (40Gbps UHBR10 mode only), but Valve only exposes DisplayPort 1.4.
RealModeX86@reddit
DP 1.4a can do up to 8k@60, 4k@120, or 1440p@144 with HDR and VRR, so it's definitely in the "good enough" camp for most, unlike the rather gimped HDMI2.0 fallback modes.
jocnews@reddit
DP 1.4 really gets saved by the DSC compression these days though, too bad it was not retrofitted to HDMI 2.0 modes.
delusionald0ctor@reddit
Only problem is I’m pretty sure the Steam Machine would be hardware final already, so the only chance Valve has is convincing the HDMI Forum to allow support in the open source driver, which AMD already tried to do and failed.
RealModeX86@reddit
Right, but that wouldn't preclude them from having possibly included a hardware converter like Intel did on ARC. They don't have to convince the hdmi forum if they ship a hardware implementation with the GPU just outputting DP signals.
Of course, all speculation until the final hardware is out
delusionald0ctor@reddit
If they have done that then this wouldn’t be an issue because they would just use the hardware converter, but seeing as it is an issue, then they don’t have a hardware converter in there, the older Intel Arc GPUs don’t have both a straight HDMI connection from the GPU and a converted HDMI, the ones that had converted HDMI only had converted HDMI.
RealModeX86@reddit
My point is we don't know yet, but we DO know they didn't do that on the Steam Deck.
It remains an issue for HDMI on Linux and AMD hardware either way, even if they do end up building in a workaround.
I hold more hope for that idea than Valve/AMD/anyone convincing the HDMI forum to stop being asshats about it though
waxwayne@reddit
Most consoles get custom drivers
lllorrr@reddit
Most consoles are not built with GPL software. Custom Linux driver will either be limited to non-GPL API or it will be required to be open source.
whatThePleb@reddit
Why would you even still want HDMI, the inferior of all?
Fluxriflex@reddit
Because if you want to directly connect to a TV, it’s the only game in town, outside of buying an unreliable DP to HDMI adapter.
Material_Ad_554@reddit
If Microsoft and Nintendo can’t influence it I doubt valve can man
leaflock7@reddit
was MS and Nintendo asking to open 2.1?
noiserr@reddit
This is why open standards matter. I've been going out of my way to make sure I have Display Port in all my displays.
hurtfulthingsourway@reddit
AMD had a working opensource driver with the HDMI firmware that loaded somewhat like Nvidia does and it was rejected by the HDMI Forum.
advester@reddit
Bastards
Rodot@reddit
Do Tizen TVs not support it?
harbour37@reddit
Hisense also has its linux os.
Green_Struggle_1815@reddit
https://media.tenor.com/QgTx6fv4IpAAAAAM/el-risitas-juan-joya-borja.gif
akera099@reddit
I think that would be objectively worse indeed. Would kinda defeat the whole point.
Lstgamerwhlstpartner@reddit
Isn't the HDMI drama all boiling down to licensing bullshit? My understanding is displayport is pretty much free for manufacturers but the owners of the HDMI license charge by the port and are pretty expensive to get
Hamza9575@reddit
Blocked on linux, even if you have infinite money. Thats the problem. Pure insanity by hdmi forum.
Ceftiofur@reddit
Not insanity. Dickhead behaviour.
jocnews@reddit
No, it's probably technical. HDMI involves DRM to protect stuff like Netflix streaming content from being copied too easily. Open sourcing would potentially compromise the DRM bits, so the DRM players don't want it, and hence HDMI Forum wants the implementations to be binary only/obfuscated, to satisfy the needs of those users.
dahauns@reddit
To my knowledge, the only "DRM bits" in HDMI would be HDCP - which isn't specific to HDMI 2.1 (or even to HDMI in general), though.
TheBraveGallade@reddit
well, its becasue they don't wangt HDMI to be open source, and by nature a linux implememntation will bascially be open source.
Strazdas1@reddit
Linux kernels are full of binary blob implementations for things that cannot be open sourced.
goodnames679@reddit
I wish displayport was even remotely reliable in comparison to HDMI. On the surface it seems like such a better standard, but dude the number of displayport cables/adapters that die if you roll them out in an even remotely large number is insane
Kyanche@reddit
I've had zero issues with the Cable Matters displayport cables, but you do have to buy the right version for your use case. Also there's no long distance displayport 2.0 unless you use fiber, I think?
goodnames679@reddit
The issues aren't really as visible on a smaller scale. Displayport was my preferred format until I had to deal with it on a large scale.
Perhaps a quarter of the PCs at my job have displayport as their only video output. Those DP cables make up over 90% of the cables we have to replace. When we had significant amounts of VGA / DVI those basically never died even when they got beat up badly by the users (I think I had to replace literally one that had died, ever.) HDMI isn't quite on that level, but they mostly only die from users smooshing their PCs against the wall and bending the hell out of the cables. I replace a dead HDMI cable maybe once every couple months due to this.
Our DisplayPort cables never get beat up due to the PCs associated with them being under counters in a spot where users have no reason to move them around, but despite this they die like crazy. We replace maybe three or four a month. We've tried a variety of brands including Cable Matters, because we'd really like to stop dealing with this. No dice.
Strazdas1@reddit
ive never seen a DP cable die unless it was physically damaged by the user. Same goes for all other cables except USB to be honest. Didnt consider this to be such a common issue.
Kyanche@reddit
I wonder what's going on there lol.
RBeck@reddit
I really think all DP and HDMI cables (and ports for that matter) should list the version on them.
MumrikDK@reddit
HDMI's existence is sort of down to licensing and DRM bullshit.
C4Cole@reddit
So that's why every GPU I've gotten recently has 3 DP ports and only one HDMI. I thought they were just out to get me and my never ending stash of old HDMI 1.2 cables(which are perfectly fine for 99% of stuff I want to plug in).
Down with HDMI, long live DisplayPort!
spooker11@reddit
Would honestly be open to GPU makers dropping HDMI altogether. Pulling an apple on us as a forcing function to switch off hdmi. Maybe TVs and consoles would catch up
RBeck@reddit
I totally agree, but wish GPUs would have mini-DP and at least one USB-C with DP-Alt
Wait_for_BM@reddit
WRONG. Don't say expensive without finding out the actual amount is in the pennies.
The admin fee is actually low if you are making multiple products with hundred of thousands of units each.
Whether or not they use a "license free" port or not, there is always a hidden fee somewhere else. If a company want to make their own chip, chances are that they'll license some premade IP from someone.
Lstgamerwhlstpartner@reddit
You left off the first part of the quote, "my understanding is..." And this misrepresented my comment. That aside, thank you for the rest of the information regarding pricing.
Wait_for_BM@reddit
You are pointing out that HDMI is expensive when it is actually $0.04 per device. You didn't know the facts to back up your claim like all others.
Can you point out how you WERE mispresented? Why a non-correct statement has so many upvotes than a fact based one?
fuddlappe@reddit
hdmi is drm, in a way. it's always down to ~~licensensing~~ money
WalkySK@reddit
It's not about licensing. AMD and gpu/laptop manufacturer already pay for it. It's about HDMI forum not wanting driver for HDMI 2.1 to be open source.
QuadraQ@reddit
HDMI is one of the worst ports ever made.
ReddusMaximus@reddit
Yup, it's basically DVI with added sound and subtracted screw fitting, with a much flimsier plug.
QuadraQ@reddit
And horrible licensing requirements, extreme cable length limitations, etc
Technonomia@reddit
In addition to HDMI 2.1 proprietary nonsense and anti-consumer restrictions on Linux..., don't forget that many big companies represented by HDMI Forum also enjoy snooping on user content for marketing purposes with the feature called ACR, which stands for Automatic Content Recognition. It is a digital fingerprint of all content watched on TVs, from internal apps to external devices connected to it via HDMI ports. Owners of those TVs would need to dig deeply into settings menu to switch it off. The ACR sends into cloud a few kilobytes of data almost every minute with information what is watched, when and for how long. Please check your settings and watch any YT video about it and how to disable it.
There is no other way to fight for opening HDMI 2.1 standard for Linux devices than taking political action and rallying local consumer right and protection groups that could then make representations before national consumer protection agencies, and ultimately national parliaments. As HDMI is a global standard backed by big, rich, global corporations, it's the only way. Those companies will not cut the hand that feeds them on their own.
Stable_Orange_Genius@reddit
Why not use DisplayPort
Nihilistic_Mystics@reddit
Because they need to be as universally compatible as possible. Not many people have TVs with DisplayPort.
anethma@reddit
And many TVs use eARC to get the audio from their tv smart apps and streaming boxes to their speakers.
And if not you’d use the pass throughs on your amp which are hdmi because it has audio.
DisplayPort just doesn’t do the things needed for home theatre use.
Strazdas1@reddit
if you are transfering video and audio through external device to the TV, TV does not need to run its own audio.
anethma@reddit
No but none of those devices have displayport. No streaming box or AVR that I am aware of does anyways.
It would have to be some kind of setup where the streaming box connects to the AVR with HDMI, then your AVR outputs to the TV with displayport?
Strazdas1@reddit
we are working in fictional scenario where displayport is an option here. But also i guess it depends on your setup? Everyone i know just connects their devices directly to TV rather than any AVR. Most people have simple setups and have 1 or 2 devices connected max. Even cable TV now is an app inside the TV rather than a seperate box.
anethma@reddit
Ya true I’m just saying that ya it would work for the one situation where you’d be doing everything through the AVR and then ya it uses DisplayPort without sound.
But many may use eARC so since HDMI works with sound etc I don’t see it going anywhere unless they implement something similar.
aes110@reddit
Given that the one thing companies like most is saving money i cant understand why HDMI is still used and they didn't all just switch to DP
starke_reaver@reddit
I always thought it was: 1. Profit 2. Shareholders 3. Not paying taxes. 4. Screwing over brand loyal customers by reducing quality/functions while increasing prices. Etc…
Strazdas1@reddit
You listed it backwards.
aes110@reddit
Ehh it all comes down yo having more money one way or the other
starke_reaver@reddit
If only the Notorious B.I.G. had been correct…
fntd@reddit
Because HDMI is deeply entrenched in the whole ecosystem and DisplayPort doesn't cover all features that are useful in that space. (e)ARC, CEC, Lip sync correction, etc. If you want to offer devices with DisplayPort support, you'd need a loooong transition period where you offer both, so why even bother? HDMI license fees are not that much to begin with ($0.04 per device if you implement HDCP which you probably have to do in the TV space anyway).
KR4T0S@reddit
AMD tried something like this and the HDMI Forum quickly shut them down. Might even be related to this device though it was a while ago they were trying to push it through. Personally I use DP when I can and am looking forward to GPMI.
starburstases@reddit
GPMI protocol will use the USB-C connector, and it's unclear whether or not it will be free. What are the odds that a standard developed by a Chinese company is fully USB compliant? I don't have high hopes. If we're talking about display interfaces that use the USB-C connector why not look forward to devices implementing DP 2.1 Alt mode, or heck, even Thunderbolt?
ffpeanut15@reddit
GPMI supports USB-C but it also has it own connector for maximum capabilities
reddit_equals_censor@reddit
the hdmi forum is SCUM.
they are an active enemy of gnu + linux.
and this situation isn't new.
amd basically bagged this scum organization to get "hdmi 2.1" working on gnu + linux for ages and all they did was show amd the middle finger.
the quotation marks are for partially meaningless hdmi 2.1 is, as people can put hdmi 2.1 on boxes with hdmi "1.4" bandwidth. another scam run by the hdmi forum, although dp does it too now, but dp at least has the direct bandwidth marketing option as well next to it if desired.
npquanh30402@reddit
Corporate greed. Use display port
kuddlesworth9419@reddit
It would be nice to move away from HDMI all together. In my opinion it would be nice to just have audio and video a separate connection, they won't do it because of DRM and Dolby but it would be nice.
Gippy_@reddit
Won't happen. RCA composite dominated the scene until HDMI arrived even though there were superior analog options. HDMI cables are also very very cheap to produce. DisplayPort has a locking mechanism but that increases the cost of the cable.
capran@reddit
I'm wondering if it will have surround sound capability? I bought a Minisforum mini gaming PC, about the size of an Xbox Series S, and installed Bazzite on it. Only to discover that over HDMI, only stereo is supported. I have to reboot into Windows if I want surround sound. To be fair, that's really just for movies, but it'd be nice if it worked in Bazzite.
Used-Rabbit-8517@reddit
It looks like SteamOS only does Stereo sound for games so it’s not looking good.
your_mind_aches@reddit
If I had surround sound downstairs, I would absolutely game on it in surround
frostygrin@reddit
Oh, so it's HDMI 2.0 bandwidth with chroma subsampling... People were hoping for HDMI 2.1 bandwidth without HDMI 2.1 features.
Used-Rabbit-8517@reddit
Yeah this is pretty disappointing. This makes it significantly less capable than all modern consoles.
advester@reddit
FRL is specifically the thing being gatekept, even though FRL is barely different from DisplayPort HBR.
Used-Rabbit-8517@reddit
One error in that article, they say DP 1.4 has more bandwidth than HDMI 2.1 which definitely isn’t true.
Used-Rabbit-8517@reddit
Why can’t AMD just make some closed source drivers?
stonerbobo@reddit
This is the exact reason I can't buy a Steam Machine or Steam Deck now. I would LOVE to buy a Steam Deck if I could use it both to game on and to feed 4K@120Hz HDR 4:4:4 to my TV via moonlight. I wish HDMI would just fucking die already. DP 2.1 already supports upto 80Gbps whereas HDMI is going to crawl up in bandwidth step by step to milk as much money in forced upgrades as they can, in addition to their open-source shenanigans.
Hamza9575@reddit
Steam machine does have displayport 2.1 though. And the deck has displayport 1.4. If you want hdmi to die then buy displayport devices, like th exact ones you are complaining about not having hdmi.
stonerbobo@reddit
I don't think you understand - I know the Steam devices do, but the TV's I want to connect them to don't. There are almost no TVs at all with DP. Only PC monitors have DP inputs.
ThatOnePerson@reddit
Specs say DisplayPort 1.4
Gippy_@reddit
Keep dreaming, just like how RCA composite held off every other analog input for TVs (remember S-Video and component?) until HDMI finally toppled it.
At this point with HDMI 2.2 supporting 96gbps, the only practical use for DisplayPort for most people is the one-cable solution for laptops to plug into a monitor dock and charge the laptop at the same time.
PrysmX@reddit
I'm honestly annoyed that a pure optical cable didn't just become the standard. A single optical cable is absolutely capable of carrying the bandwidth necessary for 4K+ streaming plus uncompressed audio, and over much longer distances. If this became the standard years ago we wouldn't have so much HDMI cable waste from having to upgrade so many times.
stonerbobo@reddit
That would cut out like 10 forced upgrade cycles across billions of cables, TVs, GPUs, peripherals and cost all of those industries billions of dollars. I'm honestly quite sure that's the only reason we see these stupid standards inch up their bandwidths step by step instead of just fixing it one go.
AndreaCicca@reddit
We have had very few changes in the industry in recent decades, even for cables.
Having an optical base standard wouldn’t change anything from this point. You would still be forced to upgrade if you wanted the latest feature. Sure the cable could still be the same, but it’s always the least expensive item inside a home theatre setup.
PrysmX@reddit
Yes, devices on either end would need to be upgraded over time but we would have way, way less cable waste was more my point. How many HDMI cables do people have laying around from needing to upgrade to higher bandwidth cables, or worse threw them away and they're in landfill somewhere.
AndreaCicca@reddit
But you still can use old cables, even for HDMI 2.1 full bandwidth devices. It's not recommended, but nobody will stop you. And if it doesn't work an ultra high speed certified cable it's like 10$/euros. It's nothing when the AV receiver alone or a soundbar one or more order of magnitude more.
They can still use them.
PrysmX@reddit
I was talking about the waste more than the cost (but that too). And when you get up to HDR Vision with lossless Atmos audio, a lot of times the older HDMI cables do not hold up. I have dozens in my closet that can't handle that bandwidth, nor the bandwidth of modern PC displays with high refresh rates and ultra wide resolutions. PCs would have also benefit from the optical cable solution.
AndreaCicca@reddit
Nothing is stopping you to use them with HDR, Dolby vision, lossless Atmos or what ever.
because they were poorly manufactured in the first place and they can't handle high enough bandwidth. I you buy good cable today they will still be good cables even tomorrow.
Optical cable solution is even worst in the PC space because nowadays you expect 1 cable that carry everything.
PrysmX@reddit
What are you even talking about? You can't use the vast majority of the older cables with the latest bandwidth requirements. You end up with signal dropouts. And no I did not cheap out. It's just the reality that if you're taking years ago, the cables were only made to the specs required at the time.. It's like having a cable now that supports 48Gbps and in 10 more years you need a cable for 16K video or something (theoretical, obviously) and the best of the best cables of today probably wouldn't support 192Gbps or whatever the new bandwidth requirement is. We reached a point where the quality of the HDMI cable absolutely does matter now where years ago it did not because of far less bandwidth requirements. This is my point of repeated obsoletion because the cables are only made to be just enough for whatever the current best quality is.
AndreaCicca@reddit
HDMI uses the same physical standard for ages at this point. With optical audio you would have the same exact problems that you had now with HDMI
noonetoldmeismelled@reddit
Valve should work with some budget TV company and release some 55-75" rebranded TV's without HDMI and just displayport. Keep the optical audio port. I need that. Pack in adapters. Someone needs to champion displayport on televisions
fntd@reddit
DisplayPort has no alternative to eARC and therefore you can't fully get rid of HDMI in the TV space.
hishnash@reddit
Display port over TB does. you have lots of extra bandwidth options here.
AndreaCicca@reddit
You need a common standard such eARC
hishnash@reddit
USB and PCIe are both standards that are channeled through TB.
A thunderbolt display can act as a TB/USB multiplexer so it exposes attached USB/PCIe devices to the upstream video source. Thus letting any audio device attached to it be directly addressable from the video source.
AndreaCicca@reddit
Display port have to walk with its own leg, Thunderbolt (or USB 4) won’t be used on TVs. You have to be able to to the exact thing that you do with eARC.
hishnash@reddit
if you were to get rid of HDMI then why would one not replace that with a simple TB/USB4 connection.
AndreaCicca@reddit
Because a simple TB/USB4 needs the support from the SOC maker and it would be a more expensive solution than just a display port input.
Display port should be able to walk with its legs, in order to star a transition (not a replacement because you still need HDMI support for a long time) it has to have the same feature as HDMI. It must have an eARC replacement, CEC etc.
HDMI is used because it has everything a TV needs and it's licensing cost are negligible.
hishnash@reddit
it requires you put in a USB-4 Doc within the TV. You would not directly attach the TV SOC to the USB4 as those chips do not support the needed dock like features that would let the TV pass through other devices as audio targets.
why would display port add eArc when the entier point of modern display port is that can be tunnelled within USB so can use that ecosystem and protools for device handshakes.
AndreaCicca@reddit
"it requires you put in a USB-4 Doc within the TV"
It's frankly a very Frankenstein solution for something that is not needed. As I said "it would be a more expensive solution".
Because that's the current workflow in the audio market. You connect your device to the TV and then you use the eARC port to connect the AV receiver, everything is handled by the TV.
If you want to make a monitor just make a monitor.
hishnash@reddit
I don't think there are any SOCs on the market today that can act as a dock (AKA pass through TB/USB4) and act as an SOC.
The chips that are made for TB/USB-4 dock features are seperate to the system SOC you use.
There is a good reason for this, as you reduce the node size it gets harder to convert the raw voltage signal into bits without defects so even if your main your chip on 6nm the parts of the chip that are decoding an encoding the raw single will not shrink and use up much more of the idea area. A dock needs a LOT of connections into the chip and itself does not do much on board logic so it is much cheaper to build the dock chip on a larger node (that is orders of magnitude cheaper).
Then the much higher end SOC node just needs a connection for a single USB/TB on it rathe than 4 or 8 of them that would be needed for the dock saving a lot of $.
AndreaCicca@reddit
with my comment I mean that a dock it's not needed because a TB port is not needed. It's a solution to a problem that doesn't exist.
hishnash@reddit
The problem is that display port itself does not provide the means for a back channel audio stream to secondly video source. (aka eARC)
If you want to be able to pass through other connected devices using a TB/USB4 would be the best choice.
It could be used for much more than just passing through audio devices, another useful feature would be passing through a network interface so that you only need to attach one of the many devices under the TV to an ethernet cable.
fntd@reddit
The bandwidth alone doesn't help if there is no standard around it. Or is there something?
hishnash@reddit
there are multiple ways to expose an audio device over TB. It is completely possible for a TB display to also act as a bridge so that it forwards all other attached TB device to the host (video source) meaning from the video source you can then select what audio output as it would show up just the same as if you attached that audio output directly to it.
advester@reddit
eARC really sucks. So does CEC.
fntd@reddit
How does eARC suck?
AndreaCicca@reddit
In no way.
akera099@reddit
You don’t need eARC on the HDMI if you have a dedicated optical cable going from your TV to your AVR.
fullsaildan@reddit
Optical is also extremely limited on audio capability/quality. So it’s a dead technology to any of us with 7.1, much less Atmos
advester@reddit
Unless your tv doesn't happen to support pass through of the codec you want because absolutely everything that touches the stream must be licensed for that specific codec.
Protonion@reddit
But then as a side effect you lose the volume control via HDMI CEC, so with optical you're forced to use the AVR remote for just the volume control and TV remote for everything else.
Kyanche@reddit
Ugh why didn't they just come up with a dedicated audio connection instead.
AndreaCicca@reddit
Optical cable is a dead standard at this point
noonetoldmeismelled@reddit
Damn I do use eARC. It'd be nice to have HDMI and eARC then
lordosthyvel@reddit
eArc is the audio return channel. Why would you need both optical and eArc at the same time?
noonetoldmeismelled@reddit
I don't. I used to use eARC but switched to optical for my cheap class D amp. I used to use eARC. Memories flooding in
lordosthyvel@reddit
Optical quality is worse than earc. EArc supports lossless audio…
Loose_Skill6641@reddit
which chipset do they use?
noonetoldmeismelled@reddit
Damn. That is a problem. Can they stick the cheapest brand of N100 mini-PC's into a 55-75" television and make a SteamOS TV
AndreaCicca@reddit
We are talking TVs not a pc
c010rb1indusa@reddit
Optical audio is not ideal for PC gaming either because you can't actually output 5.1 surround sound for games unless it's a dolby digital 5.1 or dts 5.1 bitstream, which are compressed and lossy surround sound formats.
Optical works if you have premade content like a video file with DD5.1 or DTS tracks built in, but a standard PC cannot encode TO DD5.1 or DTS in real time unless the soundcard has a feature called Dolby Digital Live.
Consoles DO have this capability to encode to DD5.1/DTS5.1 in real time but PCs don't.
coltonbyu@reddit
I can't imagine many people being okay buying a TV with NO hdmi. HDMI + Displayport is a far friendlier solution, and more convenient for just about everybody.
Sure, its less of a protest, but that HDMI adapter isn't suddenly going to make eARC and CEC stuff work nicely
Die4Ever@reddit
Use both at the same time lol, the HDMI 2.0 for CEC and audio, just the DP for the video feed
coltonbyu@reddit
hence my comment about the TV needing both. His comment said to avoid HDMI entirely.
A TV with a handful of both ports would be excellent. A TV without any HDMI will be returned heavily
Kemaro@reddit
Just to add some additional info, this is only a problem for AMD (any maybe Intel?) on Linux. Nvidia fully supports hdmi 2.1 and all of its features. This is because the Nvidia driver is still mostly proprietary even though they have open sourced the kernel modules.
cabbeer@reddit
I didn't realize that was a thing.. I can do 4k 120 out on linux with my displayport to hdmi cable, I thought it was 2.1?
AndreaCicca@reddit
Your converter is likely hdmi 2.1
shroudedwolf51@reddit
I guess, I feel like I don't understand why bother dealing with all of the licensing drama for a discount PC that most people will not have it hooked up to more than a cheap 1080p TV.
AndreaCicca@reddit
cheap 1080 TVs are dead by this point. We aren’t in the 10s
Ploddit@reddit
TL;DR, hardware interface standards should not be proprietary.
Kyanche@reddit
Ahem.
"ECOSYSTEM"
My most hated word. -runs-
Lucie-Goosey@reddit
Amen.
Lucie-Goosey@reddit
We should have some sort of global international agreements in place for developing open protocol standards for hardware and software.
DaMan619@reddit
If only we had an International Organization for Standardization
FibreTTPremises@reddit
ah yes... iOS.
advester@reddit
HDMI is just DisplayPort with different branding that you have to pay for 4 times over.
smartsass99@reddit
Feels like HDMI standards are drama every year.
bick_nyers@reddit
Oh so that's why my linux laptop can't leverage HDMI 2.1, TIL.
I wonder if a thunderbolt to HDMI 2.1 adapter will work or not... (my guess is no)
Unfortunately many monitors only have one displayport input.
yyytobyyy@reddit
It could. Video over usb-c/thunderbolt is transported using DisplayPort protocol.
SANICTHEGOTTAGOFAST@reddit
Only TBT5/USB4v2 supports tunneling DP2 links, you're limited to DP1.4 otherwise - though you can still do two independent DP1.4 tunneled links with a tiled MST config over TBT3/4/USB4v1.
DarianYT@reddit
HDMI has always been like it's the exact reason why VESA wanted to kill it many years ago.
Loose-Internal-1956@reddit
The HDMI Forum needs to be dissolved.
arandomguy111@reddit
I don't follow this as much since I don't use Linux that extent, but doesn't Nvidia support HDMI 2.1 in Linux (both the closed and open source drivers) because they use a closed source binary blob for it?
If so this seems like it's also addressable on the AMD and Valve side as well. However is there an ideological road block related to not wanting to implement a closed source solution?
YouDoNotKnowMeSir@reddit
It should be open source and I’d like them to be vocal about it. It would be nice if they had the hardware capabilities for 2.1 and then roll out a software update later if they ever make progress on it being open source.
kwirky88@reddit
Can valve do what compaq did and clean room solution this?
Lucie-Goosey@reddit
Let's pray HDMI forum sees sense with open source
Routine-Lawfulness24@reddit
“Digging in” haha it’s like the most surface level shit lol
Euler007@reddit
I can understand Linux not paying because it's free software and it's unmanageable, but this is a physical box. How much could it be, one dollar?
biscotte-nutella@reddit
Man screw HDMI, display port is here anyways
mckirkus@reddit
Isn't this a limitation of the older GPU they're using?
angry_RL_player@reddit
people be yapping anything just to shit on AMD
surf_greatriver_v4@reddit
brother read the article