NVidia sucks for Linux
Posted by aih1013@reddit | linux | View on Reddit | 48 comments
Sorry, this is going to be vent out. I owned a host of NVidia GPUs, including 1080Ti Founders Edition for some time now. Probably, 10 years or so. My workstation is purely used for work, so even if I have minor glitches here and there. I cannot justify spending a lot of time troubleshooting, but recently all Chromium based browsers started to crash on video playback.
That was a blocker, so I took out my old gdb and pinpointed the problem to… NVidia drivers, to a conflict of the glue layer with the drivers, actually. But nonetheless I bought a Radeon.
Crashes were solved. But!
Video update latency - gone!
Flickering - gone!
Wake from sleep issues - gone!
Sound problems - gone.
OMG!
dorkofeverything@reddit
Jensen Huang told his employees not to have 1 on 1 conversations with him
To me he's just another sociopath in a jacket
Seriously, play nice with open source. Or face criminal punishments, I say
vanillaknot@reddit
I work for a company that does engineering simulation software -- very impressive, incredibly functional, alarmingly expen$ive software. There are literally hundreds of machines inside our offices running RHEL, Rocky, Ubuntu, and SLES with nvidia GPUs. Quite a few are VMs using data center share-able GPUs in VDI configuration -- my RHEL 9 VDI has its assigned piece of an "NVIDIA Corporation GA102GL [A40]" per lspci. My piece has just 4G memory out of the far larger total because my work doesn't involve solving magnetic meshes of 100M points like some of the other folks do.
nvidia is actually fine. If it weren't, big enterprises like my company, and the big enterprises to whom we sell software (you would recognize all their names), would collapse outright.
When you have to maintain hundreds or thousands of machines, and deploy new ones literally every single day, you see where the problems lie. And nvidia is not ever one of them.
martyn_hare@reddit
Enterprise workloads typically use a subset of the driver functionality consumers use, and that functionality is heavily tested by NVIDIA on Linux - the rest totally isn't.
NVIDIA drivers on Linux can't do hardware accelerated encode/decode of real-time WebRTC video calls inside any major web browser in any vendor-certified capacity. This is basic functionality even an Intel iGPU is capable of. Since you wouldn't use VDI to make video calls due to added latency (and since the packages needed to do it aren't part of RHEL either) the complete lack of official VAAPI support to make this possible becomes completely irrelevant.
Sometimes they remove functionality too for market segmentation purposes, like deliberately crippling Linux and FreeBSD multi-monitor support one day just because they felt like it or the time they randomly decided to "accidentally" block PCI-E passthrough with deliberate detection routines (only for GeForce cards, Quadros were fine) until prominent developers (including Red Hat, SUSE and Canonical employees) played a cat and mouse game writing code to allow everyone to bypass their checks at the hypervisor level.
None of that matters to enterprise users because they plan their IT accordingly but all of this does matter to consumers and hobbyists running Linux on the desktop, where NVIDIA does make things suck.
Tee-hee64@reddit
Person complaining GPU doesn’t have the Nvidia open driver. Seeing a pattern here.
hangint3n@reddit
The best AMD cards don't even compare to the best Nvidia cards in performance. I've been running Nvidia cards for 10 years with little to no issues. They run my desktop just and play the games I like. So until something changes I'll keep on, keeping on.
Dr_Hexagon@reddit
I'm using Bazzite with an Nvidia 3060 card and the built in driver that comes with the distro. Video playback in Chromium works fine, no crashes.
Bazzite does have a desktop and can be used as a general purpose linux distro not only for gaming.
johncate73@reddit
Or you can just avoid Nvidia in the first place and be assured that your GPU will function properly. If they can't be arsed to care about Linux users, then why should Linux users care about them?
numberoneshodanstan@reddit
Are you offering to replace the GPUs of thousands of people?
johncate73@reddit
Congratulations. You're the second obnoxious person to make this comment. Being a Johnny-come-lately troll is kind of sad.
mWo12@reddit
So your recommendation is to trash nvidia GPUs and just get AMD instead? Are you going to sponsor this replacement for people or what?
johncate73@reddit
No, my recommendation is not to buy them in the first place if you intend to use Linux.
But to answer your obnoxious comment, if you are coming over from Windows and have Nvidia, my suggestion would be to sell the GPU and then replace it with an AMD one if you have a desktop, and if you have a laptop, pray that the Nvidia drivers work since you'd be stuck with it.
As for you, I'd be willing to sponsor you for your remedial preschool class.
Dr_Hexagon@reddit
I was running Windows 10 and then switched to linux because of Windows 11 being so bad. So I already had the Nvidia card. I thought my comment might be of value to other people that might consider switching to Linux but be put off by hearing it has bad support for Nvidia.
The reality is recent Nvidia drivers work for most people on Linux, I get very close to if not the same frame rates on high end games as I would under windows.
johncate73@reddit
Fair enough. I'll only say when they don't work, it's not fun. Having been on Linux a long time, for me Nvidia is best avoided.
mikul_@reddit
Sucks on windows too tbh
Significant_Pen3315@reddit
agreed
Blue-Pineapple389@reddit
AMD on Linux is the way. I am on my second card and it is a breeze.
loozerr@reddit
Page flip time out issues made me switch back to windows after getting a radeon gpu. I guess mixed high refresh rate is not a common use case.
natermer@reddit
I just bought a new framework 13 laptop. Fedora 43 running Gnome with 890M igpu. 120hz laptop display with 200% scaling connected to a 60hz external monitor with 125% scaling.
As far as I can tell it "just works".
The only issue I have noticed is that Steam, being the only real X11 app I still use regularly, tries to scale based on whatever monitor it gets started on. The adjustment isn't automatic when moving from one display to another like other apps as far as I can tell.
Using terminals, browsers, emacs (wayland pgtk) work just fine.
Been playing around with hot plugging eGPU over USB4. AMD's non-thunderbolt-licensed version of thunderbolt. While that works it is quite a bit more finicky. Seems to cause issues when trying to un-suspend the laptop after disconnecting the eGPU. It is nice for gaming, but it does mean that it is a good idea to reboot the laptop when done.
Also to get windows games to behave well when switching between apps and full screen and stuff like that I need to use Gamescope. But that is the same regardless of using internal or external gpu. Otherwise every game has its own weirdness when in window mode, switching to full screen and back, and cursor trapping and stuff like that. Gamescope makes everything behave. At least for the games I've tried so far.
Synthetic451@reddit
Eh there's an entire thread of people experiencing the issue: https://gitlab.freedesktop.org/drm/amd/-/issues/4141
AMD is not "just works". All of my AMD GPUs have had more instability issues than my Nvidia 3090
loozerr@reddit
Yeah I mentioned high refresh rate, in my case 240 and 480 hertz with a 9070 xt
natermer@reddit
I see. 480hz setup is indeed probably uncommon among driver devs and testers.
loozerr@reddit
I'd imagine so, I have reported my woes
DoubleOwl7777@reddit
thats more of an x11 issue. wayland has better support for high refresh rate multi monitor setups.
loozerr@reddit
No it's not. I'm on wayland. The driver is unstable.
Substantial-Sea3046@reddit
yep browser's flickering on my amd/nvidia laptop, I've learned the only solution is to disable the nvidia dgpu. This bug has literally existed for years.
on my desktop full amd igpu/dgpu, no problem
my-comp-tips@reddit
My next card will be an AMD.
mWo12@reddit
Same
oxez@reddit
User admits they are clueless and no idea what they are doing, blames it on everything else. What's new?
"Sound problems - gone."
In a post related to GPU drivers
Welcome to /r/linux everybody.
the_abortionat0r@reddit
Did you not know GPUs are audio devices? Did you think magic happened when you plug a GPU into a tv and sound comes out?
natermer@reddit
Audio over HDMI is pretty common.
Ezmiller_2@reddit
My 2060 II is working fine. You probably needed a new card anyways. 10 years is a good run on mondern GPUs IMO, especially with Nvidia cutting support for them recently. You could always try the nouveau driver, but you probably won't like it.
mmmboppe@reddit
I have a 25 years laptop with a Radeon mobile and it works out of the box in Linux with zero config
Ezmiller_2@reddit
Yeah it's hit and miss. My Thinkpad T430 won't use the 340/390 Nvidia driver with any kernel outside of the 5.x series. Depending on the distro, nouveau works decently, like with Slackware and Debian. But if run CachyOS, I get really bad screen tearing, regardless if I use the Nouveau or Intel GPU. But then everything runs great with Cachy on my 2060.
SG_87@reddit
I just swapped my fully functional RTX4080 for a 9070XT. Feels great to be team red now.
Ezmiller_2@reddit
Great that you can spend money like that without consequence.
SG_87@reddit
Well it's basically a trade. 4080 sold, bought 9070xt for the money. So it was just a trade.
Ezmiller_2@reddit
Oh ok lol. I was kinda grouchy this morning anyways.
mmmboppe@reddit
you can even remove "for Linux" from your statement and nothing will change. Nvidia took a giant shit on home users and isn't worth any further attention
DevilGeorgeColdbane@reddit
In other news: Water is wet
johncate73@reddit
Beat me to it.
If you want reliable Linux performance, use AMD or Intel GPUs.
aih1013@reddit (OP)
Yes, I’m that old tired monkey.
BinkReddit@reddit
Nvidia hasn't cared in quite a while; all of their money now comes from AI. If you want good graphics, good Linux support, and some AI, AMD is the clear winner here.
Armageddon_Bound@reddit
Now you'll just get page flip time out crashes in the middle of your games instead. Joy!
pomcomic@reddit
switched from a (perfectly cromulent) 3070 to an RX 7800 XT for exactly that reason. I've grown tired of Nvidia's driver shenanigans - one version worked fine, the next broke *something*, it got tiring real quick.
also double the VRAM go BRRRRRR
04_996_C2@reddit
"A noble AMD GPU embiggens the smallest VRAM"
TipAfraid4755@reddit
People only use Nvidia with Linux on laptops and that's pretty much what is available for them
For desktops AMD is the 100% sane choice
siete82@reddit
I use nvidia in desktop because cuda
MrLewGin@reddit
I bought a RX 9060XT GPU for that reason.