reGPU - Reviving old Optimus laptops on Linux!
Posted by NotSoEpicKebap@reddit | linux | View on Reddit | 30 comments
Legacy Optimus cards have always been a pain to set up in Linux and in my case, nothing worked at all.
My GT520M partially works on older distros even though some applications pick the iGPU instead. Unfortunately, Debian 11 can't reliably run modern applications without a glibc update, which breaks NVIDIA's libGLX for some reason thus making my dGPU unusable once again.
So what did i do?
I wrote a completely new way of using legacy Optimus cards on Linux. The iGPU copies the frames from the NVIDIA X server directly to /dev/fb0. That kind of makes the NVIDIA card the primary X device.
It has some limitations though, like having high power consumption. (due to the card being always on. quite the opposite of what Optimus was built for)
But if you are a person that doesnt really care about power saving and all you want is raw performance, it's totally fine.
Note: This project is still WIP
az-hafez@reddit
would that work on freebsd (or in the future at least) ?
cuz I tried freebsd before on nvidia 470 but I found that freebsd's nvidia legacy 470 doesn't support DRM kernel module unlike on linux
and that made the laptop only use intel Igpu (I also tried virtualgl but I coildn't get it to work and even if it worked it will not run vulkan games cuz nvidia 470 have vulkan 1.2 support)
NotSoEpicKebap@reddit (OP)
470 should support PRIME which is way simpler and straightforward to use or does FreeBSD not support it?
This thing i've made isn't really dependent on Linux, any UNIX-like system that has NVIDIA support and an exposed framebuffer device should technically work.
az-hafez@reddit
on freebsd there are no drm modules for nvidia 470
drm modules only exist on freebsd on later driver versions because freebsd developers took nvidia's drm modules sourcecode from nvidia's open kernel modules
nvidia didn't provide freebsd drm support by themselvesĀ
Longjumping-Youth934@reddit
Can I already use your software?
NotSoEpicKebap@reddit (OP)
It's extremely unstable right now. I will publish it once it manages stable FPS in a game like Minecraft
LemonXy@reddit
I have a laptop with GT 520MX and HD3000 with DRI_PRIME=1 I can easily get modern OS (Debian 13) to use the Nvidia GPU but nouveau mean that the dedicated GPU ends up being slower than the integrated Intel iGPU... Really shame that nVidia drivers are not available for modern kernels (or if noveau wasn't stuck at power saving mode)
NotSoEpicKebap@reddit (OP)
Nouveau is easy to set up but gives you far less performance. That's one of the reasons why i'm working on such a project.
GradSchoolDismal429@reddit
I thought Nouveau is on par with Nvidias driver for older card up to 700 seriesĀ
LemonXy@reddit
Yeah... but the nVidia drivers are not available on more modern kernels so at some point the decision comes to between supported OS and having the nVidia drivers and the performance
NotSoEpicKebap@reddit (OP)
In the screenshot provided above, i was using a 6.19-zen kernel on modern Arch Linux. So they do work. You just need a patched version (like the one on AUR).
LemonXy@reddit
I did not know modded drivers were a thing, that makes this super interesting.
redundant78@reddit
yeah nouveau's reclocking support for fermi cards is basically nonexistent so it stays stuck at the lowest perf level, which is why it ends up slower than the igpu. that's exactly what makes OP's approach cool - it actually uses the proprietary nvidia driver and just copies frames to the framebuffer, so you get real gpu performance instead of nouveau's crippled output.
Shished@reddit
You should get rid of it, it is 15 years old.
NotSoEpicKebap@reddit (OP)
Downvoted this from my MSI CX640!
c3h7oh@reddit
Let me downvote your comment from my 2010 ThinkPad X201.
xz3phyr@reddit
folk not all of us are rich
Shished@reddit
You don't need to be rich to buy some actually usable stuff.
phantomzero@reddit
Your head is dangerously far up your own ass.
xz3phyr@reddit
not everyone lives in a first world country, hardware should be preserved as long as its usable for its purpose
TheSapphicDoll@reddit
??? what kind of silly take is that?
Consider: you just might not have budget to do this. Just not. because that is a thing. can happen.
PsyOmega@reddit
Optimus has some weird configs out there.
My T420, you can disable the dGPU, because the iGPU was the primary.
yet on my P51, the dGPU was the primary, and you could disable the iGPU.
NotSoEpicKebap@reddit (OP)
I've never seen an Optimus device with the discrete card as primary, nor have i heard of one.
PsyOmega@reddit
Thinkpad P51 (and likely P50, P52, P53). Now you have.
NotSoEpicKebap@reddit (OP)
Cool device you got there mate
Drwankingstein@reddit
does cosmic not handle these devices properly?
NotSoEpicKebap@reddit (OP)
The issue my program fixes here is not the way the discrete card is power managed but rather the way it's output is copied to the actual display.
On muxless systems, the discrete card is not physically connected to the display. Thus requiring cooperation with the integrated card in order to actually output stuff to it.
xz3phyr@reddit
holy shit i really needed this
Fabulous_Tea_322@reddit
B. L
opa_brass@reddit
Did you forget to share the link to the project or is it deemed not ready yet?
NotSoEpicKebap@reddit (OP)
It's barely optimized right now + input is not working.