So for the last … maybe 6 years I’ve been using an Acer Aspire E5-571G, which back when I got it was quite a nice gaming setup. Nowadays it’s a bit weak, but for my intents and purposes it’s still good, so until it actually breaks down I’ll stick with it. Besides, it’s red.
The specs:
System: Host: kirika.eregion.home Kernel: 4.12.14-lp151.28.44-default x86_64 bits: 64 Console: tty 0 Distro: openSUSE Leap 15.1 Machine: Type: Laptop System: Acer product: NC-E5-571G-53GC v: V1.32 serial: NXMM0EG0014180FA073400 Mobo: Acer model: EA50_HB v: V1.32 serial: Type2BoardSerialNumber UEFI: Insyde v: 1.32 date: 09/15/2015 Battery: ID-1: BAT1 charge: 54.7 Wh condition: 54.7/55.9 Wh (98%) CPU: Dual Core: Intel Core i5-4210U type: MT MCP speed: 2394 MHz min/max: 800/2700 MHz Graphics: Device-1: Intel Haswell-ULT Integrated Graphics driver: i915 v: kernel Device-2: NVIDIA GM108M [GeForce 840M] driver: nvidia v: 440.82
After maybe three years I had upgraded it from rotating rust to a 1Gig SSD, and doubled the memory, so it’s still a pretty decent setup – even with the kind of wimpy NVIDIA 840M, since the screen only does 1366×768… but since it’s “only” a 15.6″ screen, it still gives you 100DPI – and that’s actually better than my desktop system, which only gets me 82DPI at full hd resolution.
Anyway, the issue at hand here’s the fact that the laptop has two graphics cards. One Intel integrated, and the NVIDIA. On linux, that means you can either configure your system to use one of them for everything and ignore the other one completely, which gives you the choice between no 3D performance or no battery life, or you can use the linux version of what NVIDIA has dubbed NVIDIA optimus, which in short means you use the intel chip for everything that doesn’t need 3D performance, and use the NVIDIA chip for games.
Until some while ago on linux you’d do this with bumblebeed (to power off the NVIDIA card when you don’t need it), and either optirun or primusrun as wrapper for all apps that need decent OpenGL… so you’d have to put “primusrun %command%” as launch parameters in steam for every game that needs the NVIDIA card. And this is where the problems start: there are two such wrappers, optirun and primusrun. And of those only optirun works with NVIDIA drivers past version 418.113. But optirun does not work with steam – or at least I haven’t gotten it to work. Also, old drivers = old security holes = not a good idea. Add to that mix that bumblebeed has not been updated at all in the last seven years, and you have the potential for all kind of trouble…
But enough of the past, enter the present: NVIDIA prime render offloading. In short, it means the same thing as the whole optimus/bumblebeed affair: The desktop and all regular apps use the intel card, and apps that use OpenGL and need actual performance get offloaded to the NVIDIA card.
Here’s what you need to do to get this to work on openSUSE:
- wait for openSUSE Leap 15.2, or run openSUSE Tumbleweed…
If that’s not an option for you, follow these steps:
- uninstall the nvidia-bumblebee* packages with “zypper rm -u nvidia-bumblebee nvidia-bumblebee-32bit”
- blacklist the nvidia-bumblebee* packages with “zypper al nvidia-bumblebee nvidia-bumblebee-32bit”
- uninstall bumblebee, bbswitch and primus with “zypper rm -u bumblebee bbswitch primus”
- add and activate the openSUSE nvidia repository (it’s an option in YaST)
- install the right nvidia drivers for your card, in my case the G05 drivers are right – YMMV, I actually have no idea if this works with cards that can’t use the G05 driverset…
- add the X11:Xorg repo for 15.1:
“wget -P /etc/zypp/repos.d https://download.opensuse.org/repositories/X11:/XOrg/openSUSE_Leap_15.1/X11:XOrg.repo” - make sure that repo has priority 98, autorefresh on, and is enabled, with “zypper lr” it should look like this:
9 | X11_XOrg | Yes | (r ) Yes | Yes | 98 - Upgrade your whole X11 to the versions from the X11:Xorg repo with “zypper dup -l –allow-vendor-change –allow-arch-change –recommends –allow-downgrade”
- install suse-prime: “zypper in suse-prime”
- reboot
- log in on the graphical environment, and open a shell, in which you’ll run “sudo prime-select intel”
- log out of the graphical environment, and log back in on it
- open a shell, and verify that “xrandr –listproviders” produces output that shows two providers, the intel card and the nvidia card.
Now, whenever you want to run an app with the NVIDIA card all you have to do is set __NVIDIA_PRIME_RENDER_OFFLOAD=1 and __GLX_VENDOR_LIBRARY_NAME=nvidia, for example like this:
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxinfo | grep "OpenGL renderer"
in fact, this command should produce only one line with the name of your NVIDIA card in it if all goes well.
Optionally you can install “plasma5-applet-suse-prime” which is a Plasma widget that lets you switch between the intel and the nvidia card, but with render offloading set up you don’t really want to switch – you just want to offload single apps.
One last thing: this whole process is one of the reasons why I have set up my systems based on LVM – I do have a snapshot of my root from just before I did all this, but I don’t have to suffer all the pain with BTRFS…
Have a lot of fun!