Bug 1103816 - nvidia-prime for openSUSE
nvidia-prime for openSUSE
Status: RESOLVED FIXED
: 1091828 (view as bug list)
Classification: openSUSE
Product: openSUSE Tumbleweed
Classification: openSUSE
Component: X11 3rd Party Driver
Current
x86-64 Other
: P5 - None : Enhancement (vote)
: ---
Assigned To: Michal Srb
Stefan Dirsch
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2018-08-05 06:53 UTC by Mauro Gaspari
Modified: 2018-12-18 05:42 UTC (History)
6 users (show)

See Also:
Found By: ---
Services Priority:
Business Priority:
Blocker: ---
Marketing QA Status: ---
IT Deployment: ---


Attachments
nvidia prime gui on kubuntu (78.03 KB, image/png)
2018-08-06 09:37 UTC, Mauro Gaspari
Details
driver manager on kubuntu (48.65 KB, image/png)
2018-08-06 09:40 UTC, Mauro Gaspari
Details
autobind GPUs to the screen, (v5) (3.99 KB, patch)
2018-08-15 11:58 UTC, Michal Srb
Details | Diff
Xorg.0.log (46.30 KB, text/plain)
2018-08-22 12:16 UTC, Stefan Dirsch
Details
Xorg.8.log (16.08 KB, text/plain)
2018-10-11 21:22 UTC, Damian Zaręba
Details

Note You need to log in before you can comment on or make changes to this bug.
Description Mauro Gaspari 2018-08-05 06:53:15 UTC
Hello,

I know this is a very old topic and was discussed on forums many times. My only goal here is to help improve adoption of openSUSE on the desktop. I hope AMD Ryzen APUs will take over soon and eliminate the issue for everyone on GNU/Linux. Unfortunately, as of today, Nvidia-Optimus based laptops are still the majority out there. 

I have fiddled with this optimus issue for quite a while. I tried openSUSE, Arch, Ubuntu(and derivatives).
In my humble opinion, as of today, the one distro that managed to get the best out of this largely unsupported optimus, is Ubuntu. I know Nvidia-Prime is actually a dumber switching method compared to bumblebee. However, because it is a dumb switch, it is also true that it works fine out of the box, it is easy for a regular desktop user to understand and operate.

Basically, nvidia-prime exposes only one GPU at a time to the OS. therefore all software, games, steam, even wine apps, do not need any special parameter to choose the GPU. they just use the only one that is enabled, as if it was a regular desktop PC with a single GPU. Performance is really good too.
And, as I said, this is really easy and straightforward for any user that either comes from other OS, or never had to deal with the infamous optimus on GNU/Linux.
The main drawback is that switching between proprietary Nvidia and Intel, needs a system reboot. That's a bit of a bummer but I believe performance and simplicity advantages greatly out-weight this annoyance.

There are some issues with nvidia-prime but those are being actively developed. As this example of a bug, Alberto Milone introduced some really nice fixes that made nvidia-prime even better, use less power, and switch faster. I submitted some feedback on this bug, showing power consumption, GPU switching speed, and reboot times. https://bugs.launchpad.net/ubuntu/bionic/+source/gdm3/+bug/1778011

I hope the openSUSE team considers this as an alternative to bumblebee.

Best Regards
Mauro
Comment 1 Michal Srb 2018-08-06 07:34:50 UTC
I do not have much practical experience with optimus laptops, since I do not own such hardware, so please correct me if I am wrong about something. However I know the graphical stack and how it is supposed to work.

At this moment, you should be able to use both GPUs out-of-the-box and either:
* If nvidia GPU is primary, then intel should be automatically configured for additional outputs. In this case everything renders on nvidia. Power consumption is higher.
* If intel GPU is primary, then nvidia should be automatically configured for render offloading. In this case, you can use DRI_PRIME variable to select individual applications for render offloading. If you do not offload anything to nvidia, it should be idle and power consumption should be low.

The primary GPU is by default the boot GPU, i.e. the one selected by BIOS. On some machines this can configured, on some not. You can select different primary GPU by creating configuration file for X server.

If I understand it correctly, nvidia-prime is set of scripts that modifies the configuration files to change which GPU is primary. That sounds like a fine tool to have, although ideally users would not need to do that.

Is there any other aspect of nvidia-prime that I am missing?
Comment 2 Mauro Gaspari 2018-08-06 09:37:59 UTC
Created attachment 778973 [details]
nvidia prime gui on kubuntu
Comment 3 Mauro Gaspari 2018-08-06 09:38:43 UTC
Hello Michal,

I am happy to provide some background on this optimus chip and currently used solutions by various distributions, including openSUSE.

Background:
Basically, around 2010 Nvidia released optimus chipsets, supported on windows, with intel gpu and nvidia gpu. Nvidia software uses intel GPU to render the video outputs. Then, when an application that is GPU intensive, such as games, 3d design software etc, Nvidia drivers know that, and allow the Nvidia GPU to render that software and pass info to the intel GPU that manages the desktop.
This is largely unsupported on linux even after so many years, leading Linus Torvalds himself to express his thoughts on Nvidia in a video available on the net. the video explains well how unsupported the solution was.

After a while, a project called bumblebee came out, providing similar features, minus the automatic choice of GPU to render 3d. User on GNU/Linux must manually launch software using "optirun" as prefix of the app to run it using Nvidia GPU ie: "optirun glxgears".
bumblebee webpage: https://www.bumblebee-project.org/ 
It seems last version, coincidentally called tumbleweed, was released in 2013. I see no further updates there. However I am not sure if distribution developers update the code themselves.

Bumblebee:
Bumblebee, while being as close as nvidia solution for windows, is not officially supported, and is a bit hit and miss. performance is not always good, it add some lag to the visual output, and straight up does not work on some apps/games. However it is still the default in most distributions. openSUSE and Arch use it as their primary solution to tackle the optimus issue.

Nvidia prime:
Canonical and Nvidia recently worked on an more supported solution called Nvidia PRIME. To undestand how it works, i point you to a few additional resources:

Nvidia Post explaining prime and primesync
https://devtalk.nvidia.com/default/topic/957814/linux/prime-and-prime-synchronization/

Gentoo Wiki does a good job explaining prime
https://wiki.gentoo.org/wiki/NVIDIA/Optimus

Ubuntu nvidia-prime package
https://packages.ubuntu.com/bionic/nvidia-prime

nvidia-prime is now the default on Ubuntu and Ubuntu-Based distributions, it is actively developed and provides excellent performance and power savings. An user chooses which card to use for rendering in the Nvidia GUI, reboots if there was a change, and from there on the OS always uses that GPU even across reboots. So if you choose to use the intel GPU, it will be the GPU used forever, until you use the cli or gui to change to Nvidia GPU. Also, PRIME sync seems to be the next step that will improve the solution even more. Some other distributions outside Ubuntu do clone this, for example Fedora has a clone, openSUSE has an unofficial package called Suse-prime. In the Suse-prime case however, as far as I know it is not officially supported by openSUSE, and I think it was made for 42.3, and should not work on latest Leap or Tumbleweed. https://software.opensuse.org/package/suse-prime

I am attaching a couple of screenshots I took from my Kubuntu 18.04 laptop. All a user has to do to have a fully functional Nvidia Optimus laptop is:
1. Install the OS using "nomodeset" parameter in grub
2. Reboot and use "nomodeset" parameter in grub only for first boot
3. go to Driver manager in Ubuntu, switch from nouveau display driver to Nvidia display driver.
4. Reboot. 
5. Enjoy

Sorry, it was a very long post. I hope it does a fair job explaining why I created this request. I am available to discuss this further. I do have a few laptops that use optimus and I am happy to run tests if needed.
Comment 4 Mauro Gaspari 2018-08-06 09:40:14 UTC
Created attachment 778974 [details]
driver manager on kubuntu
Comment 5 Michal Srb 2018-08-06 11:29:15 UTC
Thank you for the overview. I am familiar with the PRIME infrastructure. I made few fixes for it in X server back when it was new. It is present upstream for a while now and also in openSUSE. Note that it is not specific to nvidia, many other drivers support it. I am personally using it for intel+udl and amdgpu+udl combinations.

I am trying to understand why that is not enough for the optimus laptops. What do the scripts in nvidia-prime actually do. (Since you certainly don't need them to use prime.)

So far it seems that they are there for easier configuration. Which I am certainly in favor of having. Especially if users want to switch the primary GPU regularly. Although we would need to adapt it.

For example the prime-offload script. That script is run at the start of a graphical session. It runs the xrandr tool multiple times to query and setup output sourcing between multiple GPUs. We don't need that, because we have a patch in X server that does same auto-configuration directly from within. Our patch also handles render offloading configuration automatically. In my opinion that is cleaner solution.

The screenshot in comment 2 is nvidia-settings. The "PRIME Profiles" tab is not normally there, in Ubuntu they added it with their custom patch "08_add_prime_support.patch". In background it runs the prime-select script. YaST would be the natural place for such graphical configuration in openSUSE.

Can you please describe how openSUSE behaves out-of-the-box on optimus laptop at this moment? And what functionality is missing? Is it just the switching of primary GPU, or something more?
Comment 6 Mauro Gaspari 2018-08-06 15:17:04 UTC
Michal,

As far as I understand from openSUSE wiki, the default in openSUSE is to use bumblebee, not prime.
SDB:NVIDIA Bumblebee for Optimus hardware (common on laptops with Intel chipsets)
https://en.opensuse.org/SDB:NVIDIA_Bumblebee

I found some forum posts and git pages about suse-prime but most of those refer to leap 42.x and multiple replies confirm it is no longer working and has not been updated.https://software.opensuse.org/package/suse-prime
Comment 7 Michal Srb 2018-08-13 11:37:19 UTC
I got my hands on optimus laptop and did some testing.

Unfortunately I was not able to test with nouveau, because it fails to initialize on that laptop. I plan to look into that, but it is separate bug. I expect that most owners of optimus laptops will want to use the nvidia proprietary driver anyway to get the maximal possible performance.

With the nvidia proprietary driver I made two observations:
* Nvidia driver does not support render offloading! And we can not do anything about that, Nvidia would have to implement it. After searching around online I see that this is common knowledge among optimus laptop owners, but I did not know that. This is the key point that explains why one can not configure the Windows-like setup where everything runs on intel GPU and only selected applications sometimes run on nvidia GPU. Without render offloading, one can either run everything on intel or everything on nvidia. Switching between these two setups require changes to X configuration. The nvidia-prime scripts serve to switch the configuration automatically.

* When nvidia is used, the output sourcing from intel is not configured automatically even that it should. I debugged this and it is because our auto-configuration patch bails because nvidia does not setup xf86CrtcConfig for the main screen. The output sourcing can be set up externally using xrandr and the script in nvidia-prime can be used to do that on every start. I started working on alternative patch that will be able to auto-configure nvidia from within X server as well, but it is not ready yet.

I will work on getting the nvidia-prime into distribution. (With the exception of the auto-configuration, which should be done by X server internally.) Then it should be documented in the wiki as a possible alternative to bumblebee.
Comment 8 Mauro Gaspari 2018-08-13 12:22:48 UTC
Hello Michal,

That is great news, thanks for the effort. I do have a nvidia optimus based laptop, a Gigabyte AERO14. I was unable to install tumbleweed a few months ago. I think it was due to a problem with skylake cpus, I later found it documented on Tumbleweed wiki.
After I failed with Tumbleweed, I tried Arch and I managed to install, but using bumblebee+nvidia proprietary I also had issues with rendering 3d apps and games.

For now I settled for Kubuntu on that laptop as nvidia-prime works well there. It is far from a windows-like solution, but at least it does its job, with good performance. 
I took a full image with clonezilla and I can be available to help testing and troubleshooting when needed.
I do have a few days a week when I do not need to use that laptop so I can focus the efforts and revert back to Kubuntu until Tumbleweed becomes stable with nvidia-prime.
I will also try to see if I can find another spare optimus based laptop I can keep on Tumbleweed and help with testing.

If you need my help with testing but do not want to constantly write on this bug tracker, please PM me.

Thanks!
Mauro
Comment 9 Michal Srb 2018-08-15 11:58:41 UTC
Created attachment 779786 [details]
autobind GPUs to the screen, (v5)

Here is updated auto configuration patch that is able to auto-configure both the opensource and nvidia proprietary drivers. A xorg-x11-server package is built with it here:
https://build.opensuse.org/project/show/home:michalsrb:optimus-experiments

No need to test anything at the moment.
Comment 10 Michal Srb 2018-08-15 14:36:02 UTC
One more thing that the prime-select script does is switching the /usr/lib64/xorg/modules/extensions/libglx.so alternative between the one from X server and the one from NVidia.

With the introduction of server-side GLVND this shouldn't be necessary anymore. Server-side GLVND is supported by X server 1.20.0+ (already in Tumbleweed) and reportedly by NVidia driver 396.24+ (not yet in Tumbleweed). Once we have both in Tumbleweed, we should stop using update-alternatives for GLX and use server-side GLVND instead.

Then the only thing needed to switch between NVidia/Intel will be picking one of two X configs.
Comment 11 Stefan Dirsch 2018-08-15 15:34:14 UTC
Ah. Cool. I wasn't aware that NVIDIA already supports server-side GLVND. Unfortunately 396.xx is the current short lived branch, which I usually don't package. So this will still need some time before we can switch ...
Comment 13 Michal Srb 2018-08-16 09:50:39 UTC
Hm. I tried local build of the 396 and it seems I was wrong. It does not appear to support server-side GLVND. So we would keep the alternatives switching for now.
Comment 14 Stefan Dirsch 2018-08-16 10:06:27 UTC
(In reply to Michal Srb from comment #13)
> Hm. I tried local build of the 396 and it seems I was wrong. It does not
> appear to support server-side GLVND. So we would keep the alternatives
> switching for now.

Hmm. Maybe you need to set some option (check README). Not sure, why you thought NVIDIA would support this already. Has this been announced somewhere?
Comment 15 Michal Srb 2018-08-17 12:20:20 UTC
(In reply to Stefan Dirsch from comment #14)
> Hmm. Maybe you need to set some option (check README). Not sure, why you
> thought NVIDIA would support this already. Has this been announced somewhere?

I saw announcement that 396 supports the 1.20.0 ABI and saw multiple GLVND related options in the help of the .run file. But upon closer inspection they are all client-GLVND related. No server-side GLVND support yet.


I took the suse-prime scripts and removed things that are no longer needed:
https://github.com/michalsrb/SUSEPrime
https://build.opensuse.org/package/show/home:michalsrb:optimus-experiments/suse-prime

It works for me now. With xorg-x11-server and suse-prime from that repository I can easily switch between intel and nvidia.

As root:
  prime-select nvidia
  systemctl restart display-manager

As user in session:
  xrandr --listproviders
    Providers: number : 2
    Provider 0: ... associated providers: 1; name: NVIDIA-0
    Provider 1: ... associated providers: 1; name: Intel
  glxinfo | grep 'OpenGL renderer string'
    OpenGL renderer string: GeForce GT 640M LE/PCIe/SSE2

As root:
  prime-select intel
  systemctl restart display-manager

As user in session:
  xrandr --listproviders
    Providers: number : 1
    Provider 0: ... associated providers: 0; name: modesetting
  glxinfo | grep 'OpenGL renderer string'
    OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile


Mauro, can you please test the xorg-x11-server and suse-prime from the https://download.opensuse.org/repositories/home:/michalsrb:/optimus-experiments/openSUSE_Tumbleweed/ repository?
Comment 16 Mauro Gaspari 2018-08-18 12:55:57 UTC
Michal,

I installed Tumbleweed on my optimus laptop (Gigabyte AERO14 v7 i7-7700 Nvidia 1050ti).
As often happens on my optimus laptops and linux graphical installers, installer gets frozen during startup. Below find the steps I followed during installation:

1. Add "nomodeset" at boot parameters, installed tumbleweed via text based installer
2. Update system, using zypper ref and zypper dup
3. Reboot
4. Add optimus-experiments repo : https://download.opensuse.org/repositories/home:/michalsrb:/optimus-experiments/openSUSE_Tumbleweed/
5. Install prime packages: zypper in optimus-experiments:suse-prime optimus-experiments:xorg-x11-server
5b. An error reports unable to find Nvidia drivers package nvidia-glG04
5c. Add nvidia community repository from yast.
5d.  Install prime packages: zypper in optimus-experiments:suse-prime optimus-experiments:xorg-x11-server works (noticed the nvidia-glG04 and also nouveau packages being pulled and installed).
6. remove nomodeset from grub boot parameters
7. Run prime-select nvidia
8. Reboot
9. SDDM fails to load, Systemd logs show this line: sddm-greeter[1854]: segfault at 28 ip 00007f5d99bcdb44 sp 00007fff603de4b8 error 4 in libnvidia-glsi.so.390.77[7f5d99b76000+7c000]
10. ctrl+alt+f2 drop to terminal, login, run prime-select intel and reboot
11. system boots with in intel mode, sddm and kde work fine.

so, for now intel mode works fine, but I am unable to switch to nvidia mode. Please let me know if further system logs or other info are needed from my side. Also please check if the installation procedure is correct.

Thanks
Mauro
Comment 17 Stefan Dirsch 2018-08-22 12:13:27 UTC
Michal, tried this on our brand new Dell Precision 5520 laptop on Tumbleweed and your updated xorg-x11-server and suse-prime packages + our G04 nvidia packages

Graphics hardware:
00:02.0 VGA compatible controller: Intel Corporation HD Graphics 530 (rev 06)
01:00.0 3D controller: NVIDIA Corporation GM107GLM [Quadro M1200 Mobile] (rev a2)

# prime-select intel
# systemctl restart display-manager

Looks good!
# xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 0; name: modesetting
    output eDP-1
    output DP-1
    output HDMI-1
    output DP-2
    output HDMI-2

# glxinfo | grep 'OpenGL renderer string'
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2) 

# prime-select nvidia
# systemctl restart display-manager

=>> Blank screen

# DISPLAY=:0.0
# xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x219; cap: 0x1 (Source Output); crtcs: 0; outputs: 0; associated providers: 0; name: NVIDIA-0

# glxinfo | grep 'OpenGL renderer string'
OpenGL renderer string: Quadro M1200/PCIe/SSE2

I will attach X logfile when switching to nvidia.
Comment 18 Stefan Dirsch 2018-08-22 12:16:40 UTC
Created attachment 780409 [details]
Xorg.0.log

X logfile when using nvidia prime configuration.
Comment 19 Michal Srb 2018-08-22 12:40:45 UTC
Thank you Mauro and Stefan for testing! I will focus on the Stefan's case first, since it looks clearer.

I just noticed that the /etc/prime/xorg.conf (which is copied to /etc/X11/xorg.conf.d/90-nvidia.conf when nvidia is selected) is set to use the intel driver for the intel GPU.

Stefan, can you please try to edit it to use modesetting? Also can you try with the original and with xf86-video-intel installed?
Comment 20 Stefan Dirsch 2018-08-22 13:19:44 UTC
Indeed with modesetting things look better.

xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x29f; cap: 0x1 (Source Output); crtcs: 0; outputs: 0; associated providers: 1; name: NVIDIA-0
Provider 1: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 1; name: modesetting
    output eDP-1-1
    output DP-1-1
    output HDMI-1-1
    output DP-1-2
    output HDMI-1-2

# glxinfo|grep "OpenGL renderer string"
OpenGL renderer string: Quadro M1200/PCIe/SSE2

Unfortunately screen remains blank. :-(

The original intel setting with xf86-video-intel installed didn't change anything to the original problem.
Comment 22 Damian Zaręba 2018-09-13 14:21:30 UTC
(In reply to Stefan Dirsch from comment #20)
> Indeed with modesetting things look better.
> 
> xrandr --listproviders
> Providers: number : 2
> Provider 0: id: 0x29f; cap: 0x1 (Source Output); crtcs: 0; outputs: 0;
> associated providers: 1; name: NVIDIA-0
> Provider 1: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload,
> Sink Offload); crtcs: 3; outputs: 5; associated providers: 1; name:
> modesetting
>     output eDP-1-1
>     output DP-1-1
>     output HDMI-1-1
>     output DP-1-2
>     output HDMI-1-2
> 
> # glxinfo|grep "OpenGL renderer string"
> OpenGL renderer string: Quadro M1200/PCIe/SSE2
> 
> Unfortunately screen remains blank. :-(
> 
> The original intel setting with xf86-video-intel installed didn't change
> anything to the original problem.
Try with adding:
xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto
to the .xinitrc or startup script of login manager like lightdm or GDM and check if it helps.
Comment 23 Stefan Dirsch 2018-09-14 13:39:56 UTC
Ok. Made another test today with out Dell Precision 5510, the predecessor of 5520 (same Intel/NVIDIA GPU combo AFAIK) and there I get without intel X driver installed:

# prime-select intel
# xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 0; name: modesetting
    output eDP-1
    output DP-1
    output HDMI-1
    output DP-2
    output HDMI-2
# glxinfo |grep 'OpenGL rend'
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2) 

--> Wonderful (as before)

# prime-select nvidia
# xrandr --listproviders 
Providers: number : 1
Provider 0: id: 0x219; cap: 0x1 (Source Output); crtcs: 0; outputs: 0; associated providers: 0; name: NVIDIA-0

So this time even no modesetting driver loaded at all when nvidia driver is loaded. :-( So results are even worse this time. And I can't give Damian's suggestions a try obviously. And of course I also get a black screen in NVIDIA mode. Sigh.

Damian, seems you have better results on your hardware, right?
Comment 24 Damian Zaręba 2018-09-14 15:58:57 UTC
(In reply to Stefan Dirsch from comment #23)
> Ok. Made another test today with out Dell Precision 5510, the predecessor of
> 5520 (same Intel/NVIDIA GPU combo AFAIK) and there I get without intel X
> driver installed:
> 
> # prime-select intel
> # xrandr --listproviders
> Providers: number : 1
> Provider 0: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload,
> Sink Offload); crtcs: 3; outputs: 5; associated providers: 0; name:
> modesetting
>     output eDP-1
>     output DP-1
>     output HDMI-1
>     output DP-2
>     output HDMI-2
> # glxinfo |grep 'OpenGL rend'
> OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2) 
> 
> --> Wonderful (as before)
> 
> # prime-select nvidia
> # xrandr --listproviders 
> Providers: number : 1
> Provider 0: id: 0x219; cap: 0x1 (Source Output); crtcs: 0; outputs: 0;
> associated providers: 0; name: NVIDIA-0
> 
> So this time even no modesetting driver loaded at all when nvidia driver is
> loaded. :-( So results are even worse this time. And I can't give Damian's
> suggestions a try obviously. And of course I also get a black screen in
> NVIDIA mode. Sigh.
> 
> Damian, seems you have better results on your hardware, right?

Notatki really, I have black screen without NVIDIA PRIME, so we can try with original Ubuntu(In reply to Stefan Dirsch from comment #23)
> Ok. Made another test today with out Dell Precision 5510, the predecessor of
> 5520 (same Intel/NVIDIA GPU combo AFAIK) and there I get without intel X
> driver installed:
> 
> # prime-select intel
> # xrandr --listproviders
> Providers: number : 1
> Provider 0: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload,
> Sink Offload); crtcs: 3; outputs: 5; associated providers: 0; name:
> modesetting
>     output eDP-1
>     output DP-1
>     output HDMI-1
>     output DP-2
>     output HDMI-2
> # glxinfo |grep 'OpenGL rend'
> OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2) 
> 
> --> Wonderful (as before)
> 
> # prime-select nvidia
> # xrandr --listproviders 
> Providers: number : 1
> Provider 0: id: 0x219; cap: 0x1 (Source Output); crtcs: 0; outputs: 0;
> associated providers: 0; name: NVIDIA-0
> 
> So this time even no modesetting driver loaded at all when nvidia driver is
> loaded. :-( So results are even worse this time. And I can't give Damian's
> suggestions a try obviously. And of course I also get a black screen in
> NVIDIA mode. Sigh.
> 
> Damian, seems you have better results on your hardware, right?

Notatki really. I have Dell Precision 5510 too. I have black screen when trying to use without suse-prime. I think we should get it nowy from fedora but from Ubuntu directly or focus on primus-vk instead of full switching oraz look how nvidia-xrun did it, mix with primus-vk and bumblebee then we have perfect switching, just like DRI_PRIME.
Comment 25 Stefan Dirsch 2018-09-15 10:09:44 UTC
You lost me. No idea what you mean exactly with "Notatki", "Original Ubuntu", "nowy", "primus-vk" and "oraz" ...
Comment 26 Damian Zaręba 2018-09-15 10:16:55 UTC
(In reply to Stefan Dirsch from comment #25)
> You lost me. No idea what you mean exactly with "Notatki", "Original
> Ubuntu", "nowy", "primus-vk" and "oraz" ...

Sorry, polish autocorrect on a phone. "Notatki" meant "Not", "Original Ubuntu" meant original PRIME implementation from Ubuntu, "nowy" meant "not" and "oraz" meant "or". This is Primus-vk - https://github.com/felixdoerre/primus_vk
Comment 27 Michal Srb 2018-10-04 15:01:58 UTC
I've put more time into investigating the black screen situation: To my surprise it seems that desktop environments and display managers don't do the equivalent of `xrandr --auto` on start. KDE seems to be the only exception. So they just use the initial monitor setup from X server. Normally that is fine, because X server auto-configures all connected monitors.

However, this auto-configuration is done only for the monitors directly connected to the main GPU, not for the monitors connected to additional GPUs. That's why the screen remains black.

I am trying to extend our GPU auto-configuration patch to also auto-configure the outputs of the auto-configured GPUs. It is difficult since the internal auto-configuration in X server does not go thru randr, but basically creates a "virtual" xorg.conf with the required configuration and lets driver interpret it. But AFAIK one can not set outputs from different providers using xorg.conf. I'll keep trying...
Comment 28 Stefan Dirsch 2018-10-05 12:27:31 UTC
Thanks for the update, Michal. I'll try again with an additional "xrandr --auto" once I have the Dell Precision 5520 back from Oliver. Looks like it won't work with Dell 5510 (see comment#23). BTW, with which laptop are you testing?
Comment 29 Michal Srb 2018-10-05 12:35:59 UTC
(In reply to Stefan Dirsch from comment #28)
> BTW, with which laptop are you testing?

It is s laptop I privately borrowed. It is Sony Vaio SVS15117FLB. It has Nvidia GeForce GT 640M LE.
Comment 30 Mauro Gaspari 2018-10-08 07:21:06 UTC
I have been doing some tests on a few optimus laptops, with help from a friend. this is what I gathered so far:

ubuntu on nvidia-prime works but also struggling with some power issues, it appears that the team is working hard on some upgrades and so far it seems they work for some people, but result in black screens for others.
https://bugs.launchpad.net/bugs/1778011

arch with nvidia x-run seems a decent option, a friend also using Sony SVS laptop had good results with it. external monitors work fine too. The only issue is that it lacks the "easy switch tools" so he needs to manually start a new x session from command line. 
With some tool to switch the session, perhaps in yast or even better at the login screen, it could be a good alternative. Something like the login available at fedora and ubuntu, that allows user to choose login wayland or x. if we could have a menu for login with intel and login with nvidia that would make it really neat.
If users could choose the x-session at login, it would have a few benefits:
1. no need to reboot to switch card.
2. no need to switch from command line.
3. if something goes wrong while switching to nvidia, it won't display a black window on reboot but sddm(or gdm, lightdm) would still be available. the user could then choose login to intel x session, and have a usable desktop.
Also a switching tool in yast would work if easier to implement.

If you want me to run some tests on nvidia x-run on arch, I can do that. to run those on tumbleweed, I might need some help to adapt the documentation from arch to tumbleweed.

cheers
Mauro
Comment 31 Michal Srb 2018-10-11 09:05:00 UTC
Stefan, can you please retest with updated xorg-x11-server and suse-prime from the https://build.opensuse.org/project/show/home:michalsrb:optimus-experiments repository?

I have improved the auto-configuration patch in X server to configure all outputs of the additional GPUs and also modified the way suse-prime modifies the X configuration.

The prime-select script switches the GLX extension and adds 90-intel.conf or 90-nvidia.conf to /etc/X11/xorg.conf.d. No other changes are necessary. Make sure there is no other /etc/X11/xorg.conf or /etc/X11/xorg.conf.d/* file with Device/Screen/ScreenLayout sections as they could conflict with the configuration from prime-select. It should work fine on top of clean installation.

The modesetting driver is used for the Intel GPU, so xf86-video-intel does not have to be installed.

It worked fine for me with GDM, SDDM, LightDM and XDM and also with Plasma, Gnome shell, icewm and twm.

Please test as described in comment 15.
Comment 32 Stefan Dirsch 2018-10-11 10:44:22 UTC
Michal, great stuff! Works perfectly for me on my Dell Inspiron 5510 here (Tumbleweed, XFCE).

NVIDIA
---------
# prime-select nvidia
Providers: number : 2
Provider 0: id: 0x29f; cap: 0x1 (Source Output); crtcs: 0; outputs: 0; associated providers: 1; name: NVIDIA-0
Provider 1: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 1; name: modesetting
    output eDP-1-1
    output DP-1-1
    output HDMI-1-1
    output DP-1-2
    output HDMI-1-2

# glxinfo |grep "OpenGL renderer"
OpenGL renderer string: Quadro M1000M/PCIe/SSE2

INTEL
--------
# prime-select intel

# xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 0; name: modesetting
    output eDP-1
    output DP-1
    output HDMI-1
    output DP-2
    output HDMI-2

# glxinfo |grep "OpenGL renderer"
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2)
Comment 33 Michal Srb 2018-10-11 12:11:05 UTC
Great! I am glad that it works.

I sent email to Bo Simonsen (original author of the suse-prime scripts) asking if he is interested in getting the changes back to his repository.

I could submit the suse-prime and xorg-x11-server to X11:XOrg and from there to Factory and Tumbleweed. Then I can update the opensuse wiki to document how to use this.

We could go bit further and make it selectable in display manager if we decide that it is worth it and we have the time. I would do it like this:
* Add option to xorg.conf to select the libglx implementation, so we can switch between xorg-libglx and nvidia-libglx without having to change the symlink on disk.
* The intel-only setup would be the default and the nvidia+intel setup would be selected by launching X server with alternative configuration. (X -config ...)
* Add intel/nvidia selection to display managers, at least to SDDM and GDM. So the display manager itself would always use the intel GPU, but it would be able to launch user sessions using either the intel or the intel+nvidia setup. All it needs to do is start X with alternative configuration.
Comment 34 Damian Zaręba 2018-10-11 19:13:21 UTC
Will It be configurable to use Intel driver instead of modesetting driver? Because Modesetting gives me a huge tearing and acceleration problems on mine Intel i7-6820HQ machine. I will try this out right now and post results here.
Comment 35 Michal Srb 2018-10-11 19:16:25 UTC
(In reply to Damian Zaręba from comment #34)
> Will It be configurable to use Intel driver instead of modesetting driver?
> Because Modesetting gives me a huge tearing and acceleration problems on
> mine Intel i7-6820HQ machine. I will try this out right now and post results
> here.

Yes, you would just need to change "modesetting" to "intel" in both /prime/xorg-intel.conf and /prime/xorg-nvidia.conf. It should work the same with it.
Comment 36 Damian Zaręba 2018-10-11 21:22:41 UTC
Created attachment 785791 [details]
Xorg.8.log

Unfortunately It don't work for me - still uses Intel, checked with glxinfo. I have Dell Precision 5510 with Quadro M1000M dGPU. System generates only Xorg.8.log, which is weird, because I don't have bumblebee on system at all. I've added that Xorg.8.log as attachment.
Comment 37 Stefan Dirsch 2018-10-12 03:47:25 UTC
(In reply to Damian Zaręba from comment #36)
> Created attachment 785791 [details]
> Xorg.8.log
> 
> Unfortunately It don't work for me - still uses Intel, checked with glxinfo.
> I have Dell Precision 5510 with Quadro M1000M dGPU. System generates only
> Xorg.8.log, which is weird, because I don't have bumblebee on system at all.
> I've added that Xorg.8.log as attachment.

... iommu=on intel_iommu=on,igfx_off i915.enable_guc=3 rdblacklist=nouveau,nvidia nouveau.modeset=0 noibpb noibrs nopti nokpti threadirqs intel_pstate=skylake_hwp nmi.watchdog=0 spectre_v2=off scsi_mod.use_blk_mq=1 dm_mod.use_blk_mq=1

Please reduce your kernel options to the minimum. I was testing on my Inspiron 5510 with no additional kernel option.

We try to enhance things for TW. So if you don't use TW yet, things may not work for you for this reason.

[ 11906.312] (++) Using config file: "/etc/bumblebee/xorg.conf.nvidia"
[ 11906.312] (++) Using config directory: "/etc/bumblebee/xorg.conf.d"
[ 11906.312] (==) Using system config directory "/usr/share/X11/xorg.conf.d"

Apparently you still have Bumblebee installed. I doubt this is a good idea
and may mess things up.
Comment 38 Stefan Dirsch 2018-10-12 09:14:50 UTC
(In reply to Michal Srb from comment #33)
> I sent email to Bo Simonsen (original author of the suse-prime scripts)
> asking if he is interested in getting the changes back to his repository.

Thanks a lot!

> I could submit the suse-prime and xorg-x11-server to X11:XOrg and from there
> to Factory and Tumbleweed. Then I can update the opensuse wiki to document
> how to use this.

Yes! Please go ahead! :-)

> We could go bit further and make it selectable in display manager if we
> decide that it is worth it and we have the time. I would do it like this:
> * Add option to xorg.conf to select the libglx implementation, so we can
> switch between xorg-libglx and nvidia-libglx without having to change the
> symlink on disk.

Not sure how this is going work. Rename (also internally) both glx modules? I would assume
you simply cannot easily rename NVIDIA's glx module internally.

> * The intel-only setup would be the default and the nvidia+intel setup would
> be selected by launching X server with alternative configuration. (X -config
> ...)

Hmm. This makes things rather complicated, possibly even more complicated than
using symlinks.

> * Add intel/nvidia selection to display managers, at least to SDDM and GDM.
> So the display manager itself would always use the intel GPU, but it would
> be able to launch user sessions using either the intel or the intel+nvidia
> setup. All it needs to do is start X with alternative configuration.

Hmm. I would assume that if intel+nvidia works within a full Xsession, it would
also work in the greeter.
Comment 39 Swamp Workflow Management 2018-10-12 11:40:12 UTC
This is an autogenerated message for OBS integration:
This bug (1103816) was mentioned in
https://build.opensuse.org/request/show/641644 Factory / suse-prime
Comment 40 Swamp Workflow Management 2018-10-12 15:00:07 UTC
This is an autogenerated message for OBS integration:
This bug (1103816) was mentioned in
https://build.opensuse.org/request/show/641701 Factory / suse-prime
Comment 41 Mauro Gaspari 2018-10-14 06:05:02 UTC
Hello,

Great Work guys, this looks great. Also sorry for slow reply on this. I tested and it all works great on my Gigabyte AERO14 v7, i7-7700HQ and Nvidia gtx 1050ti
HDMI output also works fine on both intel and nvidia mode. Please see results below, including some tests with glxgears with vsync disabled.



--- NVIDIA MODE ---
xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x251; cap: 0x1 (Source Output); crtcs: 0; outputs: 0; associated providers: 1; name: NVIDIA-0
Provider 1: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 1; name: modesetting
    output eDP-1-1
    output DP-1-1
    output HDMI-1-1
    output DP-1-2
    output DP-1-3

glxinfo | grep 'OpenGL renderer string'
OpenGL renderer string: GeForce GTX 1050 Ti/PCIe/SSE2

vblank_mode=0 glxgears
95011 frames in 5.0 seconds = 19002.146 FPS
95405 frames in 5.0 seconds = 19080.961 FPS
96071 frames in 5.0 seconds = 19214.139 FPS
95391 frames in 5.0 seconds = 19078.059 FPS



--- INTEL MODE ---

xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 0; name: modesetting
    output eDP-1
    output DP-1
    output HDMI-1
    output DP-2
    output DP-3
    
glxinfo | grep 'OpenGL renderer string'
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 630 (Kaby Lake GT2) 

vblank_mode=0 glxgears
44186 frames in 5.0 seconds = 8837.184 FPS
42856 frames in 5.0 seconds = 8571.031 FPS
45093 frames in 5.0 seconds = 9018.593 FPS
44705 frames in 5.0 seconds = 8940.845 FPS
Comment 42 Swamp Workflow Management 2018-10-16 08:40:07 UTC
This is an autogenerated message for OBS integration:
This bug (1103816) was mentioned in
https://build.opensuse.org/request/show/642211 Factory / suse-prime
Comment 43 Stefan Dirsch 2018-10-17 09:48:57 UTC
Michal, also works fine for me on my Dell Inspiron *5520* here (Tumbleweed, XFCE).

NVIDIA
---------
# prime-select nvidia
# xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x29f; cap: 0x1 (Source Output); crtcs: 0; outputs: 0; associated providers: 1; name: NVIDIA-0
Provider 1: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 1; name: modesetting
    output eDP-1-1
    output DP-1-1
    output HDMI-1-1
    output DP-1-2
    output HDMI-1-2

# glxinfo |grep "OpenGL ren"
OpenGL renderer string: Quadro M1200/PCIe/SSE2

INTEL
---------
# xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 0; name: modesetting
    output eDP-1
    output DP-1
    output HDMI-1
    output DP-2
    output HDMI-2

# glxinfo |grep "OpenGL ren"
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2)
Comment 44 Stefan Dirsch 2018-10-19 12:29:17 UTC
*** Bug 1091828 has been marked as a duplicate of this bug. ***
Comment 45 Andrei Amuraritei 2018-10-24 19:14:47 UTC
Hi,

I can confirm this works for me also with the following:

openSUSE Tumbleweed on Acer Aspire V15 (v5-591G) with Intel HD Graphics 530 and NVIDIA GM107M [GeForce GTX 950M].

Have tried both with the modesetting driver and xf86-video-intel and also with x11-video-nvidiaG05 and x11-video-nvidiaG04.

Thanks for this Michael. Any news / progress on display manager GPU select or change as detailed in #33?
Comment 46 Michal Srb 2018-10-25 11:43:41 UTC
(In reply to Andrei Amuraritei from comment #45)
> Thanks for this Michael. Any news / progress on display manager GPU select
> or change as detailed in #33?

I don't think I will have time to work on that. Hopefully Nvidia will implement their server-side GLVND some time next year and that will make all these workarounds obsolete.

My plan now is to wait for the suse-prime package to get accepted into Factory, then I'll document it in openSUSE wiki and then close this bug.
Comment 51 Mauro Gaspari 2018-11-25 12:57:04 UTC
(In reply to Michal Srb from comment #46)
> (In reply to Andrei Amuraritei from comment #45)
> > Thanks for this Michael. Any news / progress on display manager GPU select
> > or change as detailed in #33?
> 
> I don't think I will have time to work on that. Hopefully Nvidia will
> implement their server-side GLVND some time next year and that will make all
> these workarounds obsolete.
> 
> My plan now is to wait for the suse-prime package to get accepted into
> Factory, then I'll document it in openSUSE wiki and then close this bug.

Michal, I have a quick question for you.
Until the suse-prime gets accepted in Factory and available in Tumbleweed, do you encourage us to use your nvidia experiments repository and use suse-prime in production, report bugs, breaks etc? Or is it better to wait until we get it in Tumbleweed?

Thanks
Mauro
Comment 52 Michal Srb 2018-11-26 08:01:55 UTC
(In reply to Mauro Gaspari from comment #51)
> Until the suse-prime gets accepted in Factory and available in Tumbleweed,
> do you encourage us to use your nvidia experiments repository and use
> suse-prime in production, report bugs, breaks etc?

The package should be (hopefully soon) in Factory/Tumbleweed, so feel free to use it as it is in my repo. Bug reports are welcome - please assign them directly to me. But consider it bleeding-edge, just like the rest of Tumbleweed.
Comment 53 Stefan Dirsch 2018-11-26 11:06:05 UTC
Seems suse-prime is already in openSUSE:Factory.

# osc ls openSUSE:Factory suse-prime
SUSEPrime-0.2.tar.gz
suse-prime.changes
suse-prime.spec

Not sure when it will be in TW though. Possibly this needs to be requested in addition. I'll take care of this.
Comment 54 Stefan Dirsch 2018-11-26 13:25:59 UTC
Oh. suse-prime is already in tumbleweed since Nov 13.
Comment 55 Stefan Dirsch 2018-11-28 16:18:52 UTC
(In reply to Michal Srb from comment #46)
> ... then I'll document it in openSUSE wiki and then close this bug.

So that would be next. ;-)
Comment 56 Michal Srb 2018-11-29 12:23:50 UTC
Documented: https://en.opensuse.org/SDB:NVIDIA_SUSE_Prime
Closing the bug.
Comment 57 Stefan Dirsch 2018-11-29 13:07:46 UTC
Thanks a lot, Michael. Looks great!
Comment 58 Mauro Gaspari 2018-12-18 05:42:38 UTC
I wish to add some info.

My Optimus laptop has an intel i7-7700HQ and Nvidia GTX 1050Ti. OpenSUSE Tumbleweed installation freezes while loading the green bars at the bottom of the screen. Actually this is not an OpenSUSE issue. Same issue happens with any Linux Live DVD or graphical installer. The lockup does not happen with text-based installers, such as Arch or Gentoo.

I thought to contribute a bit by documenting installation procedure, hopefully it helps someone. Feel free to correct any typo, edit as needed, add this to OpenSUSE documentation or post anywhere it might be relevant.

1. Boot Laptop with OpenSUSE Tumbleweed live USB/DVD.
2a. If you are installing using UEFI, move down to select "installation", press "e" to edit grub entry called "installation". Find the line that starts with "linuxefi /boot/vmlinuzxxx", move after "splash=silent" and add the following (without brackets) "acpi=off". Press F10 to boot. Note that the installer no longer hangs.
2b. If you are installing using legacy BIOS boot, move down to select "installation", press "F5", move down to select "No ACPI", press enter to confirm. Note below "F5 Kernel" now it shows "NO ACPI". Hit enter to begin installation. Note that the installer no longer hangs.
3. Install OpenSUSE Tumbleweed.
4. After installation completes, reboot, login, and update the system. Open a terminal, enter: "sudo zypper ref && sudo zypper dup -y". Complete updates, reboot the system.
5. Remove the No ACPI kernel option. After reboot, login, open a terminal and follow the instructions below:
sudo nano /etc/default/grub
find the line starting with "GRUB_CMDLINE_LINUX_DEFAULT=", delete the "acpi=off" entry, save the file (CTRL+o), exit nano (CTRL+x).
6. Update grub. Enter this command in a terminal to update grub: sudo grub2-mkconfig -o /boot/grub2/grub.cfg
7. Reboot system
8. Proceed with nvidia driver installation per official documentation: https://en.opensuse.org/SDB:NVIDIA_drivers
9. Proceed with suse-prime installation per official documentation: https://en.opensuse.org/SDB:NVIDIA_SUSE_Prime
10. Test. Reboot system, and test per instructions in suse-prime official documentation: https://en.opensuse.org/SDB:NVIDIA_SUSE_Prime