Bug 450110

Summary: [nvidia] Poor performances on second monitor with on-demand prime profile
Product: [Plasma] kwin Reporter: Arnaud <arnaud.vergnet>
Component: multi-screenAssignee: KWin default assignee <kwin-bugs-null>
Status: RESOLVED UPSTREAM    
Severity: normal CC: aleixpol, andre.vmatos, nate, notmart, stefan.hoffmeister, xaver.hugl
Priority: NOR    
Version First Reported In: 5.24.0   
Target Milestone: ---   
Platform: Other   
OS: Linux   
Latest Commit: Version Fixed/Implemented In:
Sentry Crash Report:

Description Arnaud 2022-02-12 23:00:32 UTC
SUMMARY

I have a Lenovo Y520 laptop with an integrated Intel GPU and a dedicated Nvidia GTX 1060 GPU. I am using the proprietary drivers with prime. When using the performance mode prime profile (Nvidia GPU always on), the second monitor works great. But when using the on-demand mode (Nvidia GPU wakes up on demand when starting apps with env variables), the second screen has very low fps.

I also noticed the GPU usage was way higher while on-demand mode was active: about 5% when idle on performance mode, and 20% with on-demand.

This is for X11 only as I was not able to even use the second monitor with wayland.

I don't know what information to provide to help, or if you can do anything at all as it is related to proprietary drivers.

STEPS TO REPRODUCE
1. Install nvidia proprietary drivers (apt install nvidia-driver-510)
2. Switch to on-demand mode (prime-select on-demand)
3. Reboot
4. Plug in second monitor

OBSERVED RESULT

Very poor performances with second monitor

EXPECTED RESULT

Second monitor should work as good as in performance mode.

SOFTWARE/OS VERSIONS
Operating System: KDE neon 5.24
KDE Plasma Version: 5.24.0
KDE Frameworks Version: 5.90.0
Qt Version: 5.15.3
Kernel Version: 5.13.0-28-generic (64-bit)
Graphics Platform: X11
Processors: 4 × Intel® Core™ i5-7300HQ CPU @ 2.50GHz
Memory: 15,5 GiB of RAM
Graphics Processor: NVIDIA GeForce GTX 1060 with Max-Q Design/PCIe/SSE2
NVIDIA driver: 510.47.03
Comment 1 Stefan Hoffmeister 2022-06-05 20:39:04 UTC
I think it would be very helpful to describe the physical connector setup you are running.

Case in point: On X11, I have a setup where
* Intel iGPU (only) controls internal display and HDMI
* Nvidia dGPU (only) controls the USB-C output path (i.e. DisplayPort Alternate Mode et al)
* Nvidia is in PRIME offload mode; Intel is primary

Now connect an external 4K screen to the Nvidia output path, i.e. USB-C / DisplayPort.

Now the *Xorg* process starts consuming between 25% and 40% of one CPU on an otherwise totally idle system.

Remove the 4K screen from the Nvidia output path, attach it to the HDMI port (Intel)

Now Xorg is totally fine.

Some sleuthing with perf suggests that all that CPU is burnt on getting the current system time (gettimeofday / clock_gettime) via vdso and kernel calls, with this originating from the Nvidia driver (510).
Comment 2 Arnaud 2022-06-05 21:38:42 UTC
My computer only has a single HDMI port. From what I could test, it seems this port is connected to the Nvidia dGPU because I cannot make the second monitor work with only the Intel iGPU powered on.
Comment 3 Zamundaaa 2022-11-30 12:32:19 UTC
On Xorg, Xorg and its driver modules are responsible for multi gpu support. Please report this to NVidia