[Bug 51420] Running any program in Wine causes 100% cpu usage in Xorg

WineHQ Bugzilla wine-bugs at winehq.org
Mon Dec 6 17:25:30 CST 2021


https://bugs.winehq.org/show_bug.cgi?id=51420

--- Comment #36 from Kurt Kartaltepe <kkartaltepe at gmail.com> ---
I feel like this post is the blind leading the deaf, the author of the
offending code has even responded on this thread, but not deigned to explain
the issue.

When nvidia drivers are detected wine falls back to xrandr 1.0/1.1 for querying
modes. As the author notes in their commit this is because they didn't like the
modes reported by newer xrandr versions for DVI connections.

This is why d171d1116764260f4ae272c69b54e5dfd13c6835 which introduces many more
queries to mode information to be a significant performance hit on nvidia with
DVI connections. And why reverting the 1.0 fallback also fixes it (as we get
the explicitly fast xrandr functions).

If you want to see if you are affected, well the first step is to see if you
have a DVI device. The second step is you could run the same queries wine does
and watch if your X11 session becomes unusable. For example `for x in $(seq 1
100); do xrandr --q1; done`, testing with a single DVI device shows the system
crippled and testing with a single DP devices shows no significant impact.

Matt's rebase of the revert looks fine to me. When I tested the actual revert
it also worked to avoid the excessive mode queries, as did removing the xrandr
1.0 fallback.

-- 
Do not reply to this email, post in Bugzilla using the
above URL to reply.
You are receiving this mail because:
You are watching all bug changes.



More information about the wine-bugs mailing list