[PATCH 0/1] MR132: opengl32: Don't prioritize low bit depth formats with non-matching stencil.
Paul Gofman (@gofman)
wine at gitlab.winehq.org
Mon Jun 20 11:08:35 CDT 2022
On Fri Jun 17 19:11:37 2022 +0000, Matteo Bruni wrote:
> I have tweaked / extended the tests a little further, see
> I only tested that on Nvidia for now, curious if they also pass with AMD
> with those changes.
I run the test on my AMD / Windows and added some traces / additional tests. I am attaching diff to the test (which includes your patch as well) and the output from AMD / Windows.
It seems that unfortunately 32/8 tests is a bit inconclusive here as somehow the output pixel format is 32 / 8 (honestly not sure what that means but that's what I see here; see no test failure on line 382 and trace output from line 381).
The rests of tests suggest that it prefers 24 bit whenever in doubt, see, e. g., 8x8 test, trace at line 364: it could choose 16x8 but preferred 24x8. From what I see it seems that the pattern is whenever stencil is requested it returns depths >= 24. That's what my current patch is doing. If we prioritize stencil formats when requested over depth match that will probably look more straightforward logic-wise but that would break all those tests if we'd tighten them to what is actually returned on AMD. Also, if specific games depending on the stencil choice are concerned, if we plainly prioritize stencil presence they will get 16 bit depth on Wine while getting 24 on Windows and that difference may matter (even if not break the thing completely like returning no stencil format).
What do you think?
More information about the wine-devel