[PATCH 1/4] d3d9/tests: Test more formats in srgbtexture_test().

Paul Gofman gofmanp at gmail.com
Tue Jan 21 14:17:05 CST 2020


On 1/21/20 19:52, Henri Verbeet wrote:
>
> Not for d3d9 + EXT_texture_sRGB_decode, no. In that case textures
> supporting sRGB reads are created with the sRGB format, and sRGB read
> conversion is disabled unless the appropriate sampler state is
> enabled.

I additionally verified that the game works the same way on Windows with
Nvidia by recording apitrace. So it works the same way under Wine and
there are very little chances that we can make it to avoid this format
in some sensible way on Nvidia other than faking AMD card through registry.

It looks to me when the texture storage is allocated in
wined3d_texture_gl_prepare_texture() format_gl->srgb_internal is used
only if wined3d_texture_gl_prepare_texture() is called with srgb
parameter set to TRUE, which happens only if texture loaded to SRGB
location, which happens only if texture is being bound as SRGB, which
should happen only if SRGB is enabled for sampler (sampler() state in
state.c controls that). If I did not miss something important here, the
unfortunate _SRGB8 GL format may only affect things for d3d9 if
application is actually trying to use 16 bit format as SRGB. It might
not be true for binding shader resources for d3d10+, but in this case
maybe we can fix up WINED3DFMT_FLAG_SRGB_READ for 16 bit formats based
on WINED3D_SRGB_READ_WRITE_CONTROL wined3d creation flag?

The other way it should be possible to introduce an option for some d3d
formats to perform SRGB read without using GL SRGB internal format by
explicit conversion in GLSL shader. This would add a GL shader setting
and require appropriate changes through format table building and SRGB
flag handing.




More information about the wine-devel mailing list