[PATCH 03/10] wined3d: Use GL_DEPTH_COMPONENT16 for 16 bit depth texture formats.
Matteo Bruni
matteo.mystral at gmail.com
Wed Dec 6 07:58:20 CST 2017
2017-12-06 11:00 GMT+01:00 Józef Kucia <jkucia at codeweavers.com>:
> From: Matteo Bruni <mbruni at codeweavers.com>
>
> Signed-off-by: Józef Kucia <jkucia at codeweavers.com>
> - {WINED3DFMT_D16_UNORM, GL_DEPTH_COMPONENT24_ARB, GL_DEPTH_COMPONENT24_ARB, 0,
> + {WINED3DFMT_D16_UNORM, GL_DEPTH_COMPONENT16, GL_DEPTH_COMPONENT16, 0,
FTR this has always been "wrong" as far as git history goes. I suspect
this was intentional and that the original reason is moot at this
point.
I can think of two possible motivations: working around depth buffer
precision issues, probably due to the different depth range in d3d vs
OpenGL, and avoiding color / depth buffer bit depth mismatch which was
not supported in some old Nvidia GPUs. The former should be fixed by
ARB_clip_control and the latter is something that d3d applications
also have to take care of, which means that we shouldn't need to do
anything special.
Anyway:
Signed-off-by: Matteo Bruni <mbruni at codeweavers.com>
More information about the wine-devel
mailing list