Recognize cards that expose GLSL 1.30 as DX10 capable even if they don't support EXT_GPU_SHADER4

Andrei Slavoiu andrei.slavoiu at gmail.com
Mon Jun 2 11:51:15 CDT 2014


> On 27 May 2014 17:35, Stefan Dösinger <stefandoesinger at gmail.com> wrote:
> > No, I'd say use something like if (EXT_gpu_shader4 || (
> > ARB_shader_texture_lod && glsl_version >= 1.30).
> 
> That's just wrong, GLSL 1.30 + ARB_shader_texture_lod doesn't imply
> SM4, only SM3.
Actually, it does. No SM3 card can expose GLSL 1.30.

> On 28 May 2014 21:48, Andrei Slăvoiu <andrei.slavoiu at gmail.com> wrote:
> > Actually, d3d_level_from_gl_info is misleading (broken even), as it will
> > check for SM3 capability and then return that the card is capable of
> > directx 10
> EXT_gpu_shader4 adds (among others) support for "native" integers and
> bitwise operations. It can't be supported (in hardware) on a SM3 GPU.
> The specific extensions used in d3d_level_from_gl_info() are somewhat
> arbitrary, we could have used e.g. ARB_geometry_shader4 instead of
> EXT_gpu_shader4 here as well.
Like you say, the extensions are arbitrary, so why not use GLSL version as 
well? GLSL 1.30 also adds support for integers and bitwise operations. All 
functionality of EXT_gpu_shader4 is exposed by either GLSL 1.30 or 
ARB_shader_texture_lod.

> Note that the intention of the code this function is a part of is to
> come up with a somewhat reasonable card to report to the application
> in case of unrecognized GL implementations, while e.g.
> shader_glsl_get_caps() is meant to report actual capabilities. It
> would perhaps be nice to use the actual shader backend etc. caps for
> d3d_level_from_gl_info(), but note that that wouldn't necessarily be
> easy, or make the situation much better. For the Mesa case you mention
> in the original patch it would arguably make the situation worse,
> since we can't currently properly do shader model 4 on Mesa.
I'm not actually interested in using DirectX 10 (yet). The reason I started 
messing with this part of the code is that World of Warcraft renders complete 
garbage when the card is presented as Radeon 9500. Changing this to Radeon 
X1600 improves things a bit, only the background is broken, characters and 
menus appear fine. Finally, with Radeon HD 2900 the only broken rendering is 
with the shadows. Those remain broken even when using the PCI ID of my real 
card, a KAVERI, so it's probably a mesa bug.

The reason I prefer to improve the fallback instead of simply adding the PCI 
ID for my card to the list of known cards is that with the current model there 
will always be cards that are not recognized by wine and a good fallback will 
prevent the experience for a newbie being "nothing works, wine sucks".



More information about the wine-devel mailing list