Recognize cards that expose GLSL 1.30 as DX10 capable even if they don't support EXT_GPU_SHADER4

Henri Verbeet hverbeet at gmail.com
Tue Jun 3 01:30:31 CDT 2014


On 3 June 2014 08:13, Stefan Dösinger <stefandoesinger at gmail.com> wrote:
> No, because without ARB_shader_texture_lod we only support SM 2.0,
> leading to inconsistencies between caps and PCI ID.
>
And without e.g. ARB_geometry_shader4 we can only do SM3. You'll
always have that issue unless you either use the computed D3D caps to
guess a card, or copy the code used to compute them.

> We can also decide that we don't care about such consistency. In this
> case the version logic can be removed entirely, or just used when we
> cannot match the GL renderer string and have to guess a GPU.
>
That's the only place where this function actually does something
anyway. There are calls to d3d_level_from_gl_info() in
select_card_nvidia_binary() and select_card_amd_binary() for historic
reasons, but at this point removing them wouldn't alter the behaviour
of those functions in a significant way. (Hypothetic future versions
of the proprietary drivers that remove extensions aside.)



More information about the wine-devel mailing list