[PATCH 4/9] d3d11/tests: Add test for depth bias.

Henri Verbeet hverbeet at gmail.com
Thu Dec 7 09:04:55 CST 2017


On 7 December 2017 at 14:44, Józef Kucia <jkucia at codeweavers.com> wrote:
> +    static const unsigned int bias_tests[] =
> +    {
> +        -10000, -1000, -100, -10, -9, -8, -7, -6, -5, -4, -3, -2, -1,
> +        1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 50, 100, 200, 500, 1000, 10000,
> +    };
I think it would be more appropriate for these to be signed integers.

> +    static const float quad_slopes[] =
> +    {
> +        0.0f, 0.5f, 1.0f
> +    };
...
> +                m = quad_slopes[i] / texture_desc.Height;
> +                m = sqrtf(m * m);
sqrtf(m * m) = fabsf(m) = m

Right?

> +                bias = rasterizer_desc.SlopeScaledDepthBias * m;
> +                get_texture_readback(texture, 0, &rb);
> +                for (y = 0; y < texture_desc.Height; ++y)
> +                {
> +                    depth = min(max(0.0f, depth_values[y] + bias), 1.0f);
> +                    switch (format)
> +                    {
> +                        case DXGI_FORMAT_D32_FLOAT:
> +                            data = get_readback_float(&rb, 0, y);
> +                            ok(compare_float(data, depth, 2),
> +                                    "Got depth %.8e, expected %.8e.\n", data, depth);
This needs a tolerance of 64 to pass here on Intel SKL.



More information about the wine-devel mailing list