[PATCH 09/10] d3d10/effect: Use correct vertex shader element when returning input signature.

Matteo Bruni matteo.mystral at gmail.com
Fri Oct 15 04:50:51 CDT 2021


On Thu, Oct 14, 2021 at 8:42 AM Nikolay Sivov <nsivov at codeweavers.com> wrote:
>
> Signed-off-by: Nikolay Sivov <nsivov at codeweavers.com>
> ---
>  dlls/d3d10/effect.c       | 12 +++++++-----
>  dlls/d3d10/tests/effect.c |  7 +++++++
>  2 files changed, 14 insertions(+), 5 deletions(-)

> @@ -3611,6 +3612,12 @@ if (0)
>      ok(typedesc.UnpackedSize == 0x0, "UnpackedSize is %#x, expected 0x0\n", typedesc.UnpackedSize);
>      ok(typedesc.Stride == 0x0, "Stride is %#x, expected 0x0\n", typedesc.Stride);
>
> +    /* Get input signature from vertex shader set from array element. */
> +    hr = p->lpVtbl->GetDesc(p, &pass_desc);
> +    ok(hr == S_OK, "Unexpected hr %#x.\n", hr);
> +    ok(!!pass_desc.pIAInputSignature, "Expected input signature.\n");
> +    ok(pass_desc.IAInputSignatureSize == 88, "Unexpected input signature size.\n");

It would be nice to extend this test to make sure that the correct
array element was returned (probably by using reflection).



More information about the wine-devel mailing list