[PATCH 01/10] d3d10: Allocate buffers for effect interface local_buffers.

Connor McAdams conmanx360 at gmail.com
Wed Jan 22 12:05:44 CST 2020


Not that I'm aware of, it's been a little while but I think since
there was two different size values, I was making sure that both
actually had a non-zero size. It would probably work fine basing
constant buffer creation on one or the other, and which one is
probably a matter of opinion. size_unpacked makes the most sense to
me, but it's up to you.

On Wed, Jan 22, 2020 at 10:29 AM Matteo Bruni <matteo.mystral at gmail.com> wrote:
>
> On Sat, Dec 7, 2019 at 7:23 PM Connor McAdams <conmanx360 at gmail.com> wrote:
>
> >  static HRESULT parse_fx10_local_buffer(const char *data, size_t data_size,
> >          const char **ptr, struct d3d10_effect_variable *l)
> >  {
> > @@ -2282,6 +2329,12 @@ static HRESULT parse_fx10_local_buffer(const char *data, size_t data_size,
> >      TRACE("\tBasetype: %s.\n", debug_d3d10_shader_variable_type(l->type->basetype));
> >      TRACE("\tTypeclass: %s.\n", debug_d3d10_shader_variable_class(l->type->type_class));
> >
> > +    if (l->type->size_unpacked && l->type->size_packed)
> > +    {
> > +        if (FAILED(hr = create_variable_buffer(l, d3d10_cbuffer_type)))
> > +            return hr;
> > +    }
>
> Mostly curious, is there any case where size_unpacked is non-zero but
> size_packed is?



More information about the wine-devel mailing list