[PATCH 01/10] d3d10: Allocate buffers for effect interface local_buffers.
Matteo Bruni
matteo.mystral at gmail.com
Wed Jan 22 09:29:43 CST 2020
On Sat, Dec 7, 2019 at 7:23 PM Connor McAdams <conmanx360 at gmail.com> wrote:
> static HRESULT parse_fx10_local_buffer(const char *data, size_t data_size,
> const char **ptr, struct d3d10_effect_variable *l)
> {
> @@ -2282,6 +2329,12 @@ static HRESULT parse_fx10_local_buffer(const char *data, size_t data_size,
> TRACE("\tBasetype: %s.\n", debug_d3d10_shader_variable_type(l->type->basetype));
> TRACE("\tTypeclass: %s.\n", debug_d3d10_shader_variable_class(l->type->type_class));
>
> + if (l->type->size_unpacked && l->type->size_packed)
> + {
> + if (FAILED(hr = create_variable_buffer(l, d3d10_cbuffer_type)))
> + return hr;
> + }
Mostly curious, is there any case where size_unpacked is non-zero but
size_packed is?
More information about the wine-devel
mailing list