[PATCH 2/5] winemac: Implement wglCreateContextAttribsARB.

Matteo Bruni matteo.mystral at gmail.com
Mon Jan 5 16:09:07 CST 2015


2015-01-05 21:58 GMT+01:00  <cdavis5x at gmail.com>:
>
>> On Jan 5, 2015, at 9:17 AM, Matteo Bruni <mbruni at codeweavers.com> wrote:
>>
>> As an aside, reported WGL extensions don't depend on the specific
>> GL context (e.g. WGL_ARB_pbuffer is reported as supported even on core
>> profile contexts).
> Do real Windows drivers behave like this?

Yep, both AMD and Nvidia.

>> @@ -1272,6 +1273,175 @@ static BOOL init_gl_info(void)
> […]
>> +/**********************************************************************
>> + *              create_context
>> + */
>> +static BOOL create_context(struct wgl_context *context, CGLContextObj share, BOOL core)
>> +{
> […]
>> +    attribs[n++] = kCGLPFAAuxBuffers;
>> +    attribs[n++] = pf->aux_buffers;
> You must reject any pixel format with >0 auxiliary buffers when creating a core profile context. CGL will specifically fail the ChoosePixelFormat (with error 10000, kCGLBadAttribute) if you specify both a GL version >= 3.2 and a non-zero number of auxiliary buffers.

Well, that's responsibility of the application. If the application
tries to create a core context on a pixel format with aux buffers the
context creation will indeed fail as expected.

>> +
>> +    attribs[n++] = kCGLPFAColorSize;
>> +    attribs[n++] = color_modes[pf->color_mode].color_bits;
>> +    attribs[n++] = kCGLPFAAlphaSize;
>> +    attribs[n++] = color_modes[pf->color_mode].alpha_bits;
>> +    if (color_modes[pf->color_mode].is_float)
>> +        attribs[n++] = kCGLPFAColorFloat;
>> +
>> +    attribs[n++] = kCGLPFADepthSize;
>> +    attribs[n++] = pf->depth_bits;
>> +
>> +    attribs[n++] = kCGLPFAStencilSize;
>> +    attribs[n++] = pf->stencil_bits;
>> +
>> +    if (pf->stereo)
>> +        attribs[n++] = kCGLPFAStereo;
>> +
>> +    if (pf->accum_mode)
>> +    {
>> +        attribs[n++] = kCGLPFAAccumSize;
>> +        attribs[n++] = color_modes[pf->accum_mode - 1].color_bits;
>> +    }
> You must also reject any pixel format with an accumulation buffer when creating a core profile context, for the same reason.

Same as above.

For pbuffers things are different because Windows has no problems
creating core contexts on pixel formats supporting rendering to
pbuffers. Actually, on my Windows boxes pretty much all the pixel
formats do support pbuffer rendering.

>> +
>> +    /* Explicitly requesting pbuffers in CGLChoosePixelFormat fails with core contexts. */
>> +    if (pf->pbuffer && !core)
>> +        attribs[n++] = kCGLPFAPBuffer;
>> +
>> +    if (pf->sample_buffers && pf->samples)
>> +    {
>> +        attribs[n++] = kCGLPFASampleBuffers;
>> +        attribs[n++] = pf->sample_buffers;
>> +        attribs[n++] = kCGLPFASamples;
>> +        attribs[n++] = pf->samples;
>> +    }
>> +
>> +    if (pf->backing_store)
>> +        attribs[n++] = kCGLPFABackingStore;
>> +
>> +    if (core)
>> +    {
>> +        attribs[n++] = kCGLPFAOpenGLProfile;
>> +        attribs[n++] = (int)kCGLOGLPVersion_3_2_Core;
>> +    }
> There’s a constant for requesting a 4.x core context, too. (But it’s only defined in the 10.9 and 10.10 SDKs.) You might consider using it if the requested version is >= 4.0. That way, creation will fail if the system doesn’t support it.

I ignored that constant for the time being but yes, I guess it would
make sense to make use of that. FWIW on my OS X box I get a 4.1
context back anyway.

>> +
>> +    attribs[n] = 0;
>> +
>> +    err = CGLChoosePixelFormat(attribs, &pix, &virtualScreens);
>> +    if (err != kCGLNoError || !pix)
>> +    {
>> +        WARN("CGLChoosePixelFormat() failed with error %d %s\n", err, CGLErrorString(err));
>> +        SetLastError(ERROR_INVALID_OPERATION);
> This is somewhat nitpicking, but one thing you might consider is setting the last error based on what CGL returned. For example, if you get the error kCGLBadAlloc, you could set the last error to ERROR_NO_SYSTEM_RESOURCES.

True, I took the easy way out by not mapping CGL errors to Win32 ones.
To tell the truth, I don't feel all that much inclined to add a
facility to map those errors just for this one function...

>> +        return FALSE;
>> +    }
>> +
>> +    err = CGLCreateContext(pix, share, &context->cglcontext);
>> +    CGLReleasePixelFormat(pix);
>> +    if (err != kCGLNoError || !context->cglcontext)
>> +    {
>> +        context->cglcontext = NULL;
>> +        WARN("CGLCreateContext() failed with error %d %s\n", err, CGLErrorString(err));
>> +        SetLastError(ERROR_INVALID_OPERATION);
> Ditto.
>> @@ -2076,6 +2246,133 @@ cant_match:
> […]
>> +/***********************************************************************
>> + *              macdrv_wglCreateContextAttribsARB
>> + *
>> + * WGL_ARB_create_context: wglCreateContextAttribsARB
>> + */
>> +static struct wgl_context *macdrv_wglCreateContextAttribsARB(HDC hdc,
>> +                                                             struct wgl_context *share_context,
>> +                                                             const int *attrib_list)
>> +{
> […]
>> +    if (major > 3 || (major == 3 && minor >= 2))
>> +    {
>> +        if (!(flags & WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB))
>> +        {
>> +            WARN("OS X only supports forward-compatible 3.2+ contexts\n");
>> +            SetLastError(ERROR_INVALID_VERSION_ARB);
>> +            return NULL;
>> +        }
> Just so you know, a side effect of this is that our GL 3.x tests get skipped here, because they don’t specify the FOWARD_COMPATIBLE bit.

Good to know, although core GL on OS X is really better exposed as a
forward-compatible context.

> Also, you should consider rejecting the DEBUG flag, if it’s set: OS X never returns that flag. (Or do you want to hook glGetInteger(3G) to return the debug flag if it’s set?)

AFAIK as long as the context doesn't support GL 4.3 or
ARB_debug_output the DEBUG flag doesn't have to have any effect.
The general idea here was to ignore the issue until Apple implements
this kind of functionality in its GL.

>> +        if (profile != WGL_CONTEXT_CORE_PROFILE_BIT_ARB)
>> +        {
>> +            WARN("Compatibility profiles for GL version >= 3.2 not supported\n");
>> +            SetLastError(ERROR_INVALID_PROFILE_ARB);
>> +            return NULL;
>> +        }
>> +        core = TRUE;
>> +    }
>> +    else if (major == 3)
>> +    {
>> +        WARN("OS X doesn't support 3.0 or 3.1 contexts\n");
>> +        SetLastError(ERROR_INVALID_VERSION_ARB);
>> +        return NULL;
>> +    }
> I think we can support requests for 3.1 contexts, if the FORWARD_COMPATIBLE bit is set; we just won’t advertise GL_ARB_compatibility.

Unfortunately not, because 3.2 deprecates MAX_VARYING_COMPONENTS and
MAX_VARYING_FLOATS.

> Chip
>
>
>



More information about the wine-devel mailing list