X11, ATI and GDI Error

celticht32 at aol.com celticht32 at aol.com
Wed Jul 30 15:33:54 CDT 2008

While looking at bug 13335 ( http://bugs.winehq.org/show_bug.cgi?id=13335) which is An ATI and wine bug.
I noticed something in one of the updates :


Now this set me to searching  as I really didn't like the idea of wrapping the libGL calls with 'winmemory' allocations
(To many areas of corruption possibilities and just isn't an ellegant solution). The Debian post above points out something
very interesting :

" After more investigation, I was able to trace this to a bug in my OpenGL  
initialization logic.

The following (psuedocode) sequence, forces fglrx to revert to indirect  
dpy = XOpenDisplay()
visual = glXChooseVisual(dpy)
dpy2 = XOpenDisplay()
ctx = glXCreateContext(dpy2, visual) <-- fglrx reverts to indirect  

Changing the code to use the same display connection for glXChooseVisual  
and glXCreateContext solves this problem."

So looking at that I began to Grep the latest git tree  to find out where 

are used.  There are only 2 places (and yes I know they are critical places) where they are used.
opengl.c and x11drv_attach.c both in the winex11.drv directory. So I went through and started looking
at how they were called and discovered that  the display variable (dpy and dpy2 in the above pseudocode) is not 
reused at all. It is allocated off the local stack of the subroutine it is in. later on gdi_display which is the global
variable gets assigned display which calls the XOpenDisplay.  This gdi_display is used as the global handle 
if I am reading the code correctly.  

Now this is where my pointer knowledge goes south and I think this is right... and this is where the problem might be:

in the process_attach routine someone says gdi_display = display.  Display is a local pointer to the routine and will
go away when the function returns. 

Here is my theory :

with the ATI card being a pig for memory and Wine allocating alot of memory itself the pointer to the driver gets invalidated
somehow corrupted etc. and when it is used later X thinks its another display and everything happily crashes.

Mind you this is a theory right now but is this possible?  I think an easy fix to this would be to do the following instead:

gdi_display = XopenDisplay()'; This allocates the display structure off the global heap not the local heap
display = gdi_display;

and then leave the rest of the code alone.

Am I going down the right track with this? If so it would be alot simpler fix that some of the ones out there being forced around. And it should not
effect any of the other supported cards.



-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.winehq.org/pipermail/wine-devel/attachments/20080730/5acd4e17/attachment.htm 

More information about the wine-devel mailing list