fenix at club-internet.fr
Thu Dec 22 02:23:36 CST 2005
On Thursday 22 December 2005 03:29, Aric Cyr wrote:
> Tom Spear <speeddymon <at> gmail.com> writes:
> > Aric Cyr wrote:
> > >>>I took a look at the D3D_OK hack, and I believe the problem to be
> > >>>CheckDeviceFormat in wined3d/directx.c. This function should return
> > >>> an error if
> > >>>D3DFMT_D32 is checked for on cards which don't support 32bit depth.
> > >>>Currently
> > >>>it just returns OK for most formats though. This code is really just
> > >>> a stub as
> > >>>it stands, and needs to be converted to check if there are any visuals
> > >>> that meet
> > >>>the requested format's requirements, and if there is, return D3D_OK,
> > >>>otherwise
> > >>>D3D_NOTAVAILBLE.
> > >>>
> > >>>To test my theory, try returning D3D_NOTAVAILABLE for the D3DFMT_D32
> > >>> case (you'll need to add it) in wined3d/directx.c in
> > >>>IWineD3DImpl_CheckDeviceFormat()
> > >>>and see if that fixes that issue or not. This would be just another
> > >>> hack though, so a real patch would still be necessary as I decribed
> > >>> above.
> > Well I took a stab at adding the case for D3DFMT_D32, to the bottom of
> > the other cases, and let it return the D3DERR_NOTAVAILABLE (as opposed
> > to D3D_NOTAVAILABLE), and ran the benchmarks again.. Now it finishes
> > the first one and then goes to do the second one, but crashes in a
> > different spot, so it seems we also have some stack corrupion (as was
> > mentioned in the bug).. So that hack works for now, I would suggest
> > that since the rest of that code is stubbed out, we should probably go
> > ahead and submit a patch so we can at least run the darn thing and then
> > start debugging the stack corruption issue.
> Thanks for testing this out. You have proved my theory correct, so I'll
> see about making a patch which will correct CheckDeviceFormat(). Basically
> that whole function needs a re-write, so I'd rather do it that way than to
> put this hack in there. Especially since, I assume, this problem is not
> present on nVidia systems, only ATI.
No, as x11 only handle 24 bpp buffers (instead of 32), GLX always reports 24
bpp on such cards (for depth buffers) and many games want 16 or 32 Depth
Buffers (only few support 24 bits)
for exemple in mine card, i have a lot of config as:
id dep cl sp sz l ci b ro r g b a bf th cl r g b a ns b eat
0x21 24 tc 0 32 0 r y . 8 8 8 0 4 24 8 16 16 16 16 0 0 None
0x67 24 dc 0 32 0 r . . 8 8 8 0 4 24 8 16 16 16 16 4 1 Ncon
always 24-bpp for Depth Buffer for 32-bpp Pixel Buffer (stupid limitation ?)
I hate X11 limitations :(
Maybe one day we will have a native alpha support and 32-bpp for buffers on X
> > I should add that on the first run, I disabled the title screens between
> > benchmarks, and changed the "Display and CPU Settings" so that I was
> > using 32-bit textures and triple buffering, and it ran thru several of
> > the tests, while on the 2nd and 3rd runs, I left all settings at
> > defaults; during run 2, it died just after the title screen for mark #2,
> > and during run 3, it died in the middle of mark #2...
> If I'm not mistaken, doesn't 3DMark change resolutions between benchmark
> and title screens? If so, it is possible, and quite likely that there is a
> resolution change bug in wine. If I recall, I had similar crashing
> problems with World of Warcraft if I tried to change resolutions from
Wine cannot change resolution "in-game".
as created window and associated context are managed by x11drv init
we cannot handle this case (and never manage case you want to change bpp
because of x11 limitations).
Anyway i plan (when i'll have time, and when ddraw will use wined3d) to
rethink x11drv glx init to only init glx window context on need (so in future
only for opengl32 and wined3d) using wanted rendering options.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Size: 189 bytes
Desc: not available
Url : http://www.winehq.org/pipermail/wine-devel/attachments/20051222/3f1f0a51/attachment-0001.pgp
More information about the wine-devel