Speed of gdi vs opengl DirectDrawRenderer
stefandoesinger at gmail.com
Fri Feb 20 03:19:31 CST 2015
-----BEGIN PGP SIGNED MESSAGE-----
Am 2015-02-19 um 21:12 schrieb Jonas Maebe:
> I profiled wine (Instruments time profile) and noticed that most
> time was spent in wined3d's convert_r5g6b5_x8r8g8b8. Replacing
> that routine with an optimized sse2 version from pixman did not
> make much of a difference.
Is there a demo version of this game somewhere? I am reworking the d3d
blitting code at the moment, and it seems like this would be an
interesting game to look at.
If the game does video memory r5g6b5 to video memory x8r8g8b8 blits
then we should be able to do this on the GPU. It's quite possible
however that this is either a system memory to system memory blit, in
which case doing this in the CPU is the correct thing. If the game
uploads from a system memory r5g6b5 to a video memory x8r8g8b8 texture
then we can in theory let OpenGL do the work via glTexSubImage2D, but
it'll mostly mean that OpenGL converts the data using the CPU before
sending it to the GPU.
This may be a game bug - you say the game has troubles on Windows too.
It may also be a bug in our modesetting code, in the sense that the
game sets the display format to r5g6b5, but we stay at x8r8g8b8
because X11 (and I think OSX, Ken correct me please) can't switch the
color depth. Ideally OpenGL takes care of the resulting conversion,
but that's not always the case.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2
-----END PGP SIGNATURE-----
More information about the wine-devel