Tricking program into seeing actual gfx driver not WINE's
stefan at codeweavers.com
Fri Jul 4 07:45:26 CDT 2008
Actually we have quite a bit of code to tell the app more about the GPU and
not just provide a generic wine one. This is needed because some games
insist on a proper GPU PCI ID. We don't report and GPU-specific renderer
strings yet, but that should be rather easy to add, if you look at the PCI
ID reporting code. Currently you have to recompile for that, but you are
welcome to send a patch that solves this problem in a generic way and send
it to wine-patches.
The more troublesome problem is that Wine does not have any CUDA support at
this point. The Windows CUDA DLL will not make you happy, because it talks
to the Windows hardware drivers. Thus we need an implementation of this
cudart.dll which calls the Linux cuda cudart.so instead. (And then hope it
From: wine-devel-bounces at winehq.org [mailto:wine-devel-bounces at winehq.org]
On Behalf Of Seth Shelnutt
Sent: Thursday, July 03, 2008 10:24 PM
To: wine-devel at winehq.org
Subject: Tricking program into seeing actual gfx driver not WINE's
We have run into an interesting problem while trying to get the latest
version of Stanford's Folding at Home GPU client to work in Linux via WINE.
The programs says it does not detect a compatible GPU. Even when the user
has installed the correct Nvidia drivers (with CUDA support) and has a
compatible GPU. The problem I believe lies in the fact that the program is
not told that there is a Nvidia 8800 installed, instead by the nature of
WINE it see that "WINE" is the graphics card, as WINE first translate the
direct3d calls into opengl calls that are then passed on to the GPU. So the
question is, is it possible to trick programs into believing they are
running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see how many
people run steam via WINE. This was done by noting the graphics driver
installed and how the wine one appeared when running WINE. Well this is fine
but what we need is a way to make the program to see that it is actually
running on Nvidia hardware. Because if the client would just start then the
direct3d calls can be translated into opengl calls and the Nvidia linux
drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card detected,
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the wine-devel