Tricking program into seeing actual gfx driver not WINE's

Seth Shelnutt shelnutt2 at gmail.com
Fri Jul 4 23:17:09 CDT 2008


What options do I need to change in order to compile WINE with support for
the more GPU specific information?

Also when changing the following lines of code in order to change the output
of IWineD3DImpl_GetAdapterIdentifier to for now identify it as a 8800 GT
with 173 drivers, would the second lines of code be correct? I just want to
make sure "driver" actually means "driver" Which would be "nvidia 173.14"
and description simply the card correct?

        Adapters[0].driver = "Display";
        Adapters[0].description = "Direct3D HAL";



        Adapters[0].driver = "Nvidia 173.14";
        Adapters[0].description = "Nvidia 8800 GT";


Also if this is the case would it not be easy to simply grab the driver
version from the xserver, or atleast the xserver would give you the card and
brand, Nvidia 8800GT but I am not sure how to get specific driver
information. I'm looking for a command but glxinfo is only opengl info, and
I've yet to find anything else.

On Fri, Jul 4, 2008 at 8:45 AM, Stefan Dösinger <stefan at codeweavers.com>
wrote:

>  Actually we have quite a bit of code to tell the app more about the GPU
> and not just provide a generic wine one. This is needed because some games
> insist on a proper GPU PCI ID. We don't report and GPU-specific renderer
> strings yet, but that should be rather easy to add, if you look at the PCI
> ID reporting code. Currently you have to recompile for that, but you are
> welcome to send a patch that solves this problem in a generic way and send
> it to wine-patches.
>
>
>
> The more troublesome problem is that Wine does not have any CUDA support at
> this point. The Windows CUDA DLL will not make you happy, because it talks
> to the Windows hardware drivers. Thus we need an implementation of this
> cudart.dll which calls the Linux cuda cudart.so instead. (And then hope it
> works out)
>
>
>
> *From:* wine-devel-bounces at winehq.org [mailto:
> wine-devel-bounces at winehq.org] *On Behalf Of *Seth Shelnutt
> *Sent:* Thursday, July 03, 2008 10:24 PM
> *To:* wine-devel at winehq.org
> *Subject:* Tricking program into seeing actual gfx driver not WINE's
>
>
>
> Hello All,
>
> We have run into an interesting problem while trying to get the latest
> version of Stanford's Folding at Home GPU client to work in Linux via WINE.
> The programs says it does not detect a compatible GPU. Even when the user
> has installed the correct Nvidia drivers (with CUDA support)  and has a
> compatible GPU. The problem I believe lies in  the fact that the program is
> not told that there is a Nvidia 8800 installed, instead by the nature of
> WINE it see that  "WINE" is the graphics card, as WINE first translate the
> direct3d calls into opengl calls that are then passed on to the GPU. So the
> question is, is it possible to trick programs into believing they are
> running on the right hardware? (As in fact they are).
>
> I remember a while ago the steam system spec survey was used to see how
> many people run steam via WINE. This was done by noting the graphics driver
> installed and how the wine one appeared when running WINE. Well this is fine
> but what we need is a way to make the program to see that it is actually
> running on Nvidia hardware. Because if the client would just start then the
> direct3d calls can be translated into opengl calls and the Nvidia linux
> drivers can then handle them and run it all fine and dandy.
>
> Here is the post, with error message about wrong graphics card detected,
> http://www.ocforums.com/showpost.php?p=5698997&postcount=19 .
>
>
> Thanks,
>
> Seth Shelnutt
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.winehq.org/pipermail/wine-devel/attachments/20080705/ec9cac00/attachment-0001.htm 


More information about the wine-devel mailing list