Wine, fullscreen applications, and RandR 1.2

Andy Ritger aritger at nvidia.com
Wed Sep 5 11:52:11 CDT 2012


Thanks, Henri.

On Wed, Sep 05, 2012 at 01:34:47AM -0700, Henri Verbeet wrote:
> On 5 September 2012 08:07, Andy Ritger <aritger at nvidia.com> wrote:
> > Questions:
> >
> > * Looking at dlls/winex11.drv/xrandr.c, the first RandR CRTC/output's
> >   modelist is used to populate Wine's list of available modes.  Is the
> >   data flow between Wine and Windows applications always such that you
> >   need to advertise a list of (width, height, refreshRate)s?  Or would
> >   an application ever tell Wine what resolution it wants?
> >
> Windows applications use EnumDisplaySettingsEx() to query supported
> modes, and ChangeDisplaySettingsEx() to set one. Applications can't
> make up modes on their own.

Thanks for clarifying.

> > * Would you be open to patches to make dlls/winex11.drv/xrandr.c generate
> >   a larger set of (width, height, refreshRate)s, and then have
> >   xrandr12_set_current_mode() use RandR transformation matrix and Border
> >   property to satisfy those?  I was envisioning something where we take
> >   the "preferred" mode for the RandR output, and create all of the
> >   following resolutions using ViewPort{In,Out}:
> >
> >         1920 x 1200
> >         1920 x 1080
> >         1600 x 1200
> >         1280 x 1024
> >         1280 x 720
> >         1024 x 768
> >         800 x 600
> >         640 x 480
> >
> It's ultimately not up to me whether such a patch would be accepted,
> but it's not something I would be particularly happy about. I think
> the preferred way to handle this would be to generate the standard DMT
> etc. modes in the kernel, and use the "scaling mode" output property
> to control the scaling mode, pretty much like all the other drivers.

At first glance, I agree that would be easier for applications, but that
approach has some drawbacks:

* lies to the user/application about what timings are actually being
  driven to the monitor
* the above bullet causes confusion: the timings reported in the monitor's
  on screen display don't match what is reported by the X server
* user/application doesn't get complete control over what actual timings
  are being sent to the monitor
* does not provide the full flexibility of the hardware to configure,
  e.g., arbitrary position of the ViewPortOut within the active raster

I imagine counter arguments include:

* we already have the "scaling mode" output property in most drivers;
  that is good enough
* Transformation matrix and Border are too low level for most applications

For the first counter argument: I'm trying to make the case that
providing the full flexibility, and being truthful about modetimings to
users/applications, is valuable enough to merit a change (hopefully even
in the drivers that currently expose a "scaling mode" output property).

For the second counter argument: maybe the viewport configuration
belongs in a library rather than directly in applications like Wine.
In that case, I'd like to build a better understanding of Wine's needs
so that I could properly design the API to such a library.

> > * The current xrandr.c code picks the first CRTC/output, which may not
> >   be currently active.  At the least, it should scan for an active
> >   CRTC+output.  I imagine it would be even better if the user could
> >   configure which RandR output they want.  Would that be reasonable?  What
> >   mechanisms are available in Wine for users to provide runtime configuration?
> >
> The RandR primary display should be CRTC 0, output 0.

That is true most of the time, but I don't believe it is strictly mandated
by the RandR specification:

    http://cgit.freedesktop.org/xorg/proto/randrproto/tree/randrproto.txt

When the RandR primary output (as queried/set by RR[SG]etOutputPrimary)
is non-None, then its CRTC will be sorted to the front of the CRTCs
list reported by RRGetScreenResources{,Current}.  However, None is a
valid value for the primary output, in which case all bets are off wrt
CRTC/output sorting order in the RRGetScreenResources{,Current} reply.

Further, while RandR primary output seems like a reasonable default,
the spec spells out a focus on window manager (e.g., "primary" is where
the menu bar should be placed).  It seems like a valid use case would
be for the user to have his window manager primary output on one monitor,
but run his full screen Wine application on another monitor.  Given that,
would it be reasonable for the user to specify the RandR output he wants
Wine to use?

> Users can
> typically change this through xrandr or xorg.conf. Unfortunately not
> all drivers do something reasonable by default here, so we'll probably
> add code to pick the first connected display as Win32 primary instead
> if no primary is defined through RandR. For the moment we end up
> falling back to the older RandR version though, so at least the
> behaviour isn't any worse than before.
>
> > * From the current code, it does not look like Wine's RandR support tries
> >   to do anything with multiple simultaneous RandR outputs.
> >
> Yes, proper multihead support is something we still need to implement.
> There aren't a lot of Win32 applications that do something useful with
> multiple displays though, so it's not something that has a very high
> priority at the moment.

I can definitely believe that plumbing RandR outputs to multiple objects
in Win32 is not an important/compelling use case, since not many Win32
applications would do useful things with that.  What seems more useful,
though, is driving multiple RandR outputs and presenting that to Win32 as
a single big screen.  E.g., "immersive gaming" where your Wine application
spans two, three, or more RandR outputs (NVIDIA Kepler GPUs can have up
to four heads).

> >   Ironically:
> >   this actually works better with RandR 1.1 + NVIDIA: users can configure
> >   their MetaModes to describe what mode (plus viewport configuration)
> >   they want on each monitor, and then RandR 1.1 chooses the MetaMode.
> >
> No. With RandR 1.1 you get one big screen, and you can choose between
> getting fullscreen applications stretched across all your displays, or
> turning off all displays except one.

There were plenty of deficiencies in RandR 1.1 and NVIDIA's implementation
of it, but with MetaModes the user could configure whatever combinations
of displays he wanted.  You are right, though, that any displays not in
the MetaMode would get turned off when RandR 1.1 switches to it.

> What you actually want is for the
> application to be fullscreen on a specific display, or multiple
> displays if the application supports that, and leave everything else
> alone.

I agree that displays not used by Wine should be left alone, not turned
off.

But you said above that most Win32 applications don't natively take
advantage of multiple displays themselves.  I expect some users are
interested in having their Wine application fullscreen across multiple
displays.  Does that seem reasonable?

> As an aside, the fake refresh rates generated by "DynamicTwinView"
> aren't very helpful either. Some Win32 applications expect modes like
> 800x600 at 60Hz or 1024x768 at 60Hz to always exist, and just die if they
> don't.

Yes, the fake refresh rate reporting in DynamicTwinView was unfortunate,
but was necessary in RandR to distinguish between MetaModes of the
same size.  Now that we have RandR 1.2, hopefully we can move past that.

Thanks,
- Andy




More information about the wine-devel mailing list