[Bug 28723] Sound stutter in Rage when emulated windows version is set to "Windows 7" (XAudio2 -> mmdevapi sound output path)

wine-bugs at winehq.org wine-bugs at winehq.org
Wed Nov 2 20:59:24 CDT 2011


http://bugs.winehq.org/show_bug.cgi?id=28723

--- Comment #19 from Alexey Loukianov <mooroon2 at mail.ru> 2011-11-02 20:59:24 CDT ---
[offtopic]Austin, IMHO adding a remark about "the correct" version to set when
reporting bugs for Wine that had been built from git won't hurt.[/offtopic]

OK, I had tracked this bug down. Looks like this bug is caused by the XAudio2
usage scheme of the mmdevapi devices under certain circumstances. The "direct"
bug-trigger is the "DefaultPeriod" value that is reported to XA2 by
winealsa.drv mmdevdrv, but the actual bug cause it a bit more complicated. Lets
get a bit deeper into details.

Looks like the code execution flow is as following:
1. XA2 calls GetDevicePeriod() to get default and minimum period size for
selected Core Audio device.
2. Code from winealsa.drv/mmdevdrv.c handles the above call and returns
hard-coded values, namely 100000 and 50000 for wine-1.3.30-10-gb0652dd+.
3. XA2 calls Initialize() with "duration" set based on the value of
"DefaultPeriod" it had just retrieved and "period" set to 0 (request to use the
"default device period").
4. XA2 calls SetEventHandle to set an event it would use as "interrupt"
determining when to check for buffer padding value.
5. XA2 calls GetBuffer to fill in the buffer with the initial portion of audio
frames. 
6. XA2 waits for event to be signaled and calls GetCurrentPadding as soon as
event fires. In case retrieved padding indicates that there's enough free
buffer space to hold next 10ms of audioframes (441 for 44.1kHz samplerate) - it
pumps in 10ms audio data chunk.

First step on the road towards bug lies in p.p. 2 and 3 from the list above. As
long as DefaultPeriod exposed by AudioClient COM object instance is equal or
larger than 105000hns, XA2 requests shared buffer duration which is ~4x of the
DefaultPeriod duration. First call to GetBuffer (p.5 in the above list) fills
in 3/4 of the requested buffer duration. 

When the advertised default period is less than 105000hns (104999hns and lower)
XA2 switches into using duration buffer that is ~2x DefaultPeriod in length and
it fills in only half of the requested buffer on the first call to GetBuffer.

I would attach PDF document illustrating experimentally determined dependency
between the value of "DefaultPeriod" and the "duration" size that is requested
by XA2 shortly after posting this comment.

What happens next is winealsa.drv schedule a timer that fires once per
requested period time. Timer handler does nothing more than pumps as much data
as it can to ALSA and then fires the app-provided event (if available). It has
one important consequence: if an app (A) calls GetCurrentPadding/GetBuffer only
as a reaction to the fired event, (B) wants to pumps the data into Core Audio
device in 10ms chunks and (C) had requested pretty small buffer duration -
underruns are unavoidable. That's exactly the case with XA2 + winealsa mmdevdrv
with DefaultPeriod < 10.5ms. What we've got are roughly two event fires per
requested buffer duration and amount of frames that are played back by ALSA
between consequent event fires is about 10ms (actually it is roughly the amount
of frames that fit into DefaultPeriod duration). 

What we got as a result is the following sequence:
1. Let's assume that DefaultPeriod = 10ms. Buffer duration requested by XA2
would be 20ms, and XA2 would pump-in 10ms of data prior to starting up the
sound output.
2. winealsa.drv pumps available 10ms of data to ALSA and schedules timer to
fire once in every 10ms (+ scheduling delays).
3. 10ms+ passes by and timer callback is invoked. It founds (what a surprise!)
that ALSA had hit underrun and recovers it. Then it checks if any data is
available to pump into ALSA and founds that there's none. OK then, time to fire
up event.
4. XA2 handles fired event and determines that current padding is 0. Instead of
filling in the entire buffer (i.e. 20ms of data) it pumps in 10ms of audio
data.
5. Rinse, wash, repeat. Scheduling delays and the fact that underrun recovery
and data transfers require some random time brings in enough variability to the
process. Logs show that sometimes stars and moon disposition allow XA2 to fill
in the whole buffer, but the granularity of the fired events leaves no chance
for this condition to remain stable.

Now to the hard part: actually I have no idea about how to "properly" fix this
bug. What we want is to have finer granularity of the event signaling. One of
the ways to achieve it is to use snd_async_add_pcm_handler(), but I don't know
what are the possible consequences to receiving SIGIO callback in a Wine-driven
process (if that's possible at all).

Another possible way to workaround the problem is to use higher rate for timer.
I had tried to use 4x timer rate (used "This->mmdev_period_rt / 40000" it the
CreateTimerQueueTimer() call) and it "fixed" the bug. Obvious downside of such
approach is that we are relying on the timer scheduling granularity which might
be not fine enough to handle extra-small buffer case, and also we hog CPU at
least 4x times more when we use 4x timer fire rate.

A "quick" workaround would be to set DefaultPeriod to anything larger than
10.5ms. It would "automagically fix XA2+mmdevapi" making users happy but OTOH
wouldn't really fix the bug.

Andrew, I hope you would come up with a brilliant idea about what to do to fix
this pretty fun race condition in a "Right Way".

-- 
Configure bugmail: http://bugs.winehq.org/userprefs.cgi?tab=email
Do not reply to this email, post in Bugzilla using the
above URL to reply.
------- You are receiving this mail because: -------
You are watching all bug changes.



More information about the wine-bugs mailing list