Is the test.winehq.org front page too pessimistic?

Juan Lang juan.lang at gmail.com
Wed Feb 11 15:58:33 CST 2009


The front page of test.winehq.org shows statistics about failed tests,
but it doesn't seem to take into account the number of individual
tests that passed and failed, rather the number of files that had any
failures.

So, for example, about a week ago I got a fix committed for some
failing mapi32 tests.  Looking at the machines with test failures,
before the fix was committed, 139 tests were run, with 134 of them
failing, whereas after the fix was committed, the same number of tests
were run, with only 6 of them failing.  Nonetheless, the 4th of
February shows a higher failure rate (14.6%) than the 3rd of February
(12.4%).

I know other tests could have started failing in the interim, but it
seems like we've been putting a fair amount of effort into reducing
test failures lately, while the percent of failed tests isn't going
down, at least not on the main page.  If you look at a particular
day's results, the numbers look a bit better over time.

I'm not sending a patch, because there may be different opinions on
this.  That is, perhaps some people like to see a statistic on the
number of files with failing tests on any machine, which the front
page appears to show, while others may like to see the number of
failures in a particular file, which a day's results show.  My own
opinion is that it's hard to get motivated to fix something without
some sort of positive feedback for it, so changing the front page
would be better.

My own feeling is that there are far fewer failing tests now than
there used to be, and I'd sure like to see that reflected somewhere at
a quick glance.  Thoughts?
--Juan



More information about the wine-devel mailing list