[webkit-dev] How to deal with 1-pixel differences in reftests ?
ap at webkit.org
Wed Nov 18 09:12:27 PST 2015
I do not think that there is a way to algorithmically define what an acceptable difference is. Here are a few cases where it's critical to detect small differences:
- color management, e.g. testing different code paths that should match precisely;
- finding uninitialized memory use bugs in rendering pipeline, which do cause minor pixel noise;
- testing antialiasing and scaling behavior (e.g. that we should revert to high quality scaling after an animation is done);
- testing text rendering, where the difference can easily be small, e.g one accent over one letter on a mostly blank test result.
Talking from past experience that we had with pixel test tolerance and with retrying failing tests, I believe that leeway in reporting failures quickly causes significant deterioration in infrastructure quality, up to a point where we can't tell what's going on with many tests.
> 18 нояб. 2015 г., в 4:36, Carlos Alberto Lopez Perez <clopez at igalia.com> написал(а):
> Some reference tests give a 1-pixel or very few pixel differences .
> I'm not sure if this really indicates a problem in the WebKit code, or
> it indicates that we are just too strict not allowing even a very small
> percentage in pixel differences for this kind of tests.
> Should we tolerate a few pixel differences for reftests ?
> I have done some tests, and the test in  passes for any value of
> tolerance >= 0.00001% (with the GTK port).
> I'm inclined to allow a very small value, for example a 0.001% (that
> would be 100 times stricter than the tolerance value we use for the
> other tests)
> For example, this is happening in the GTK port:
> The diff image normalized (so you can see where is the diff):
> webkit-dev mailing list
> webkit-dev at lists.webkit.org
More information about the webkit-dev