[Webkit-unassigned] [Bug 157966] New: In accelerated drawing mode, ImageBuffer::putByteArray() should copy the bytes directly to the IOSurface backing store

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Fri May 20 20:33:07 PDT 2016


https://bugs.webkit.org/show_bug.cgi?id=157966

            Bug ID: 157966
           Summary: In accelerated drawing mode,
                    ImageBuffer::putByteArray() should copy the bytes
                    directly to the IOSurface backing store
    Classification: Unclassified
           Product: WebKit
           Version: WebKit Nightly Build
          Hardware: Unspecified
                OS: Unspecified
            Status: NEW
          Severity: Normal
          Priority: P2
         Component: Images
          Assignee: webkit-unassigned at lists.webkit.org
          Reporter: sabouhallawa at apple.com

This is related to https://bugs.webkit.org/show_bug.cgi?id=65767 which was trying to fix the following problem:

Repro steps:

    var img_a = canvas.toDataURL( 'image/png' );

    // Allocate and set the colors of the 'data' buffer

    context.putImageData( data, 0, 0 );

    var img_b = canvas.toDataURL( 'image/png' );

Result:
    img_a and img_b are identical although the context should have been changed when putImageData() is called.

What happened before the fix of this bug was the following: ImageBuffer::putByteArray() was doing the right thing by setting the bytes directly to the IOSurface backing store. The real problem was in the CG SPI CGIOSurfaceContextCreateImage(). Once this function is called, it caches the last returned image. Only when the context changes, by calling a CG drawing function, this cache will be invalidated. And any subsequent calls to CGIOSurfaceContextCreateImage() will force recreating the image. In the above scenario, context.putImageData() was not calling any CG drawing functions so the cached image was not thrown away.

Instead of putting the bytes directly into the backing store of the IOSurface, The fix http://trac.webkit.org/changeset/106836 was doing the following:

1. Putting the data into another ImageBuffer
2. Creating a native CGImageRef from this ImageBuffer
3. Drawing this native CGImageRef into the destination context.

This of course fixed the issue of getting a stale canvas image after calling putImageData() because drawing the CGImageRef into the context flushes the cache of CGIOSurfaceContextCreateImage(). And subsequent calls to this SPI was getting a fresh image.

The fix was kind of heavy hack. The fix should have been be one of the following:

1. IOSurface should be smart enough to know that the cached image, which is created by CGIOSurfaceContextCreateImage(), is invalid if the IOSurface backing store gets changed directly.
2. A new IOSurface API is needed to allow invalidating the IOSurface cached image manually. After putting the data in the IOSurface backing store, this function can be called so no stale cached image will be used.
3. A light weight fix can be used, for now, to force invalidating the IOSurface cached image.

This has been causing a performance issue with the canvas putImageData(). The Animometer 'Images' test has a very low score because of the current implementation of ImageBuffer::putByteArray()

-- 
You are receiving this mail because:
You are the assignee for the bug.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.webkit.org/pipermail/webkit-unassigned/attachments/20160521/5620da5e/attachment.html>


More information about the webkit-unassigned mailing list