[Webkit-unassigned] [Bug 234920] ImageBitmap created from a video element has poor performance

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Tue Jan 11 04:58:37 PST 2022


https://bugs.webkit.org/show_bug.cgi?id=234920

--- Comment #8 from Simon Taylor <simontaylor1 at ntlworld.com> ---
(In reply to Kimmo Kinnunen from comment #7)
> (In reply to Simon Taylor from comment #6)
> > - Our code uses the WebGL API directly, but our users want to make use of
> > WebGL engines (Three, Babylon, PlayCanvas etc) for their rendering. Engines
> > often maintain a cache of underlying WebGL state to avoid un-necessarily
> > resetting bits that are unchanged.
> 
> But this doesn't explain why you want to snapshot video to ImageBitmap and
> then use the ImageBitmap in two different contexts.
> 
> Currently you can snapshot the video to your processing context via just
> texImage2D.

Yes. But then later once processing is complete we want to render that frame in a rendering context and want to guarantee it's the exact same frame that has been processed so the processed results are in sync.

One straightforward way as you suggest is to use a single WebGL context for both processing and rendering, and a texture pool to allow a new frame to be processed whilst still keeping an older frame around for rendering. That's what we do now and it is of course a strategy that works.

The motivation for wanting to split into two contexts on Safari right now is really all around encapsulation and ease-of-integration with third party engines.

Imagine a library like TensorFlow that might want to implement a WebGL backend to speed up some ML inference operation - let's say something like human pose estimation. A user then wants to run that inference on video frames, and then use Three.js to render a virtual skeleton on top of the most recent frame that has results available, so the video frame and results appear perfectly synchronised.

Three.js abstracts away the underlying WebGL context and internally caches the state. There are no public APIs to allow accessing underlying WebGL objects or inform Three that its state cache may be outdated. 

If the user wants to use a single WebGL context that's used for both TensorFlow's backend and the Three.js rendering, the only really supported way to do that with Three.js would be for TensorFlow to write all its WebGL code against the Three.js abstractions for shaders / programs / renderbuffers etc so that Three.js can then remain solely responsible for the overall context state. Of course that wouldn't help a user who wants to write a page that uses TensorFlow but renders with Babylon. Nor does a dependency on a specific engine really make sense for a library project like TensorFlow.

Hopefully that helps to explain the justification for using a separate context for processing, even on current Safari where OffscreenCanvas doesn't exist.

Using a separate context is easy, the only real requirement is a primitive to allow quickly getting the same video frame as a texture in both contexts.

Just doing a separate texImage2d on the rendering context from the same video element doesn't guarantee it will be the same frame (the video is playing, and might have a new frame by the second texImage2d call).

ImageBitmap seemed the right primitive, but isn't suitably performant with the current implementation.

It does look like readPixels to a PIXEL_PACK_BUFFER and then into an ArrayBuffer might fit the bill with iOS 15 though. With that approach the processing context would texImage2d from the video and readPixels to copy it back to JS (we need it there for processing anyway, although not necessarily as full RGBA). The rendering context would texImage from the RGBA ArrayBuffer.

There's no strict need for ImageBitmap.

> > > Currently in WebKit ImageBitmap is not implemented to be an optimisation
> > > across multiple Context2D and WebGL elements.
> > 
> > Without wishing to sound rude - what's the intention of the current
> > implementation then?
> 
> I think the main use-case is to convert a blob to an image to be drawn in
> WebGL or Context2D?
> As in, it's not feasible to convert a blob to an image otherwise.
> As in, it is possible to convert a video element to a texture otherwise by
> just directly via texImage2D.

>From a blob it's possible to do something like:

var im = document.createElement('img');
im.src = createObjectURL(blob);
im.onloaded = () => {gl.texImage2d(..., im)};

createImageBitmap is definitely a cleaner API and offers more options (like specifying a smaller resolution up-front).

> > I guess it's natural to see APIs how you want them to be, but for me it
> > feels the intention of ImageBitmap is to keep hold of potentially-large,
> > uncompressed images so they can be easily consumed in various places.
> 
> Sure, in abstract it can be that. Currently WebKit is not there, though.
> 
> And since we are not there, I'm trying to understand what is the use-case,
> e.g. is getting there the only way to solve the use-case.

Good to know, thanks. Hopefully I've justified why an efficient way to transfer images between different canvas contexts is a useful primitive to have.

That primitive doesn't need to be ImageBitmap, it's just what I thought was the main purpose for it.

On Safari I'm actually pretty happy that PIXEL_PACK_BUFFER readPixels can solve the need.

> > For me createImageBitmap means "please do any prep work / decoding / etc to
> > get this source ready for efficient use in other web APIs - and off the main
> > thread please, just let me know when it's ready".
> 
> Right. But from WebGL perspective that's what video element is -- for the
> simple case the prep work is already "done" and it's efficient to use
> already.
> 
> From WebGL perspective you can upload the same video element 1,2 or 77 times
> in different contexts and textures and it's going to be observably as fast
> as it ever is going to be..

Performance of direct texImage2d(video) is pretty good as you say. The main issue of just using that with separate processing and rendering contexts is that the contexts may end up with different frames (the last one uploaded might be a later frame).

There is also some "conversion" work that goes on, and blocks the JS thread, in every texImage2d call. It is acceptably performant (< 2ms when on a Performance core at high clocks) so it's not a major concern. However on an efficiency core it can be over 5ms - if that work happened off the main thread before createImageBitmap resolved the promise, it's more likely the rest of the WebGL workload could fit in the main thread without needing to move to a Performance core.

So the hope for me with createImageBitmap was twofold - that Metal conversion into a RGBA texture would happen without blocking the main thread, and the resulting ImageBitmap would be so quick to consume in WebGL that there'd be effectively no CPU overhead in splitting the processing into a dedicated context.

There's no spec requirement for that of course so it's still just a nice-to-have wishlist thing.

> Yes, due to various reasons, mostly that not all components are in GPU
> Process, ImageBitmap is not equivalent to a buffer that could be mapped to
> various GPU-based implementations (Context2D or WebGL). We're working on
> this part.

That's great to hear!

> However, it is to prioritise the work, it would still be useful to
> understand if zero-overhead ImageBuffer is something that is a must for
> implementing the feature or a nice to have for implementing the feature.

It's definitely not a requirement, and actually readPixels with PIXEL_PACK_BUFFER is a pretty good fit for us anyway. This one is not really a major concern, I just wanted to flag that the performance of the current ImageBitmap implementation didn't really match my expectations for the API.

In terms of my personal priorities, the rAF weirdness is the biggest issue I have with current iOS Safari (Bug 234923). Not WebGL-related but the catastrophic iOS 15 performance regression in <video> playback when data is in a blob or data URI (Bug 232076) has also caused us some headaches.

-- 
You are receiving this mail because:
You are the assignee for the bug.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.webkit.org/pipermail/webkit-unassigned/attachments/20220111/de65a265/attachment.htm>


More information about the webkit-unassigned mailing list