[Webkit-unassigned] [Bug 86410] [texmap][GStreamer] Composited Video support
bugzilla-daemon at webkit.org
bugzilla-daemon at webkit.org
Fri Nov 2 03:11:44 PDT 2012
https://bugs.webkit.org/show_bug.cgi?id=86410
--- Comment #57 from Simon Hausmann <hausmann at webkit.org> 2012-11-02 03:13:05 PST ---
(In reply to comment #56)
> Created an attachment (id=172019)
--> (https://bugs.webkit.org/attachment.cgi?id=172019&action=review) [details]
> support gst-vaapi to upload vaSurface to webkit texture
>
> revise Simon's patch with fixes(share the X Display between webkit and gst pipeline),
> it runs on my sandy bridge PC; however, playback stops after several seconds.
I've observed the same :)
> it it not my target since there is TFP and FBO in gst-vaapi to support gst_surface_converter_upload(),
> it is something like 3 times memory copy.
>
> next, I will work on the following proposal:
> 1. gst-vaapi export a 'handler' for video surface
> 2. a EGLImage can be create from this handler by eglCreateImageKHR()
> 3. texture is created from the above EGLImage by glEGLImageTargetTexture2DOES()
>
> something are not certain are:
> a) should the step 2 be done in gst-vaapi?
> pros: EGLImage is generic, webkit will be happy
> cons: EGL context should be shared between webkit and gst-vaapi, it increases the complexity
What if WebKit provides the EGL context but gst-vaapi creates its own context that is initialized to _share_
resources with the one provided by WebKit (or the app in general)?
> b) should we export YUV (indicated by the 'handler') from gst-vaapi?
> pros: it is better for performance, driver needn't convert YUV to RGB before export the buffer.
> cons: complexity will be introduced on webkit side: detail info of the YUV buffer layout and corresponding shader to render such YUV texture.
> anyway, I will try RGB first. my SandyBridge can convert 720p video frame to RGB format at ~400fps
I wonder if it would be possible to have API in the gst_surface_converter interfaces that would allow for
the app/webkit to call into the implementation (gst-vaapi) to create (compile) the fragment shader required for rendering
the texture? Then WebKit could call that function when it needs to.
Another approach would be for gst-vaapi to provide the shader as a string, but I don't think that's as clean.
> I'm not familiar with webkit,
> could someone give clues on adding glEGLImageTargetTexture2DOES?
Would this be actually needed in WebKit if gst-vaapi continues to support as texture as interface instead of EGL image?
Then again, I suppose one argument in favour of using EGL images instead of texture is that OpenMax IL has IIRC an extension
that allows for binding a video surface to an EGL image. So maybe that would make the gstreamer interface more useable in the long run.
> and how could I make sure EGL/GLES is used instead of GLX/OGL?
If you decide to use the Qt port for testing/developing it, just make sure Qt was configured with -opengl es2. You might also want to use wayland as windowing system, i.e. build the qtwayland module.
--
Configure bugmail: https://bugs.webkit.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
More information about the webkit-unassigned
mailing list