[Webkit-unassigned] [Bug 86410] [texmap][GStreamer] Composited Video support

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Fri Oct 26 05:24:39 PDT 2012


https://bugs.webkit.org/show_bug.cgi?id=86410





--- Comment #52 from Simon Hausmann <hausmann at webkit.org>  2012-10-26 05:25:47 PST ---
(In reply to comment #51)
> > I'm sure it's called for each _screen visible_ frame. Every time you want to re-render the entire WebKit scene onto the screen.
> > 
> > I don't see a way around it though, it's not specific to video frames, isn't it? I mean, as the video players other content needs updating, too (although only every second).
> > 
> > Can you elaborate how your ideal rendering pass would look like?
> 
> eventually, we just need one texture for each Graphics Layer.

That is what we have. Actually, regular content layers usually have multiple textures because of tiling. But video for example has one texture only. That texture is an implementation detail of MediaPlayerPrivateGStreamer as implementation of TextureMapperPlatformLayer.

> so 3D pipeline (glClear/drawTexture/swapBuffer) isn't necessary when we have such texture alerady. for example: we can use bindSurface only with video texture as input. any gap for it?
> 
> if we reach the above solution, it means TextuerMapper is heavy for us, we can create a simple class for it.
> 
> comparing to controls, video layer is larger and higher update frequency, it is really expensive to do glClear and copy the texture.

But we don't copy the texture, we only draw it once, onto the screen (into the back-buffer).

Can you elaborate what you mean with bindSurface()? I think I must be misunderstanding something here.

Note that the video texture is _not_ copied into an intermediate surface, it is copied straight onto the screen.

-- 
Configure bugmail: https://bugs.webkit.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.


More information about the webkit-unassigned mailing list