[Webkit-unassigned] [Bug 102994] [META][Qt] Implement Threaded Coordinated Graphics in WebKit1

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Tue Dec 18 04:14:23 PST 2012


https://bugs.webkit.org/show_bug.cgi?id=102994


Simon Hausmann <hausmann at webkit.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |abecsi at webkit.org,
                   |                            |jocelyn.turcotte at digia.com,
                   |                            |zeno at webkit.org




--- Comment #6 from Simon Hausmann <hausmann at webkit.org>  2012-12-18 04:16:39 PST ---
(In reply to comment #5)
[...]
> > What exactly is it that the compositor thread does in terms of graphics? I understand that you want to spin timers there to update animation states. But what does it do in term of rendering?
> > 
> > The actual layers themselves cannot be rendered from within the compositor thread without blocking the main thread. Or do you populate the layers from within the main thread via the ShareableSurface? 
> > 
> > Assuming you do that - populate the layers in the main thread, maintain the layer composition in the compositor thread - then how do you integrate that with the rendering in Qt? In WK1 the rendering happens in the main thread, when ::paintEvent() is called. In your scenario, how would the ::paintEvent() (or QGraphicsWidget::paint) implementation look like conceptually?
> 
> QT WK2 coordinated graphics already draws contents using TextureMapper off the main thread in UI Process.
> You can see Vyacheslav's explanation : http://lists.webkit.org/pipermail/webkit-dev/2012-November/022924.html
> QtQuick 5.0 url is changed : http://qt-project.org/doc/qt-5.0/qtquick/qquickitem.html#updatePaintNode
> 
> Coordinated Graphics maintains duplicated layer tree in the rendering thread, and the rendering thread paints contents by 60fps if animation exists.
> 
> Our way about separating threads would be the similar to current coordinated graphics' way. If webview is backed by qquickitem, the rendering thread draws UI Widgets as well as web contents, and synchronizes with the main thread only when updating states.

"_If_ webview is backed by qquickitem" - That is exactly the point. With WebKit1 we do not have that situation, there is currently no support for using WebKit1 with QQuickItem (QtQuick2).

> I don't decide yet whether to use qquickitem like WK2 to use the rendering thread or to use something like RunLoop in WK2. We can not just use thread because we need to run a timer in the separated thread. If we use RunLoop, we draws contents to texture (FBO) off the main thread, and blits texture to webview's surface on the main thread. The later way is hard to implement but a platform independent approach. We need to discuss whether is proper.

Yeah, I think we do need to discuss this. In my opinion in boils down to the question of where we want to go with WebKit1 in the Qt port. Right now WebKit1 is used to integrate with QWidget and QGraphicsView and WebKit2 is used for the QtQuick2 integration.

Technically we have three rendering code paths to maintain right now:

    (1) The coordinated graphics way with WebKit2 and a tiny bit of private Qt API that allows us to integrate with the QtQuick2 scene graph. This implies the availability of OpenGL and rendering happens in a secondary thread (Qt's render thread that finishes off the frame with a swapbuffers call).

    (2) The QWidget (and regular QGraphicsView) way, where we know we're rendering in software into a given QPainter. On the WebKit side that's done using the software texture mapper and WebKit1. Rendering is required to happen in the main thread and ends up in the (image) backing store of the QWindow/Widget.

    (3) The way when using QGraphicsView with a QGLWidget as viewport, where we're again required to render in the main thread but we can use OpenGL and therefore use the GL texture mapper - with WebKit1.


We consciously promote option (1) over the others and would like to keep the effort required to maintain (2) and (3) at a minimum. The way (2) and (3) are tied to the main thread is indeed one of the main reasons for favoring (1).

This bug is effectively suggesting to add a fourth way of rendering that is similar to option (1) but is based on WebKit1 and would potentially be based on a QtQuick2 integration. I would like to see some compelling arguments why we should add this fourth way to the list of supported and maintained rendering code paths for the Qt port.

In other words: Why would somebody choose a potential WebKit1 based QtQuick2 integration over the existing WebKit2 based one?

-- 
Configure bugmail: https://bugs.webkit.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.


More information about the webkit-unassigned mailing list