[Webkit-unassigned] [Bug 59064] [GTK] Support EGL backend for WebGL

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Wed Aug 3 23:20:47 PDT 2011


https://bugs.webkit.org/show_bug.cgi?id=59064


Martin Robinson <mrobinson at webkit.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
 Attachment #102630|review?                     |review-
               Flag|                            |




--- Comment #18 from Martin Robinson <mrobinson at webkit.org>  2011-08-03 23:20:46 PST ---
(From update of attachment 102630)
View in context: https://bugs.webkit.org/attachment.cgi?id=102630&action=review

I think some of these #ifdefs can be removed entirely. For instance in GraphicsContext3DGtk, you can eliminate almost all of them. It probably makes sense to split GraphicsContext3DInternal though.

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DGtk.cpp:99
> +#if USE(EGL)
> +    ::glGenFramebuffers(1, &m_fbo);
> +    ::glBindFramebuffer(GL_FRAMEBUFFER, m_fbo);
> +#else
>      ::glGenFramebuffersEXT(1, &m_fbo);
>      ::glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, m_fbo);
> +#endif

Adding an alias in OpenGLShims can eliminate this #ifdef.

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DGtk.cpp:171
> +        if (m_attrs.stencil || m_attrs.depth) {
> +            ::glDeleteRenderbuffers(1, &m_depthBuffer);
> +            ::glDeleteRenderbuffers(1, &m_stencilBuffer);
> +        }
> +#else
>          if (m_attrs.stencil || m_attrs.depth)

Does EGL use the OpenGLShims.h. If so you should be able to alias glDeleteFramebuffersEXT to glDeleteFramebuffers and remove this #ifdef entirely.

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DGtk.cpp:176
> +    ::glDeleteFramebuffers(1, &m_fbo);

Ditto for glDeleteFramebuffers and glDeleteFramebuffersEXT.

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DInternal.cpp:113
> +#if USE(EGL)
> +    EGLDisplay eglDisplay = eglGetDisplay(sharedDisplay());
> +    if (eglDisplay == EGL_NO_DISPLAY)
> +        return nullptr;
> +    if (!initialized) {
> +        if (!eglInitialize(eglDisplay, 0, 0))
> +            return nullptr;
> +        initialized = true;
> +    }
> +#else

Instead of doing this, why not have sharedDisplay return an EGLDisplay?

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DInternal.cpp:116
>          success = initializeOpenGLShims();

Why shouldn't EGL use the OpenGL shims?

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DInternal.cpp:134
> +#if USE(EGL)
> +    GraphicsContext3DInternal* internal = createPbufferContext(eglDisplay);
> +#else
>      GraphicsContext3DInternal* internal = createPbufferContext();
> +#endif
>      if (!internal)
> +#if USE(EGL)
> +        internal = createPixmapContext(eglDisplay);
> +#else
>          internal = createPixmapContext();
> +#endif
>      if (!internal)

If you take my advice above you can remove these #ifdefs and just call sharedDisplay in these methods.

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DInternal.cpp:413
>  } // namespace WebCore

I don't think there is any shared code in this file. Would it make sense to split it out into GraphicsContext3DInternalEGL.cpp?

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DInternal.h:79
>  }

Almost nothing is shared here too.

-- 
Configure bugmail: https://bugs.webkit.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.



More information about the webkit-unassigned mailing list