[webkit-reviews] review denied: [Bug 59064] [GTK] Support EGL backend for WebGL : [Attachment 102630] updated patch

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Wed Aug 3 23:20:45 PDT 2011


Martin Robinson <mrobinson at webkit.org> has denied ChangSeok Oh
<kevin.cs.oh at gmail.com>'s request for review:
Bug 59064: [GTK] Support EGL backend for WebGL
https://bugs.webkit.org/show_bug.cgi?id=59064

Attachment 102630: updated patch
https://bugs.webkit.org/attachment.cgi?id=102630&action=review

------- Additional Comments from Martin Robinson <mrobinson at webkit.org>
View in context: https://bugs.webkit.org/attachment.cgi?id=102630&action=review


I think some of these #ifdefs can be removed entirely. For instance in
GraphicsContext3DGtk, you can eliminate almost all of them. It probably makes
sense to split GraphicsContext3DInternal though.

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DGtk.cpp:99
> +#if USE(EGL)
> +    ::glGenFramebuffers(1, &m_fbo);
> +    ::glBindFramebuffer(GL_FRAMEBUFFER, m_fbo);
> +#else
>      ::glGenFramebuffersEXT(1, &m_fbo);
>      ::glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, m_fbo);
> +#endif

Adding an alias in OpenGLShims can eliminate this #ifdef.

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DGtk.cpp:171
> +	   if (m_attrs.stencil || m_attrs.depth) {
> +	       ::glDeleteRenderbuffers(1, &m_depthBuffer);
> +	       ::glDeleteRenderbuffers(1, &m_stencilBuffer);
> +	   }
> +#else
>	   if (m_attrs.stencil || m_attrs.depth)

Does EGL use the OpenGLShims.h. If so you should be able to alias
glDeleteFramebuffersEXT to glDeleteFramebuffers and remove this #ifdef
entirely.

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DGtk.cpp:176
> +    ::glDeleteFramebuffers(1, &m_fbo);

Ditto for glDeleteFramebuffers and glDeleteFramebuffersEXT.

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DInternal.cpp:113
> +#if USE(EGL)
> +    EGLDisplay eglDisplay = eglGetDisplay(sharedDisplay());
> +    if (eglDisplay == EGL_NO_DISPLAY)
> +	   return nullptr;
> +    if (!initialized) {
> +	   if (!eglInitialize(eglDisplay, 0, 0))
> +	       return nullptr;
> +	   initialized = true;
> +    }
> +#else

Instead of doing this, why not have sharedDisplay return an EGLDisplay?

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DInternal.cpp:116
>	   success = initializeOpenGLShims();

Why shouldn't EGL use the OpenGL shims?

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DInternal.cpp:134
> +#if USE(EGL)
> +    GraphicsContext3DInternal* internal = createPbufferContext(eglDisplay);
> +#else
>      GraphicsContext3DInternal* internal = createPbufferContext();
> +#endif
>      if (!internal)
> +#if USE(EGL)
> +	   internal = createPixmapContext(eglDisplay);
> +#else
>	   internal = createPixmapContext();
> +#endif
>      if (!internal)

If you take my advice above you can remove these #ifdefs and just call
sharedDisplay in these methods.

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DInternal.cpp:413
>  } // namespace WebCore

I don't think there is any shared code in this file. Would it make sense to
split it out into GraphicsContext3DInternalEGL.cpp?

> Source/WebCore/platform/graphics/gtk/GraphicsContext3DInternal.h:79
>  }

Almost nothing is shared here too.


More information about the webkit-reviews mailing list