[webkit-reviews] review denied: [Bug 206132] [Media in GPU process] Synchronize the properties of video layers in the GPU process with the hosting layer in the web process : [Attachment 390594] Patch

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Wed Feb 12 17:56:20 PST 2020


Simon Fraser (smfr) <simon.fraser at apple.com> has denied  review:
Bug 206132: [Media in GPU process] Synchronize the properties of video layers
in the GPU process with the hosting layer in the web process
https://bugs.webkit.org/show_bug.cgi?id=206132

Attachment 390594: Patch

https://bugs.webkit.org/attachment.cgi?id=390594&action=review




--- Comment #4 from Simon Fraser (smfr) <simon.fraser at apple.com> ---
Comment on attachment 390594
  --> https://bugs.webkit.org/attachment.cgi?id=390594
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=390594&action=review

> Source/WebCore/ChangeLog:8
> +	   Add an interface to MediaPlayer to set the frame of a video in the
inline mode.

"set the frame" is ambiguous when used in the context of video.

> Source/WebCore/platform/graphics/MediaPlayer.h:332
> +    void setVideoInlineFrame(const FloatRect&);

Is this really a frame, or just a size? Does it ever have non-zero origin?
Maybe call it setSize which would be much less confusing.

> Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.cpp:470
> +    // Initially the size of the platformLayer will be 0 because we do not
provide mediaPlayerContentBoxRect() in this class.

A size will be 0x0

> Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.cpp:473
> +    // Since the renderer in the Web process will take care of the video
layers configurations,

video layer's configuration

> Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.cpp:474
> +    // the RemoteMediaPlayerProxy does not need to manage the layers by
itself.

This is a rather long-winded comment. Maybe prune it down to the minimum that
will help a future you remember why the code is this way.

> Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.cpp:515
> +void RemoteMediaPlayerProxy::setVideoInlineFrameFenced(const FloatRect
bounds, IPC::Attachment fencePort)

const FloatRect& or FloatRect but not const FloatRect

> Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.cpp:521
> +    m_player->setVideoInlineFrame(bounds);

"bounds" and "frame" have different meanings in AppKit/UIKit. "frame" is in the
coordinate space of the superview. "bounds" establishes your local coordinate
system Don't mix and match them.

> Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.h:112
> +    void setVideoInlineFrameFenced(const WebCore::FloatRect bounds,
IPC::Attachment);

Please use MachPort, not IPC::Attachment (existing code also needs to be
migrated).

> Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.messages.in:60
> +    SetVideoInlineFrameFenced(WebCore::FloatRect bounds, IPC::Attachment
fencePort)

MachPort

> Source/WebKit/WebKit.xcodeproj/project.pbxproj:2685
> +		1D32F89B23A84BA600B1EA6A /* RemoteMediaResourceProxy.h */ =
{isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path =
RemoteMediaResourceProxy.h; sourceTree = "<group>"; tabWidth = 4; };
> +		1D32F89D23A84C5B00B1EA6A /* RemoteMediaResourceProxy.cpp */ =
{isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path =
RemoteMediaResourceProxy.cpp; sourceTree = "<group>"; tabWidth = 4; };

Why tabWidth

> Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.cpp:1009
> +void MediaPlayerPrivateRemote::setVideoInlineFrameFenced(const FloatRect&
rect, mach_port_name_t fencePort)

MachPort

> Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.h:123
> +    void setVideoInlineFrameFenced(const WebCore::FloatRect&,
mach_port_name_t);

MachPort

> Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.h:233
> +    void setSize(const WebCore::IntSize&) final { };

No semicolon

> Source/WebKit/WebProcess/GPU/media/VideoLayerRemote.h:38
> +using LayerHostingContextID = uint32_t;

Pull in LayerHostingContext.h?

> Source/WebKit/WebProcess/GPU/media/VideoLayerRemote.h:40
> +RetainPtr<CALayer> createVideoLayerRemote(MediaPlayerPrivateRemote*,
LayerHostingContextID);

This is cocoa code inside ENABLE(GPU_PROCESS). I think we should assume that
other platforms will want to build with ENABLE(GPU_PROCESS), so you need #if
PLATFORM(COCOA) as well.

> Source/WebKit/WebProcess/GPU/media/cocoa/VideoLayerRemoteCocoa.h:34
> + at interface WKVideoLayerRemote : CALayer

PLATFORM(COCOA)

> Source/WebKit/WebProcess/GPU/media/cocoa/VideoLayerRemoteCocoa.h:36
> + at property (nonatomic, retain) CALayer *videoSublayer;

Layers retain their sublayers already. Does this need to be retain?

> Source/WebKit/WebProcess/GPU/media/cocoa/VideoLayerRemoteCocoa.h:37
> + at property CGRect videoLayerFrame;

nonatomic?

> Source/WebKit/WebProcess/GPU/media/cocoa/VideoLayerRemoteCocoa.mm:100
> +	   [self performSelector:@selector(resolveBounds) withObject:nil
afterDelay:animationDuration + 0.1];

Noooooo. Source of test flakiness for decades to come!

Also please don't use performSelector:withObject:afterDelay:. Use an explicit
timer, or dispatch with a block. Then you don't need all the
cancelPreviousPerformRequestsWithTarget.

> Source/WebKit/WebProcess/GPU/media/cocoa/VideoLayerRemoteCocoa.mm:121
> +	       _context = nil;

Why is the context nulled out here?

> Source/WebKit/WebProcess/GPU/media/cocoa/VideoLayerRemoteCocoa.mm:123
> +	       mach_port_name_t fencePort = _fenceSendRight.leakSendRight();
> +	      
mediaPlayerPrivateRemote->setVideoInlineFrameFenced(self.videoLayerFrame,
fencePort);

MachPort cleans this up.


More information about the webkit-reviews mailing list