<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[214120] trunk/Source/WebCore</title>
</head>
<body>

<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt;  }
#msg dl a { font-weight: bold}
#msg dl a:link    { color:#fc3; }
#msg dl a:active  { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff  {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/214120">214120</a></dd>
<dt>Author</dt> <dd>eric.carlson@apple.com</dd>
<dt>Date</dt> <dd>2017-03-17 14:14:28 -0700 (Fri, 17 Mar 2017)</dd>
</dl>

<h3>Log Message</h3>
<pre>[MediaStream] Compensate for video capture orientation
https://bugs.webkit.org/show_bug.cgi?id=169313
&lt;rdar://problem/30994785&gt;

Reviewed by Jer Noble.

No new tests, the mock video source doesn't support rotation. Test will be added when this
is fixed in https://bugs.webkit.org/show_bug.cgi?id=169822.

Add 'orientation' and 'mirrored' attributes to MediaSample
* platform/MediaSample.h:
(WebCore::MediaSample::videoOrientation):
(WebCore::MediaSample::videoMirrored):
* platform/graphics/avfoundation/MediaSampleAVFObjC.h:

A video sample can be rotated and/or mirrored, so the video layer may need to be rotated
and resized for display. We don't want expose this information to the renderer, so allocate
return a generic CALayer as the player's platforLayer, and add the video layer as a sublayer
so we can adjust it to display correctly. Add an enum for playback state as well as display
mode so we correctly display a black frame when video frames are available but playback has
not yet started.

* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
(-[WebAVSampleBufferStatusChangeListener initWithParent:]):
(-[WebAVSampleBufferStatusChangeListener invalidate]):
(-[WebAVSampleBufferStatusChangeListener beginObservingLayers]):
(-[WebAVSampleBufferStatusChangeListener stopObservingLayers]): Ditto.
(-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::videoTransformationMatrix):
(WebCore::runWithoutAnimations):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerErrorDidChange):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::cancelLoad):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::displayLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentMediaTime):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::activeStatusChanged):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setShouldBufferData):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayerBoundsChanged):
(-[WebAVSampleBufferStatusChangeListener beginObservingLayer:]): Deleted.
(-[WebAVSampleBufferStatusChangeListener stopObservingLayer:]): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paused): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::shouldBePlaying): Deleted.

* platform/mediastream/mac/AVVideoCaptureSource.h:
* platform/mediastream/mac/AVVideoCaptureSource.mm:
(WebCore::AVVideoCaptureSource::processNewFrame): Add connection parameter so we can get the
video orientation.
(WebCore::AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection):

Pass sample orientation to libwebrtc.
* platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
(WebCore::RealtimeOutgoingVideoSource::sendFrame):
(WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable):
* platform/mediastream/mac/RealtimeOutgoingVideoSource.h:</pre>

<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkSourceWebCoreChangeLog">trunk/Source/WebCore/ChangeLog</a></li>
<li><a href="#trunkSourceWebCoreplatformMediaSampleh">trunk/Source/WebCore/platform/MediaSample.h</a></li>
<li><a href="#trunkSourceWebCoreplatformgraphicsavfoundationMediaSampleAVFObjCh">trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h</a></li>
<li><a href="#trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCh">trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h</a></li>
<li><a href="#trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCmm">trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAVVideoCaptureSourceh">trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAVVideoCaptureSourcemm">trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacRealtimeOutgoingVideoSourcecpp">trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacRealtimeOutgoingVideoSourceh">trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.h</a></li>
</ul>

</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkSourceWebCoreChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/ChangeLog (214119 => 214120)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/ChangeLog        2017-03-17 21:12:47 UTC (rev 214119)
+++ trunk/Source/WebCore/ChangeLog        2017-03-17 21:14:28 UTC (rev 214120)
</span><span class="lines">@@ -1,3 +1,75 @@
</span><ins>+2017-03-17  Eric Carlson  &lt;eric.carlson@apple.com&gt;
+
+        [MediaStream] Compensate for video capture orientation
+        https://bugs.webkit.org/show_bug.cgi?id=169313
+        &lt;rdar://problem/30994785&gt;
+
+        Reviewed by Jer Noble.
+
+        No new tests, the mock video source doesn't support rotation. Test will be added when this
+        is fixed in https://bugs.webkit.org/show_bug.cgi?id=169822.
+
+        Add 'orientation' and 'mirrored' attributes to MediaSample
+        * platform/MediaSample.h:
+        (WebCore::MediaSample::videoOrientation):
+        (WebCore::MediaSample::videoMirrored):
+        * platform/graphics/avfoundation/MediaSampleAVFObjC.h:
+
+        A video sample can be rotated and/or mirrored, so the video layer may need to be rotated
+        and resized for display. We don't want expose this information to the renderer, so allocate
+        return a generic CALayer as the player's platforLayer, and add the video layer as a sublayer
+        so we can adjust it to display correctly. Add an enum for playback state as well as display
+        mode so we correctly display a black frame when video frames are available but playback has
+        not yet started.
+
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
+        (-[WebAVSampleBufferStatusChangeListener initWithParent:]):
+        (-[WebAVSampleBufferStatusChangeListener invalidate]):
+        (-[WebAVSampleBufferStatusChangeListener beginObservingLayers]):
+        (-[WebAVSampleBufferStatusChangeListener stopObservingLayers]): Ditto.
+        (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::videoTransformationMatrix):
+        (WebCore::runWithoutAnimations):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerErrorDidChange):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayers):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::cancelLoad):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::displayLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentMediaTime):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::activeStatusChanged):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setShouldBufferData):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayerBoundsChanged):
+        (-[WebAVSampleBufferStatusChangeListener beginObservingLayer:]): Deleted.
+        (-[WebAVSampleBufferStatusChangeListener stopObservingLayer:]): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paused): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::shouldBePlaying): Deleted.
+
+        * platform/mediastream/mac/AVVideoCaptureSource.h:
+        * platform/mediastream/mac/AVVideoCaptureSource.mm:
+        (WebCore::AVVideoCaptureSource::processNewFrame): Add connection parameter so we can get the
+        video orientation.
+        (WebCore::AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection):
+
+        Pass sample orientation to libwebrtc.
+        * platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
+        (WebCore::RealtimeOutgoingVideoSource::sendFrame):
+        (WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable):
+        * platform/mediastream/mac/RealtimeOutgoingVideoSource.h:
+
</ins><span class="cx"> 2017-03-17  Zalan Bujtas  &lt;zalan@apple.com&gt;
</span><span class="cx"> 
</span><span class="cx">         Fix the flow thread state on the descendants of out of flow positioned replaced elements.
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformMediaSampleh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/MediaSample.h (214119 => 214120)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/MediaSample.h        2017-03-17 21:12:47 UTC (rev 214119)
+++ trunk/Source/WebCore/platform/MediaSample.h        2017-03-17 21:14:28 UTC (rev 214120)
</span><span class="lines">@@ -77,6 +77,16 @@
</span><span class="cx">     virtual SampleFlags flags() const = 0;
</span><span class="cx">     virtual PlatformSample platformSample() = 0;
</span><span class="cx"> 
</span><ins>+    enum class VideoOrientation {
+        Unknown,
+        Portrait,
+        PortraitUpsideDown,
+        LandscapeRight,
+        LandscapeLeft,
+    };
+    virtual VideoOrientation videoOrientation() const { return VideoOrientation::Unknown; }
+    virtual bool videoMirrored() const { return false; }
+
</ins><span class="cx">     bool isSync() const { return flags() &amp; IsSync; }
</span><span class="cx">     bool isNonDisplaying() const { return flags() &amp; IsNonDisplaying; }
</span><span class="cx"> 
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformgraphicsavfoundationMediaSampleAVFObjCh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h (214119 => 214120)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h        2017-03-17 21:12:47 UTC (rev 214119)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h        2017-03-17 21:14:28 UTC (rev 214120)
</span><span class="lines">@@ -37,7 +37,7 @@
</span><span class="cx"> public:
</span><span class="cx">     static Ref&lt;MediaSampleAVFObjC&gt; create(CMSampleBufferRef sample, int trackID) { return adoptRef(*new MediaSampleAVFObjC(sample, trackID)); }
</span><span class="cx">     static Ref&lt;MediaSampleAVFObjC&gt; create(CMSampleBufferRef sample, AtomicString trackID) { return adoptRef(*new MediaSampleAVFObjC(sample, trackID)); }
</span><del>-    static Ref&lt;MediaSampleAVFObjC&gt; create(CMSampleBufferRef sample) { return adoptRef(*new MediaSampleAVFObjC(sample)); }
</del><ins>+    static Ref&lt;MediaSampleAVFObjC&gt; create(CMSampleBufferRef sample, VideoOrientation orientation = VideoOrientation::Unknown, bool mirrored = false) { return adoptRef(*new MediaSampleAVFObjC(sample, orientation, mirrored)); }
</ins><span class="cx">     static RefPtr&lt;MediaSampleAVFObjC&gt; createImageSample(Ref&lt;JSC::Uint8ClampedArray&gt;&amp;&amp;, unsigned long width, unsigned long height);
</span><span class="cx">     static RefPtr&lt;MediaSampleAVFObjC&gt; createImageSample(Vector&lt;uint8_t&gt;&amp;&amp;, unsigned long width, unsigned long height);
</span><span class="cx"> 
</span><span class="lines">@@ -56,6 +56,13 @@
</span><span class="cx">         , m_id(String::format(&quot;%d&quot;, trackID))
</span><span class="cx">     {
</span><span class="cx">     }
</span><ins>+    MediaSampleAVFObjC(CMSampleBufferRef sample, VideoOrientation orientation, bool mirrored)
+        : m_sample(sample)
+        , m_orientation(orientation)
+        , m_mirrored(mirrored)
+    {
+    }
+
</ins><span class="cx">     virtual ~MediaSampleAVFObjC() { }
</span><span class="cx"> 
</span><span class="cx">     MediaTime presentationTime() const override;
</span><span class="lines">@@ -79,8 +86,13 @@
</span><span class="cx">     std::pair&lt;RefPtr&lt;MediaSample&gt;, RefPtr&lt;MediaSample&gt;&gt; divide(const MediaTime&amp; presentationTime) override;
</span><span class="cx">     Ref&lt;MediaSample&gt; createNonDisplayingCopy() const override;
</span><span class="cx"> 
</span><ins>+    VideoOrientation videoOrientation() const final { return m_orientation; }
+    bool videoMirrored() const final { return m_mirrored; }
+
</ins><span class="cx">     RetainPtr&lt;CMSampleBufferRef&gt; m_sample;
</span><span class="cx">     AtomicString m_id;
</span><ins>+    VideoOrientation m_orientation { VideoOrientation::Unknown };
+    bool m_mirrored { false };
</ins><span class="cx"> };
</span><span class="cx"> 
</span><span class="cx"> }
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h (214119 => 214120)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h        2017-03-17 21:12:47 UTC (rev 214119)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h        2017-03-17 21:14:28 UTC (rev 214120)
</span><span class="lines">@@ -31,6 +31,7 @@
</span><span class="cx"> #include &quot;MediaPlayerPrivate.h&quot;
</span><span class="cx"> #include &quot;MediaSample.h&quot;
</span><span class="cx"> #include &quot;MediaStreamPrivate.h&quot;
</span><ins>+#include &lt;CoreGraphics/CGAffineTransform.h&gt;
</ins><span class="cx"> #include &lt;wtf/Function.h&gt;
</span><span class="cx"> #include &lt;wtf/MediaTime.h&gt;
</span><span class="cx"> #include &lt;wtf/WeakPtr.h&gt;
</span><span class="lines">@@ -75,11 +76,16 @@
</span><span class="cx"> 
</span><span class="cx">     WeakPtr&lt;MediaPlayerPrivateMediaStreamAVFObjC&gt; createWeakPtr() { return m_weakPtrFactory.createWeakPtr(); }
</span><span class="cx"> 
</span><del>-    void ensureLayer();
-    void destroyLayer();
</del><ins>+    void ensureLayers();
+    void destroyLayers();
</ins><span class="cx"> 
</span><span class="cx">     void layerStatusDidChange(AVSampleBufferDisplayLayer*);
</span><ins>+    void layerErrorDidChange(AVSampleBufferDisplayLayer*);
+    void backgroundLayerBoundsChanged();
</ins><span class="cx"> 
</span><ins>+    PlatformLayer* displayLayer();
+    PlatformLayer* backgroundLayer();
+
</ins><span class="cx"> private:
</span><span class="cx">     // MediaPlayerPrivateInterface
</span><span class="cx"> 
</span><span class="lines">@@ -99,7 +105,7 @@
</span><span class="cx"> 
</span><span class="cx">     void play() override;
</span><span class="cx">     void pause() override;
</span><del>-    bool paused() const override;
</del><ins>+    bool paused() const override { return !playing(); }
</ins><span class="cx"> 
</span><span class="cx">     void setVolume(float) override;
</span><span class="cx">     void setMuted(bool) override;
</span><span class="lines">@@ -156,7 +162,7 @@
</span><span class="cx"> 
</span><span class="cx">     bool ended() const override { return m_ended; }
</span><span class="cx"> 
</span><del>-    bool shouldBePlaying() const;
</del><ins>+    void setShouldBufferData(bool) override;
</ins><span class="cx"> 
</span><span class="cx">     MediaPlayer::ReadyState currentReadyState();
</span><span class="cx">     void updateReadyState();
</span><span class="lines">@@ -177,6 +183,13 @@
</span><span class="cx">     bool updateDisplayMode();
</span><span class="cx">     void updateCurrentFrameImage();
</span><span class="cx"> 
</span><ins>+    enum class PlaybackState {
+        None,
+        Playing,
+        Paused,
+    };
+    bool playing() const { return m_playbackState == PlaybackState::Playing; }
+
</ins><span class="cx">     // MediaStreamPrivate::Observer
</span><span class="cx">     void activeStatusChanged() override;
</span><span class="cx">     void characteristicsChanged() override;
</span><span class="lines">@@ -200,6 +213,8 @@
</span><span class="cx"> 
</span><span class="cx">     AudioSourceProvider* audioSourceProvider() final;
</span><span class="cx"> 
</span><ins>+    CGAffineTransform videoTransformationMatrix(MediaSample&amp;);
+
</ins><span class="cx">     MediaPlayer* m_player { nullptr };
</span><span class="cx">     WeakPtrFactory&lt;MediaPlayerPrivateMediaStreamAVFObjC&gt; m_weakPtrFactory;
</span><span class="cx">     RefPtr&lt;MediaStreamPrivate&gt; m_mediaStreamPrivate;
</span><span class="lines">@@ -208,6 +223,7 @@
</span><span class="cx"> 
</span><span class="cx">     RetainPtr&lt;WebAVSampleBufferStatusChangeListener&gt; m_statusChangeListener;
</span><span class="cx">     RetainPtr&lt;AVSampleBufferDisplayLayer&gt; m_sampleBufferDisplayLayer;
</span><ins>+    RetainPtr&lt;PlatformLayer&gt; m_backgroundLayer;
</ins><span class="cx">     std::unique_ptr&lt;Clock&gt; m_clock;
</span><span class="cx"> 
</span><span class="cx">     MediaTime m_pausedTime;
</span><span class="lines">@@ -232,6 +248,10 @@
</span><span class="cx">     FloatSize m_intrinsicSize;
</span><span class="cx">     float m_volume { 1 };
</span><span class="cx">     DisplayMode m_displayMode { None };
</span><ins>+    PlaybackState m_playbackState { PlaybackState::None };
+    MediaSample::VideoOrientation m_videoOrientation { MediaSample::VideoOrientation::Unknown };
+    CGAffineTransform m_videoTransform;
+    bool m_videoMirrored { false };
</ins><span class="cx">     bool m_playing { false };
</span><span class="cx">     bool m_muted { false };
</span><span class="cx">     bool m_ended { false };
</span><span class="lines">@@ -238,6 +258,8 @@
</span><span class="cx">     bool m_hasEverEnqueuedVideoFrame { false };
</span><span class="cx">     bool m_pendingSelectedTrackCheck { false };
</span><span class="cx">     bool m_shouldDisplayFirstVideoFrame { false };
</span><ins>+    bool m_transformIsValid { false };
+    bool m_videoSizeChanged;
</ins><span class="cx"> 
</span><span class="cx"> #if PLATFORM(MAC) &amp;&amp; ENABLE(VIDEO_PRESENTATION_MODE)
</span><span class="cx">     std::unique_ptr&lt;VideoFullscreenLayerManager&gt; m_videoFullscreenLayerManager;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm (214119 => 214120)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm        2017-03-17 21:12:47 UTC (rev 214119)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm        2017-03-17 21:14:28 UTC (rev 214120)
</span><span class="lines">@@ -57,20 +57,24 @@
</span><span class="cx"> 
</span><span class="cx"> SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer)
</span><span class="cx"> 
</span><del>-#define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral()
-#define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed()
</del><ins>+SOFT_LINK_POINTER(AVFoundation, AVLayerVideoGravityResizeAspect, NSString *)
+SOFT_LINK_POINTER(AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *)
+SOFT_LINK_POINTER(AVFoundation, AVLayerVideoGravityResize, NSString *)
</ins><span class="cx"> 
</span><ins>+#define AVLayerVideoGravityResizeAspect getAVLayerVideoGravityResizeAspect()
+#define AVLayerVideoGravityResizeAspectFill getAVLayerVideoGravityResizeAspectFill()
+#define AVLayerVideoGravityResize getAVLayerVideoGravityResize()
+
</ins><span class="cx"> using namespace WebCore;
</span><span class="cx"> 
</span><span class="cx"> @interface WebAVSampleBufferStatusChangeListener : NSObject {
</span><span class="cx">     MediaPlayerPrivateMediaStreamAVFObjC* _parent;
</span><del>-    Vector&lt;RetainPtr&lt;AVSampleBufferDisplayLayer&gt;&gt; _layers;
</del><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> - (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)callback;
</span><span class="cx"> - (void)invalidate;
</span><del>-- (void)beginObservingLayer:(AVSampleBufferDisplayLayer *)layer;
-- (void)stopObservingLayer:(AVSampleBufferDisplayLayer *)layer;
</del><ins>+- (void)beginObservingLayers;
+- (void)stopObservingLayers;
</ins><span class="cx"> @end
</span><span class="cx"> 
</span><span class="cx"> @implementation WebAVSampleBufferStatusChangeListener
</span><span class="lines">@@ -81,6 +85,7 @@
</span><span class="cx">         return nil;
</span><span class="cx"> 
</span><span class="cx">     _parent = parent;
</span><ins>+
</ins><span class="cx">     return self;
</span><span class="cx"> }
</span><span class="cx"> 
</span><span class="lines">@@ -92,9 +97,7 @@
</span><span class="cx"> 
</span><span class="cx"> - (void)invalidate
</span><span class="cx"> {
</span><del>-    for (auto&amp; layer : _layers)
-        [layer removeObserver:self forKeyPath:@&quot;status&quot;];
-    _layers.clear();
</del><ins>+    [self stopObservingLayers];
</ins><span class="cx"> 
</span><span class="cx">     [[NSNotificationCenter defaultCenter] removeObserver:self];
</span><span class="cx"> 
</span><span class="lines">@@ -101,22 +104,28 @@
</span><span class="cx">     _parent = nullptr;
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-- (void)beginObservingLayer:(AVSampleBufferDisplayLayer*)layer
</del><ins>+- (void)beginObservingLayers
</ins><span class="cx"> {
</span><span class="cx">     ASSERT(_parent);
</span><del>-    ASSERT(!_layers.contains(layer));
</del><ins>+    ASSERT(_parent-&gt;displayLayer());
+    ASSERT(_parent-&gt;backgroundLayer());
</ins><span class="cx"> 
</span><del>-    _layers.append(layer);
-    [layer addObserver:self forKeyPath:@&quot;status&quot; options:NSKeyValueObservingOptionNew context:nullptr];
</del><ins>+    [_parent-&gt;displayLayer() addObserver:self forKeyPath:@&quot;status&quot; options:NSKeyValueObservingOptionNew context:nil];
+    [_parent-&gt;displayLayer() addObserver:self forKeyPath:@&quot;error&quot; options:NSKeyValueObservingOptionNew context:nil];
+    [_parent-&gt;backgroundLayer() addObserver:self forKeyPath:@&quot;bounds&quot; options:NSKeyValueObservingOptionNew context:nil];
</ins><span class="cx"> }
</span><span class="cx"> 
</span><del>-- (void)stopObservingLayer:(AVSampleBufferDisplayLayer*)layer
</del><ins>+- (void)stopObservingLayers
</ins><span class="cx"> {
</span><del>-    ASSERT(_parent);
-    ASSERT(_layers.contains(layer));
</del><ins>+    if (!_parent)
+        return;
</ins><span class="cx"> 
</span><del>-    [layer removeObserver:self forKeyPath:@&quot;status&quot;];
-    _layers.remove(_layers.find(layer));
</del><ins>+    if (_parent-&gt;displayLayer()) {
+        [_parent-&gt;displayLayer() removeObserver:self forKeyPath:@&quot;status&quot;];
+        [_parent-&gt;displayLayer() removeObserver:self forKeyPath:@&quot;error&quot;];
+    }
+    if (_parent-&gt;backgroundLayer())
+        [_parent-&gt;backgroundLayer() removeObserver:self forKeyPath:@&quot;bounds&quot;];
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> - (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
</span><span class="lines">@@ -125,23 +134,47 @@
</span><span class="cx">     UNUSED_PARAM(keyPath);
</span><span class="cx">     ASSERT(_parent);
</span><span class="cx"> 
</span><del>-    RetainPtr&lt;WebAVSampleBufferStatusChangeListener&gt; protectedSelf = self;
</del><span class="cx">     if ([object isKindOfClass:getAVSampleBufferDisplayLayerClass()]) {
</span><span class="cx">         RetainPtr&lt;AVSampleBufferDisplayLayer&gt; layer = (AVSampleBufferDisplayLayer *)object;
</span><del>-        RetainPtr&lt;NSNumber&gt; status = [change valueForKey:NSKeyValueChangeNewKey];
</del><ins>+        ASSERT(layer.get() == _parent-&gt;displayLayer());
</ins><span class="cx"> 
</span><del>-        ASSERT(_layers.contains(layer.get()));
-        ASSERT([keyPath isEqualToString:@&quot;status&quot;]);
</del><ins>+        if ([keyPath isEqualToString:@&quot;status&quot;]) {
+            RetainPtr&lt;NSNumber&gt; status = [change valueForKey:NSKeyValueChangeNewKey];
+            callOnMainThread([protectedSelf = WTFMove(self), layer = WTFMove(layer), status = WTFMove(status)] {
+                if (!protectedSelf-&gt;_parent)
+                    return;
</ins><span class="cx"> 
</span><del>-        callOnMainThread([protectedSelf = WTFMove(protectedSelf), layer = WTFMove(layer)] {
-            if (!protectedSelf-&gt;_parent)
-                return;
</del><ins>+                protectedSelf-&gt;_parent-&gt;layerStatusDidChange(layer.get());
+            });
+            return;
+        }
</ins><span class="cx"> 
</span><del>-            protectedSelf-&gt;_parent-&gt;layerStatusDidChange(layer.get());
-        });
</del><ins>+        if ([keyPath isEqualToString:@&quot;error&quot;]) {
+            RetainPtr&lt;NSNumber&gt; status = [change valueForKey:NSKeyValueChangeNewKey];
+            callOnMainThread([protectedSelf = WTFMove(self), layer = WTFMove(layer), status = WTFMove(status)] {
+                if (!protectedSelf-&gt;_parent)
+                    return;
</ins><span class="cx"> 
</span><del>-    } else
-        ASSERT_NOT_REACHED();
</del><ins>+                protectedSelf-&gt;_parent-&gt;layerErrorDidChange(layer.get());
+            });
+            return;
+        }
+    }
+
+    if ([[change valueForKey:NSKeyValueChangeNotificationIsPriorKey] boolValue])
+        return;
+
+    if ((CALayer *)object == _parent-&gt;backgroundLayer()) {
+        if ([keyPath isEqualToString:@&quot;bounds&quot;]) {
+            callOnMainThread([protectedSelf = WTFMove(self)] {
+                if (!protectedSelf-&gt;_parent)
+                    return;
+
+                protectedSelf-&gt;_parent-&gt;backgroundLayerBoundsChanged();
+            });
+        }
+    }
+
</ins><span class="cx"> }
</span><span class="cx"> @end
</span><span class="cx"> 
</span><span class="lines">@@ -177,7 +210,7 @@
</span><span class="cx">             track-&gt;removeObserver(*this);
</span><span class="cx">     }
</span><span class="cx"> 
</span><del>-    destroyLayer();
</del><ins>+    destroyLayers();
</ins><span class="cx"> 
</span><span class="cx">     [m_statusChangeListener invalidate];
</span><span class="cx"> 
</span><span class="lines">@@ -268,6 +301,46 @@
</span><span class="cx">     return timelineOffset;
</span><span class="cx"> }
</span><span class="cx"> 
</span><ins>+CGAffineTransform MediaPlayerPrivateMediaStreamAVFObjC::videoTransformationMatrix(MediaSample&amp; sample)
+{
+    if (m_transformIsValid)
+        return m_videoTransform;
+
+    CMSampleBufferRef sampleBuffer = sample.platformSample().sample.cmSampleBuffer;
+    CVPixelBufferRef pixelBuffer = static_cast&lt;CVPixelBufferRef&gt;(CMSampleBufferGetImageBuffer(sampleBuffer));
+    size_t width = CVPixelBufferGetWidth(pixelBuffer);
+    size_t height = CVPixelBufferGetHeight(pixelBuffer);
+    if (!width || !height)
+        return CGAffineTransformIdentity;
+
+    ASSERT(m_videoOrientation &gt;= MediaSample::VideoOrientation::Unknown);
+    ASSERT(m_videoOrientation &lt;= MediaSample::VideoOrientation::LandscapeLeft);
+
+    // Unknown, Portrait, PortraitUpsideDown, LandscapeRight, LandscapeLeft,
+#if PLATFORM(MAC)
+    static float sensorAngle[] = { 0, 0, 180, 90, 270 };
+#else
+    static float sensorAngle[] = { 180, 180, 0, 90, 270 };
+#endif
+    float rotation = sensorAngle[static_cast&lt;int&gt;(m_videoOrientation)];
+    m_videoTransform = CGAffineTransformMakeRotation(rotation * M_PI / 180);
+
+    if (sample.videoMirrored())
+        m_videoTransform = CGAffineTransformScale(m_videoTransform, -1, 1);
+
+    m_transformIsValid = true;
+    return m_videoTransform;
+}
+
+static void runWithoutAnimations(std::function&lt;void()&gt; function)
+{
+    [CATransaction begin];
+    [CATransaction setAnimationDuration:0];
+    [CATransaction setDisableActions:YES];
+    function();
+    [CATransaction commit];
+}
+
</ins><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPrivate&amp; track, MediaSample&amp; sample)
</span><span class="cx"> {
</span><span class="cx">     ASSERT(m_videoTrackMap.contains(track.id()));
</span><span class="lines">@@ -296,6 +369,25 @@
</span><span class="cx">     updateSampleTimes(sample, timelineOffset, &quot;MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample&quot;);
</span><span class="cx"> 
</span><span class="cx">     if (m_sampleBufferDisplayLayer) {
</span><ins>+        if (sample.videoOrientation() != m_videoOrientation || sample.videoMirrored() != m_videoMirrored) {
+            m_videoOrientation = sample.videoOrientation();
+            m_videoMirrored = sample.videoMirrored();
+            m_transformIsValid = false;
+        }
+
+        if (m_videoSizeChanged || !m_transformIsValid) {
+            runWithoutAnimations([this, &amp;sample] {
+                auto backgroundBounds = m_backgroundLayer.get().bounds;
+                auto videoBounds = backgroundBounds;
+                if (m_videoOrientation == MediaSample::VideoOrientation::LandscapeRight || m_videoOrientation == MediaSample::VideoOrientation::LandscapeLeft)
+                    std::swap(videoBounds.size.width, videoBounds.size.height);
+                m_sampleBufferDisplayLayer.get().bounds = videoBounds;
+                m_sampleBufferDisplayLayer.get().position = { backgroundBounds.size.width / 2, backgroundBounds.size.height / 2};
+                m_sampleBufferDisplayLayer.get().affineTransform = videoTransformationMatrix(sample);
+                m_videoSizeChanged = false;
+            });
+        }
+
</ins><span class="cx">         if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
</span><span class="cx">             addSampleToPendingQueue(m_pendingVideoSampleQueue, sample);
</span><span class="cx">             requestNotificationWhenReadyForVideoData();
</span><span class="lines">@@ -334,6 +426,12 @@
</span><span class="cx">     return nullptr;
</span><span class="cx"> }
</span><span class="cx"> 
</span><ins>+void MediaPlayerPrivateMediaStreamAVFObjC::layerErrorDidChange(AVSampleBufferDisplayLayer* layer)
+{
+    UNUSED_PARAM(layer);
+    LOG(Media, &quot;MediaPlayerPrivateMediaStreamAVFObjC::layerErrorDidChange(%p) - error = %s&quot;, this, [[layer.error localizedDescription] UTF8String]);
+}
+
</ins><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer)
</span><span class="cx"> {
</span><span class="cx">     LOG(Media, &quot;MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(%p) - status = %d&quot;, this, (int)layer.status);
</span><span class="lines">@@ -359,7 +457,7 @@
</span><span class="cx">     [m_sampleBufferDisplayLayer flushAndRemoveImage];
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer()
</del><ins>+void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers()
</ins><span class="cx"> {
</span><span class="cx">     if (m_sampleBufferDisplayLayer)
</span><span class="cx">         return;
</span><span class="lines">@@ -373,30 +471,44 @@
</span><span class="cx">         return;
</span><span class="cx">     }
</span><span class="cx"> 
</span><ins>+    m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
+    m_sampleBufferDisplayLayer.get().anchorPoint = { .5, .5 };
+    m_sampleBufferDisplayLayer.get().needsDisplayOnBoundsChange = YES;
+    m_sampleBufferDisplayLayer.get().videoGravity = AVLayerVideoGravityResizeAspectFill;
+
+    m_backgroundLayer = adoptNS([[CALayer alloc] init]);
+    m_backgroundLayer.get().backgroundColor = cachedCGColor(Color::black);
+    m_backgroundLayer.get().needsDisplayOnBoundsChange = YES;
+
+    [m_statusChangeListener beginObservingLayers];
+
+    [m_backgroundLayer addSublayer:m_sampleBufferDisplayLayer.get()];
+
</ins><span class="cx"> #ifndef NDEBUG
</span><span class="cx">     [m_sampleBufferDisplayLayer setName:@&quot;MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer&quot;];
</span><ins>+    [m_backgroundLayer setName:@&quot;MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer parent&quot;];
</ins><span class="cx"> #endif
</span><del>-    m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
-    [m_statusChangeListener beginObservingLayer:m_sampleBufferDisplayLayer.get()];
</del><span class="cx"> 
</span><span class="cx">     updateRenderingMode();
</span><span class="cx">     
</span><span class="cx"> #if PLATFORM(MAC) &amp;&amp; ENABLE(VIDEO_PRESENTATION_MODE)
</span><del>-    m_videoFullscreenLayerManager-&gt;setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player-&gt;client().mediaPlayerContentBoxRect()).size());
</del><ins>+    m_videoFullscreenLayerManager-&gt;setVideoLayer(m_backgroundLayer.get(), snappedIntRect(m_player-&gt;client().mediaPlayerContentBoxRect()).size());
</ins><span class="cx"> #endif
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-void MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer()
</del><ins>+void MediaPlayerPrivateMediaStreamAVFObjC::destroyLayers()
</ins><span class="cx"> {
</span><span class="cx">     if (!m_sampleBufferDisplayLayer)
</span><span class="cx">         return;
</span><span class="cx"> 
</span><ins>+    [m_statusChangeListener stopObservingLayers];
</ins><span class="cx">     if (m_sampleBufferDisplayLayer) {
</span><span class="cx">         m_pendingVideoSampleQueue.clear();
</span><del>-        [m_statusChangeListener stopObservingLayer:m_sampleBufferDisplayLayer.get()];
</del><span class="cx">         [m_sampleBufferDisplayLayer stopRequestingMediaData];
</span><span class="cx">         [m_sampleBufferDisplayLayer flush];
</span><ins>+        m_sampleBufferDisplayLayer = nullptr;
</ins><span class="cx">     }
</span><ins>+    m_backgroundLayer = nullptr;
</ins><span class="cx"> 
</span><span class="cx">     updateRenderingMode();
</span><span class="cx">     
</span><span class="lines">@@ -446,7 +558,7 @@
</span><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::cancelLoad()
</span><span class="cx"> {
</span><span class="cx">     LOG(Media, &quot;MediaPlayerPrivateMediaStreamAVFObjC::cancelLoad(%p)&quot;, this);
</span><del>-    if (m_playing)
</del><ins>+    if (playing())
</ins><span class="cx">         pause();
</span><span class="cx"> }
</span><span class="cx"> 
</span><span class="lines">@@ -457,17 +569,26 @@
</span><span class="cx"> 
</span><span class="cx"> PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::platformLayer() const
</span><span class="cx"> {
</span><del>-    if (!m_sampleBufferDisplayLayer || m_displayMode == None)
</del><ins>+    if (!m_backgroundLayer || m_displayMode == None)
</ins><span class="cx">         return nullptr;
</span><span class="cx"> 
</span><span class="cx"> #if PLATFORM(MAC) &amp;&amp; ENABLE(VIDEO_PRESENTATION_MODE)
</span><span class="cx">     return m_videoFullscreenLayerManager-&gt;videoInlineLayer();
</span><span class="cx"> #else
</span><ins>+    return m_backgroundLayer.get();
+#endif
+}
</ins><span class="cx"> 
</span><ins>+PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::displayLayer()
+{
</ins><span class="cx">     return m_sampleBufferDisplayLayer.get();
</span><del>-#endif
</del><span class="cx"> }
</span><span class="cx"> 
</span><ins>+PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayer()
+{
+    return m_backgroundLayer.get();
+}
+
</ins><span class="cx"> MediaPlayerPrivateMediaStreamAVFObjC::DisplayMode MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode() const
</span><span class="cx"> {
</span><span class="cx">     if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || !m_sampleBufferDisplayLayer)
</span><span class="lines">@@ -474,16 +595,19 @@
</span><span class="cx">         return None;
</span><span class="cx"> 
</span><span class="cx">     if (auto* track = m_mediaStreamPrivate-&gt;activeVideoTrack()) {
</span><del>-        if (!m_shouldDisplayFirstVideoFrame || !track-&gt;enabled() || track-&gt;muted())
</del><ins>+        if (!track-&gt;enabled() || track-&gt;muted())
</ins><span class="cx">             return PaintItBlack;
</span><span class="cx">     }
</span><span class="cx"> 
</span><del>-    if (m_playing) {
</del><ins>+    if (playing()) {
</ins><span class="cx">         if (!m_mediaStreamPrivate-&gt;isProducingData())
</span><span class="cx">             return PausedImage;
</span><span class="cx">         return LivePreview;
</span><span class="cx">     }
</span><span class="cx"> 
</span><ins>+    if (m_playbackState == PlaybackState::None)
+        return PaintItBlack;
+
</ins><span class="cx">     return PausedImage;
</span><span class="cx"> }
</span><span class="cx"> 
</span><span class="lines">@@ -493,11 +617,13 @@
</span><span class="cx"> 
</span><span class="cx">     if (displayMode == m_displayMode)
</span><span class="cx">         return false;
</span><del>-
</del><span class="cx">     m_displayMode = displayMode;
</span><span class="cx"> 
</span><del>-    if (m_displayMode &lt; PausedImage &amp;&amp; m_sampleBufferDisplayLayer)
-        flushAndRemoveVideoSampleBuffers();
</del><ins>+    if (m_sampleBufferDisplayLayer) {
+        runWithoutAnimations([this] {
+            m_sampleBufferDisplayLayer.get().hidden = m_displayMode &lt; PausedImage;
+        });
+    }
</ins><span class="cx"> 
</span><span class="cx">     return true;
</span><span class="cx"> }
</span><span class="lines">@@ -506,10 +632,10 @@
</span><span class="cx"> {
</span><span class="cx">     LOG(Media, &quot;MediaPlayerPrivateMediaStreamAVFObjC::play(%p)&quot;, this);
</span><span class="cx"> 
</span><del>-    if (!metaDataAvailable() || m_playing || m_ended)
</del><ins>+    if (!metaDataAvailable() || playing() || m_ended)
</ins><span class="cx">         return;
</span><span class="cx"> 
</span><del>-    m_playing = true;
</del><ins>+    m_playbackState = PlaybackState::Playing;
</ins><span class="cx">     if (!m_clock-&gt;isRunning())
</span><span class="cx">         m_clock-&gt;start();
</span><span class="cx"> 
</span><span class="lines">@@ -528,11 +654,11 @@
</span><span class="cx"> {
</span><span class="cx">     LOG(Media, &quot;MediaPlayerPrivateMediaStreamAVFObjC::pause(%p)&quot;, this);
</span><span class="cx"> 
</span><del>-    if (!metaDataAvailable() || !m_playing || m_ended)
</del><ins>+    if (!metaDataAvailable() || !playing() || m_ended)
</ins><span class="cx">         return;
</span><span class="cx"> 
</span><span class="cx">     m_pausedTime = currentMediaTime();
</span><del>-    m_playing = false;
</del><ins>+    m_playbackState = PlaybackState::Paused;
</ins><span class="cx"> 
</span><span class="cx">     for (const auto&amp; track : m_audioTrackMap.values())
</span><span class="cx">         track-&gt;pause();
</span><span class="lines">@@ -541,11 +667,6 @@
</span><span class="cx">     flushRenderers();
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-bool MediaPlayerPrivateMediaStreamAVFObjC::paused() const
-{
-    return !m_playing;
-}
-
</del><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::setVolume(float volume)
</span><span class="cx"> {
</span><span class="cx">     LOG(Media, &quot;MediaPlayerPrivateMediaStreamAVFObjC::setVolume(%p)&quot;, this);
</span><span class="lines">@@ -593,7 +714,7 @@
</span><span class="cx"> 
</span><span class="cx"> MediaTime MediaPlayerPrivateMediaStreamAVFObjC::currentMediaTime() const
</span><span class="cx"> {
</span><del>-    if (!m_playing)
</del><ins>+    if (paused())
</ins><span class="cx">         return m_pausedTime;
</span><span class="cx"> 
</span><span class="cx">     return streamTime();
</span><span class="lines">@@ -650,7 +771,7 @@
</span><span class="cx"> {
</span><span class="cx">     scheduleDeferredTask([this] {
</span><span class="cx">         bool ended = !m_mediaStreamPrivate-&gt;active();
</span><del>-        if (ended &amp;&amp; m_playing)
</del><ins>+        if (ended &amp;&amp; playing())
</ins><span class="cx">             pause();
</span><span class="cx"> 
</span><span class="cx">         updateReadyState();
</span><span class="lines">@@ -670,6 +791,7 @@
</span><span class="cx">         return;
</span><span class="cx"> 
</span><span class="cx">     scheduleDeferredTask([this] {
</span><ins>+        m_transformIsValid = false;
</ins><span class="cx">         if (m_player)
</span><span class="cx">             m_player-&gt;client().mediaPlayerRenderingModeChanged(m_player);
</span><span class="cx">     });
</span><span class="lines">@@ -825,7 +947,7 @@
</span><span class="cx"> 
</span><span class="cx">         if (oldVideoTrack != m_activeVideoTrack)
</span><span class="cx">             m_imagePainter.reset();
</span><del>-        ensureLayer();
</del><ins>+        ensureLayers();
</ins><span class="cx">         m_sampleBufferDisplayLayer.get().hidden = hideVideoLayer;
</span><span class="cx">         m_pendingSelectedTrackCheck = false;
</span><span class="cx">         updateDisplayMode();
</span><span class="lines">@@ -914,11 +1036,10 @@
</span><span class="cx">     if (m_displayMode == None || !metaDataAvailable() || context.paintingDisabled())
</span><span class="cx">         return;
</span><span class="cx"> 
</span><del>-    GraphicsContextStateSaver stateSaver(context);
-
</del><span class="cx">     if (m_displayMode != PaintItBlack &amp;&amp; m_imagePainter.mediaSample)
</span><span class="cx">         updateCurrentFrameImage();
</span><span class="cx"> 
</span><ins>+    GraphicsContextStateSaver stateSaver(context);
</ins><span class="cx">     if (m_displayMode == PaintItBlack || !m_imagePainter.cgImage || !m_imagePainter.mediaSample) {
</span><span class="cx">         context.fillRect(IntRect(IntPoint(), IntSize(destRect.width(), destRect.height())), Color::black);
</span><span class="cx">         return;
</span><span class="lines">@@ -926,15 +1047,18 @@
</span><span class="cx"> 
</span><span class="cx">     auto image = m_imagePainter.cgImage.get();
</span><span class="cx">     FloatRect imageRect(0, 0, CGImageGetWidth(image), CGImageGetHeight(image));
</span><del>-    context.drawNativeImage(image, imageRect.size(), destRect, imageRect);
</del><ins>+    AffineTransform videoTransform = videoTransformationMatrix(*m_imagePainter.mediaSample);
+    FloatRect transformedDestRect = videoTransform.inverse().value_or(AffineTransform()).mapRect(destRect);
+    context.concatCTM(videoTransform);
+    context.drawNativeImage(image, imageRect.size(), transformedDestRect, imageRect);
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged()
</span><span class="cx"> {
</span><span class="cx">     if (m_player-&gt;client().mediaPlayerRenderingCanBeAccelerated(m_player))
</span><del>-        ensureLayer();
</del><ins>+        ensureLayers();
</ins><span class="cx">     else
</span><del>-        destroyLayer();
</del><ins>+        destroyLayers();
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> String MediaPlayerPrivateMediaStreamAVFObjC::engineDescription() const
</span><span class="lines">@@ -943,11 +1067,6 @@
</span><span class="cx">     return description;
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-bool MediaPlayerPrivateMediaStreamAVFObjC::shouldBePlaying() const
-{
-    return m_playing &amp;&amp; m_readyState &gt;= MediaPlayer::HaveFutureData;
-}
-
</del><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::setReadyState(MediaPlayer::ReadyState readyState)
</span><span class="cx"> {
</span><span class="cx">     if (m_readyState == readyState)
</span><span class="lines">@@ -969,6 +1088,12 @@
</span><span class="cx">     m_player-&gt;networkStateChanged();
</span><span class="cx"> }
</span><span class="cx"> 
</span><ins>+void MediaPlayerPrivateMediaStreamAVFObjC::setShouldBufferData(bool shouldBuffer)
+{
+    if (!shouldBuffer)
+        flushAndRemoveVideoSampleBuffers();
+}
+
</ins><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::scheduleDeferredTask(Function&lt;void ()&gt;&amp;&amp; function)
</span><span class="cx"> {
</span><span class="cx">     ASSERT(function);
</span><span class="lines">@@ -987,6 +1112,14 @@
</span><span class="cx">     pixelBufferConformer = nullptr;
</span><span class="cx"> }
</span><span class="cx"> 
</span><ins>+void MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayerBoundsChanged()
+{
+    if (!m_backgroundLayer || !m_sampleBufferDisplayLayer)
+        return;
+
+    m_videoSizeChanged = true;
</ins><span class="cx"> }
</span><span class="cx"> 
</span><ins>+}
+
</ins><span class="cx"> #endif
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAVVideoCaptureSourceh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h (214119 => 214120)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h        2017-03-17 21:12:47 UTC (rev 214119)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h        2017-03-17 21:14:28 UTC (rev 214120)
</span><span class="lines">@@ -77,7 +77,7 @@
</span><span class="cx">     bool updateFramerate(CMSampleBufferRef);
</span><span class="cx"> 
</span><span class="cx">     void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) final;
</span><del>-    void processNewFrame(RetainPtr&lt;CMSampleBufferRef&gt;);
</del><ins>+    void processNewFrame(RetainPtr&lt;CMSampleBufferRef&gt;, RetainPtr&lt;AVCaptureConnection&gt;);
</ins><span class="cx"> 
</span><span class="cx">     RetainPtr&lt;NSString&gt; m_pendingPreset;
</span><span class="cx">     RetainPtr&lt;CMSampleBufferRef&gt; m_buffer;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAVVideoCaptureSourcemm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm (214119 => 214120)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm        2017-03-17 21:12:47 UTC (rev 214119)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm        2017-03-17 21:14:28 UTC (rev 214120)
</span><span class="lines">@@ -417,7 +417,7 @@
</span><span class="cx">     return frameRate != m_frameRate;
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-void AVVideoCaptureSource::processNewFrame(RetainPtr&lt;CMSampleBufferRef&gt; sampleBuffer)
</del><ins>+void AVVideoCaptureSource::processNewFrame(RetainPtr&lt;CMSampleBufferRef&gt; sampleBuffer, RetainPtr&lt;AVCaptureConnectionType&gt; connection)
</ins><span class="cx"> {
</span><span class="cx">     // Ignore frames delivered when the session is not running, we want to hang onto the last image
</span><span class="cx">     // delivered before it stopped.
</span><span class="lines">@@ -432,8 +432,27 @@
</span><span class="cx">     m_buffer = sampleBuffer;
</span><span class="cx">     m_lastImage = nullptr;
</span><span class="cx"> 
</span><ins>+    MediaSample::VideoOrientation orientation = MediaSample::VideoOrientation::Unknown;
+    switch ([connection videoOrientation]) {
+    case AVCaptureVideoOrientationPortrait:
+        orientation = MediaSample::VideoOrientation::Portrait;
+        break;
+    case AVCaptureVideoOrientationPortraitUpsideDown:
+        orientation = MediaSample::VideoOrientation::PortraitUpsideDown;
+        break;
+    case AVCaptureVideoOrientationLandscapeRight:
+        orientation = MediaSample::VideoOrientation::LandscapeRight;
+        break;
+    case AVCaptureVideoOrientationLandscapeLeft:
+        orientation = MediaSample::VideoOrientation::LandscapeLeft;
+        break;
+    }
+
</ins><span class="cx">     bool settingsChanged = false;
</span><span class="cx">     CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(formatDescription);
</span><ins>+    if (orientation == MediaSample::VideoOrientation::LandscapeRight || orientation == MediaSample::VideoOrientation::LandscapeLeft)
+        std::swap(dimensions.width, dimensions.height);
+
</ins><span class="cx">     if (dimensions.width != m_width || dimensions.height != m_height) {
</span><span class="cx">         m_width = dimensions.width;
</span><span class="cx">         m_height = dimensions.height;
</span><span class="lines">@@ -443,15 +462,16 @@
</span><span class="cx">     if (settingsChanged)
</span><span class="cx">         settingsDidChange();
</span><span class="cx"> 
</span><del>-    videoSampleAvailable(MediaSampleAVFObjC::create(m_buffer.get()));
</del><ins>+    videoSampleAvailable(MediaSampleAVFObjC::create(m_buffer.get(), orientation, [connection isVideoMirrored]));
</ins><span class="cx"> }
</span><span class="cx"> 
</span><del>-void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*)
</del><ins>+void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType* captureConnection)
</ins><span class="cx"> {
</span><span class="cx">     RetainPtr&lt;CMSampleBufferRef&gt; buffer = sampleBuffer;
</span><ins>+    RetainPtr&lt;AVCaptureConnectionType&gt; connection = captureConnection;
</ins><span class="cx"> 
</span><del>-    scheduleDeferredTask([this, buffer] {
-        this-&gt;processNewFrame(buffer);
</del><ins>+    scheduleDeferredTask([this, buffer, connection] {
+        this-&gt;processNewFrame(buffer, connection);
</ins><span class="cx">     });
</span><span class="cx"> }
</span><span class="cx"> 
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacRealtimeOutgoingVideoSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp (214119 => 214120)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp        2017-03-17 21:12:47 UTC (rev 214119)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp        2017-03-17 21:14:28 UTC (rev 214120)
</span><span class="lines">@@ -73,9 +73,9 @@
</span><span class="cx">     m_sinks.removeFirst(sink);
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-void RealtimeOutgoingVideoSource::sendFrame(rtc::scoped_refptr&lt;webrtc::VideoFrameBuffer&gt;&amp;&amp; buffer)
</del><ins>+void RealtimeOutgoingVideoSource::sendFrame(rtc::scoped_refptr&lt;webrtc::VideoFrameBuffer&gt;&amp;&amp; buffer, webrtc::VideoRotation rotation)
</ins><span class="cx"> {
</span><del>-    webrtc::VideoFrame frame(buffer, 0, 0, webrtc::kVideoRotation_0);
</del><ins>+    webrtc::VideoFrame frame(buffer, 0, 0, rotation);
</ins><span class="cx">     for (auto* sink : m_sinks)
</span><span class="cx">         sink-&gt;OnFrame(frame);
</span><span class="cx"> }
</span><span class="lines">@@ -91,7 +91,7 @@
</span><span class="cx">     if (m_muted || !m_enabled) {
</span><span class="cx">         auto blackBuffer = m_bufferPool.CreateBuffer(settings.width(), settings.height());
</span><span class="cx">         blackBuffer-&gt;SetToBlack();
</span><del>-        sendFrame(WTFMove(blackBuffer));
</del><ins>+        sendFrame(WTFMove(blackBuffer), webrtc::kVideoRotation_0);
</ins><span class="cx">         return;
</span><span class="cx">     }
</span><span class="cx"> 
</span><span class="lines">@@ -99,8 +99,25 @@
</span><span class="cx">     auto pixelBuffer = static_cast&lt;CVPixelBufferRef&gt;(CMSampleBufferGetImageBuffer(sample.platformSample().sample.cmSampleBuffer));
</span><span class="cx">     auto pixelFormatType = CVPixelBufferGetPixelFormatType(pixelBuffer);
</span><span class="cx"> 
</span><del>-    if (pixelFormatType == kCVPixelFormatType_420YpCbCr8Planar) {
-        sendFrame(new rtc::RefCountedObject&lt;webrtc::CoreVideoFrameBuffer&gt;(pixelBuffer));
</del><ins>+    webrtc::VideoRotation rotation;
+    switch (sample.videoOrientation()) {
+    case MediaSample::VideoOrientation::Unknown:
+    case MediaSample::VideoOrientation::Portrait:
+        rotation = webrtc::kVideoRotation_0;
+        break;
+    case MediaSample::VideoOrientation::PortraitUpsideDown:
+        rotation = webrtc::kVideoRotation_180;
+        break;
+    case MediaSample::VideoOrientation::LandscapeRight:
+        rotation = webrtc::kVideoRotation_90;
+        break;
+    case MediaSample::VideoOrientation::LandscapeLeft:
+        rotation = webrtc::kVideoRotation_270;
+        break;
+    }
+
+    if (pixelFormatType == kCVPixelFormatType_420YpCbCr8Planar || pixelFormatType == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange) {
+        sendFrame(new rtc::RefCountedObject&lt;webrtc::CoreVideoFrameBuffer&gt;(pixelBuffer), rotation);
</ins><span class="cx">         return;
</span><span class="cx">     }
</span><span class="cx"> 
</span><span class="lines">@@ -116,7 +133,7 @@
</span><span class="cx">         webrtc::ConvertToI420(webrtc::kBGRA, source, 0, 0, settings.width(), settings.height(), 0, webrtc::kVideoRotation_0, newBuffer);
</span><span class="cx">     }
</span><span class="cx">     CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
</span><del>-    sendFrame(WTFMove(newBuffer));
</del><ins>+    sendFrame(WTFMove(newBuffer), rotation);
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> } // namespace WebCore
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacRealtimeOutgoingVideoSourceh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.h (214119 => 214120)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.h        2017-03-17 21:12:47 UTC (rev 214119)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.h        2017-03-17 21:14:28 UTC (rev 214120)
</span><span class="lines">@@ -50,7 +50,7 @@
</span><span class="cx"> private:
</span><span class="cx">     RealtimeOutgoingVideoSource(Ref&lt;RealtimeMediaSource&gt;&amp;&amp;);
</span><span class="cx"> 
</span><del>-    void sendFrame(rtc::scoped_refptr&lt;webrtc::VideoFrameBuffer&gt;&amp;&amp;);
</del><ins>+    void sendFrame(rtc::scoped_refptr&lt;webrtc::VideoFrameBuffer&gt;&amp;&amp;, webrtc::VideoRotation);
</ins><span class="cx"> 
</span><span class="cx">     // Notifier API
</span><span class="cx">     void RegisterObserver(webrtc::ObserverInterface*) final { }
</span></span></pre>
</div>
</div>

</body>
</html>