<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[214044] trunk</title>
</head>
<body>

<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt;  }
#msg dl a { font-weight: bold}
#msg dl a:link    { color:#fc3; }
#msg dl a:active  { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff  {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/214044">214044</a></dd>
<dt>Author</dt> <dd>commit-queue@webkit.org</dd>
<dt>Date</dt> <dd>2017-03-16 09:09:50 -0700 (Thu, 16 Mar 2017)</dd>
</dl>

<h3>Log Message</h3>
<pre>Improve WebRTC track enabled support
https://bugs.webkit.org/show_bug.cgi?id=169727

Patch by Youenn Fablet &lt;youenn@apple.com&gt; on 2017-03-16
Reviewed by Alex Christensen.

Source/WebCore:

Tests: webrtc/peer-connection-audio-mute2.html
       webrtc/peer-connection-remote-audio-mute.html
       webrtc/video-remote-mute.html

Making sure muted/disabled sources produce silence/black frames.
For outgoing audio/video sources, this should be done by the actual a/v providers.
We keep this filtering here until we are sure they implement that.

* platform/audio/mac/AudioSampleDataSource.mm:
(WebCore::AudioSampleDataSource::pullAvalaibleSamplesAsChunks): Ensuring disabled audio tracks send silence.
Used for outgoing webrtc tracks.
* platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
(WebCore::MockRealtimeAudioSourceMac::render): Ditto.
* platform/mediastream/mac/RealtimeIncomingAudioSource.cpp:
(WebCore::RealtimeIncomingAudioSource::OnData): Ditto.
* platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
(WebCore::RealtimeIncomingVideoSource::pixelBufferFromVideoFrame): Generating black frames if muted.
(WebCore::RealtimeIncomingVideoSource::OnFrame):
* platform/mediastream/mac/RealtimeIncomingVideoSource.h:
* platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
(WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable): Ensuring we quit after sending black frame.

LayoutTests:

* TestExpectations:
* webrtc/audio-peer-connection-webaudio.html:
* webrtc/peer-connection-audio-mute-expected.txt:
* webrtc/peer-connection-audio-mute.html:
* webrtc/peer-connection-audio-mute2-expected.txt: Added.
* webrtc/peer-connection-audio-mute2.html: Added.
* webrtc/peer-connection-remote-audio-mute-expected.txt: Added.
* webrtc/peer-connection-remote-audio-mute.html: Added.
* webrtc/video-mute-expected.txt:
* webrtc/video-mute.html:
* webrtc/video-remote-mute-expected.txt: Added.
* webrtc/video-remote-mute.html: Added.</pre>

<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkLayoutTestsChangeLog">trunk/LayoutTests/ChangeLog</a></li>
<li><a href="#trunkLayoutTestsTestExpectations">trunk/LayoutTests/TestExpectations</a></li>
<li><a href="#trunkLayoutTestswebrtcaudiopeerconnectionwebaudiohtml">trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionaudiomuteexpectedtxt">trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionaudiomutehtml">trunk/LayoutTests/webrtc/peer-connection-audio-mute.html</a></li>
<li><a href="#trunkLayoutTestswebrtcvideomuteexpectedtxt">trunk/LayoutTests/webrtc/video-mute-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcvideomutehtml">trunk/LayoutTests/webrtc/video-mute.html</a></li>
<li><a href="#trunkSourceWebCoreChangeLog">trunk/Source/WebCore/ChangeLog</a></li>
<li><a href="#trunkSourceWebCoreplatformaudiomacAudioSampleDataSourcemm">trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMacmm">trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacRealtimeIncomingAudioSourcecpp">trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacRealtimeIncomingVideoSourcecpp">trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacRealtimeIncomingVideoSourceh">trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacRealtimeOutgoingVideoSourcecpp">trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp</a></li>
</ul>

<h3>Added Paths</h3>
<ul>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionaudiomute2expectedtxt">trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionaudiomute2html">trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionremoteaudiomuteexpectedtxt">trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionremoteaudiomutehtml">trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionremoteaudiomute2expectedtxt">trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionremoteaudiomute2html">trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html</a></li>
<li><a href="#trunkLayoutTestswebrtcvideoremotemuteexpectedtxt">trunk/LayoutTests/webrtc/video-remote-mute-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcvideoremotemutehtml">trunk/LayoutTests/webrtc/video-remote-mute.html</a></li>
</ul>

</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkLayoutTestsChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/ChangeLog (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/ChangeLog        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/ChangeLog        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -1,3 +1,23 @@
</span><ins>+2017-03-16  Youenn Fablet  &lt;youenn@apple.com&gt;
+
+        Improve WebRTC track enabled support
+        https://bugs.webkit.org/show_bug.cgi?id=169727
+
+        Reviewed by Alex Christensen.
+
+        * TestExpectations:
+        * webrtc/audio-peer-connection-webaudio.html:
+        * webrtc/peer-connection-audio-mute-expected.txt:
+        * webrtc/peer-connection-audio-mute.html:
+        * webrtc/peer-connection-audio-mute2-expected.txt: Added.
+        * webrtc/peer-connection-audio-mute2.html: Added.
+        * webrtc/peer-connection-remote-audio-mute-expected.txt: Added.
+        * webrtc/peer-connection-remote-audio-mute.html: Added.
+        * webrtc/video-mute-expected.txt:
+        * webrtc/video-mute.html:
+        * webrtc/video-remote-mute-expected.txt: Added.
+        * webrtc/video-remote-mute.html: Added.
+
</ins><span class="cx"> 2017-03-16  Manuel Rego Casasnovas  &lt;rego@igalia.com&gt;
</span><span class="cx"> 
</span><span class="cx">         [css-grid] Crash on debug removing a positioned child
</span></span></pre></div>
<a id="trunkLayoutTestsTestExpectations"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/TestExpectations (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/TestExpectations        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/TestExpectations        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -711,7 +711,10 @@
</span><span class="cx"> # GTK enables some of these tests on their TestExpectations file.
</span><span class="cx"> [ Release ] webrtc [ Skip ]
</span><span class="cx"> 
</span><del>-[ Debug ] webrtc/audio-peer-connection-webaudio.html [ Failure ]
</del><ins>+[ Debug ] webrtc/peer-connection-audio-mute.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-audio-mute2.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-remote-audio-mute.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-remote-audio-mute2.html [ Pass Failure ]
</ins><span class="cx"> fast/mediastream/getUserMedia-webaudio.html [ Skip ]
</span><span class="cx"> fast/mediastream/RTCPeerConnection-AddRemoveStream.html [ Skip ]
</span><span class="cx"> fast/mediastream/RTCPeerConnection-closed-state.html [ Skip ]
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcaudiopeerconnectionwebaudiohtml"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -11,7 +11,7 @@
</span><span class="cx">         if (window.testRunner)
</span><span class="cx">             testRunner.setUserMediaPermission(true);
</span><span class="cx"> 
</span><del>-       return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) =&gt; {
</del><ins>+        return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) =&gt; {
</ins><span class="cx">             if (window.internals)
</span><span class="cx">                 internals.useMockRTCPeerConnectionFactory(&quot;TwoRealPeerConnections&quot;);
</span><span class="cx">             return new Promise((resolve, reject) =&gt; {
</span><span class="lines">@@ -21,14 +21,14 @@
</span><span class="cx">                     secondConnection.onaddstream = (streamEvent) =&gt; { resolve(streamEvent.stream); };
</span><span class="cx">                 });
</span><span class="cx">                 setTimeout(() =&gt; reject(&quot;Test timed out&quot;), 5000);
</span><del>-            }).then((stream) =&gt; {
-                return analyseAudio(stream, 1000);
-            }).then((results) =&gt; {
-                assert_true(results.heardHum, &quot;heard hum&quot;);
-                assert_true(results.heardBip, &quot;heard bip&quot;);
-                assert_true(results.heardBop, &quot;heard bop&quot;);
</del><span class="cx">             });
</span><del>-         });
</del><ins>+        }).then((remoteStream) =&gt; {
+            return analyseAudio(remoteStream, 1000);
+        }).then((results) =&gt; {
+            assert_true(results.heardHum, &quot;heard hum&quot;);
+            assert_true(results.heardBip, &quot;heard bip&quot;);
+            assert_true(results.heardBop, &quot;heard bop&quot;);
+        });
</ins><span class="cx">     }, &quot;Basic audio playback through a peer connection&quot;);
</span><span class="cx">     &lt;/script&gt;
</span><span class="cx"> &lt;/head&gt;
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionaudiomuteexpectedtxt"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -1,3 +1,3 @@
</span><span class="cx"> 
</span><del>-FAIL Muting and unmuting an audio track assert_true: heard hum expected true got false
</del><ins>+PASS Muting a local audio track and making sure the remote track is silent 
</ins><span class="cx"> 
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionaudiomutehtml"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/webrtc/peer-connection-audio-mute.html (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-audio-mute.html        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -13,17 +13,19 @@
</span><span class="cx">         if (window.testRunner)
</span><span class="cx">             testRunner.setUserMediaPermission(true);
</span><span class="cx"> 
</span><del>-        return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) =&gt; {
</del><ins>+        var localTrack;
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) =&gt; {
</ins><span class="cx">             if (window.internals)
</span><span class="cx">                 internals.useMockRTCPeerConnectionFactory(&quot;TwoRealPeerConnections&quot;);
</span><span class="cx"> 
</span><del>-            var stream;
</del><ins>+            localTrack = localStream.getAudioTracks()[0];
+            var remoteStream;
</ins><span class="cx">             return new Promise((resolve, reject) =&gt; {
</span><span class="cx">                 createConnections((firstConnection) =&gt; {
</span><del>-                    firstConnection.addStream(stream);
</del><ins>+                    firstConnection.addStream(localStream);
</ins><span class="cx">                 }, (secondConnection) =&gt; {
</span><span class="cx">                     secondConnection.onaddstream = (streamEvent) =&gt; {
</span><del>-                        stream = streamEvent.stream;
</del><ins>+                        remoteStream = streamEvent.stream;
</ins><span class="cx">                         resolve();
</span><span class="cx">                     };
</span><span class="cx">                 });
</span><span class="lines">@@ -30,36 +32,19 @@
</span><span class="cx">             }).then(() =&gt; {
</span><span class="cx">                 return waitFor(500);
</span><span class="cx">             }).then(() =&gt; {
</span><del>-                return analyseAudio(stream, 500).then((results) =&gt; {
-                    assert_true(results.heardHum, &quot;heard hum&quot;);
-                    assert_true(results.heardBip, &quot;heard bip&quot;);
-                    assert_true(results.heardBop, &quot;heard bop&quot;);
</del><ins>+                return analyseAudio(remoteStream, 500).then((results) =&gt; {
+                    assert_true(results.heardHum, &quot;heard hum from remote enabled track&quot;);
</ins><span class="cx">                 });
</span><span class="cx">             }).then(() =&gt; {
</span><del>-                stream.getAudioTracks().forEach((track) =&gt; {
-                    track.enabled = false;
-                });
</del><ins>+                localTrack.enabled = false;
</ins><span class="cx">                 return waitFor(500);
</span><span class="cx">             }).then(() =&gt; {
</span><del>-                return analyseAudio(stream, 500).then((results) =&gt; {
-                    assert_false(results.heardHum, &quot;heard hum&quot;);
-                    assert_false(results.heardBip, &quot;heard bip&quot;);
-                    assert_false(results.heardBop, &quot;heard bop&quot;);
</del><ins>+                return analyseAudio(remoteStream, 500).then((results) =&gt; {
+                    assert_false(results.heardHum, &quot;not heard hum from remote disabled track&quot;);
</ins><span class="cx">                 });
</span><del>-            }).then(() =&gt; {
-                stream.getAudioTracks().forEach((track) =&gt; {
-                    track.enabled = true;
-                });
-                return waitFor(500);
-            }).then(() =&gt; {
-                return analyseAudio(stream, 500).then((results) =&gt; {
-                    assert_true(results.heardHum, &quot;heard hum&quot;);
-                    assert_true(results.heardBip, &quot;heard bip&quot;);
-                    assert_true(results.heardBop, &quot;heard bop&quot;);
-                });
</del><span class="cx">             });
</span><span class="cx">         });
</span><del>-    }, &quot;Muting and unmuting an audio track&quot;);
</del><ins>+    }, &quot;Muting a local audio track and making sure the remote track is silent&quot;);
</ins><span class="cx">     &lt;/script&gt;
</span><span class="cx"> &lt;/body&gt;
</span><span class="cx"> &lt;/html&gt;
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionaudiomute2expectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt                                (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,3 @@
</span><ins>+
+PASS Muting and unmuting a local audio track 
+
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionaudiomute2htmlfromrev214043trunkLayoutTestswebrtcpeerconnectionaudiomutehtml"></a>
<div class="copfile"><h4>Copied: trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html (from rev 214043, trunk/LayoutTests/webrtc/peer-connection-audio-mute.html) (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html                                (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,57 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+&lt;head&gt;
+    &lt;meta charset=&quot;utf-8&quot;&gt;
+    &lt;title&gt;Testing local audio capture playback causes &quot;playing&quot; event to fire&lt;/title&gt;
+    &lt;script src=&quot;../resources/testharness.js&quot;&gt;&lt;/script&gt;
+    &lt;script src=&quot;../resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+&lt;/head&gt;
+&lt;body&gt;
+    &lt;script src =&quot;routines.js&quot;&gt;&lt;/script&gt;
+    &lt;script&gt;
+    promise_test((test) =&gt; {
+        if (window.testRunner)
+            testRunner.setUserMediaPermission(true);
+
+        var localTrack;
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) =&gt; {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory(&quot;TwoRealPeerConnections&quot;);
+
+            localTrack = localStream.getAudioTracks()[0];
+            var remoteStream;
+            return new Promise((resolve, reject) =&gt; {
+                createConnections((firstConnection) =&gt; {
+                    firstConnection.addStream(localStream);
+                }, (secondConnection) =&gt; {
+                    secondConnection.onaddstream = (streamEvent) =&gt; {
+                        remoteStream = streamEvent.stream;
+                        resolve();
+                    };
+                });
+            }).then(() =&gt; {
+                return waitFor(500);
+            }).then(() =&gt; {
+                return analyseAudio(remoteStream, 500).then((results) =&gt; {
+                    assert_true(results.heardHum, &quot;heard hum from remote enabled track&quot;);
+                });
+            }).then(() =&gt; {
+                localTrack.enabled = false;
+                return waitFor(500);
+            }).then(() =&gt; {
+                return analyseAudio(remoteStream, 500).then((results) =&gt; {
+                    assert_false(results.heardHum, &quot;not heard hum from remote disabled track&quot;);
+                });
+            }).then(() =&gt; {
+                localTrack.enabled = true;
+                return waitFor(500);
+            }).then(() =&gt; {
+                return analyseAudio(remoteStream, 500).then((results) =&gt; {
+                    assert_true(results.heardHum, &quot;heard hum from remote reenabled track&quot;);
+                });
+            });
+        });
+    }, &quot;Muting and unmuting a local audio track&quot;);
+    &lt;/script&gt;
+&lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionremoteaudiomuteexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt                                (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,3 @@
</span><ins>+
+PASS Muting an incoming audio track 
+
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionremoteaudiomutehtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html                                (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,47 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+&lt;head&gt;
+    &lt;meta charset=&quot;utf-8&quot;&gt;
+    &lt;title&gt;Testing local audio capture playback causes &quot;playing&quot; event to fire&lt;/title&gt;
+    &lt;script src=&quot;../resources/testharness.js&quot;&gt;&lt;/script&gt;
+    &lt;script src=&quot;../resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+&lt;/head&gt;
+&lt;body&gt;
+    &lt;script src =&quot;routines.js&quot;&gt;&lt;/script&gt;
+    &lt;script&gt;
+    promise_test((test) =&gt; {
+        if (window.testRunner)
+            testRunner.setUserMediaPermission(true);
+
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) =&gt; {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory(&quot;TwoRealPeerConnections&quot;);
+
+            var remoteTrack;
+            var remoteStream;
+            return new Promise((resolve, reject) =&gt; {
+                createConnections((firstConnection) =&gt; {
+                    firstConnection.addStream(localStream);
+                }, (secondConnection) =&gt; {
+                    secondConnection.onaddstream = (streamEvent) =&gt; {
+                        remoteStream = streamEvent.stream;
+                        remoteTrack = remoteStream.getAudioTracks()[0];
+                        resolve();
+                    };
+                });
+            }).then(() =&gt; {
+                return analyseAudio(remoteStream, 500).then((results) =&gt; {
+                    assert_true(results.heardHum, &quot;heard hum from remote enabled track&quot;);
+                });
+            }).then(() =&gt; {
+                remoteTrack.enabled = false;
+            }).then(() =&gt; {
+                return analyseAudio(remoteStream, 500).then((results) =&gt; {
+                    assert_false(results.heardHum, &quot;not heard hum from remote disabled track&quot;);
+                });
+            });
+        });
+    }, &quot;Muting an incoming audio track&quot;);
+    &lt;/script&gt;
+&lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionremoteaudiomute2expectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt                                (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,3 @@
</span><ins>+
+PASS Muting and unmuting an incoming audio track 
+
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionremoteaudiomute2html"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html                                (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,53 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+&lt;head&gt;
+    &lt;meta charset=&quot;utf-8&quot;&gt;
+    &lt;title&gt;Testing local audio capture playback causes &quot;playing&quot; event to fire&lt;/title&gt;
+    &lt;script src=&quot;../resources/testharness.js&quot;&gt;&lt;/script&gt;
+    &lt;script src=&quot;../resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+&lt;/head&gt;
+&lt;body&gt;
+    &lt;script src =&quot;routines.js&quot;&gt;&lt;/script&gt;
+    &lt;script&gt;
+    promise_test((test) =&gt; {
+        if (window.testRunner)
+            testRunner.setUserMediaPermission(true);
+
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) =&gt; {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory(&quot;TwoRealPeerConnections&quot;);
+
+            var remoteTrack;
+            var remoteStream;
+            return new Promise((resolve, reject) =&gt; {
+                createConnections((firstConnection) =&gt; {
+                    firstConnection.addStream(localStream);
+                }, (secondConnection) =&gt; {
+                    secondConnection.onaddstream = (streamEvent) =&gt; {
+                        remoteStream = streamEvent.stream;
+                        remoteTrack = remoteStream.getAudioTracks()[0];
+                        resolve();
+                    };
+                });
+            }).then(() =&gt; {
+                return analyseAudio(remoteStream, 500).then((results) =&gt; {
+                    assert_true(results.heardHum, &quot;heard hum from remote enabled track&quot;);
+                });
+            }).then(() =&gt; {
+                remoteTrack.enabled = false;
+            }).then(() =&gt; {
+                return analyseAudio(remoteStream, 500).then((results) =&gt; {
+                    assert_false(results.heardHum, &quot;not heard hum from remote disabled track&quot;);
+                });
+            }).then(() =&gt; {
+                remoteTrack.enabled = true;
+            }).then(() =&gt; {
+                return analyseAudio(remoteStream, 500).then((results) =&gt; {
+                    assert_true(results.heardHum, &quot;heard hum from remote reenabled track&quot;);
+                });
+            });
+        });
+    }, &quot;Muting and unmuting an incoming audio track&quot;);
+    &lt;/script&gt;
+&lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcvideomuteexpectedtxt"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/webrtc/video-mute-expected.txt (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/video-mute-expected.txt        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/video-mute-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -1,4 +1,4 @@
</span><span class="cx"> 
</span><span class="cx"> 
</span><del>-PASS Video muted/unmuted track 
</del><ins>+PASS Outgoing muted/unmuted video track 
</ins><span class="cx"> 
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcvideomutehtml"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/webrtc/video-mute.html (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/video-mute.html        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/video-mute.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -21,10 +21,11 @@
</span><span class="cx">     canvas.height = video.videoHeight;
</span><span class="cx">     canvas.getContext('2d').drawImage(video, 0, 0, canvas.width, canvas.height);
</span><span class="cx"> 
</span><del>-    imageData = canvas.getContext('2d').getImageData(10, 325, 250, 1);
</del><ins>+    imageData = canvas.getContext('2d').getImageData(0, 0, canvas.width, canvas.height);
</ins><span class="cx">     data = imageData.data;
</span><span class="cx">     for (var cptr = 0; cptr &lt; canvas.width * canvas.height; ++cptr) {
</span><del>-        if (data[4 * cptr] || data[4 * cptr + 1] || data[4 * cptr + 2])
</del><ins>+        // Approximatively black pixels.
+        if (data[4 * cptr] &gt; 10 || data[4 * cptr + 1] &gt; 10 || data[4 * cptr + 2] &gt; 10)
</ins><span class="cx">             return false;
</span><span class="cx">     }
</span><span class="cx">     return true;
</span><span class="lines">@@ -35,35 +36,36 @@
</span><span class="cx">     if (window.testRunner)
</span><span class="cx">         testRunner.setUserMediaPermission(true);
</span><span class="cx"> 
</span><del>-    return navigator.mediaDevices.getUserMedia({ video: true}).then((stream) =&gt; {
</del><ins>+    return navigator.mediaDevices.getUserMedia({ video: true}).then((localStream) =&gt; {
</ins><span class="cx">         return new Promise((resolve, reject) =&gt; {
</span><span class="cx">             if (window.internals)
</span><span class="cx">                 internals.useMockRTCPeerConnectionFactory(&quot;TwoRealPeerConnections&quot;);
</span><span class="cx"> 
</span><ins>+            track = localStream.getVideoTracks()[0];
+
</ins><span class="cx">             createConnections((firstConnection) =&gt; {
</span><del>-                firstConnection.addStream(stream);
</del><ins>+                firstConnection.addStream(localStream);
</ins><span class="cx">             }, (secondConnection) =&gt; {
</span><span class="cx">                 secondConnection.onaddstream = (streamEvent) =&gt; { resolve(streamEvent.stream); };
</span><span class="cx">             });
</span><span class="cx">             setTimeout(() =&gt; reject(&quot;Test timed out&quot;), 5000);
</span><span class="cx">         });
</span><del>-    }).then((stream) =&gt; {
-        video.srcObject = stream;
-        track = stream.getVideoTracks()[0];
</del><ins>+    }).then((remoteStream) =&gt; {
+        video.srcObject = remoteStream;
</ins><span class="cx">         return video.play();
</span><span class="cx">     }).then(() =&gt; {
</span><del>-         assert_false(isVideoBlack());
</del><ins>+         assert_false(isVideoBlack(), &quot;track is enabled, video is not black&quot;);
</ins><span class="cx">     }).then(() =&gt; {
</span><span class="cx">         track.enabled = false;
</span><span class="cx">         return waitFor(500);
</span><span class="cx">     }).then(() =&gt; {
</span><del>-        assert_true(isVideoBlack());
</del><ins>+        assert_true(isVideoBlack(), &quot;track is disabled, video is black&quot;);
</ins><span class="cx">         track.enabled = true;
</span><span class="cx">         return waitFor(500);
</span><span class="cx">     }).then(() =&gt; {
</span><del>-        assert_false(isVideoBlack());
</del><ins>+        assert_false(isVideoBlack(), &quot;track is reenabled, video is not black&quot;);
</ins><span class="cx">     });
</span><del>-}, &quot;Video muted/unmuted track&quot;);
</del><ins>+}, &quot;Outgoing muted/unmuted video track&quot;);
</ins><span class="cx">         &lt;/script&gt;
</span><span class="cx">     &lt;/body&gt;
</span><span class="cx"> &lt;/html&gt;
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcvideoremotemuteexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/video-remote-mute-expected.txt (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/video-remote-mute-expected.txt                                (rev 0)
+++ trunk/LayoutTests/webrtc/video-remote-mute-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,4 @@
</span><ins>+
+
+PASS Incoming muted/unmuted video track 
+
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcvideoremotemutehtmlfromrev214043trunkLayoutTestswebrtcvideomutehtml"></a>
<div class="copfile"><h4>Copied: trunk/LayoutTests/webrtc/video-remote-mute.html (from rev 214043, trunk/LayoutTests/webrtc/video-mute.html) (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/video-remote-mute.html                                (rev 0)
+++ trunk/LayoutTests/webrtc/video-remote-mute.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,69 @@
</span><ins>+&lt;!doctype html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;utf-8&quot;&gt;
+        &lt;title&gt;Testing basic video exchange from offerer to receiver&lt;/title&gt;
+        &lt;script src=&quot;../resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;../resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body&gt;
+        &lt;video id=&quot;video&quot; autoplay=&quot;&quot;&gt;&lt;/video&gt;
+        &lt;canvas id=&quot;canvas&quot; width=&quot;640&quot; height=&quot;480&quot;&gt;&lt;/canvas&gt;
+        &lt;script src =&quot;routines.js&quot;&gt;&lt;/script&gt;
+        &lt;script&gt;
+video = document.getElementById(&quot;video&quot;);
+canvas = document.getElementById(&quot;canvas&quot;);
+// FIXME: We should use tracks
+
+function isVideoBlack()
+{
+    canvas.width = video.videoWidth;
+    canvas.height = video.videoHeight;
+    canvas.getContext('2d').drawImage(video, 0, 0, canvas.width, canvas.height);
+
+    imageData = canvas.getContext('2d').getImageData(10, 325, 250, 1);
+    data = imageData.data;
+    for (var cptr = 0; cptr &lt; canvas.width * canvas.height; ++cptr) {
+        if (data[4 * cptr] || data[4 * cptr + 1] || data[4 * cptr + 2])
+            return false;
+    }
+    return true;
+}
+
+var track;
+promise_test((test) =&gt; {
+    if (window.testRunner)
+        testRunner.setUserMediaPermission(true);
+
+    return navigator.mediaDevices.getUserMedia({ video: true}).then((localStream) =&gt; {
+        return new Promise((resolve, reject) =&gt; {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory(&quot;TwoRealPeerConnections&quot;);
+
+            createConnections((firstConnection) =&gt; {
+                firstConnection.addStream(localStream);
+            }, (secondConnection) =&gt; {
+                secondConnection.onaddstream = (streamEvent) =&gt; { resolve(streamEvent.stream); };
+            });
+            setTimeout(() =&gt; reject(&quot;Test timed out&quot;), 5000);
+        });
+    }).then((remoteStream) =&gt; {
+        video.srcObject = remoteStream;
+        track = remoteStream.getVideoTracks()[0];
+        return video.play();
+    }).then(() =&gt; {
+         assert_false(isVideoBlack());
+    }).then(() =&gt; {
+        track.enabled = false;
+        return waitFor(500);
+    }).then(() =&gt; {
+        assert_true(isVideoBlack());
+        track.enabled = true;
+        return waitFor(500);
+    }).then(() =&gt; {
+        assert_false(isVideoBlack());
+    });
+}, &quot;Incoming muted/unmuted video track&quot;);
+        &lt;/script&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkSourceWebCoreChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/ChangeLog (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/ChangeLog        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/ChangeLog        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -1,5 +1,34 @@
</span><span class="cx"> 2017-03-16  Youenn Fablet  &lt;youenn@apple.com&gt;
</span><span class="cx"> 
</span><ins>+        Improve WebRTC track enabled support
+        https://bugs.webkit.org/show_bug.cgi?id=169727
+
+        Reviewed by Alex Christensen.
+
+        Tests: webrtc/peer-connection-audio-mute2.html
+               webrtc/peer-connection-remote-audio-mute.html
+               webrtc/video-remote-mute.html
+
+        Making sure muted/disabled sources produce silence/black frames.
+        For outgoing audio/video sources, this should be done by the actual a/v providers.
+        We keep this filtering here until we are sure they implement that.
+
+        * platform/audio/mac/AudioSampleDataSource.mm:
+        (WebCore::AudioSampleDataSource::pullAvalaibleSamplesAsChunks): Ensuring disabled audio tracks send silence.
+        Used for outgoing webrtc tracks.
+        * platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
+        (WebCore::MockRealtimeAudioSourceMac::render): Ditto.
+        * platform/mediastream/mac/RealtimeIncomingAudioSource.cpp:
+        (WebCore::RealtimeIncomingAudioSource::OnData): Ditto.
+        * platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
+        (WebCore::RealtimeIncomingVideoSource::pixelBufferFromVideoFrame): Generating black frames if muted.
+        (WebCore::RealtimeIncomingVideoSource::OnFrame):
+        * platform/mediastream/mac/RealtimeIncomingVideoSource.h:
+        * platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
+        (WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable): Ensuring we quit after sending black frame.
+
+2017-03-16  Youenn Fablet  &lt;youenn@apple.com&gt;
+
</ins><span class="cx">         LibWebRTC outgoing source should be thread safe refcounted
</span><span class="cx">         https://bugs.webkit.org/show_bug.cgi?id=169726
</span><span class="cx"> 
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiomacAudioSampleDataSourcemm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -311,6 +311,16 @@
</span><span class="cx">         timeStamp = startFrame;
</span><span class="cx"> 
</span><span class="cx">     startFrame = timeStamp;
</span><ins>+
+    if (m_muted) {
+        AudioSampleBufferList::zeroABL(buffer, sampleCountPerChunk * m_outputDescription-&gt;bytesPerFrame());
+        while (endFrame - startFrame &gt;= sampleCountPerChunk) {
+            consumeFilledBuffer();
+            startFrame += sampleCountPerChunk;
+        }
+        return true;
+    }
+
</ins><span class="cx">     while (endFrame - startFrame &gt;= sampleCountPerChunk) {
</span><span class="cx">         if (m_ringBuffer-&gt;fetch(&amp;buffer, sampleCountPerChunk, startFrame, CARingBuffer::Copy))
</span><span class="cx">             return false;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMacmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -147,9 +147,6 @@
</span><span class="cx"> 
</span><span class="cx"> void MockRealtimeAudioSourceMac::render(double delta)
</span><span class="cx"> {
</span><del>-    if (m_muted || !m_enabled)
-        return;
-
</del><span class="cx">     if (!m_audioBufferList)
</span><span class="cx">         reconfigure();
</span><span class="cx"> 
</span><span class="lines">@@ -162,8 +159,11 @@
</span><span class="cx">         uint32_t bipBopCount = std::min(frameCount, bipBopRemain);
</span><span class="cx">         for (auto&amp; audioBuffer : m_audioBufferList-&gt;buffers()) {
</span><span class="cx">             audioBuffer.mDataByteSize = frameCount * m_streamFormat.mBytesPerFrame;
</span><del>-            memcpy(audioBuffer.mData, &amp;m_bipBopBuffer[bipBopStart], sizeof(Float32) * bipBopCount);
-            addHum(HumVolume, HumFrequency, m_sampleRate, m_samplesRendered, static_cast&lt;float*&gt;(audioBuffer.mData), bipBopCount);
</del><ins>+            if (!m_muted &amp;&amp; m_enabled) {
+                memcpy(audioBuffer.mData, &amp;m_bipBopBuffer[bipBopStart], sizeof(Float32) * bipBopCount);
+                addHum(HumVolume, HumFrequency, m_sampleRate, m_samplesRendered, static_cast&lt;float*&gt;(audioBuffer.mData), bipBopCount);
+            } else
+                memset(audioBuffer.mData, 0, sizeof(Float32) * bipBopCount);
</ins><span class="cx">         }
</span><span class="cx">         emitSampleBuffers(bipBopCount);
</span><span class="cx">         m_samplesRendered += bipBopCount;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacRealtimeIncomingAudioSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -95,13 +95,17 @@
</span><span class="cx">             m_audioSourceProvider-&gt;prepare(&amp;m_streamFormat);
</span><span class="cx">     }
</span><span class="cx"> 
</span><ins>+    // FIXME: We should not need to do the extra memory allocation and copy.
+    // Instead, we should be able to directly pass audioData pointer.
</ins><span class="cx">     WebAudioBufferList audioBufferList { CAAudioStreamDescription(m_streamFormat), WTF::safeCast&lt;uint32_t&gt;(numberOfFrames) };
</span><span class="cx">     audioBufferList.buffer(0)-&gt;mDataByteSize = numberOfChannels * numberOfFrames * bitsPerSample / 8;
</span><span class="cx">     audioBufferList.buffer(0)-&gt;mNumberChannels = numberOfChannels;
</span><del>-    // FIXME: We should not need to do the extra memory allocation and copy.
-    // Instead, we should be able to directly pass audioData pointer.
-    memcpy(audioBufferList.buffer(0)-&gt;mData, audioData, audioBufferList.buffer(0)-&gt;mDataByteSize);
</del><span class="cx"> 
</span><ins>+    if (muted() || !enabled())
+        memset(audioBufferList.buffer(0)-&gt;mData, 0, audioBufferList.buffer(0)-&gt;mDataByteSize);
+    else
+        memcpy(audioBufferList.buffer(0)-&gt;mData, audioData, audioBufferList.buffer(0)-&gt;mDataByteSize);
+
</ins><span class="cx">     audioSamplesAvailable(mediaTime, audioBufferList, CAAudioStreamDescription(m_streamFormat), numberOfFrames);
</span><span class="cx"> }
</span><span class="cx"> 
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacRealtimeIncomingVideoSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -92,13 +92,40 @@
</span><span class="cx">         m_videoTrack-&gt;RemoveSink(this);
</span><span class="cx"> }
</span><span class="cx"> 
</span><ins>+CVPixelBufferRef RealtimeIncomingVideoSource::pixelBufferFromVideoFrame(const webrtc::VideoFrame&amp; frame)
+{
+    if (muted() || !enabled()) {
+        if (!m_blackFrame || m_blackFrameWidth != frame.width() || m_blackFrameHeight != frame.height()) {
+            CVPixelBufferRef pixelBuffer = nullptr;
+            auto status = CVPixelBufferCreate(kCFAllocatorDefault, frame.width(), frame.height(), kCVPixelFormatType_420YpCbCr8Planar, nullptr, &amp;pixelBuffer);
+            ASSERT_UNUSED(status, status == noErr);
+
+            m_blackFrame = pixelBuffer;
+            m_blackFrameWidth = frame.width();
+            m_blackFrameHeight = frame.height();
+
+            status = CVPixelBufferLockBaseAddress(pixelBuffer, 0);
+            ASSERT(status == noErr);
+            void* data = CVPixelBufferGetBaseAddress(pixelBuffer);
+            size_t yLength = frame.width() * frame.height();
+            memset(data, 0, yLength);
+            memset(static_cast&lt;uint8_t*&gt;(data) + yLength, 128, yLength / 2);
+
+            status = CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
+            ASSERT(!status);
+        }
+        return m_blackFrame.get();
+    }
+    auto buffer = frame.video_frame_buffer();
+    return static_cast&lt;CVPixelBufferRef&gt;(buffer-&gt;native_handle());
+}
+
</ins><span class="cx"> void RealtimeIncomingVideoSource::OnFrame(const webrtc::VideoFrame&amp; frame)
</span><span class="cx"> {
</span><span class="cx">     if (!m_isProducingData)
</span><span class="cx">         return;
</span><span class="cx"> 
</span><del>-    auto buffer = frame.video_frame_buffer();
-    CVPixelBufferRef pixelBuffer = static_cast&lt;CVPixelBufferRef&gt;(buffer-&gt;native_handle());
</del><ins>+    auto pixelBuffer = pixelBufferFromVideoFrame(frame);
</ins><span class="cx"> 
</span><span class="cx">     // FIXME: Convert timing information from VideoFrame to CMSampleTimingInfo.
</span><span class="cx">     // For the moment, we will pretend that frames should be rendered asap.
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacRealtimeIncomingVideoSourceh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -70,6 +70,8 @@
</span><span class="cx">     // rtc::VideoSinkInterface
</span><span class="cx">     void OnFrame(const webrtc::VideoFrame&amp;) final;
</span><span class="cx"> 
</span><ins>+    CVPixelBufferRef pixelBufferFromVideoFrame(const webrtc::VideoFrame&amp;);
+
</ins><span class="cx">     RefPtr&lt;Image&gt; m_currentImage;
</span><span class="cx">     RealtimeMediaSourceSettings m_currentSettings;
</span><span class="cx">     RealtimeMediaSourceSupportedConstraints m_supportedConstraints;
</span><span class="lines">@@ -79,6 +81,9 @@
</span><span class="cx">     rtc::scoped_refptr&lt;webrtc::VideoTrackInterface&gt; m_videoTrack;
</span><span class="cx">     RetainPtr&lt;CMSampleBufferRef&gt; m_buffer;
</span><span class="cx">     PixelBufferConformerCV m_conformer;
</span><ins>+    RetainPtr&lt;CVPixelBufferRef&gt; m_blackFrame;
+    int m_blackFrameWidth { 0 };
+    int m_blackFrameHeight { 0 };
</ins><span class="cx"> };
</span><span class="cx"> 
</span><span class="cx"> } // namespace WebCore
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacRealtimeOutgoingVideoSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -92,6 +92,7 @@
</span><span class="cx">         auto blackBuffer = m_bufferPool.CreateBuffer(settings.width(), settings.height());
</span><span class="cx">         blackBuffer-&gt;SetToBlack();
</span><span class="cx">         sendFrame(WTFMove(blackBuffer));
</span><ins>+        return;
</ins><span class="cx">     }
</span><span class="cx"> 
</span><span class="cx">     ASSERT(sample.platformSample().type == PlatformSample::CMSampleBufferType);
</span></span></pre>
</div>
</div>

</body>
</html>