<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[214044] trunk</title>
</head>
<body>
<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; }
#msg dl a { font-weight: bold}
#msg dl a:link { color:#fc3; }
#msg dl a:active { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/214044">214044</a></dd>
<dt>Author</dt> <dd>commit-queue@webkit.org</dd>
<dt>Date</dt> <dd>2017-03-16 09:09:50 -0700 (Thu, 16 Mar 2017)</dd>
</dl>
<h3>Log Message</h3>
<pre>Improve WebRTC track enabled support
https://bugs.webkit.org/show_bug.cgi?id=169727
Patch by Youenn Fablet <youenn@apple.com> on 2017-03-16
Reviewed by Alex Christensen.
Source/WebCore:
Tests: webrtc/peer-connection-audio-mute2.html
webrtc/peer-connection-remote-audio-mute.html
webrtc/video-remote-mute.html
Making sure muted/disabled sources produce silence/black frames.
For outgoing audio/video sources, this should be done by the actual a/v providers.
We keep this filtering here until we are sure they implement that.
* platform/audio/mac/AudioSampleDataSource.mm:
(WebCore::AudioSampleDataSource::pullAvalaibleSamplesAsChunks): Ensuring disabled audio tracks send silence.
Used for outgoing webrtc tracks.
* platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
(WebCore::MockRealtimeAudioSourceMac::render): Ditto.
* platform/mediastream/mac/RealtimeIncomingAudioSource.cpp:
(WebCore::RealtimeIncomingAudioSource::OnData): Ditto.
* platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
(WebCore::RealtimeIncomingVideoSource::pixelBufferFromVideoFrame): Generating black frames if muted.
(WebCore::RealtimeIncomingVideoSource::OnFrame):
* platform/mediastream/mac/RealtimeIncomingVideoSource.h:
* platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
(WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable): Ensuring we quit after sending black frame.
LayoutTests:
* TestExpectations:
* webrtc/audio-peer-connection-webaudio.html:
* webrtc/peer-connection-audio-mute-expected.txt:
* webrtc/peer-connection-audio-mute.html:
* webrtc/peer-connection-audio-mute2-expected.txt: Added.
* webrtc/peer-connection-audio-mute2.html: Added.
* webrtc/peer-connection-remote-audio-mute-expected.txt: Added.
* webrtc/peer-connection-remote-audio-mute.html: Added.
* webrtc/video-mute-expected.txt:
* webrtc/video-mute.html:
* webrtc/video-remote-mute-expected.txt: Added.
* webrtc/video-remote-mute.html: Added.</pre>
<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkLayoutTestsChangeLog">trunk/LayoutTests/ChangeLog</a></li>
<li><a href="#trunkLayoutTestsTestExpectations">trunk/LayoutTests/TestExpectations</a></li>
<li><a href="#trunkLayoutTestswebrtcaudiopeerconnectionwebaudiohtml">trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionaudiomuteexpectedtxt">trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionaudiomutehtml">trunk/LayoutTests/webrtc/peer-connection-audio-mute.html</a></li>
<li><a href="#trunkLayoutTestswebrtcvideomuteexpectedtxt">trunk/LayoutTests/webrtc/video-mute-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcvideomutehtml">trunk/LayoutTests/webrtc/video-mute.html</a></li>
<li><a href="#trunkSourceWebCoreChangeLog">trunk/Source/WebCore/ChangeLog</a></li>
<li><a href="#trunkSourceWebCoreplatformaudiomacAudioSampleDataSourcemm">trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMacmm">trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacRealtimeIncomingAudioSourcecpp">trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacRealtimeIncomingVideoSourcecpp">trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacRealtimeIncomingVideoSourceh">trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacRealtimeOutgoingVideoSourcecpp">trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp</a></li>
</ul>
<h3>Added Paths</h3>
<ul>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionaudiomute2expectedtxt">trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionaudiomute2html">trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionremoteaudiomuteexpectedtxt">trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionremoteaudiomutehtml">trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionremoteaudiomute2expectedtxt">trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcpeerconnectionremoteaudiomute2html">trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html</a></li>
<li><a href="#trunkLayoutTestswebrtcvideoremotemuteexpectedtxt">trunk/LayoutTests/webrtc/video-remote-mute-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebrtcvideoremotemutehtml">trunk/LayoutTests/webrtc/video-remote-mute.html</a></li>
</ul>
</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkLayoutTestsChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/ChangeLog (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/ChangeLog        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/ChangeLog        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -1,3 +1,23 @@
</span><ins>+2017-03-16 Youenn Fablet <youenn@apple.com>
+
+ Improve WebRTC track enabled support
+ https://bugs.webkit.org/show_bug.cgi?id=169727
+
+ Reviewed by Alex Christensen.
+
+ * TestExpectations:
+ * webrtc/audio-peer-connection-webaudio.html:
+ * webrtc/peer-connection-audio-mute-expected.txt:
+ * webrtc/peer-connection-audio-mute.html:
+ * webrtc/peer-connection-audio-mute2-expected.txt: Added.
+ * webrtc/peer-connection-audio-mute2.html: Added.
+ * webrtc/peer-connection-remote-audio-mute-expected.txt: Added.
+ * webrtc/peer-connection-remote-audio-mute.html: Added.
+ * webrtc/video-mute-expected.txt:
+ * webrtc/video-mute.html:
+ * webrtc/video-remote-mute-expected.txt: Added.
+ * webrtc/video-remote-mute.html: Added.
+
</ins><span class="cx"> 2017-03-16 Manuel Rego Casasnovas <rego@igalia.com>
</span><span class="cx">
</span><span class="cx"> [css-grid] Crash on debug removing a positioned child
</span></span></pre></div>
<a id="trunkLayoutTestsTestExpectations"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/TestExpectations (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/TestExpectations        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/TestExpectations        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -711,7 +711,10 @@
</span><span class="cx"> # GTK enables some of these tests on their TestExpectations file.
</span><span class="cx"> [ Release ] webrtc [ Skip ]
</span><span class="cx">
</span><del>-[ Debug ] webrtc/audio-peer-connection-webaudio.html [ Failure ]
</del><ins>+[ Debug ] webrtc/peer-connection-audio-mute.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-audio-mute2.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-remote-audio-mute.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-remote-audio-mute2.html [ Pass Failure ]
</ins><span class="cx"> fast/mediastream/getUserMedia-webaudio.html [ Skip ]
</span><span class="cx"> fast/mediastream/RTCPeerConnection-AddRemoveStream.html [ Skip ]
</span><span class="cx"> fast/mediastream/RTCPeerConnection-closed-state.html [ Skip ]
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcaudiopeerconnectionwebaudiohtml"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -11,7 +11,7 @@
</span><span class="cx"> if (window.testRunner)
</span><span class="cx"> testRunner.setUserMediaPermission(true);
</span><span class="cx">
</span><del>- return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) => {
</del><ins>+ return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) => {
</ins><span class="cx"> if (window.internals)
</span><span class="cx"> internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
</span><span class="cx"> return new Promise((resolve, reject) => {
</span><span class="lines">@@ -21,14 +21,14 @@
</span><span class="cx"> secondConnection.onaddstream = (streamEvent) => { resolve(streamEvent.stream); };
</span><span class="cx"> });
</span><span class="cx"> setTimeout(() => reject("Test timed out"), 5000);
</span><del>- }).then((stream) => {
- return analyseAudio(stream, 1000);
- }).then((results) => {
- assert_true(results.heardHum, "heard hum");
- assert_true(results.heardBip, "heard bip");
- assert_true(results.heardBop, "heard bop");
</del><span class="cx"> });
</span><del>- });
</del><ins>+ }).then((remoteStream) => {
+ return analyseAudio(remoteStream, 1000);
+ }).then((results) => {
+ assert_true(results.heardHum, "heard hum");
+ assert_true(results.heardBip, "heard bip");
+ assert_true(results.heardBop, "heard bop");
+ });
</ins><span class="cx"> }, "Basic audio playback through a peer connection");
</span><span class="cx"> </script>
</span><span class="cx"> </head>
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionaudiomuteexpectedtxt"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -1,3 +1,3 @@
</span><span class="cx">
</span><del>-FAIL Muting and unmuting an audio track assert_true: heard hum expected true got false
</del><ins>+PASS Muting a local audio track and making sure the remote track is silent
</ins><span class="cx">
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionaudiomutehtml"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/webrtc/peer-connection-audio-mute.html (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-audio-mute.html        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -13,17 +13,19 @@
</span><span class="cx"> if (window.testRunner)
</span><span class="cx"> testRunner.setUserMediaPermission(true);
</span><span class="cx">
</span><del>- return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) => {
</del><ins>+ var localTrack;
+ return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
</ins><span class="cx"> if (window.internals)
</span><span class="cx"> internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
</span><span class="cx">
</span><del>- var stream;
</del><ins>+ localTrack = localStream.getAudioTracks()[0];
+ var remoteStream;
</ins><span class="cx"> return new Promise((resolve, reject) => {
</span><span class="cx"> createConnections((firstConnection) => {
</span><del>- firstConnection.addStream(stream);
</del><ins>+ firstConnection.addStream(localStream);
</ins><span class="cx"> }, (secondConnection) => {
</span><span class="cx"> secondConnection.onaddstream = (streamEvent) => {
</span><del>- stream = streamEvent.stream;
</del><ins>+ remoteStream = streamEvent.stream;
</ins><span class="cx"> resolve();
</span><span class="cx"> };
</span><span class="cx"> });
</span><span class="lines">@@ -30,36 +32,19 @@
</span><span class="cx"> }).then(() => {
</span><span class="cx"> return waitFor(500);
</span><span class="cx"> }).then(() => {
</span><del>- return analyseAudio(stream, 500).then((results) => {
- assert_true(results.heardHum, "heard hum");
- assert_true(results.heardBip, "heard bip");
- assert_true(results.heardBop, "heard bop");
</del><ins>+ return analyseAudio(remoteStream, 500).then((results) => {
+ assert_true(results.heardHum, "heard hum from remote enabled track");
</ins><span class="cx"> });
</span><span class="cx"> }).then(() => {
</span><del>- stream.getAudioTracks().forEach((track) => {
- track.enabled = false;
- });
</del><ins>+ localTrack.enabled = false;
</ins><span class="cx"> return waitFor(500);
</span><span class="cx"> }).then(() => {
</span><del>- return analyseAudio(stream, 500).then((results) => {
- assert_false(results.heardHum, "heard hum");
- assert_false(results.heardBip, "heard bip");
- assert_false(results.heardBop, "heard bop");
</del><ins>+ return analyseAudio(remoteStream, 500).then((results) => {
+ assert_false(results.heardHum, "not heard hum from remote disabled track");
</ins><span class="cx"> });
</span><del>- }).then(() => {
- stream.getAudioTracks().forEach((track) => {
- track.enabled = true;
- });
- return waitFor(500);
- }).then(() => {
- return analyseAudio(stream, 500).then((results) => {
- assert_true(results.heardHum, "heard hum");
- assert_true(results.heardBip, "heard bip");
- assert_true(results.heardBop, "heard bop");
- });
</del><span class="cx"> });
</span><span class="cx"> });
</span><del>- }, "Muting and unmuting an audio track");
</del><ins>+ }, "Muting a local audio track and making sure the remote track is silent");
</ins><span class="cx"> </script>
</span><span class="cx"> </body>
</span><span class="cx"> </html>
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionaudiomute2expectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt         (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,3 @@
</span><ins>+
+PASS Muting and unmuting a local audio track
+
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionaudiomute2htmlfromrev214043trunkLayoutTestswebrtcpeerconnectionaudiomutehtml"></a>
<div class="copfile"><h4>Copied: trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html (from rev 214043, trunk/LayoutTests/webrtc/peer-connection-audio-mute.html) (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html         (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,57 @@
</span><ins>+<!DOCTYPE html>
+<html>
+<head>
+ <meta charset="utf-8">
+ <title>Testing local audio capture playback causes "playing" event to fire</title>
+ <script src="../resources/testharness.js"></script>
+ <script src="../resources/testharnessreport.js"></script>
+</head>
+<body>
+ <script src ="routines.js"></script>
+ <script>
+ promise_test((test) => {
+ if (window.testRunner)
+ testRunner.setUserMediaPermission(true);
+
+ var localTrack;
+ return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
+ if (window.internals)
+ internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+ localTrack = localStream.getAudioTracks()[0];
+ var remoteStream;
+ return new Promise((resolve, reject) => {
+ createConnections((firstConnection) => {
+ firstConnection.addStream(localStream);
+ }, (secondConnection) => {
+ secondConnection.onaddstream = (streamEvent) => {
+ remoteStream = streamEvent.stream;
+ resolve();
+ };
+ });
+ }).then(() => {
+ return waitFor(500);
+ }).then(() => {
+ return analyseAudio(remoteStream, 500).then((results) => {
+ assert_true(results.heardHum, "heard hum from remote enabled track");
+ });
+ }).then(() => {
+ localTrack.enabled = false;
+ return waitFor(500);
+ }).then(() => {
+ return analyseAudio(remoteStream, 500).then((results) => {
+ assert_false(results.heardHum, "not heard hum from remote disabled track");
+ });
+ }).then(() => {
+ localTrack.enabled = true;
+ return waitFor(500);
+ }).then(() => {
+ return analyseAudio(remoteStream, 500).then((results) => {
+ assert_true(results.heardHum, "heard hum from remote reenabled track");
+ });
+ });
+ });
+ }, "Muting and unmuting a local audio track");
+ </script>
+</body>
+</html>
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionremoteaudiomuteexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt         (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,3 @@
</span><ins>+
+PASS Muting an incoming audio track
+
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionremoteaudiomutehtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html         (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,47 @@
</span><ins>+<!DOCTYPE html>
+<html>
+<head>
+ <meta charset="utf-8">
+ <title>Testing local audio capture playback causes "playing" event to fire</title>
+ <script src="../resources/testharness.js"></script>
+ <script src="../resources/testharnessreport.js"></script>
+</head>
+<body>
+ <script src ="routines.js"></script>
+ <script>
+ promise_test((test) => {
+ if (window.testRunner)
+ testRunner.setUserMediaPermission(true);
+
+ return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
+ if (window.internals)
+ internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+ var remoteTrack;
+ var remoteStream;
+ return new Promise((resolve, reject) => {
+ createConnections((firstConnection) => {
+ firstConnection.addStream(localStream);
+ }, (secondConnection) => {
+ secondConnection.onaddstream = (streamEvent) => {
+ remoteStream = streamEvent.stream;
+ remoteTrack = remoteStream.getAudioTracks()[0];
+ resolve();
+ };
+ });
+ }).then(() => {
+ return analyseAudio(remoteStream, 500).then((results) => {
+ assert_true(results.heardHum, "heard hum from remote enabled track");
+ });
+ }).then(() => {
+ remoteTrack.enabled = false;
+ }).then(() => {
+ return analyseAudio(remoteStream, 500).then((results) => {
+ assert_false(results.heardHum, "not heard hum from remote disabled track");
+ });
+ });
+ });
+ }, "Muting an incoming audio track");
+ </script>
+</body>
+</html>
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionremoteaudiomute2expectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt         (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,3 @@
</span><ins>+
+PASS Muting and unmuting an incoming audio track
+
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcpeerconnectionremoteaudiomute2html"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html         (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,53 @@
</span><ins>+<!DOCTYPE html>
+<html>
+<head>
+ <meta charset="utf-8">
+ <title>Testing local audio capture playback causes "playing" event to fire</title>
+ <script src="../resources/testharness.js"></script>
+ <script src="../resources/testharnessreport.js"></script>
+</head>
+<body>
+ <script src ="routines.js"></script>
+ <script>
+ promise_test((test) => {
+ if (window.testRunner)
+ testRunner.setUserMediaPermission(true);
+
+ return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
+ if (window.internals)
+ internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+ var remoteTrack;
+ var remoteStream;
+ return new Promise((resolve, reject) => {
+ createConnections((firstConnection) => {
+ firstConnection.addStream(localStream);
+ }, (secondConnection) => {
+ secondConnection.onaddstream = (streamEvent) => {
+ remoteStream = streamEvent.stream;
+ remoteTrack = remoteStream.getAudioTracks()[0];
+ resolve();
+ };
+ });
+ }).then(() => {
+ return analyseAudio(remoteStream, 500).then((results) => {
+ assert_true(results.heardHum, "heard hum from remote enabled track");
+ });
+ }).then(() => {
+ remoteTrack.enabled = false;
+ }).then(() => {
+ return analyseAudio(remoteStream, 500).then((results) => {
+ assert_false(results.heardHum, "not heard hum from remote disabled track");
+ });
+ }).then(() => {
+ remoteTrack.enabled = true;
+ }).then(() => {
+ return analyseAudio(remoteStream, 500).then((results) => {
+ assert_true(results.heardHum, "heard hum from remote reenabled track");
+ });
+ });
+ });
+ }, "Muting and unmuting an incoming audio track");
+ </script>
+</body>
+</html>
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcvideomuteexpectedtxt"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/webrtc/video-mute-expected.txt (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/video-mute-expected.txt        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/video-mute-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -1,4 +1,4 @@
</span><span class="cx">
</span><span class="cx">
</span><del>-PASS Video muted/unmuted track
</del><ins>+PASS Outgoing muted/unmuted video track
</ins><span class="cx">
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcvideomutehtml"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/webrtc/video-mute.html (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/video-mute.html        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/video-mute.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -21,10 +21,11 @@
</span><span class="cx"> canvas.height = video.videoHeight;
</span><span class="cx"> canvas.getContext('2d').drawImage(video, 0, 0, canvas.width, canvas.height);
</span><span class="cx">
</span><del>- imageData = canvas.getContext('2d').getImageData(10, 325, 250, 1);
</del><ins>+ imageData = canvas.getContext('2d').getImageData(0, 0, canvas.width, canvas.height);
</ins><span class="cx"> data = imageData.data;
</span><span class="cx"> for (var cptr = 0; cptr < canvas.width * canvas.height; ++cptr) {
</span><del>- if (data[4 * cptr] || data[4 * cptr + 1] || data[4 * cptr + 2])
</del><ins>+ // Approximatively black pixels.
+ if (data[4 * cptr] > 10 || data[4 * cptr + 1] > 10 || data[4 * cptr + 2] > 10)
</ins><span class="cx"> return false;
</span><span class="cx"> }
</span><span class="cx"> return true;
</span><span class="lines">@@ -35,35 +36,36 @@
</span><span class="cx"> if (window.testRunner)
</span><span class="cx"> testRunner.setUserMediaPermission(true);
</span><span class="cx">
</span><del>- return navigator.mediaDevices.getUserMedia({ video: true}).then((stream) => {
</del><ins>+ return navigator.mediaDevices.getUserMedia({ video: true}).then((localStream) => {
</ins><span class="cx"> return new Promise((resolve, reject) => {
</span><span class="cx"> if (window.internals)
</span><span class="cx"> internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
</span><span class="cx">
</span><ins>+ track = localStream.getVideoTracks()[0];
+
</ins><span class="cx"> createConnections((firstConnection) => {
</span><del>- firstConnection.addStream(stream);
</del><ins>+ firstConnection.addStream(localStream);
</ins><span class="cx"> }, (secondConnection) => {
</span><span class="cx"> secondConnection.onaddstream = (streamEvent) => { resolve(streamEvent.stream); };
</span><span class="cx"> });
</span><span class="cx"> setTimeout(() => reject("Test timed out"), 5000);
</span><span class="cx"> });
</span><del>- }).then((stream) => {
- video.srcObject = stream;
- track = stream.getVideoTracks()[0];
</del><ins>+ }).then((remoteStream) => {
+ video.srcObject = remoteStream;
</ins><span class="cx"> return video.play();
</span><span class="cx"> }).then(() => {
</span><del>- assert_false(isVideoBlack());
</del><ins>+ assert_false(isVideoBlack(), "track is enabled, video is not black");
</ins><span class="cx"> }).then(() => {
</span><span class="cx"> track.enabled = false;
</span><span class="cx"> return waitFor(500);
</span><span class="cx"> }).then(() => {
</span><del>- assert_true(isVideoBlack());
</del><ins>+ assert_true(isVideoBlack(), "track is disabled, video is black");
</ins><span class="cx"> track.enabled = true;
</span><span class="cx"> return waitFor(500);
</span><span class="cx"> }).then(() => {
</span><del>- assert_false(isVideoBlack());
</del><ins>+ assert_false(isVideoBlack(), "track is reenabled, video is not black");
</ins><span class="cx"> });
</span><del>-}, "Video muted/unmuted track");
</del><ins>+}, "Outgoing muted/unmuted video track");
</ins><span class="cx"> </script>
</span><span class="cx"> </body>
</span><span class="cx"> </html>
</span></span></pre></div>
<a id="trunkLayoutTestswebrtcvideoremotemuteexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webrtc/video-remote-mute-expected.txt (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/video-remote-mute-expected.txt         (rev 0)
+++ trunk/LayoutTests/webrtc/video-remote-mute-expected.txt        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,4 @@
</span><ins>+
+
+PASS Incoming muted/unmuted video track
+
</ins></span></pre></div>
<a id="trunkLayoutTestswebrtcvideoremotemutehtmlfromrev214043trunkLayoutTestswebrtcvideomutehtml"></a>
<div class="copfile"><h4>Copied: trunk/LayoutTests/webrtc/video-remote-mute.html (from rev 214043, trunk/LayoutTests/webrtc/video-mute.html) (0 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webrtc/video-remote-mute.html         (rev 0)
+++ trunk/LayoutTests/webrtc/video-remote-mute.html        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -0,0 +1,69 @@
</span><ins>+<!doctype html>
+<html>
+ <head>
+ <meta charset="utf-8">
+ <title>Testing basic video exchange from offerer to receiver</title>
+ <script src="../resources/testharness.js"></script>
+ <script src="../resources/testharnessreport.js"></script>
+ </head>
+ <body>
+ <video id="video" autoplay=""></video>
+ <canvas id="canvas" width="640" height="480"></canvas>
+ <script src ="routines.js"></script>
+ <script>
+video = document.getElementById("video");
+canvas = document.getElementById("canvas");
+// FIXME: We should use tracks
+
+function isVideoBlack()
+{
+ canvas.width = video.videoWidth;
+ canvas.height = video.videoHeight;
+ canvas.getContext('2d').drawImage(video, 0, 0, canvas.width, canvas.height);
+
+ imageData = canvas.getContext('2d').getImageData(10, 325, 250, 1);
+ data = imageData.data;
+ for (var cptr = 0; cptr < canvas.width * canvas.height; ++cptr) {
+ if (data[4 * cptr] || data[4 * cptr + 1] || data[4 * cptr + 2])
+ return false;
+ }
+ return true;
+}
+
+var track;
+promise_test((test) => {
+ if (window.testRunner)
+ testRunner.setUserMediaPermission(true);
+
+ return navigator.mediaDevices.getUserMedia({ video: true}).then((localStream) => {
+ return new Promise((resolve, reject) => {
+ if (window.internals)
+ internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+ createConnections((firstConnection) => {
+ firstConnection.addStream(localStream);
+ }, (secondConnection) => {
+ secondConnection.onaddstream = (streamEvent) => { resolve(streamEvent.stream); };
+ });
+ setTimeout(() => reject("Test timed out"), 5000);
+ });
+ }).then((remoteStream) => {
+ video.srcObject = remoteStream;
+ track = remoteStream.getVideoTracks()[0];
+ return video.play();
+ }).then(() => {
+ assert_false(isVideoBlack());
+ }).then(() => {
+ track.enabled = false;
+ return waitFor(500);
+ }).then(() => {
+ assert_true(isVideoBlack());
+ track.enabled = true;
+ return waitFor(500);
+ }).then(() => {
+ assert_false(isVideoBlack());
+ });
+}, "Incoming muted/unmuted video track");
+ </script>
+ </body>
+</html>
</ins></span></pre></div>
<a id="trunkSourceWebCoreChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/ChangeLog (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/ChangeLog        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/ChangeLog        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -1,5 +1,34 @@
</span><span class="cx"> 2017-03-16 Youenn Fablet <youenn@apple.com>
</span><span class="cx">
</span><ins>+ Improve WebRTC track enabled support
+ https://bugs.webkit.org/show_bug.cgi?id=169727
+
+ Reviewed by Alex Christensen.
+
+ Tests: webrtc/peer-connection-audio-mute2.html
+ webrtc/peer-connection-remote-audio-mute.html
+ webrtc/video-remote-mute.html
+
+ Making sure muted/disabled sources produce silence/black frames.
+ For outgoing audio/video sources, this should be done by the actual a/v providers.
+ We keep this filtering here until we are sure they implement that.
+
+ * platform/audio/mac/AudioSampleDataSource.mm:
+ (WebCore::AudioSampleDataSource::pullAvalaibleSamplesAsChunks): Ensuring disabled audio tracks send silence.
+ Used for outgoing webrtc tracks.
+ * platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
+ (WebCore::MockRealtimeAudioSourceMac::render): Ditto.
+ * platform/mediastream/mac/RealtimeIncomingAudioSource.cpp:
+ (WebCore::RealtimeIncomingAudioSource::OnData): Ditto.
+ * platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
+ (WebCore::RealtimeIncomingVideoSource::pixelBufferFromVideoFrame): Generating black frames if muted.
+ (WebCore::RealtimeIncomingVideoSource::OnFrame):
+ * platform/mediastream/mac/RealtimeIncomingVideoSource.h:
+ * platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
+ (WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable): Ensuring we quit after sending black frame.
+
+2017-03-16 Youenn Fablet <youenn@apple.com>
+
</ins><span class="cx"> LibWebRTC outgoing source should be thread safe refcounted
</span><span class="cx"> https://bugs.webkit.org/show_bug.cgi?id=169726
</span><span class="cx">
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiomacAudioSampleDataSourcemm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -311,6 +311,16 @@
</span><span class="cx"> timeStamp = startFrame;
</span><span class="cx">
</span><span class="cx"> startFrame = timeStamp;
</span><ins>+
+ if (m_muted) {
+ AudioSampleBufferList::zeroABL(buffer, sampleCountPerChunk * m_outputDescription->bytesPerFrame());
+ while (endFrame - startFrame >= sampleCountPerChunk) {
+ consumeFilledBuffer();
+ startFrame += sampleCountPerChunk;
+ }
+ return true;
+ }
+
</ins><span class="cx"> while (endFrame - startFrame >= sampleCountPerChunk) {
</span><span class="cx"> if (m_ringBuffer->fetch(&buffer, sampleCountPerChunk, startFrame, CARingBuffer::Copy))
</span><span class="cx"> return false;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMacmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -147,9 +147,6 @@
</span><span class="cx">
</span><span class="cx"> void MockRealtimeAudioSourceMac::render(double delta)
</span><span class="cx"> {
</span><del>- if (m_muted || !m_enabled)
- return;
-
</del><span class="cx"> if (!m_audioBufferList)
</span><span class="cx"> reconfigure();
</span><span class="cx">
</span><span class="lines">@@ -162,8 +159,11 @@
</span><span class="cx"> uint32_t bipBopCount = std::min(frameCount, bipBopRemain);
</span><span class="cx"> for (auto& audioBuffer : m_audioBufferList->buffers()) {
</span><span class="cx"> audioBuffer.mDataByteSize = frameCount * m_streamFormat.mBytesPerFrame;
</span><del>- memcpy(audioBuffer.mData, &m_bipBopBuffer[bipBopStart], sizeof(Float32) * bipBopCount);
- addHum(HumVolume, HumFrequency, m_sampleRate, m_samplesRendered, static_cast<float*>(audioBuffer.mData), bipBopCount);
</del><ins>+ if (!m_muted && m_enabled) {
+ memcpy(audioBuffer.mData, &m_bipBopBuffer[bipBopStart], sizeof(Float32) * bipBopCount);
+ addHum(HumVolume, HumFrequency, m_sampleRate, m_samplesRendered, static_cast<float*>(audioBuffer.mData), bipBopCount);
+ } else
+ memset(audioBuffer.mData, 0, sizeof(Float32) * bipBopCount);
</ins><span class="cx"> }
</span><span class="cx"> emitSampleBuffers(bipBopCount);
</span><span class="cx"> m_samplesRendered += bipBopCount;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacRealtimeIncomingAudioSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -95,13 +95,17 @@
</span><span class="cx"> m_audioSourceProvider->prepare(&m_streamFormat);
</span><span class="cx"> }
</span><span class="cx">
</span><ins>+ // FIXME: We should not need to do the extra memory allocation and copy.
+ // Instead, we should be able to directly pass audioData pointer.
</ins><span class="cx"> WebAudioBufferList audioBufferList { CAAudioStreamDescription(m_streamFormat), WTF::safeCast<uint32_t>(numberOfFrames) };
</span><span class="cx"> audioBufferList.buffer(0)->mDataByteSize = numberOfChannels * numberOfFrames * bitsPerSample / 8;
</span><span class="cx"> audioBufferList.buffer(0)->mNumberChannels = numberOfChannels;
</span><del>- // FIXME: We should not need to do the extra memory allocation and copy.
- // Instead, we should be able to directly pass audioData pointer.
- memcpy(audioBufferList.buffer(0)->mData, audioData, audioBufferList.buffer(0)->mDataByteSize);
</del><span class="cx">
</span><ins>+ if (muted() || !enabled())
+ memset(audioBufferList.buffer(0)->mData, 0, audioBufferList.buffer(0)->mDataByteSize);
+ else
+ memcpy(audioBufferList.buffer(0)->mData, audioData, audioBufferList.buffer(0)->mDataByteSize);
+
</ins><span class="cx"> audioSamplesAvailable(mediaTime, audioBufferList, CAAudioStreamDescription(m_streamFormat), numberOfFrames);
</span><span class="cx"> }
</span><span class="cx">
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacRealtimeIncomingVideoSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -92,13 +92,40 @@
</span><span class="cx"> m_videoTrack->RemoveSink(this);
</span><span class="cx"> }
</span><span class="cx">
</span><ins>+CVPixelBufferRef RealtimeIncomingVideoSource::pixelBufferFromVideoFrame(const webrtc::VideoFrame& frame)
+{
+ if (muted() || !enabled()) {
+ if (!m_blackFrame || m_blackFrameWidth != frame.width() || m_blackFrameHeight != frame.height()) {
+ CVPixelBufferRef pixelBuffer = nullptr;
+ auto status = CVPixelBufferCreate(kCFAllocatorDefault, frame.width(), frame.height(), kCVPixelFormatType_420YpCbCr8Planar, nullptr, &pixelBuffer);
+ ASSERT_UNUSED(status, status == noErr);
+
+ m_blackFrame = pixelBuffer;
+ m_blackFrameWidth = frame.width();
+ m_blackFrameHeight = frame.height();
+
+ status = CVPixelBufferLockBaseAddress(pixelBuffer, 0);
+ ASSERT(status == noErr);
+ void* data = CVPixelBufferGetBaseAddress(pixelBuffer);
+ size_t yLength = frame.width() * frame.height();
+ memset(data, 0, yLength);
+ memset(static_cast<uint8_t*>(data) + yLength, 128, yLength / 2);
+
+ status = CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
+ ASSERT(!status);
+ }
+ return m_blackFrame.get();
+ }
+ auto buffer = frame.video_frame_buffer();
+ return static_cast<CVPixelBufferRef>(buffer->native_handle());
+}
+
</ins><span class="cx"> void RealtimeIncomingVideoSource::OnFrame(const webrtc::VideoFrame& frame)
</span><span class="cx"> {
</span><span class="cx"> if (!m_isProducingData)
</span><span class="cx"> return;
</span><span class="cx">
</span><del>- auto buffer = frame.video_frame_buffer();
- CVPixelBufferRef pixelBuffer = static_cast<CVPixelBufferRef>(buffer->native_handle());
</del><ins>+ auto pixelBuffer = pixelBufferFromVideoFrame(frame);
</ins><span class="cx">
</span><span class="cx"> // FIXME: Convert timing information from VideoFrame to CMSampleTimingInfo.
</span><span class="cx"> // For the moment, we will pretend that frames should be rendered asap.
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacRealtimeIncomingVideoSourceh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -70,6 +70,8 @@
</span><span class="cx"> // rtc::VideoSinkInterface
</span><span class="cx"> void OnFrame(const webrtc::VideoFrame&) final;
</span><span class="cx">
</span><ins>+ CVPixelBufferRef pixelBufferFromVideoFrame(const webrtc::VideoFrame&);
+
</ins><span class="cx"> RefPtr<Image> m_currentImage;
</span><span class="cx"> RealtimeMediaSourceSettings m_currentSettings;
</span><span class="cx"> RealtimeMediaSourceSupportedConstraints m_supportedConstraints;
</span><span class="lines">@@ -79,6 +81,9 @@
</span><span class="cx"> rtc::scoped_refptr<webrtc::VideoTrackInterface> m_videoTrack;
</span><span class="cx"> RetainPtr<CMSampleBufferRef> m_buffer;
</span><span class="cx"> PixelBufferConformerCV m_conformer;
</span><ins>+ RetainPtr<CVPixelBufferRef> m_blackFrame;
+ int m_blackFrameWidth { 0 };
+ int m_blackFrameHeight { 0 };
</ins><span class="cx"> };
</span><span class="cx">
</span><span class="cx"> } // namespace WebCore
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacRealtimeOutgoingVideoSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp (214043 => 214044)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp        2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp        2017-03-16 16:09:50 UTC (rev 214044)
</span><span class="lines">@@ -92,6 +92,7 @@
</span><span class="cx"> auto blackBuffer = m_bufferPool.CreateBuffer(settings.width(), settings.height());
</span><span class="cx"> blackBuffer->SetToBlack();
</span><span class="cx"> sendFrame(WTFMove(blackBuffer));
</span><ins>+ return;
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> ASSERT(sample.platformSample().type == PlatformSample::CMSampleBufferType);
</span></span></pre>
</div>
</div>
</body>
</html>