<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[211728] trunk/Source/WebCore</title>
</head>
<body>

<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt;  }
#msg dl a { font-weight: bold}
#msg dl a:link    { color:#fc3; }
#msg dl a:active  { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff  {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/211728">211728</a></dd>
<dt>Author</dt> <dd>eric.carlson@apple.com</dd>
<dt>Date</dt> <dd>2017-02-06 09:22:27 -0800 (Mon, 06 Feb 2017)</dd>
</dl>

<h3>Log Message</h3>
<pre>[MediaStream Mac] Stop using AVSampleBufferAudioRenderer
https://bugs.webkit.org/show_bug.cgi?id=167821

Reviewed by Jer Noble.

* WebCore.xcodeproj/project.pbxproj: Add new files.

* platform/audio/mac/AudioSampleDataSource.cpp:
(WebCore::AudioSampleDataSource::pullSamplesInternal): Don't assume the first timestamp from the
render proc after a pause is zero.

Stop using an audio renderer for each audio track. No audio renderers means we don't need to use
an AVSampleBufferRenderSynchronizer.
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
(-[WebAVSampleBufferStatusChangeListener invalidate]): No more audio renderers.
(-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC): Pause
  audio tracks explicitly.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider): Remove the existing code,
  it was incorrect and not thread safe.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers): No more audio renderers.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): No more render synchronizer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): Start each audio track.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause): Pause each audio track.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setVolume): Pass the command to each audio track.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setMuted): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::streamTime): No more render synchronizer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated): Don't handle audio samples.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateTracks): Update for audio track class change. No
more render synchronizer.
(-[WebAVSampleBufferStatusChangeListener beginObservingRenderer:]): Deleted.
(-[WebAVSampleBufferStatusChangeListener stopObservingRenderer:]): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange): Deleted.

* platform/mediastream/AudioTrackPrivateMediaStream.h:

* platform/mediastream/MediaStreamTrackPrivate.cpp:
(WebCore::MediaStreamTrackPrivate::MediaStreamTrackPrivate): add/removeObserver takes a reference,
not a pointer.
(WebCore::MediaStreamTrackPrivate::~MediaStreamTrackPrivate): Ditto.
(WebCore::MediaStreamTrackPrivate::videoSampleAvailable): Renamed from sourceHasMoreMediaData.
(WebCore::MediaStreamTrackPrivate::sourceHasMoreMediaData): Deleted.
* platform/mediastream/MediaStreamTrackPrivate.h:

* platform/mediastream/RealtimeMediaSource.cpp:
(WebCore::RealtimeMediaSource::addObserver): Take a reference, not a pointer.
(WebCore::RealtimeMediaSource::removeObserver): Ditto.
(WebCore::RealtimeMediaSource::videoSampleAvailable): Renamed from mediaDataUpdated.
(WebCore::RealtimeMediaSource::audioSamplesAvailable): New.
(WebCore::RealtimeMediaSource::stop): Drive-by cleanup.
(WebCore::RealtimeMediaSource::requestStop): Ditto.
(WebCore::RealtimeMediaSource::mediaDataUpdated): Deleted.
* platform/mediastream/RealtimeMediaSource.h:

* platform/mediastream/mac/AVAudioCaptureSource.h:
* platform/mediastream/mac/AVAudioCaptureSource.mm:
(WebCore::AVAudioCaptureSource::AVAudioCaptureSource):
(WebCore::AVAudioCaptureSource::addObserver):
(WebCore::AVAudioCaptureSource::shutdownCaptureSession):
(WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection):
(WebCore::operator==): Deleted.
(WebCore::operator!=): Deleted.

* platform/mediastream/mac/AVVideoCaptureSource.mm:
(WebCore::AVVideoCaptureSource::processNewFrame): Call videoSampleAvailable, not mediaDataUpdated.

Render audio with a CoreAudio output unit.
* platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp: Added.
(WebCore::AudioTrackPrivateMediaStreamCocoa::AudioTrackPrivateMediaStreamCocoa):
(WebCore::AudioTrackPrivateMediaStreamCocoa::~AudioTrackPrivateMediaStreamCocoa):
(WebCore::AudioTrackPrivateMediaStreamCocoa::playInternal):
(WebCore::AudioTrackPrivateMediaStreamCocoa::play):
(WebCore::AudioTrackPrivateMediaStreamCocoa::pause):
(WebCore::AudioTrackPrivateMediaStreamCocoa::setVolume):
(WebCore::AudioTrackPrivateMediaStreamCocoa::setupAudioUnit):
(WebCore::AudioTrackPrivateMediaStreamCocoa::audioSamplesAvailable):
(WebCore::AudioTrackPrivateMediaStreamCocoa::sourceStopped):
(WebCore::AudioTrackPrivateMediaStreamCocoa::render):
(WebCore::AudioTrackPrivateMediaStreamCocoa::inputProc):
* platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h: Added.

* platform/mediastream/mac/MockRealtimeAudioSourceMac.h:
* platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
(WebCore::alignTo16Bytes):
(WebCore::MockRealtimeAudioSourceMac::emitSampleBuffers):
(WebCore::MockRealtimeAudioSourceMac::reconfigure): Minor cleanup.
(WebCore::MockRealtimeAudioSourceMac::render): Ditto.

* platform/mediastream/mac/MockRealtimeVideoSourceMac.mm:
(WebCore::MockRealtimeVideoSourceMac::updateSampleBuffer): Call videoSampleAvailable, not mediaDataUpdated.

* platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h:
* platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm:
(WebCore::WebAudioSourceProviderAVFObjC::~WebAudioSourceProviderAVFObjC):
(WebCore::WebAudioSourceProviderAVFObjC::provideInput): Use a mutex. Get rid of m_writeAheadCount,
it is always 0.
(WebCore::WebAudioSourceProviderAVFObjC::prepare): Use a lock.
(WebCore::WebAudioSourceProviderAVFObjC::unprepare): Ditto.
(WebCore::WebAudioSourceProviderAVFObjC::process): Ditto.
* platform/mock/MockRealtimeAudioSource.h:
(WebCore::MockRealtimeAudioSource::renderInterval): Decrease the render interval.</pre>

<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkSourceWebCoreChangeLog">trunk/Source/WebCore/ChangeLog</a></li>
<li><a href="#trunkSourceWebCoreWebCorexcodeprojprojectpbxproj">trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj</a></li>
<li><a href="#trunkSourceWebCoreplatformaudiomacAudioSampleDataSourcecpp">trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCh">trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h</a></li>
<li><a href="#trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCmm">trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreamAudioTrackPrivateMediaStreamh">trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreamMediaStreamTrackPrivatecpp">trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreamMediaStreamTrackPrivateh">trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreamRealtimeMediaSourcecpp">trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreamRealtimeMediaSourceh">trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAVAudioCaptureSourceh">trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAVAudioCaptureSourcemm">trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAVVideoCaptureSourcemm">trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMach">trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMacmm">trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacMockRealtimeVideoSourceMacmm">trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacWebAudioSourceProviderAVFObjCh">trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacWebAudioSourceProviderAVFObjCmm">trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmockMockRealtimeAudioSourceh">trunk/Source/WebCore/platform/mock/MockRealtimeAudioSource.h</a></li>
</ul>

<h3>Added Paths</h3>
<ul>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAudioTrackPrivateMediaStreamCocoacpp">trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAudioTrackPrivateMediaStreamCocoah">trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h</a></li>
</ul>

</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkSourceWebCoreChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/ChangeLog (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/ChangeLog        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/ChangeLog        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -1,3 +1,115 @@
</span><ins>+2017-02-06  Eric Carlson  &lt;eric.carlson@apple.com&gt;
+
+        [MediaStream Mac] Stop using AVSampleBufferAudioRenderer
+        https://bugs.webkit.org/show_bug.cgi?id=167821
+
+        Reviewed by Jer Noble.
+
+        * WebCore.xcodeproj/project.pbxproj: Add new files.
+
+        * platform/audio/mac/AudioSampleDataSource.cpp:
+        (WebCore::AudioSampleDataSource::pullSamplesInternal): Don't assume the first timestamp from the
+        render proc after a pause is zero.
+
+        Stop using an audio renderer for each audio track. No audio renderers means we don't need to use
+        an AVSampleBufferRenderSynchronizer.
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
+        (-[WebAVSampleBufferStatusChangeListener invalidate]): No more audio renderers.
+        (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]): Ditto.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC): Ditto.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC): Pause
+          audio tracks explicitly.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider): Remove the existing code,
+          it was incorrect and not thread safe.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers): No more audio renderers.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): No more render synchronizer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): Ditto.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): Start each audio track.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause): Pause each audio track.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setVolume): Pass the command to each audio track.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setMuted): Ditto.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::streamTime): No more render synchronizer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated): Don't handle audio samples.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateTracks): Update for audio track class change. No
+        more render synchronizer.
+        (-[WebAVSampleBufferStatusChangeListener beginObservingRenderer:]): Deleted.
+        (-[WebAVSampleBufferStatusChangeListener stopObservingRenderer:]): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange): Deleted.
+
+        * platform/mediastream/AudioTrackPrivateMediaStream.h:
+
+        * platform/mediastream/MediaStreamTrackPrivate.cpp:
+        (WebCore::MediaStreamTrackPrivate::MediaStreamTrackPrivate): add/removeObserver takes a reference,
+        not a pointer.
+        (WebCore::MediaStreamTrackPrivate::~MediaStreamTrackPrivate): Ditto.
+        (WebCore::MediaStreamTrackPrivate::videoSampleAvailable): Renamed from sourceHasMoreMediaData.
+        (WebCore::MediaStreamTrackPrivate::sourceHasMoreMediaData): Deleted.
+        * platform/mediastream/MediaStreamTrackPrivate.h:
+
+        * platform/mediastream/RealtimeMediaSource.cpp:
+        (WebCore::RealtimeMediaSource::addObserver): Take a reference, not a pointer.
+        (WebCore::RealtimeMediaSource::removeObserver): Ditto.
+        (WebCore::RealtimeMediaSource::videoSampleAvailable): Renamed from mediaDataUpdated.
+        (WebCore::RealtimeMediaSource::audioSamplesAvailable): New.
+        (WebCore::RealtimeMediaSource::stop): Drive-by cleanup.
+        (WebCore::RealtimeMediaSource::requestStop): Ditto.
+        (WebCore::RealtimeMediaSource::mediaDataUpdated): Deleted.
+        * platform/mediastream/RealtimeMediaSource.h:
+
+        * platform/mediastream/mac/AVAudioCaptureSource.h:
+        * platform/mediastream/mac/AVAudioCaptureSource.mm:
+        (WebCore::AVAudioCaptureSource::AVAudioCaptureSource):
+        (WebCore::AVAudioCaptureSource::addObserver):
+        (WebCore::AVAudioCaptureSource::shutdownCaptureSession):
+        (WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection):
+        (WebCore::operator==): Deleted.
+        (WebCore::operator!=): Deleted.
+
+        * platform/mediastream/mac/AVVideoCaptureSource.mm:
+        (WebCore::AVVideoCaptureSource::processNewFrame): Call videoSampleAvailable, not mediaDataUpdated.
+
+        Render audio with a CoreAudio output unit.
+        * platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp: Added.
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::AudioTrackPrivateMediaStreamCocoa):
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::~AudioTrackPrivateMediaStreamCocoa):
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::playInternal):
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::play):
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::pause):
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::setVolume):
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::setupAudioUnit):
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::audioSamplesAvailable):
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::sourceStopped):
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::render):
+        (WebCore::AudioTrackPrivateMediaStreamCocoa::inputProc):
+        * platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h: Added.
+
+        * platform/mediastream/mac/MockRealtimeAudioSourceMac.h:
+        * platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
+        (WebCore::alignTo16Bytes):
+        (WebCore::MockRealtimeAudioSourceMac::emitSampleBuffers):
+        (WebCore::MockRealtimeAudioSourceMac::reconfigure): Minor cleanup.
+        (WebCore::MockRealtimeAudioSourceMac::render): Ditto.
+
+        * platform/mediastream/mac/MockRealtimeVideoSourceMac.mm:
+        (WebCore::MockRealtimeVideoSourceMac::updateSampleBuffer): Call videoSampleAvailable, not mediaDataUpdated.
+
+        * platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h:
+        * platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm:
+        (WebCore::WebAudioSourceProviderAVFObjC::~WebAudioSourceProviderAVFObjC):
+        (WebCore::WebAudioSourceProviderAVFObjC::provideInput): Use a mutex. Get rid of m_writeAheadCount,
+        it is always 0.
+        (WebCore::WebAudioSourceProviderAVFObjC::prepare): Use a lock.
+        (WebCore::WebAudioSourceProviderAVFObjC::unprepare): Ditto.
+        (WebCore::WebAudioSourceProviderAVFObjC::process): Ditto.
+        * platform/mock/MockRealtimeAudioSource.h:
+        (WebCore::MockRealtimeAudioSource::renderInterval): Decrease the render interval.
+
</ins><span class="cx"> 2017-02-06  Antoine Quint  &lt;graouts@apple.com&gt;
</span><span class="cx"> 
</span><span class="cx">         [Modern Media Controls] Add a backdrop filter to the start button on macOS
</span></span></pre></div>
<a id="trunkSourceWebCoreWebCorexcodeprojprojectpbxproj"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -159,6 +159,7 @@
</span><span class="cx">                 07638A9A1884487200E15A1B /* MediaSessionManagerIOS.mm in Sources */ = {isa = PBXBuildFile; fileRef = 07638A981884487200E15A1B /* MediaSessionManagerIOS.mm */; };
</span><span class="cx">                 076970861463AD8700F502CF /* TextTrackList.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 076970841463AD8700F502CF /* TextTrackList.cpp */; };
</span><span class="cx">                 076970871463AD8700F502CF /* TextTrackList.h in Headers */ = {isa = PBXBuildFile; fileRef = 076970851463AD8700F502CF /* TextTrackList.h */; };
</span><ins>+                076EC1331E44F56D00E5D813 /* AudioTrackPrivateMediaStreamCocoa.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 076EC1321E44F2CB00E5D813 /* AudioTrackPrivateMediaStreamCocoa.cpp */; };
</ins><span class="cx">                 076F0D0E12B8192700C26AA4 /* MediaPlayerPrivateAVFoundation.h in Headers */ = {isa = PBXBuildFile; fileRef = 076F0D0A12B8192700C26AA4 /* MediaPlayerPrivateAVFoundation.h */; };
</span><span class="cx">                 07707CB01E205EE300005BF7 /* AudioSourceObserverObjC.h in Headers */ = {isa = PBXBuildFile; fileRef = 07707CAF1E205EC400005BF7 /* AudioSourceObserverObjC.h */; };
</span><span class="cx">                 077664FC183E6B5C00133B92 /* JSQuickTimePluginReplacement.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 077664FA183E6B5C00133B92 /* JSQuickTimePluginReplacement.cpp */; };
</span><span class="lines">@@ -284,7 +285,6 @@
</span><span class="cx">                 07B7116F1D899E63009F0FFB /* CaptureDeviceManager.h in Headers */ = {isa = PBXBuildFile; fileRef = 07B7116C1D899E63009F0FFB /* CaptureDeviceManager.h */; };
</span><span class="cx">                 07C046C31E42508B007201E7 /* CAAudioStreamDescription.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 073B87571E40DCFD0071C0EC /* CAAudioStreamDescription.cpp */; };
</span><span class="cx">                 07C046C41E42508B007201E7 /* CAAudioStreamDescription.h in Headers */ = {isa = PBXBuildFile; fileRef = 073B87581E40DCFD0071C0EC /* CAAudioStreamDescription.h */; settings = {ATTRIBUTES = (Private, ); }; };
</span><del>-                07C046C71E425155007201E7 /* AudioTrackPrivateMediaStreamCocoa.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 07C046C51E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.cpp */; };
</del><span class="cx">                 07C046C81E425155007201E7 /* AudioTrackPrivateMediaStreamCocoa.h in Headers */ = {isa = PBXBuildFile; fileRef = 07C046C61E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.h */; };
</span><span class="cx">                 07C046CB1E426413007201E7 /* AudioStreamDescription.h in Headers */ = {isa = PBXBuildFile; fileRef = 073B87561E40DCE50071C0EC /* AudioStreamDescription.h */; settings = {ATTRIBUTES = (Private, ); }; };
</span><span class="cx">                 07C1C0E21BFB600100BD2256 /* MediaTrackSupportedConstraints.h in Headers */ = {isa = PBXBuildFile; fileRef = 07C1C0E01BFB600100BD2256 /* MediaTrackSupportedConstraints.h */; };
</span><span class="lines">@@ -7256,6 +7256,7 @@
</span><span class="cx">                 07638A981884487200E15A1B /* MediaSessionManagerIOS.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = MediaSessionManagerIOS.mm; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><span class="cx">                 076970841463AD8700F502CF /* TextTrackList.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = TextTrackList.cpp; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><span class="cx">                 076970851463AD8700F502CF /* TextTrackList.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = TextTrackList.h; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><ins>+                076EC1321E44F2CB00E5D813 /* AudioTrackPrivateMediaStreamCocoa.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = AudioTrackPrivateMediaStreamCocoa.cpp; sourceTree = &quot;&lt;group&gt;&quot;; };
</ins><span class="cx">                 076F0D0912B8192700C26AA4 /* MediaPlayerPrivateAVFoundation.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = MediaPlayerPrivateAVFoundation.cpp; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><span class="cx">                 076F0D0A12B8192700C26AA4 /* MediaPlayerPrivateAVFoundation.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = MediaPlayerPrivateAVFoundation.h; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><span class="cx">                 07707CAF1E205EC400005BF7 /* AudioSourceObserverObjC.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioSourceObserverObjC.h; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><span class="lines">@@ -7337,6 +7338,7 @@
</span><span class="cx">                 07B7116A1D899E63009F0FFB /* CaptureDevice.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CaptureDevice.h; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><span class="cx">                 07B7116B1D899E63009F0FFB /* CaptureDeviceManager.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = CaptureDeviceManager.cpp; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><span class="cx">                 07B7116C1D899E63009F0FFB /* CaptureDeviceManager.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CaptureDeviceManager.h; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><ins>+                07C046C61E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioTrackPrivateMediaStreamCocoa.h; sourceTree = &quot;&lt;group&gt;&quot;; };
</ins><span class="cx">                 07C1C0E01BFB600100BD2256 /* MediaTrackSupportedConstraints.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = MediaTrackSupportedConstraints.h; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><span class="cx">                 07C1C0E11BFB600100BD2256 /* MediaTrackSupportedConstraints.idl */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = MediaTrackSupportedConstraints.idl; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><span class="cx">                 07C1C0E41BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RealtimeMediaSourceSupportedConstraints.h; sourceTree = &quot;&lt;group&gt;&quot;; };
</span><span class="lines">@@ -15402,6 +15404,7 @@
</span><span class="cx">                 0729B14D17CFCCA0004F1D60 /* mac */ = {
</span><span class="cx">                         isa = PBXGroup;
</span><span class="cx">                         children = (
</span><ins>+                                076EC1321E44F2CB00E5D813 /* AudioTrackPrivateMediaStreamCocoa.cpp */,
</ins><span class="cx">                                 5CDD83391E4324BB00621E92 /* RealtimeIncomingVideoSource.cpp */,
</span><span class="cx">                                 5CDD833A1E4324BB00621E92 /* RealtimeIncomingVideoSource.h */,
</span><span class="cx">                                 5CDD833B1E4324BB00621E92 /* RealtimeOutgoingVideoSource.cpp */,
</span><span class="lines">@@ -15408,7 +15411,6 @@
</span><span class="cx">                                 5CDD833C1E4324BB00621E92 /* RealtimeOutgoingVideoSource.h */,
</span><span class="cx">                                 07707CB11E20649C00005BF7 /* AudioCaptureSourceProviderObjC.h */,
</span><span class="cx">                                 07707CAF1E205EC400005BF7 /* AudioSourceObserverObjC.h */,
</span><del>-                                07C046C51E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.cpp */,
</del><span class="cx">                                 07C046C61E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.h */,
</span><span class="cx">                                 070363D8181A1CDC00C074A5 /* AVAudioCaptureSource.h */,
</span><span class="cx">                                 070363D9181A1CDC00C074A5 /* AVAudioCaptureSource.mm */,
</span><span class="lines">@@ -25245,6 +25247,7 @@
</span><span class="cx">                                 CDE3A85417F5FCE600C5BE20 /* AudioTrackPrivateAVF.h in Headers */,
</span><span class="cx">                                 CDE3A85817F6020400C5BE20 /* AudioTrackPrivateAVFObjC.h in Headers */,
</span><span class="cx">                                 CD54A763180F9F7000B076C9 /* AudioTrackPrivateMediaSourceAVFObjC.h in Headers */,
</span><ins>+                                07C046C81E425155007201E7 /* AudioTrackPrivateMediaStreamCocoa.h in Headers */,
</ins><span class="cx">                                 07D6A4F81BF2307D00174146 /* AudioTrackPrivateMediaStream.h in Headers */,
</span><span class="cx">                                 FD31608B12B026F700C1A359 /* AudioUtilities.h in Headers */,
</span><span class="cx">                                 7EE6846012D26E3800E79415 /* AuthenticationCF.h in Headers */,
</span><span class="lines">@@ -31870,6 +31873,7 @@
</span><span class="cx">                                 7C39C3741DDBB8D300FEFB29 /* SVGTransformListValues.cpp in Sources */,
</span><span class="cx">                                 7CE58D571DD7D96D00128552 /* SVGTransformValue.cpp in Sources */,
</span><span class="cx">                                 B2227AE10D00BF220071B782 /* SVGTRefElement.cpp in Sources */,
</span><ins>+                                076EC1331E44F56D00E5D813 /* AudioTrackPrivateMediaStreamCocoa.cpp in Sources */,
</ins><span class="cx">                                 B2227AE40D00BF220071B782 /* SVGTSpanElement.cpp in Sources */,
</span><span class="cx">                                 B2227AE90D00BF220071B782 /* SVGURIReference.cpp in Sources */,
</span><span class="cx">                                 B2227AEC0D00BF220071B782 /* SVGUseElement.cpp in Sources */,
</span><span class="lines">@@ -32119,7 +32123,6 @@
</span><span class="cx">                                 49C7B9E51042D32F0009D447 /* WebGLTexture.cpp in Sources */,
</span><span class="cx">                                 6F995A231A7078B100A735F4 /* WebGLTransformFeedback.cpp in Sources */,
</span><span class="cx">                                 0C3F1F5A10C8871200D72CE1 /* WebGLUniformLocation.cpp in Sources */,
</span><del>-                                07C046C71E425155007201E7 /* AudioTrackPrivateMediaStreamCocoa.cpp in Sources */,
</del><span class="cx">                                 6F995A251A7078B100A735F4 /* WebGLVertexArrayObject.cpp in Sources */,
</span><span class="cx">                                 6F222B761AB52D8A0094651A /* WebGLVertexArrayObjectBase.cpp in Sources */,
</span><span class="cx">                                 77A17A7712F28642004E02F6 /* WebGLVertexArrayObjectOES.cpp in Sources */,
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiomacAudioSampleDataSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.cpp (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.cpp        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.cpp        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -255,14 +255,13 @@
</span><span class="cx">         const double tenMS = .01;
</span><span class="cx">         const double fiveMS = .005;
</span><span class="cx">         double sampleRate = m_outputDescription-&gt;sampleRate();
</span><ins>+        m_outputSampleOffset = timeStamp + m_timeStamp;
</ins><span class="cx">         if (buffered &gt; sampleRate * twentyMS)
</span><del>-            m_outputSampleOffset = m_timeStamp - sampleRate * twentyMS;
</del><ins>+            m_outputSampleOffset -= sampleRate * twentyMS;
</ins><span class="cx">         else if (buffered &gt; sampleRate * tenMS)
</span><del>-            m_outputSampleOffset = m_timeStamp - sampleRate * tenMS;
</del><ins>+            m_outputSampleOffset -= sampleRate * tenMS;
</ins><span class="cx">         else if (buffered &gt; sampleRate * fiveMS)
</span><del>-            m_outputSampleOffset = m_timeStamp - sampleRate * fiveMS;
-        else
-            m_outputSampleOffset = m_timeStamp;
</del><ins>+            m_outputSampleOffset -= sampleRate * fiveMS;
</ins><span class="cx"> 
</span><span class="cx">         m_transitioningFromPaused = false;
</span><span class="cx">     }
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -45,7 +45,7 @@
</span><span class="cx"> 
</span><span class="cx"> namespace WebCore {
</span><span class="cx"> 
</span><del>-class AudioTrackPrivateMediaStream;
</del><ins>+class AudioTrackPrivateMediaStreamCocoa;
</ins><span class="cx"> class AVVideoCaptureSource;
</span><span class="cx"> class Clock;
</span><span class="cx"> class MediaSourcePrivateClient;
</span><span class="lines">@@ -55,10 +55,6 @@
</span><span class="cx"> class VideoFullscreenLayerManager;
</span><span class="cx"> #endif
</span><span class="cx"> 
</span><del>-#if __has_include(&lt;AVFoundation/AVSampleBufferRenderSynchronizer.h&gt;)
-#define USE_RENDER_SYNCHRONIZER 1
-#endif
-
</del><span class="cx"> class MediaPlayerPrivateMediaStreamAVFObjC final : public MediaPlayerPrivateInterface, private MediaStreamPrivate::Observer, private MediaStreamTrackPrivate::Observer {
</span><span class="cx"> public:
</span><span class="cx">     explicit MediaPlayerPrivateMediaStreamAVFObjC(MediaPlayer*);
</span><span class="lines">@@ -81,7 +77,6 @@
</span><span class="cx">     void ensureLayer();
</span><span class="cx">     void destroyLayer();
</span><span class="cx"> 
</span><del>-    void rendererStatusDidChange(AVSampleBufferAudioRenderer*, NSNumber*);
</del><span class="cx">     void layerStatusDidChange(AVSampleBufferDisplayLayer*, NSNumber*);
</span><span class="cx"> 
</span><span class="cx"> private:
</span><span class="lines">@@ -144,13 +139,6 @@
</span><span class="cx">     void flushAndRemoveVideoSampleBuffers();
</span><span class="cx">     void requestNotificationWhenReadyForVideoData();
</span><span class="cx"> 
</span><del>-    void enqueueAudioSample(MediaStreamTrackPrivate&amp;, MediaSample&amp;);
-    void createAudioRenderer(AtomicString);
-    void destroyAudioRenderer(AVSampleBufferAudioRenderer*);
-    void destroyAudioRenderer(AtomicString);
-    void destroyAudioRenderers();
-    void requestNotificationWhenReadyForAudioData(AtomicString);
-
</del><span class="cx">     void paint(GraphicsContext&amp;, const FloatRect&amp;) override;
</span><span class="cx">     void paintCurrentFrameInContext(GraphicsContext&amp;, const FloatRect&amp;) override;
</span><span class="cx">     bool metaDataAvailable() const { return m_mediaStreamPrivate &amp;&amp; m_readyState &gt;= MediaPlayer::HaveMetadata; }
</span><span class="lines">@@ -210,9 +198,7 @@
</span><span class="cx"> 
</span><span class="cx">     MediaTime streamTime() const;
</span><span class="cx"> 
</span><del>-#if USE(RENDER_SYNCHRONIZER)
</del><span class="cx">     AudioSourceProvider* audioSourceProvider() final;
</span><del>-#endif
</del><span class="cx"> 
</span><span class="cx">     MediaPlayer* m_player { nullptr };
</span><span class="cx">     WeakPtrFactory&lt;MediaPlayerPrivateMediaStreamAVFObjC&gt; m_weakPtrFactory;
</span><span class="lines">@@ -222,22 +208,14 @@
</span><span class="cx"> 
</span><span class="cx">     RetainPtr&lt;WebAVSampleBufferStatusChangeListener&gt; m_statusChangeListener;
</span><span class="cx">     RetainPtr&lt;AVSampleBufferDisplayLayer&gt; m_sampleBufferDisplayLayer;
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-    HashMap&lt;String, RetainPtr&lt;AVSampleBufferAudioRenderer&gt;&gt; m_audioRenderers;
-    RetainPtr&lt;AVSampleBufferRenderSynchronizer&gt; m_synchronizer;
-#else
</del><span class="cx">     std::unique_ptr&lt;Clock&gt; m_clock;
</span><del>-#endif
</del><span class="cx"> 
</span><span class="cx">     MediaTime m_pausedTime;
</span><span class="cx">     RetainPtr&lt;CGImageRef&gt; m_pausedImage;
</span><span class="cx"> 
</span><del>-    HashMap&lt;String, RefPtr&lt;AudioTrackPrivateMediaStream&gt;&gt; m_audioTrackMap;
</del><ins>+    HashMap&lt;String, RefPtr&lt;AudioTrackPrivateMediaStreamCocoa&gt;&gt; m_audioTrackMap;
</ins><span class="cx">     HashMap&lt;String, RefPtr&lt;VideoTrackPrivateMediaStream&gt;&gt; m_videoTrackMap;
</span><span class="cx">     PendingSampleQueue m_pendingVideoSampleQueue;
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-    PendingSampleQueue m_pendingAudioSampleQueue;
-#endif
</del><span class="cx"> 
</span><span class="cx">     MediaPlayer::NetworkState m_networkState { MediaPlayer::Empty };
</span><span class="cx">     MediaPlayer::ReadyState m_readyState { MediaPlayer::HaveNothing };
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -29,7 +29,7 @@
</span><span class="cx"> #if ENABLE(MEDIA_STREAM) &amp;&amp; USE(AVFOUNDATION)
</span><span class="cx"> 
</span><span class="cx"> #import &quot;AVFoundationSPI.h&quot;
</span><del>-#import &quot;AudioTrackPrivateMediaStream.h&quot;
</del><ins>+#import &quot;AudioTrackPrivateMediaStreamCocoa.h&quot;
</ins><span class="cx"> #import &quot;Clock.h&quot;
</span><span class="cx"> #import &quot;CoreMediaSoftLink.h&quot;
</span><span class="cx"> #import &quot;GraphicsContext.h&quot;
</span><span class="lines">@@ -52,7 +52,6 @@
</span><span class="cx"> 
</span><span class="cx"> SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
</span><span class="cx"> 
</span><del>-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferAudioRenderer)
</del><span class="cx"> SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer)
</span><span class="cx"> SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferRenderSynchronizer)
</span><span class="cx"> 
</span><span class="lines">@@ -67,7 +66,6 @@
</span><span class="cx"> @interface WebAVSampleBufferStatusChangeListener : NSObject {
</span><span class="cx">     MediaPlayerPrivateMediaStreamAVFObjC* _parent;
</span><span class="cx">     Vector&lt;RetainPtr&lt;AVSampleBufferDisplayLayer&gt;&gt; _layers;
</span><del>-    Vector&lt;RetainPtr&lt;AVSampleBufferAudioRenderer&gt;&gt; _renderers;
</del><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> - (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)callback;
</span><span class="lines">@@ -74,8 +72,6 @@
</span><span class="cx"> - (void)invalidate;
</span><span class="cx"> - (void)beginObservingLayer:(AVSampleBufferDisplayLayer *)layer;
</span><span class="cx"> - (void)stopObservingLayer:(AVSampleBufferDisplayLayer *)layer;
</span><del>-- (void)beginObservingRenderer:(AVSampleBufferAudioRenderer *)renderer;
-- (void)stopObservingRenderer:(AVSampleBufferAudioRenderer *)renderer;
</del><span class="cx"> @end
</span><span class="cx"> 
</span><span class="cx"> @implementation WebAVSampleBufferStatusChangeListener
</span><span class="lines">@@ -101,10 +97,6 @@
</span><span class="cx">         [layer removeObserver:self forKeyPath:@&quot;status&quot;];
</span><span class="cx">     _layers.clear();
</span><span class="cx"> 
</span><del>-    for (auto&amp; renderer : _renderers)
-        [renderer removeObserver:self forKeyPath:@&quot;status&quot;];
-    _renderers.clear();
-
</del><span class="cx">     [[NSNotificationCenter defaultCenter] removeObserver:self];
</span><span class="cx"> 
</span><span class="cx">     _parent = nullptr;
</span><span class="lines">@@ -128,24 +120,6 @@
</span><span class="cx">     _layers.remove(_layers.find(layer));
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-- (void)beginObservingRenderer:(AVSampleBufferAudioRenderer*)renderer
-{
-    ASSERT(_parent);
-    ASSERT(!_renderers.contains(renderer));
-
-    _renderers.append(renderer);
-    [renderer addObserver:self forKeyPath:@&quot;status&quot; options:NSKeyValueObservingOptionNew context:nullptr];
-}
-
-- (void)stopObservingRenderer:(AVSampleBufferAudioRenderer*)renderer
-{
-    ASSERT(_parent);
-    ASSERT(_renderers.contains(renderer));
-
-    [renderer removeObserver:self forKeyPath:@&quot;status&quot;];
-    _renderers.remove(_renderers.find(renderer));
-}
-
</del><span class="cx"> - (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
</span><span class="cx"> {
</span><span class="cx">     UNUSED_PARAM(context);
</span><span class="lines">@@ -167,19 +141,6 @@
</span><span class="cx">             protectedSelf-&gt;_parent-&gt;layerStatusDidChange(layer.get(), status.get());
</span><span class="cx">         });
</span><span class="cx"> 
</span><del>-    } else if ([object isKindOfClass:getAVSampleBufferAudioRendererClass()]) {
-        RetainPtr&lt;AVSampleBufferAudioRenderer&gt; renderer = (AVSampleBufferAudioRenderer *)object;
-        RetainPtr&lt;NSNumber&gt; status = [change valueForKey:NSKeyValueChangeNewKey];
-
-        ASSERT(_renderers.contains(renderer.get()));
-        ASSERT([keyPath isEqualToString:@&quot;status&quot;]);
-
-        callOnMainThread([protectedSelf = WTFMove(protectedSelf), renderer = WTFMove(renderer), status = WTFMove(status)] {
-            if (!protectedSelf-&gt;_parent)
-                return;
-
-            protectedSelf-&gt;_parent-&gt;rendererStatusDidChange(renderer.get(), status.get());
-        });
</del><span class="cx">     } else
</span><span class="cx">         ASSERT_NOT_REACHED();
</span><span class="cx"> }
</span><span class="lines">@@ -196,11 +157,7 @@
</span><span class="cx">     : m_player(player)
</span><span class="cx">     , m_weakPtrFactory(this)
</span><span class="cx">     , m_statusChangeListener(adoptNS([[WebAVSampleBufferStatusChangeListener alloc] initWithParent:this]))
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-    , m_synchronizer(adoptNS([allocAVSampleBufferRenderSynchronizerInstance() init]))
-#else
</del><span class="cx">     , m_clock(Clock::create())
</span><del>-#endif
</del><span class="cx"> #if PLATFORM(MAC) &amp;&amp; ENABLE(VIDEO_PRESENTATION_MODE)
</span><span class="cx">     , m_videoFullscreenLayerManager(VideoFullscreenLayerManager::create())
</span><span class="cx"> #endif
</span><span class="lines">@@ -211,6 +168,9 @@
</span><span class="cx"> MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC()
</span><span class="cx"> {
</span><span class="cx">     LOG(Media, &quot;MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC(%p)&quot;, this);
</span><ins>+    for (const auto&amp; track : m_audioTrackMap.values())
+        track-&gt;pause();
+
</ins><span class="cx">     if (m_mediaStreamPrivate) {
</span><span class="cx">         m_mediaStreamPrivate-&gt;removeObserver(*this);
</span><span class="cx"> 
</span><span class="lines">@@ -219,9 +179,6 @@
</span><span class="cx">     }
</span><span class="cx"> 
</span><span class="cx">     destroyLayer();
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-    destroyAudioRenderers();
-#endif
</del><span class="cx"> 
</span><span class="cx">     m_audioTrackMap.clear();
</span><span class="cx">     m_videoTrackMap.clear();
</span><span class="lines">@@ -315,33 +272,6 @@
</span><span class="cx">     return timelineOffset;
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-void MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample(MediaStreamTrackPrivate&amp; track, MediaSample&amp; sample)
-{
-    ASSERT(m_audioTrackMap.contains(track.id()));
-    ASSERT(m_audioRenderers.contains(sample.trackID()));
-
-    auto audioTrack = m_audioTrackMap.get(track.id());
-    MediaTime timelineOffset = audioTrack-&gt;timelineOffset();
-    if (timelineOffset == MediaTime::invalidTime()) {
-        timelineOffset = calculateTimelineOffset(sample, rendererLatency);
-        audioTrack-&gt;setTimelineOffset(timelineOffset);
-        LOG(MediaCaptureSamples, &quot;MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample: timeline offset for track %s set to %s&quot;, track.id().utf8().data(), toString(timelineOffset).utf8().data());
-    }
-
-    updateSampleTimes(sample, timelineOffset, &quot;MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample&quot;);
-
-    auto renderer = m_audioRenderers.get(sample.trackID());
-    if (![renderer isReadyForMoreMediaData]) {
-        addSampleToPendingQueue(m_pendingAudioSampleQueue, sample);
-        requestNotificationWhenReadyForAudioData(sample.trackID());
-        return;
-    }
-
-    [renderer enqueueSampleBuffer:sample.platformSample().sample.cmSampleBuffer];
-}
-#endif
-
</del><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPrivate&amp; track, MediaSample&amp; sample)
</span><span class="cx"> {
</span><span class="cx">     ASSERT(m_videoTrackMap.contains(track.id()));
</span><span class="lines">@@ -400,102 +330,12 @@
</span><span class="cx">     }];
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData(AtomicString trackID)
-{
-    if (!m_audioRenderers.contains(trackID))
-        return;
-
-    auto renderer = m_audioRenderers.get(trackID);
-    [renderer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
-        [renderer stopRequestingMediaData];
-
-        auto audioTrack = m_audioTrackMap.get(trackID);
-        while (!m_pendingAudioSampleQueue.isEmpty()) {
-            if (![renderer isReadyForMoreMediaData]) {
-                requestNotificationWhenReadyForAudioData(trackID);
-                return;
-            }
-
-            auto sample = m_pendingAudioSampleQueue.takeFirst();
-            enqueueAudioSample(audioTrack-&gt;streamTrack(), sample.get());
-        }
-    }];
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer(AtomicString trackID)
-{
-    ASSERT(!m_audioRenderers.contains(trackID));
-    auto renderer = adoptNS([allocAVSampleBufferAudioRendererInstance() init]);
-    [renderer setAudioTimePitchAlgorithm:(m_player-&gt;preservesPitch() ? AVAudioTimePitchAlgorithmSpectral : AVAudioTimePitchAlgorithmVarispeed)];
-    m_audioRenderers.set(trackID, renderer);
-    [m_synchronizer addRenderer:renderer.get()];
-    [m_statusChangeListener beginObservingRenderer:renderer.get()];
-    if (m_audioRenderers.size() == 1)
-        renderingModeChanged();
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer(AVSampleBufferAudioRenderer* renderer)
-{
-    [m_statusChangeListener stopObservingRenderer:renderer];
-    [renderer flush];
-    [renderer stopRequestingMediaData];
-
-    CMTime now = CMTimebaseGetTime([m_synchronizer timebase]);
-    [m_synchronizer removeRenderer:renderer atTime:now withCompletionHandler:^(BOOL) { }];
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer(AtomicString trackID)
-{
-    if (!m_audioRenderers.contains(trackID))
-        return;
-
-    destroyAudioRenderer(m_audioRenderers.get(trackID).get());
-    m_audioRenderers.remove(trackID);
-    if (!m_audioRenderers.size())
-        renderingModeChanged();
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers()
-{
-    m_pendingAudioSampleQueue.clear();
-    for (auto&amp; renderer : m_audioRenderers.values())
-        destroyAudioRenderer(renderer.get());
-    m_audioRenderers.clear();
-}
-
</del><span class="cx"> AudioSourceProvider* MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider()
</span><span class="cx"> {
</span><span class="cx">     // FIXME: This should return a mix of all audio tracks - https://bugs.webkit.org/show_bug.cgi?id=160305
</span><del>-    for (const auto&amp; track : m_audioTrackMap.values()) {
-        if (track-&gt;streamTrack().ended() || !track-&gt;streamTrack().enabled() || track-&gt;streamTrack().muted())
-            continue;
-
-        return track-&gt;streamTrack().audioSourceProvider();
-    }
</del><span class="cx">     return nullptr;
</span><span class="cx"> }
</span><del>-#endif
</del><span class="cx"> 
</span><del>-void MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange(AVSampleBufferAudioRenderer* renderer, NSNumber* status)
-{
-#if USE(RENDER_SYNCHRONIZER)
-    String trackID;
-    for (auto&amp; pair : m_audioRenderers) {
-        if (pair.value == renderer) {
-            trackID = pair.key;
-            break;
-        }
-    }
-    ASSERT(!trackID.isEmpty());
-    if (status.integerValue == AVQueuedSampleBufferRenderingStatusRendering)
-        m_audioTrackMap.get(trackID)-&gt;setTimelineOffset(MediaTime::invalidTime());
-#else
-    UNUSED_PARAM(renderer);
-    UNUSED_PARAM(status);
-#endif
-}
-
</del><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer, NSNumber* status)
</span><span class="cx"> {
</span><span class="cx">     if (status.integerValue != AVQueuedSampleBufferRenderingStatusRendering)
</span><span class="lines">@@ -513,11 +353,6 @@
</span><span class="cx"> {
</span><span class="cx">     if (m_sampleBufferDisplayLayer)
</span><span class="cx">         [m_sampleBufferDisplayLayer flush];
</span><del>-
-#if USE(RENDER_SYNCHRONIZER)
-    for (auto&amp; renderer : m_audioRenderers.values())
-        [renderer flush];
-#endif
</del><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> bool MediaPlayerPrivateMediaStreamAVFObjC::shouldEnqueueVideoSampleBuffer() const
</span><span class="lines">@@ -549,10 +384,6 @@
</span><span class="cx">     m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
</span><span class="cx">     [m_statusChangeListener beginObservingLayer:m_sampleBufferDisplayLayer.get()];
</span><span class="cx"> 
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-    [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()];
-#endif
-
</del><span class="cx">     renderingModeChanged();
</span><span class="cx">     
</span><span class="cx"> #if PLATFORM(MAC) &amp;&amp; ENABLE(VIDEO_PRESENTATION_MODE)
</span><span class="lines">@@ -570,13 +401,6 @@
</span><span class="cx">         [m_statusChangeListener stopObservingLayer:m_sampleBufferDisplayLayer.get()];
</span><span class="cx">         [m_sampleBufferDisplayLayer stopRequestingMediaData];
</span><span class="cx">         [m_sampleBufferDisplayLayer flush];
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-        CMTime currentTime = CMTimebaseGetTime([m_synchronizer timebase]);
-        [m_synchronizer removeRenderer:m_sampleBufferDisplayLayer.get() atTime:currentTime withCompletionHandler:^(BOOL) {
-            // No-op.
-        }];
-        m_sampleBufferDisplayLayer = nullptr;
-#endif
</del><span class="cx">     }
</span><span class="cx"> 
</span><span class="cx">     renderingModeChanged();
</span><span class="lines">@@ -700,14 +524,12 @@
</span><span class="cx">         return;
</span><span class="cx"> 
</span><span class="cx">     m_playing = true;
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-    if (!m_synchronizer.get().rate)
-        [m_synchronizer setRate:1 ]; // streamtime
-#else
</del><span class="cx">     if (!m_clock-&gt;isRunning())
</span><span class="cx">         m_clock-&gt;start();
</span><del>-#endif
</del><span class="cx"> 
</span><ins>+    for (const auto&amp; track : m_audioTrackMap.values())
+        track-&gt;play();
+
</ins><span class="cx">     m_haveEverPlayed = true;
</span><span class="cx">     scheduleDeferredTask([this] {
</span><span class="cx">         updateDisplayMode();
</span><span class="lines">@@ -725,6 +547,9 @@
</span><span class="cx">     m_pausedTime = currentMediaTime();
</span><span class="cx">     m_playing = false;
</span><span class="cx"> 
</span><ins>+    for (const auto&amp; track : m_audioTrackMap.values())
+        track-&gt;pause();
+
</ins><span class="cx">     updateDisplayMode();
</span><span class="cx">     updatePausedImage();
</span><span class="cx">     flushRenderers();
</span><span class="lines">@@ -743,11 +568,8 @@
</span><span class="cx">         return;
</span><span class="cx"> 
</span><span class="cx">     m_volume = volume;
</span><del>-
-#if USE(RENDER_SYNCHRONIZER)
-    for (auto&amp; renderer : m_audioRenderers.values())
-        [renderer setVolume:volume];
-#endif
</del><ins>+    for (const auto&amp; track : m_audioTrackMap.values())
+        track-&gt;setVolume(m_volume);
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::setMuted(bool muted)
</span><span class="lines">@@ -758,11 +580,6 @@
</span><span class="cx">         return;
</span><span class="cx"> 
</span><span class="cx">     m_muted = muted;
</span><del>-    
-#if USE(RENDER_SYNCHRONIZER)
-    for (auto&amp; renderer : m_audioRenderers.values())
-        [renderer setMuted:muted];
-#endif
</del><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> bool MediaPlayerPrivateMediaStreamAVFObjC::hasVideo() const
</span><span class="lines">@@ -796,11 +613,7 @@
</span><span class="cx"> 
</span><span class="cx"> MediaTime MediaPlayerPrivateMediaStreamAVFObjC::streamTime() const
</span><span class="cx"> {
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-    return toMediaTime(CMTimebaseGetTime([m_synchronizer timebase]));
-#else
</del><span class="cx">     return MediaTime::createWithDouble(m_clock-&gt;currentTime());
</span><del>-#endif
</del><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> MediaPlayer::NetworkState MediaPlayerPrivateMediaStreamAVFObjC::networkState() const
</span><span class="lines">@@ -925,19 +738,11 @@
</span><span class="cx">     if (!m_playing || streamTime().toDouble() &lt; 0)
</span><span class="cx">         return;
</span><span class="cx"> 
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-    if (!CMTimebaseGetEffectiveRate([m_synchronizer timebase]))
-        return;
-#endif
-
</del><span class="cx">     switch (track.type()) {
</span><span class="cx">     case RealtimeMediaSource::None:
</span><span class="cx">         // Do nothing.
</span><span class="cx">         break;
</span><span class="cx">     case RealtimeMediaSource::Audio:
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-        enqueueAudioSample(track, mediaSample);
-#endif
</del><span class="cx">         break;
</span><span class="cx">     case RealtimeMediaSource::Video:
</span><span class="cx">         if (&amp;track == m_activeVideoTrack.get())
</span><span class="lines">@@ -1037,36 +842,23 @@
</span><span class="cx"> {
</span><span class="cx">     MediaStreamTrackPrivateVector currentTracks = m_mediaStreamPrivate-&gt;tracks();
</span><span class="cx"> 
</span><del>-    Function&lt;void(RefPtr&lt;AudioTrackPrivateMediaStream&gt;, int, TrackState)&gt;  setAudioTrackState = [this](auto track, int index, TrackState state)
</del><ins>+    Function&lt;void(RefPtr&lt;AudioTrackPrivateMediaStreamCocoa&gt;, int, TrackState)&gt;  setAudioTrackState = [this](auto track, int index, TrackState state)
</ins><span class="cx">     {
</span><span class="cx">         switch (state) {
</span><span class="cx">         case TrackState::Remove:
</span><del>-            track-&gt;streamTrack().removeObserver(*this);
</del><span class="cx">             m_player-&gt;removeAudioTrack(*track);
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-            destroyAudioRenderer(track-&gt;id());
-#endif
</del><span class="cx">             break;
</span><span class="cx">         case TrackState::Add:
</span><del>-            track-&gt;streamTrack().addObserver(*this);
</del><span class="cx">             m_player-&gt;addAudioTrack(*track);
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-            createAudioRenderer(track-&gt;id());
-#endif
</del><span class="cx">             break;
</span><span class="cx">         case TrackState::Configure:
</span><span class="cx">             track-&gt;setTrackIndex(index);
</span><span class="cx">             bool enabled = track-&gt;streamTrack().enabled() &amp;&amp; !track-&gt;streamTrack().muted();
</span><span class="cx">             track-&gt;setEnabled(enabled);
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-            auto renderer = m_audioRenderers.get(track-&gt;id());
-            ASSERT(renderer);
-            renderer.get().muted = !enabled;
-#endif
</del><span class="cx">             break;
</span><span class="cx">         }
</span><span class="cx">     };
</span><del>-    updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &amp;AudioTrackPrivateMediaStream::create, setAudioTrackState);
</del><ins>+    updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &amp;AudioTrackPrivateMediaStreamCocoa::create, setAudioTrackState);
</ins><span class="cx"> 
</span><span class="cx">     Function&lt;void(RefPtr&lt;VideoTrackPrivateMediaStream&gt;, int, TrackState)&gt; setVideoTrackState = [&amp;](auto track, int index, TrackState state)
</span><span class="cx">     {
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreamAudioTrackPrivateMediaStreamh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -32,7 +32,7 @@
</span><span class="cx"> 
</span><span class="cx"> namespace WebCore {
</span><span class="cx"> 
</span><del>-class AudioTrackPrivateMediaStream final : public AudioTrackPrivate {
</del><ins>+class AudioTrackPrivateMediaStream : public AudioTrackPrivate {
</ins><span class="cx">     WTF_MAKE_NONCOPYABLE(AudioTrackPrivateMediaStream)
</span><span class="cx"> public:
</span><span class="cx">     static RefPtr&lt;AudioTrackPrivateMediaStream&gt; create(MediaStreamTrackPrivate&amp; streamTrack)
</span><span class="lines">@@ -53,7 +53,7 @@
</span><span class="cx">     MediaTime timelineOffset() const { return m_timelineOffset; }
</span><span class="cx">     void setTimelineOffset(const MediaTime&amp; offset) { m_timelineOffset = offset; }
</span><span class="cx"> 
</span><del>-private:
</del><ins>+protected:
</ins><span class="cx">     AudioTrackPrivateMediaStream(MediaStreamTrackPrivate&amp; track)
</span><span class="cx">         : m_streamTrack(track)
</span><span class="cx">         , m_id(track.id())
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreamMediaStreamTrackPrivatecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -52,12 +52,12 @@
</span><span class="cx">     , m_isEnabled(true)
</span><span class="cx">     , m_isEnded(false)
</span><span class="cx"> {
</span><del>-    m_source-&gt;addObserver(this);
</del><ins>+    m_source-&gt;addObserver(*this);
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> MediaStreamTrackPrivate::~MediaStreamTrackPrivate()
</span><span class="cx"> {
</span><del>-    m_source-&gt;removeObserver(this);
</del><ins>+    m_source-&gt;removeObserver(*this);
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> void MediaStreamTrackPrivate::addObserver(MediaStreamTrackPrivate::Observer&amp; observer)
</span><span class="lines">@@ -198,7 +198,7 @@
</span><span class="cx">     return !m_isEnded;
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-void MediaStreamTrackPrivate::sourceHasMoreMediaData(MediaSample&amp; mediaSample)
</del><ins>+void MediaStreamTrackPrivate::videoSampleAvailable(MediaSample&amp; mediaSample)
</ins><span class="cx"> {
</span><span class="cx">     mediaSample.setTrackID(id());
</span><span class="cx">     for (auto&amp; observer : m_observers)
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreamMediaStreamTrackPrivateh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -100,7 +100,7 @@
</span><span class="cx">     void sourceMutedChanged() final;
</span><span class="cx">     void sourceSettingsChanged() final;
</span><span class="cx">     bool preventSourceFromStopping() final;
</span><del>-    void sourceHasMoreMediaData(MediaSample&amp;) final;
</del><ins>+    void videoSampleAvailable(MediaSample&amp;) final;
</ins><span class="cx"> 
</span><span class="cx">     Vector&lt;Observer*&gt; m_observers;
</span><span class="cx">     Ref&lt;RealtimeMediaSource&gt; m_source;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreamRealtimeMediaSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -67,16 +67,16 @@
</span><span class="cx">     m_remote = false;
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-void RealtimeMediaSource::addObserver(RealtimeMediaSource::Observer* observer)
</del><ins>+void RealtimeMediaSource::addObserver(RealtimeMediaSource::Observer&amp; observer)
</ins><span class="cx"> {
</span><del>-    m_observers.append(observer);
</del><ins>+    m_observers.append(&amp;observer);
</ins><span class="cx"> }
</span><span class="cx"> 
</span><del>-void RealtimeMediaSource::removeObserver(RealtimeMediaSource::Observer* observer)
</del><ins>+void RealtimeMediaSource::removeObserver(RealtimeMediaSource::Observer&amp; observer)
</ins><span class="cx"> {
</span><del>-    size_t pos = m_observers.find(observer);
-    if (pos != notFound)
-        m_observers.remove(pos);
</del><ins>+    m_observers.removeFirstMatching([&amp;observer](auto* anObserver) {
+        return anObserver == &amp;observer;
+    });
</ins><span class="cx"> 
</span><span class="cx">     if (!m_observers.size())
</span><span class="cx">         stop();
</span><span class="lines">@@ -112,12 +112,19 @@
</span><span class="cx">     });
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-void RealtimeMediaSource::mediaDataUpdated(MediaSample&amp; mediaSample)
</del><ins>+void RealtimeMediaSource::videoSampleAvailable(MediaSample&amp; mediaSample)
</ins><span class="cx"> {
</span><del>-    for (auto&amp; observer : m_observers)
-        observer-&gt;sourceHasMoreMediaData(mediaSample);
</del><ins>+    ASSERT(isMainThread());
+    for (const auto&amp; observer : m_observers)
+        observer-&gt;videoSampleAvailable(mediaSample);
</ins><span class="cx"> }
</span><span class="cx"> 
</span><ins>+void RealtimeMediaSource::audioSamplesAvailable(const MediaTime&amp; time, void* audioData, const AudioStreamDescription&amp; description, size_t numberOfFrames)
+{
+    for (const auto&amp; observer : m_observers)
+        observer-&gt;audioSamplesAvailable(time, audioData, description, numberOfFrames);
+}
+
</ins><span class="cx"> bool RealtimeMediaSource::readonly() const
</span><span class="cx"> {
</span><span class="cx">     return m_readonly;
</span><span class="lines">@@ -130,7 +137,7 @@
</span><span class="cx"> 
</span><span class="cx">     m_stopped = true;
</span><span class="cx"> 
</span><del>-    for (auto* observer : m_observers) {
</del><ins>+    for (const auto&amp; observer : m_observers) {
</ins><span class="cx">         if (observer != callingObserver)
</span><span class="cx">             observer-&gt;sourceStopped();
</span><span class="cx">     }
</span><span class="lines">@@ -143,7 +150,7 @@
</span><span class="cx">     if (stopped())
</span><span class="cx">         return;
</span><span class="cx"> 
</span><del>-    for (auto* observer : m_observers) {
</del><ins>+    for (const auto&amp; observer : m_observers) {
</ins><span class="cx">         if (observer-&gt;preventSourceFromStopping())
</span><span class="cx">             return;
</span><span class="cx">     }
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreamRealtimeMediaSourceh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -47,8 +47,13 @@
</span><span class="cx"> #include &lt;wtf/WeakPtr.h&gt;
</span><span class="cx"> #include &lt;wtf/text/WTFString.h&gt;
</span><span class="cx"> 
</span><ins>+namespace WTF {
+class MediaTime;
+}
+
</ins><span class="cx"> namespace WebCore {
</span><span class="cx"> 
</span><ins>+class AudioStreamDescription;
</ins><span class="cx"> class FloatRect;
</span><span class="cx"> class GraphicsContext;
</span><span class="cx"> class MediaStreamPrivate;
</span><span class="lines">@@ -68,8 +73,11 @@
</span><span class="cx">         // Observer state queries.
</span><span class="cx">         virtual bool preventSourceFromStopping() = 0;
</span><span class="cx">         
</span><del>-        // Media data changes.
-        virtual void sourceHasMoreMediaData(MediaSample&amp;) = 0;
</del><ins>+        // Called on the main thread.
+        virtual void videoSampleAvailable(MediaSample&amp;) { }
+
+        // May be called on a background thread.
+        virtual void audioSamplesAvailable(const MediaTime&amp;, void* /*audioData*/, const AudioStreamDescription&amp;, size_t /*numberOfFrames*/) { }
</ins><span class="cx">     };
</span><span class="cx"> 
</span><span class="cx">     virtual ~RealtimeMediaSource() { }
</span><span class="lines">@@ -99,7 +107,9 @@
</span><span class="cx">     virtual bool supportsConstraints(const MediaConstraints&amp;, String&amp;);
</span><span class="cx"> 
</span><span class="cx">     virtual void settingsDidChange();
</span><del>-    void mediaDataUpdated(MediaSample&amp;);
</del><ins>+
+    void videoSampleAvailable(MediaSample&amp;);
+    void audioSamplesAvailable(const MediaTime&amp;, void*, const AudioStreamDescription&amp;, size_t);
</ins><span class="cx">     
</span><span class="cx">     bool stopped() const { return m_stopped; }
</span><span class="cx"> 
</span><span class="lines">@@ -112,8 +122,8 @@
</span><span class="cx">     virtual bool remote() const { return m_remote; }
</span><span class="cx">     virtual void setRemote(bool remote) { m_remote = remote; }
</span><span class="cx"> 
</span><del>-    void addObserver(Observer*);
-    void removeObserver(Observer*);
</del><ins>+    void addObserver(Observer&amp;);
+    void removeObserver(Observer&amp;);
</ins><span class="cx"> 
</span><span class="cx">     virtual void startProducingData() { }
</span><span class="cx">     virtual void stopProducingData() { }
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAVAudioCaptureSourceh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -30,8 +30,10 @@
</span><span class="cx"> 
</span><span class="cx"> #include &quot;AVMediaCaptureSource.h&quot;
</span><span class="cx"> #include &quot;AudioCaptureSourceProviderObjC.h&quot;
</span><ins>+#include &quot;CAAudioStreamDescription.h&quot;
</ins><span class="cx"> #include &lt;wtf/Lock.h&gt;
</span><span class="cx"> 
</span><ins>+typedef struct AudioBufferList AudioBufferList;
</ins><span class="cx"> typedef struct AudioStreamBasicDescription AudioStreamBasicDescription;
</span><span class="cx"> typedef const struct opaqueCMFormatDescription *CMFormatDescriptionRef;
</span><span class="cx"> 
</span><span class="lines">@@ -64,9 +66,11 @@
</span><span class="cx">     AudioSourceProvider* audioSourceProvider() override;
</span><span class="cx"> 
</span><span class="cx">     RetainPtr&lt;AVCaptureConnection&gt; m_audioConnection;
</span><ins>+    size_t m_listBufferSize { 0 };
+    std::unique_ptr&lt;AudioBufferList&gt; m_list;
</ins><span class="cx"> 
</span><span class="cx">     RefPtr&lt;WebAudioSourceProviderAVFObjC&gt; m_audioSourceProvider;
</span><del>-    std::unique_ptr&lt;AudioStreamBasicDescription&gt; m_inputDescription;
</del><ins>+    std::unique_ptr&lt;CAAudioStreamDescription&gt; m_inputDescription;
</ins><span class="cx">     Vector&lt;AudioSourceObserverObjC*&gt; m_observers;
</span><span class="cx">     Lock m_lock;
</span><span class="cx"> };
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAVAudioCaptureSourcemm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -28,7 +28,9 @@
</span><span class="cx"> 
</span><span class="cx"> #if ENABLE(MEDIA_STREAM) &amp;&amp; USE(AVFOUNDATION)
</span><span class="cx"> 
</span><ins>+#import &quot;AudioSampleBufferList.h&quot;
</ins><span class="cx"> #import &quot;AudioSourceObserverObjC.h&quot;
</span><ins>+#import &quot;CAAudioStreamDescription.h&quot;
</ins><span class="cx"> #import &quot;Logging.h&quot;
</span><span class="cx"> #import &quot;MediaConstraints.h&quot;
</span><span class="cx"> #import &quot;MediaSampleAVFObjC.h&quot;
</span><span class="lines">@@ -91,7 +93,6 @@
</span><span class="cx"> AVAudioCaptureSource::AVAudioCaptureSource(AVCaptureDeviceTypedef* device, const AtomicString&amp; id)
</span><span class="cx">     : AVMediaCaptureSource(device, id, RealtimeMediaSource::Audio)
</span><span class="cx"> {
</span><del>-    m_inputDescription = std::make_unique&lt;AudioStreamBasicDescription&gt;();
</del><span class="cx"> }
</span><span class="cx">     
</span><span class="cx"> AVAudioCaptureSource::~AVAudioCaptureSource()
</span><span class="lines">@@ -120,8 +121,8 @@
</span><span class="cx"> {
</span><span class="cx">     LockHolder lock(m_lock);
</span><span class="cx">     m_observers.append(&amp;observer);
</span><del>-    if (m_inputDescription-&gt;mSampleRate)
-        observer.prepare(m_inputDescription.get());
</del><ins>+    if (m_inputDescription)
+        observer.prepare(&amp;m_inputDescription-&gt;streamDescription());
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> void AVAudioCaptureSource::removeObserver(AudioSourceObserverObjC&amp; observer)
</span><span class="lines">@@ -162,7 +163,7 @@
</span><span class="cx">         LockHolder lock(m_lock);
</span><span class="cx"> 
</span><span class="cx">         m_audioConnection = nullptr;
</span><del>-        m_inputDescription = std::make_unique&lt;AudioStreamBasicDescription&gt;();
</del><ins>+        m_inputDescription = nullptr;
</ins><span class="cx"> 
</span><span class="cx">         for (auto&amp; observer : m_observers)
</span><span class="cx">             observer-&gt;unprepare();
</span><span class="lines">@@ -174,23 +175,6 @@
</span><span class="cx">     m_audioSourceProvider = nullptr;
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-static bool operator==(const AudioStreamBasicDescription&amp; a, const AudioStreamBasicDescription&amp; b)
-{
-    return a.mSampleRate == b.mSampleRate
-        &amp;&amp; a.mFormatID == b.mFormatID
-        &amp;&amp; a.mFormatFlags == b.mFormatFlags
-        &amp;&amp; a.mBytesPerPacket == b.mBytesPerPacket
-        &amp;&amp; a.mFramesPerPacket == b.mFramesPerPacket
-        &amp;&amp; a.mBytesPerFrame == b.mBytesPerFrame
-        &amp;&amp; a.mChannelsPerFrame == b.mChannelsPerFrame
-        &amp;&amp; a.mBitsPerChannel == b.mBitsPerChannel;
-}
-
-static bool operator!=(const AudioStreamBasicDescription&amp; a, const AudioStreamBasicDescription&amp; b)
-{
-    return !(a == b);
-}
-
</del><span class="cx"> void AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*)
</span><span class="cx"> {
</span><span class="cx">     if (muted())
</span><span class="lines">@@ -200,11 +184,6 @@
</span><span class="cx">     if (!formatDescription)
</span><span class="cx">         return;
</span><span class="cx"> 
</span><del>-    RetainPtr&lt;CMSampleBufferRef&gt; buffer = sampleBuffer;
-    scheduleDeferredTask([this, buffer] {
-        mediaDataUpdated(MediaSampleAVFObjC::create(buffer.get()));
-    });
-
</del><span class="cx">     std::unique_lock&lt;Lock&gt; lock(m_lock, std::try_to_lock);
</span><span class="cx">     if (!lock.owns_lock()) {
</span><span class="cx">         // Failed to acquire the lock, just return instead of blocking.
</span><span class="lines">@@ -211,16 +190,31 @@
</span><span class="cx">         return;
</span><span class="cx">     }
</span><span class="cx"> 
</span><ins>+    const AudioStreamBasicDescription* streamDescription = CMAudioFormatDescriptionGetStreamBasicDescription(formatDescription);
+    if (!m_inputDescription || *m_inputDescription != *streamDescription) {
+        m_inputDescription = std::make_unique&lt;CAAudioStreamDescription&gt;(*streamDescription);
+        m_listBufferSize = AudioSampleBufferList::audioBufferListSizeForStream(*m_inputDescription.get());
+        m_list = std::unique_ptr&lt;AudioBufferList&gt;(static_cast&lt;AudioBufferList*&gt;(::operator new (m_listBufferSize)));
+        memset(m_list.get(), 0, m_listBufferSize);
+        m_list-&gt;mNumberBuffers = m_inputDescription-&gt;numberOfChannelStreams();
+
+        if (!m_observers.isEmpty()) {
+            for (auto&amp; observer : m_observers)
+                observer-&gt;prepare(streamDescription);
+        }
+    }
+
+    CMItemCount frameCount = CMSampleBufferGetNumSamples(sampleBuffer);
+    CMBlockBufferRef buffer = nil;
+    OSStatus err = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, nullptr, m_list.get(), m_listBufferSize, kCFAllocatorSystemDefault, kCFAllocatorSystemDefault, kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, &amp;buffer);
+    if (!err)
+        audioSamplesAvailable(toMediaTime(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)), m_list-&gt;mBuffers[0].mData, CAAudioStreamDescription(*streamDescription), frameCount);
+    else
+        LOG_ERROR(&quot;AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection(%p) - CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer returned error %d (%.4s)&quot;, this, (int)err, (char*)&amp;err);
+
</ins><span class="cx">     if (m_observers.isEmpty())
</span><span class="cx">         return;
</span><span class="cx"> 
</span><del>-    const AudioStreamBasicDescription* streamDescription = CMAudioFormatDescriptionGetStreamBasicDescription(formatDescription);
-    if (*m_inputDescription != *streamDescription) {
-        m_inputDescription = std::make_unique&lt;AudioStreamBasicDescription&gt;(*streamDescription);
-        for (auto&amp; observer : m_observers)
-            observer-&gt;prepare(m_inputDescription.get());
-    }
-
</del><span class="cx">     for (auto&amp; observer : m_observers)
</span><span class="cx">         observer-&gt;process(formatDescription, sampleBuffer);
</span><span class="cx"> }
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAVVideoCaptureSourcemm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -423,7 +423,7 @@
</span><span class="cx">     if (settingsChanged)
</span><span class="cx">         settingsDidChange();
</span><span class="cx"> 
</span><del>-    mediaDataUpdated(MediaSampleAVFObjC::create(m_buffer.get()));
</del><ins>+    videoSampleAvailable(MediaSampleAVFObjC::create(m_buffer.get()));
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*)
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAudioTrackPrivateMediaStreamCocoacpp"></a>
<div class="addfile"><h4>Added: trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp (0 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp                                (rev 0)
+++ trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -0,0 +1,254 @@
</span><ins>+/*
+ * Copyright (C) 2017 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
+ * THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#include &quot;config.h&quot;
+#include &quot;AudioTrackPrivateMediaStreamCocoa.h&quot;
+
+#include &quot;AudioSampleBufferList.h&quot;
+#include &quot;AudioSampleDataSource.h&quot;
+#include &quot;AudioSession.h&quot;
+#include &quot;CAAudioStreamDescription.h&quot;
+#include &quot;Logging.h&quot;
+
+#include &quot;CoreMediaSoftLink.h&quot;
+
+#if ENABLE(VIDEO_TRACK)
+
+namespace WebCore {
+
+const int renderBufferSize = 128;
+
+AudioTrackPrivateMediaStreamCocoa::AudioTrackPrivateMediaStreamCocoa(MediaStreamTrackPrivate&amp; track)
+    : AudioTrackPrivateMediaStream(track)
+{
+    track.source().addObserver(*this);
+}
+
+AudioTrackPrivateMediaStreamCocoa::~AudioTrackPrivateMediaStreamCocoa()
+{
+    std::lock_guard&lt;Lock&gt; lock(m_internalStateLock);
+
+    streamTrack().source().removeObserver(*this);
+
+    if (m_dataSource)
+        m_dataSource-&gt;setPaused(true);
+
+    if (m_remoteIOUnit) {
+        AudioOutputUnitStop(m_remoteIOUnit);
+        AudioComponentInstanceDispose(m_remoteIOUnit);
+        m_remoteIOUnit = nullptr;
+    }
+
+    m_dataSource = nullptr;
+    m_inputDescription = nullptr;
+    m_outputDescription = nullptr;
+}
+
+void AudioTrackPrivateMediaStreamCocoa::playInternal()
+{
+    ASSERT(m_internalStateLock.isHeld());
+
+    if (m_isPlaying)
+        return;
+
+    if (m_remoteIOUnit) {
+        ASSERT(m_dataSource);
+        m_dataSource-&gt;setPaused(false);
+        if (!AudioOutputUnitStart(m_remoteIOUnit))
+            m_isPlaying = true;
+    }
+
+    m_autoPlay = !m_isPlaying;
+}
+
+void AudioTrackPrivateMediaStreamCocoa::play()
+{
+    std::lock_guard&lt;Lock&gt; lock(m_internalStateLock);
+    playInternal();
+}
+
+void AudioTrackPrivateMediaStreamCocoa::pause()
+{
+    std::lock_guard&lt;Lock&gt; lock(m_internalStateLock);
+
+    m_isPlaying = false;
+    m_autoPlay = false;
+
+    if (m_remoteIOUnit)
+        AudioOutputUnitStop(m_remoteIOUnit);
+    if (m_dataSource)
+        m_dataSource-&gt;setPaused(true);
+}
+
+void AudioTrackPrivateMediaStreamCocoa::setVolume(float volume)
+{
+    m_volume = volume;
+    if (m_dataSource)
+        m_dataSource-&gt;setVolume(m_volume);
+}
+
+OSStatus AudioTrackPrivateMediaStreamCocoa::setupAudioUnit()
+{
+    ASSERT(m_internalStateLock.isHeld());
+
+    AudioComponentDescription ioUnitDescription { kAudioUnitType_Output, 0, kAudioUnitManufacturer_Apple, 0, 0 };
+#if PLATFORM(IOS)
+    ioUnitDescription.componentSubType = kAudioUnitSubType_RemoteIO;
+#else
+    ioUnitDescription.componentSubType = kAudioUnitSubType_DefaultOutput;
+#endif
+
+    AudioComponent ioComponent = AudioComponentFindNext(nullptr, &amp;ioUnitDescription);
+    ASSERT(ioComponent);
+    if (!ioComponent) {
+        LOG(Media, &quot;AudioTrackPrivateMediaStreamCocoa::setupAudioUnit(%p) unable to find remote IO unit component&quot;, this);
+        return -1;
+    }
+
+    OSStatus err = AudioComponentInstanceNew(ioComponent, &amp;m_remoteIOUnit);
+    if (err) {
+        LOG(Media, &quot;AudioTrackPrivateMediaStreamCocoa::setupAudioUnit(%p) unable to open vpio unit, error %d (%.4s)&quot;, this, (int)err, (char*)&amp;err);
+        return -1;
+    }
+
+#if PLATFORM(IOS)
+    UInt32 param = 1;
+    err = AudioUnitSetProperty(m_remoteIOUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, 0, &amp;param, sizeof(param));
+    if (err) {
+        LOG(Media, &quot;AudioTrackPrivateMediaStreamCocoa::setupAudioUnit(%p) unable to enable vpio unit output, error %d (%.4s)&quot;, this, (int)err, (char*)&amp;err);
+        return err;
+    }
+#endif
+
+    AURenderCallbackStruct callback = { inputProc, this };
+    err = AudioUnitSetProperty(m_remoteIOUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, 0, &amp;callback, sizeof(callback));
+    if (err) {
+        LOG(Media, &quot;AudioTrackPrivateMediaStreamCocoa::setupAudioUnit(%p) unable to set vpio unit speaker proc, error %d (%.4s)&quot;, this, (int)err, (char*)&amp;err);
+        return err;
+    }
+
+    AudioStreamBasicDescription outputDescription = { };
+    UInt32 size = sizeof(outputDescription);
+    err  = AudioUnitGetProperty(m_remoteIOUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &amp;outputDescription, &amp;size);
+    if (err) {
+        LOG(Media, &quot;AudioTrackPrivateMediaStreamCocoa::setupAudioUnits(%p) unable to get input stream format, error %d (%.4s)&quot;, this, (int)err, (char*)&amp;err);
+        return err;
+    }
+
+    outputDescription = m_inputDescription-&gt;streamDescription();
+    outputDescription.mSampleRate = AudioSession::sharedSession().sampleRate();
+
+    err = AudioUnitSetProperty(m_remoteIOUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &amp;outputDescription, sizeof(outputDescription));
+    if (err) {
+        LOG(Media, &quot;AudioTrackPrivateMediaStreamCocoa::setupAudioUnits(%p) unable to set input stream format, error %d (%.4s)&quot;, this, (int)err, (char*)&amp;err);
+        return err;
+    }
+    m_outputDescription = std::make_unique&lt;CAAudioStreamDescription&gt;(outputDescription);
+
+    err = AudioUnitInitialize(m_remoteIOUnit);
+    if (err) {
+        LOG(Media, &quot;AudioTrackPrivateMediaStreamCocoa::setupAudioUnits(%p) AudioUnitInitialize() failed, error %d (%.4s)&quot;, this, (int)err, (char*)&amp;err);
+        return err;
+    }
+
+    AudioSession::sharedSession().setPreferredBufferSize(renderBufferSize);
+
+    return err;
+}
+
+void AudioTrackPrivateMediaStreamCocoa::audioSamplesAvailable(const MediaTime&amp; sampleTime, void* audioData, const AudioStreamDescription&amp; description, size_t sampleCount)
+{
+    ASSERT(description.platformDescription().type == PlatformDescription::CAAudioStreamBasicType);
+
+    std::lock_guard&lt;Lock&gt; lock(m_internalStateLock);
+
+    CAAudioStreamDescription streamDescription = toCAAudioStreamDescription(description);
+    if (!m_inputDescription || *m_inputDescription != description) {
+
+        m_inputDescription = nullptr;
+        m_outputDescription = nullptr;
+
+        if (m_remoteIOUnit) {
+            AudioOutputUnitStop(m_remoteIOUnit);
+            AudioComponentInstanceDispose(m_remoteIOUnit);
+            m_remoteIOUnit = nullptr;
+        }
+
+        m_inputDescription = std::make_unique&lt;CAAudioStreamDescription&gt;(streamDescription);
+        if (setupAudioUnit()) {
+            m_inputDescription = nullptr;
+            return;
+        }
+
+        if (!m_dataSource)
+            m_dataSource = AudioSampleDataSource::create(description.sampleRate() * 2);
+        if (!m_dataSource)
+            return;
+
+        if (m_dataSource-&gt;setInputFormat(streamDescription))
+            return;
+        if (m_dataSource-&gt;setOutputFormat(*m_outputDescription.get()))
+            return;
+
+        m_dataSource-&gt;setVolume(m_volume);
+    }
+
+    m_dataSource-&gt;pushSamples(m_inputDescription-&gt;streamDescription(), sampleTime, audioData, sampleCount);
+
+    if (m_autoPlay)
+        playInternal();
+}
+
+void AudioTrackPrivateMediaStreamCocoa::sourceStopped()
+{
+    pause();
+}
+
+OSStatus AudioTrackPrivateMediaStreamCocoa::render(UInt32 sampleCount, AudioBufferList&amp; ioData, UInt32 /*inBusNumber*/, const AudioTimeStamp&amp; timeStamp, AudioUnitRenderActionFlags&amp; actionFlags)
+{
+    std::unique_lock&lt;Lock&gt; lock(m_internalStateLock, std::try_to_lock);
+    if (!lock.owns_lock())
+        return kAudioConverterErr_UnspecifiedError;
+
+    if (!m_isPlaying || m_muted || !m_dataSource || streamTrack().muted() || streamTrack().ended() || !streamTrack().enabled()) {
+        AudioSampleBufferList::zeroABL(ioData, static_cast&lt;size_t&gt;(sampleCount));
+        actionFlags = kAudioUnitRenderAction_OutputIsSilence;
+        return 0;
+    }
+
+    m_dataSource-&gt;pullSamples(ioData, static_cast&lt;size_t&gt;(sampleCount), timeStamp.mSampleTime, timeStamp.mHostTime, AudioSampleDataSource::Copy);
+
+    return 0;
+}
+
+OSStatus AudioTrackPrivateMediaStreamCocoa::inputProc(void* userData, AudioUnitRenderActionFlags* actionFlags, const AudioTimeStamp* timeStamp, UInt32 inBusNumber, UInt32 sampleCount, AudioBufferList* ioData)
+{
+    return static_cast&lt;AudioTrackPrivateMediaStreamCocoa*&gt;(userData)-&gt;render(sampleCount, *ioData, inBusNumber, *timeStamp, *actionFlags);
+}
+
+
+} // namespace WebCore
+
+#endif // ENABLE(VIDEO_TRACK)
</ins></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAudioTrackPrivateMediaStreamCocoah"></a>
<div class="addfile"><h4>Added: trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h (0 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h                                (rev 0)
+++ trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -0,0 +1,94 @@
</span><ins>+/*
+ * Copyright (C) 2017 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
+ * THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#pragma once
+
+#if ENABLE(VIDEO_TRACK) &amp;&amp; ENABLE(MEDIA_STREAM)
+
+#include &quot;AudioSourceObserverObjC.h&quot;
+#include &quot;AudioTrackPrivateMediaStream.h&quot;
+#include &lt;AudioToolbox/AudioToolbox.h&gt;
+#include &lt;CoreAudio/CoreAudioTypes.h&gt;
+#include &lt;wtf/Lock.h&gt;
+
+namespace WebCore {
+
+class AudioSampleDataSource;
+class AudioSampleBufferList;
+class CAAudioStreamDescription;
+
+class AudioTrackPrivateMediaStreamCocoa final : public AudioTrackPrivateMediaStream, private RealtimeMediaSource::Observer {
+    WTF_MAKE_NONCOPYABLE(AudioTrackPrivateMediaStreamCocoa)
+public:
+    static RefPtr&lt;AudioTrackPrivateMediaStreamCocoa&gt; create(MediaStreamTrackPrivate&amp; streamTrack)
+    {
+        return adoptRef(*new AudioTrackPrivateMediaStreamCocoa(streamTrack));
+    }
+
+    void play();
+    void pause();
+    bool isPlaying() { return m_isPlaying; }
+
+    void setVolume(float);
+    float volume() const { return m_volume; }
+
+    void setMuted(bool muted) { m_muted = muted; }
+    bool muted() const { return m_muted; }
+
+private:
+    AudioTrackPrivateMediaStreamCocoa(MediaStreamTrackPrivate&amp;);
+    ~AudioTrackPrivateMediaStreamCocoa();
+
+    // RealtimeMediaSource::Observer
+    void sourceStopped() final;
+    void sourceMutedChanged()  final { }
+    void sourceSettingsChanged() final { }
+    bool preventSourceFromStopping() final { return false; }
+    void audioSamplesAvailable(const MediaTime&amp;, void*, const AudioStreamDescription&amp;, size_t) final;
+
+    static OSStatus inputProc(void*, AudioUnitRenderActionFlags*, const AudioTimeStamp*, UInt32 inBusNumber, UInt32 numberOfFrames, AudioBufferList*);
+    OSStatus render(UInt32 sampleCount, AudioBufferList&amp;, UInt32 inBusNumber, const AudioTimeStamp&amp;, AudioUnitRenderActionFlags&amp;);
+
+    OSStatus setupAudioUnit();
+    void cleanup();
+    void zeroBufferList(AudioBufferList&amp;, size_t);
+    void playInternal();
+
+    AudioComponentInstance m_remoteIOUnit { nullptr };
+    std::unique_ptr&lt;CAAudioStreamDescription&gt; m_inputDescription;
+    std::unique_ptr&lt;CAAudioStreamDescription&gt; m_outputDescription;
+
+    RefPtr&lt;AudioSampleDataSource&gt; m_dataSource;
+
+    Lock m_internalStateLock;
+    float m_volume { 1 };
+    bool m_isPlaying { false };
+    bool m_autoPlay { false };
+    bool m_muted { false };
+};
+
+}
+
+#endif // ENABLE(VIDEO_TRACK) &amp;&amp; ENABLE(MEDIA_STREAM)
</ins></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMach"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -72,7 +72,7 @@
</span><span class="cx"> 
</span><span class="cx">     uint32_t m_maximiumFrameCount;
</span><span class="cx">     uint32_t m_sampleRate { 44100 };
</span><del>-    double m_bytesPerFrame { sizeof(Float32) };
</del><ins>+    uint64_t m_bytesEmitted { 0 };
</ins><span class="cx"> 
</span><span class="cx">     RetainPtr&lt;CMFormatDescriptionRef&gt; m_formatDescription;
</span><span class="cx">     AudioStreamBasicDescription m_streamFormat;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMacmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -32,6 +32,8 @@
</span><span class="cx"> #import &quot;MockRealtimeAudioSourceMac.h&quot;
</span><span class="cx"> 
</span><span class="cx"> #if ENABLE(MEDIA_STREAM)
</span><ins>+#import &quot;AudioSampleBufferList.h&quot;
+#import &quot;CAAudioStreamDescription.h&quot;
</ins><span class="cx"> #import &quot;MediaConstraints.h&quot;
</span><span class="cx"> #import &quot;MediaSampleAVFObjC.h&quot;
</span><span class="cx"> #import &quot;NotImplemented.h&quot;
</span><span class="lines">@@ -49,6 +51,11 @@
</span><span class="cx"> 
</span><span class="cx"> namespace WebCore {
</span><span class="cx"> 
</span><ins>+static inline size_t alignTo16Bytes(size_t size)
+{
+    return (size + 15) &amp; ~15;
+}
+
</ins><span class="cx"> RefPtr&lt;MockRealtimeAudioSource&gt; MockRealtimeAudioSource::create(const String&amp; name, const MediaConstraints* constraints)
</span><span class="cx"> {
</span><span class="cx">     auto source = adoptRef(new MockRealtimeAudioSourceMac(name));
</span><span class="lines">@@ -92,7 +99,11 @@
</span><span class="cx"> {
</span><span class="cx">     ASSERT(m_formatDescription);
</span><span class="cx"> 
</span><del>-    CMTime startTime = CMTimeMake(elapsedTime() * m_sampleRate, m_sampleRate);
</del><ins>+    CMTime startTime = CMTimeMake(m_bytesEmitted, m_sampleRate);
+    m_bytesEmitted += frameCount;
+
+    audioSamplesAvailable(toMediaTime(startTime), m_audioBufferList-&gt;mBuffers[0].mData, CAAudioStreamDescription(m_streamFormat), frameCount);
+
</ins><span class="cx">     CMSampleBufferRef sampleBuffer;
</span><span class="cx">     OSStatus result = CMAudioSampleBufferCreateWithPacketDescriptions(nullptr, nullptr, true, nullptr, nullptr, m_formatDescription.get(), frameCount, startTime, nullptr, &amp;sampleBuffer);
</span><span class="cx">     ASSERT(sampleBuffer);
</span><span class="lines">@@ -108,9 +119,7 @@
</span><span class="cx">     result = CMSampleBufferSetDataReady(sampleBuffer);
</span><span class="cx">     ASSERT(!result);
</span><span class="cx"> 
</span><del>-    mediaDataUpdated(MediaSampleAVFObjC::create(sampleBuffer));
-
-    for (auto&amp; observer : m_observers)
</del><ins>+    for (const auto&amp; observer : m_observers)
</ins><span class="cx">         observer-&gt;process(m_formatDescription.get(), sampleBuffer);
</span><span class="cx"> }
</span><span class="cx"> 
</span><span class="lines">@@ -119,10 +128,24 @@
</span><span class="cx">     m_maximiumFrameCount = WTF::roundUpToPowerOfTwo(renderInterval() / 1000. * m_sampleRate * 2);
</span><span class="cx">     ASSERT(m_maximiumFrameCount);
</span><span class="cx"> 
</span><ins>+    const int bytesPerFloat = sizeof(Float32);
+    const int bitsPerByte = 8;
+    int channelCount = 1;
+    m_streamFormat = { };
+    m_streamFormat.mSampleRate = m_sampleRate;
+    m_streamFormat.mFormatID = kAudioFormatLinearPCM;
+    m_streamFormat.mFormatFlags = kAudioFormatFlagsNativeFloatPacked;
+    m_streamFormat.mBytesPerPacket = bytesPerFloat * channelCount;
+    m_streamFormat.mFramesPerPacket = 1;
+    m_streamFormat.mBytesPerFrame = bytesPerFloat * channelCount;
+    m_streamFormat.mChannelsPerFrame = channelCount;
+    m_streamFormat.mBitsPerChannel = bitsPerByte * bytesPerFloat;
+
</ins><span class="cx">     // AudioBufferList is a variable-length struct, so create on the heap with a generic new() operator
</span><span class="cx">     // with a custom size, and initialize the struct manually.
</span><del>-    uint32_t bufferDataSize = m_bytesPerFrame * m_maximiumFrameCount;
-    uint32_t baseSize = offsetof(AudioBufferList, mBuffers) + sizeof(AudioBuffer);
</del><ins>+    uint32_t bufferDataSize = m_streamFormat.mBytesPerFrame * m_maximiumFrameCount;
+    uint32_t baseSize = AudioSampleBufferList::audioBufferListSizeForStream(m_streamFormat);
+
</ins><span class="cx">     uint64_t bufferListSize = baseSize + bufferDataSize;
</span><span class="cx">     ASSERT(bufferListSize &lt;= SIZE_MAX);
</span><span class="cx">     if (bufferListSize &gt; SIZE_MAX)
</span><span class="lines">@@ -132,24 +155,9 @@
</span><span class="cx">     m_audioBufferList = std::unique_ptr&lt;AudioBufferList&gt;(static_cast&lt;AudioBufferList*&gt;(::operator new (m_audioBufferListBufferSize)));
</span><span class="cx">     memset(m_audioBufferList.get(), 0, m_audioBufferListBufferSize);
</span><span class="cx"> 
</span><del>-    m_audioBufferList-&gt;mNumberBuffers = 1;
-    auto&amp; buffer = m_audioBufferList-&gt;mBuffers[0];
-    buffer.mNumberChannels = 1;
-    buffer.mDataByteSize = bufferDataSize;
-    buffer.mData = reinterpret_cast&lt;uint8_t*&gt;(m_audioBufferList.get()) + baseSize;
</del><ins>+    uint8_t* bufferData = reinterpret_cast&lt;uint8_t*&gt;(m_audioBufferList.get()) + baseSize;
+    AudioSampleBufferList::configureBufferListForStream(*m_audioBufferList.get(), m_streamFormat, bufferData, bufferDataSize);
</ins><span class="cx"> 
</span><del>-    const int bytesPerFloat = sizeof(Float32);
-    const int bitsPerByte = 8;
-    m_streamFormat = { };
-    m_streamFormat.mSampleRate = m_sampleRate;
-    m_streamFormat.mFormatID = kAudioFormatLinearPCM;
-    m_streamFormat.mFormatFlags = kAudioFormatFlagsNativeFloatPacked | kAudioFormatFlagIsNonInterleaved;
-    m_streamFormat.mBytesPerPacket = bytesPerFloat;
-    m_streamFormat.mFramesPerPacket = 1;
-    m_streamFormat.mBytesPerFrame = bytesPerFloat;
-    m_streamFormat.mChannelsPerFrame = 1;
-    m_streamFormat.mBitsPerChannel = bitsPerByte * bytesPerFloat;
-
</del><span class="cx">     CMFormatDescriptionRef formatDescription;
</span><span class="cx">     CMAudioFormatDescriptionCreate(NULL, &amp;m_streamFormat, 0, NULL, 0, NULL, NULL, &amp;formatDescription);
</span><span class="cx">     m_formatDescription = adoptCF(formatDescription);
</span><span class="lines">@@ -162,11 +170,12 @@
</span><span class="cx"> {
</span><span class="cx">     static double theta;
</span><span class="cx">     static const double frequencies[] = { 1500., 500. };
</span><ins>+    static const double tau = 2 * M_PI;
</ins><span class="cx"> 
</span><span class="cx">     if (!m_audioBufferList)
</span><span class="cx">         reconfigure();
</span><span class="cx"> 
</span><del>-    uint32_t totalFrameCount = delta * m_sampleRate;
</del><ins>+    uint32_t totalFrameCount = alignTo16Bytes(delta * m_sampleRate);
</ins><span class="cx">     uint32_t frameCount = std::min(totalFrameCount, m_maximiumFrameCount);
</span><span class="cx">     double elapsed = elapsedTime();
</span><span class="cx">     while (frameCount) {
</span><span class="lines">@@ -180,7 +189,7 @@
</span><span class="cx">             case 0:
</span><span class="cx">             case 14: {
</span><span class="cx">                 int index = fmod(elapsed, 1) * 2;
</span><del>-                increment = 2.0 * M_PI * frequencies[index] / m_sampleRate;
</del><ins>+                increment = tau * frequencies[index] / m_sampleRate;
</ins><span class="cx">                 silent = false;
</span><span class="cx">                 break;
</span><span class="cx">             }
</span><span class="lines">@@ -193,10 +202,12 @@
</span><span class="cx">                 continue;
</span><span class="cx">             }
</span><span class="cx"> 
</span><del>-            buffer[frame] = sin(theta) * 0.25;
-            theta += increment;
-            if (theta &gt; 2.0 * M_PI)
-                theta -= 2.0 * M_PI;
</del><ins>+            float tone = sin(theta) * 0.25;
+            buffer[frame] = tone;
+
+                theta += increment;
+            if (theta &gt; tau)
+                theta -= tau;
</ins><span class="cx">             elapsed += 1 / m_sampleRate;
</span><span class="cx">         }
</span><span class="cx"> 
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacMockRealtimeVideoSourceMacmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -125,7 +125,7 @@
</span><span class="cx">     auto pixelBuffer = pixelBufferFromCGImage(imageBuffer()-&gt;copyImage()-&gt;nativeImage().get());
</span><span class="cx">     auto sampleBuffer = CMSampleBufferFromPixelBuffer(pixelBuffer.get());
</span><span class="cx">     
</span><del>-    mediaDataUpdated(MediaSampleAVFObjC::create(sampleBuffer.get()));
</del><ins>+    videoSampleAvailable(MediaSampleAVFObjC::create(sampleBuffer.get()));
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> } // namespace WebCore
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacWebAudioSourceProviderAVFObjCh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -23,8 +23,7 @@
</span><span class="cx">  * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 
</span><span class="cx">  */
</span><span class="cx"> 
</span><del>-#ifndef WebAudioSourceProviderAVFObjC_h
-#define WebAudioSourceProviderAVFObjC_h
</del><ins>+#pragma once
</ins><span class="cx"> 
</span><span class="cx"> #if ENABLE(WEB_AUDIO) &amp;&amp; ENABLE(MEDIA_STREAM)
</span><span class="cx"> 
</span><span class="lines">@@ -31,6 +30,7 @@
</span><span class="cx"> #include &quot;AudioCaptureSourceProviderObjC.h&quot;
</span><span class="cx"> #include &quot;AudioSourceObserverObjC.h&quot;
</span><span class="cx"> #include &quot;AudioSourceProvider.h&quot;
</span><ins>+#include &lt;wtf/Lock.h&gt;
</ins><span class="cx"> #include &lt;wtf/RefCounted.h&gt;
</span><span class="cx"> #include &lt;wtf/RefPtr.h&gt;
</span><span class="cx"> 
</span><span class="lines">@@ -68,11 +68,11 @@
</span><span class="cx">     std::unique_ptr&lt;AudioStreamBasicDescription&gt; m_outputDescription;
</span><span class="cx">     std::unique_ptr&lt;CARingBuffer&gt; m_ringBuffer;
</span><span class="cx"> 
</span><del>-    uint64_t m_writeAheadCount { 0 };
</del><span class="cx">     uint64_t m_writeCount { 0 };
</span><span class="cx">     uint64_t m_readCount { 0 };
</span><span class="cx">     AudioSourceProviderClient* m_client { nullptr };
</span><span class="cx">     AudioCaptureSourceProviderObjC* m_captureSource { nullptr };
</span><ins>+    Lock m_mutex;
</ins><span class="cx">     bool m_connected { false };
</span><span class="cx"> };
</span><span class="cx">     
</span><span class="lines">@@ -79,5 +79,3 @@
</span><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> #endif
</span><del>-
-#endif
</del></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacWebAudioSourceProviderAVFObjCmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -1,5 +1,5 @@
</span><span class="cx"> /*
</span><del>- * Copyright (C) 2015 Apple Inc. All rights reserved.
</del><ins>+ * Copyright (C) 2015-2017 Apple Inc. All rights reserved.
</ins><span class="cx">  *
</span><span class="cx">  * Redistribution and use in source and binary forms, with or without
</span><span class="cx">  * modification, are permitted provided that the following conditions
</span><span class="lines">@@ -65,6 +65,8 @@
</span><span class="cx"> 
</span><span class="cx"> WebAudioSourceProviderAVFObjC::~WebAudioSourceProviderAVFObjC()
</span><span class="cx"> {
</span><ins>+    std::lock_guard&lt;Lock&gt; lock(m_mutex);
+
</ins><span class="cx">     if (m_converter) {
</span><span class="cx">         // FIXME: make and use a smart pointer for AudioConverter
</span><span class="cx">         AudioConverterDispose(m_converter);
</span><span class="lines">@@ -76,7 +78,8 @@
</span><span class="cx"> 
</span><span class="cx"> void WebAudioSourceProviderAVFObjC::provideInput(AudioBus* bus, size_t framesToProcess)
</span><span class="cx"> {
</span><del>-    if (!m_ringBuffer) {
</del><ins>+    std::unique_lock&lt;Lock&gt; lock(m_mutex, std::try_to_lock);
+    if (!lock.owns_lock() || !m_ringBuffer) {
</ins><span class="cx">         bus-&gt;zero();
</span><span class="cx">         return;
</span><span class="cx">     }
</span><span class="lines">@@ -85,12 +88,12 @@
</span><span class="cx">     uint64_t endFrame = 0;
</span><span class="cx">     m_ringBuffer-&gt;getCurrentFrameBounds(startFrame, endFrame);
</span><span class="cx"> 
</span><del>-    if (m_writeCount &lt;= m_readCount + m_writeAheadCount) {
</del><ins>+    if (m_writeCount &lt;= m_readCount) {
</ins><span class="cx">         bus-&gt;zero();
</span><span class="cx">         return;
</span><span class="cx">     }
</span><span class="cx"> 
</span><del>-    uint64_t framesAvailable = endFrame - (m_readCount + m_writeAheadCount);
</del><ins>+    uint64_t framesAvailable = endFrame - m_readCount;
</ins><span class="cx">     if (framesAvailable &lt; framesToProcess) {
</span><span class="cx">         framesToProcess = static_cast&lt;size_t&gt;(framesAvailable);
</span><span class="cx">         bus-&gt;zero();
</span><span class="lines">@@ -136,6 +139,8 @@
</span><span class="cx"> 
</span><span class="cx"> void WebAudioSourceProviderAVFObjC::prepare(const AudioStreamBasicDescription* format)
</span><span class="cx"> {
</span><ins>+    std::lock_guard&lt;Lock&gt; lock(m_mutex);
+
</ins><span class="cx">     LOG(Media, &quot;WebAudioSourceProviderAVFObjC::prepare(%p)&quot;, this);
</span><span class="cx"> 
</span><span class="cx">     m_inputDescription = std::make_unique&lt;AudioStreamBasicDescription&gt;(*format);
</span><span class="lines">@@ -200,6 +205,8 @@
</span><span class="cx"> 
</span><span class="cx"> void WebAudioSourceProviderAVFObjC::unprepare()
</span><span class="cx"> {
</span><ins>+    std::lock_guard&lt;Lock&gt; lock(m_mutex);
+
</ins><span class="cx">     m_inputDescription = nullptr;
</span><span class="cx">     m_outputDescription = nullptr;
</span><span class="cx">     m_ringBuffer = nullptr;
</span><span class="lines">@@ -214,6 +221,8 @@
</span><span class="cx"> 
</span><span class="cx"> void WebAudioSourceProviderAVFObjC::process(CMFormatDescriptionRef, CMSampleBufferRef sampleBuffer)
</span><span class="cx"> {
</span><ins>+    std::lock_guard&lt;Lock&gt; lock(m_mutex);
+
</ins><span class="cx">     if (!m_ringBuffer)
</span><span class="cx">         return;
</span><span class="cx"> 
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmockMockRealtimeAudioSourceh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mock/MockRealtimeAudioSource.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mock/MockRealtimeAudioSource.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mock/MockRealtimeAudioSource.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -56,7 +56,7 @@
</span><span class="cx">     virtual void render(double) { }
</span><span class="cx"> 
</span><span class="cx">     double elapsedTime();
</span><del>-    static int renderInterval() { return 125; }
</del><ins>+    static int renderInterval() { return 60; }
</ins><span class="cx"> 
</span><span class="cx"> private:
</span><span class="cx"> 
</span></span></pre>
</div>
</div>

</body>
</html>