<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[211728] trunk/Source/WebCore</title>
</head>
<body>
<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; }
#msg dl a { font-weight: bold}
#msg dl a:link { color:#fc3; }
#msg dl a:active { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/211728">211728</a></dd>
<dt>Author</dt> <dd>eric.carlson@apple.com</dd>
<dt>Date</dt> <dd>2017-02-06 09:22:27 -0800 (Mon, 06 Feb 2017)</dd>
</dl>
<h3>Log Message</h3>
<pre>[MediaStream Mac] Stop using AVSampleBufferAudioRenderer
https://bugs.webkit.org/show_bug.cgi?id=167821
Reviewed by Jer Noble.
* WebCore.xcodeproj/project.pbxproj: Add new files.
* platform/audio/mac/AudioSampleDataSource.cpp:
(WebCore::AudioSampleDataSource::pullSamplesInternal): Don't assume the first timestamp from the
render proc after a pause is zero.
Stop using an audio renderer for each audio track. No audio renderers means we don't need to use
an AVSampleBufferRenderSynchronizer.
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
(-[WebAVSampleBufferStatusChangeListener invalidate]): No more audio renderers.
(-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC): Pause
audio tracks explicitly.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider): Remove the existing code,
it was incorrect and not thread safe.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers): No more audio renderers.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): No more render synchronizer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): Start each audio track.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause): Pause each audio track.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setVolume): Pass the command to each audio track.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setMuted): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::streamTime): No more render synchronizer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated): Don't handle audio samples.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateTracks): Update for audio track class change. No
more render synchronizer.
(-[WebAVSampleBufferStatusChangeListener beginObservingRenderer:]): Deleted.
(-[WebAVSampleBufferStatusChangeListener stopObservingRenderer:]): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange): Deleted.
* platform/mediastream/AudioTrackPrivateMediaStream.h:
* platform/mediastream/MediaStreamTrackPrivate.cpp:
(WebCore::MediaStreamTrackPrivate::MediaStreamTrackPrivate): add/removeObserver takes a reference,
not a pointer.
(WebCore::MediaStreamTrackPrivate::~MediaStreamTrackPrivate): Ditto.
(WebCore::MediaStreamTrackPrivate::videoSampleAvailable): Renamed from sourceHasMoreMediaData.
(WebCore::MediaStreamTrackPrivate::sourceHasMoreMediaData): Deleted.
* platform/mediastream/MediaStreamTrackPrivate.h:
* platform/mediastream/RealtimeMediaSource.cpp:
(WebCore::RealtimeMediaSource::addObserver): Take a reference, not a pointer.
(WebCore::RealtimeMediaSource::removeObserver): Ditto.
(WebCore::RealtimeMediaSource::videoSampleAvailable): Renamed from mediaDataUpdated.
(WebCore::RealtimeMediaSource::audioSamplesAvailable): New.
(WebCore::RealtimeMediaSource::stop): Drive-by cleanup.
(WebCore::RealtimeMediaSource::requestStop): Ditto.
(WebCore::RealtimeMediaSource::mediaDataUpdated): Deleted.
* platform/mediastream/RealtimeMediaSource.h:
* platform/mediastream/mac/AVAudioCaptureSource.h:
* platform/mediastream/mac/AVAudioCaptureSource.mm:
(WebCore::AVAudioCaptureSource::AVAudioCaptureSource):
(WebCore::AVAudioCaptureSource::addObserver):
(WebCore::AVAudioCaptureSource::shutdownCaptureSession):
(WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection):
(WebCore::operator==): Deleted.
(WebCore::operator!=): Deleted.
* platform/mediastream/mac/AVVideoCaptureSource.mm:
(WebCore::AVVideoCaptureSource::processNewFrame): Call videoSampleAvailable, not mediaDataUpdated.
Render audio with a CoreAudio output unit.
* platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp: Added.
(WebCore::AudioTrackPrivateMediaStreamCocoa::AudioTrackPrivateMediaStreamCocoa):
(WebCore::AudioTrackPrivateMediaStreamCocoa::~AudioTrackPrivateMediaStreamCocoa):
(WebCore::AudioTrackPrivateMediaStreamCocoa::playInternal):
(WebCore::AudioTrackPrivateMediaStreamCocoa::play):
(WebCore::AudioTrackPrivateMediaStreamCocoa::pause):
(WebCore::AudioTrackPrivateMediaStreamCocoa::setVolume):
(WebCore::AudioTrackPrivateMediaStreamCocoa::setupAudioUnit):
(WebCore::AudioTrackPrivateMediaStreamCocoa::audioSamplesAvailable):
(WebCore::AudioTrackPrivateMediaStreamCocoa::sourceStopped):
(WebCore::AudioTrackPrivateMediaStreamCocoa::render):
(WebCore::AudioTrackPrivateMediaStreamCocoa::inputProc):
* platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h: Added.
* platform/mediastream/mac/MockRealtimeAudioSourceMac.h:
* platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
(WebCore::alignTo16Bytes):
(WebCore::MockRealtimeAudioSourceMac::emitSampleBuffers):
(WebCore::MockRealtimeAudioSourceMac::reconfigure): Minor cleanup.
(WebCore::MockRealtimeAudioSourceMac::render): Ditto.
* platform/mediastream/mac/MockRealtimeVideoSourceMac.mm:
(WebCore::MockRealtimeVideoSourceMac::updateSampleBuffer): Call videoSampleAvailable, not mediaDataUpdated.
* platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h:
* platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm:
(WebCore::WebAudioSourceProviderAVFObjC::~WebAudioSourceProviderAVFObjC):
(WebCore::WebAudioSourceProviderAVFObjC::provideInput): Use a mutex. Get rid of m_writeAheadCount,
it is always 0.
(WebCore::WebAudioSourceProviderAVFObjC::prepare): Use a lock.
(WebCore::WebAudioSourceProviderAVFObjC::unprepare): Ditto.
(WebCore::WebAudioSourceProviderAVFObjC::process): Ditto.
* platform/mock/MockRealtimeAudioSource.h:
(WebCore::MockRealtimeAudioSource::renderInterval): Decrease the render interval.</pre>
<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkSourceWebCoreChangeLog">trunk/Source/WebCore/ChangeLog</a></li>
<li><a href="#trunkSourceWebCoreWebCorexcodeprojprojectpbxproj">trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj</a></li>
<li><a href="#trunkSourceWebCoreplatformaudiomacAudioSampleDataSourcecpp">trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCh">trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h</a></li>
<li><a href="#trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCmm">trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreamAudioTrackPrivateMediaStreamh">trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreamMediaStreamTrackPrivatecpp">trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreamMediaStreamTrackPrivateh">trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreamRealtimeMediaSourcecpp">trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreamRealtimeMediaSourceh">trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAVAudioCaptureSourceh">trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAVAudioCaptureSourcemm">trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAVVideoCaptureSourcemm">trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMach">trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMacmm">trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacMockRealtimeVideoSourceMacmm">trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacWebAudioSourceProviderAVFObjCh">trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacWebAudioSourceProviderAVFObjCmm">trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformmockMockRealtimeAudioSourceh">trunk/Source/WebCore/platform/mock/MockRealtimeAudioSource.h</a></li>
</ul>
<h3>Added Paths</h3>
<ul>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAudioTrackPrivateMediaStreamCocoacpp">trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformmediastreammacAudioTrackPrivateMediaStreamCocoah">trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h</a></li>
</ul>
</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkSourceWebCoreChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/ChangeLog (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/ChangeLog        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/ChangeLog        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -1,3 +1,115 @@
</span><ins>+2017-02-06 Eric Carlson <eric.carlson@apple.com>
+
+ [MediaStream Mac] Stop using AVSampleBufferAudioRenderer
+ https://bugs.webkit.org/show_bug.cgi?id=167821
+
+ Reviewed by Jer Noble.
+
+ * WebCore.xcodeproj/project.pbxproj: Add new files.
+
+ * platform/audio/mac/AudioSampleDataSource.cpp:
+ (WebCore::AudioSampleDataSource::pullSamplesInternal): Don't assume the first timestamp from the
+ render proc after a pause is zero.
+
+ Stop using an audio renderer for each audio track. No audio renderers means we don't need to use
+ an AVSampleBufferRenderSynchronizer.
+ * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
+ * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
+ (-[WebAVSampleBufferStatusChangeListener invalidate]): No more audio renderers.
+ (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]): Ditto.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC): Ditto.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC): Pause
+ audio tracks explicitly.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider): Remove the existing code,
+ it was incorrect and not thread safe.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers): No more audio renderers.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): No more render synchronizer.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): Ditto.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): Start each audio track.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause): Pause each audio track.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setVolume): Pass the command to each audio track.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setMuted): Ditto.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::streamTime): No more render synchronizer.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated): Don't handle audio samples.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateTracks): Update for audio track class change. No
+ more render synchronizer.
+ (-[WebAVSampleBufferStatusChangeListener beginObservingRenderer:]): Deleted.
+ (-[WebAVSampleBufferStatusChangeListener stopObservingRenderer:]): Deleted.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample): Deleted.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData): Deleted.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer): Deleted.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer): Deleted.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers): Deleted.
+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange): Deleted.
+
+ * platform/mediastream/AudioTrackPrivateMediaStream.h:
+
+ * platform/mediastream/MediaStreamTrackPrivate.cpp:
+ (WebCore::MediaStreamTrackPrivate::MediaStreamTrackPrivate): add/removeObserver takes a reference,
+ not a pointer.
+ (WebCore::MediaStreamTrackPrivate::~MediaStreamTrackPrivate): Ditto.
+ (WebCore::MediaStreamTrackPrivate::videoSampleAvailable): Renamed from sourceHasMoreMediaData.
+ (WebCore::MediaStreamTrackPrivate::sourceHasMoreMediaData): Deleted.
+ * platform/mediastream/MediaStreamTrackPrivate.h:
+
+ * platform/mediastream/RealtimeMediaSource.cpp:
+ (WebCore::RealtimeMediaSource::addObserver): Take a reference, not a pointer.
+ (WebCore::RealtimeMediaSource::removeObserver): Ditto.
+ (WebCore::RealtimeMediaSource::videoSampleAvailable): Renamed from mediaDataUpdated.
+ (WebCore::RealtimeMediaSource::audioSamplesAvailable): New.
+ (WebCore::RealtimeMediaSource::stop): Drive-by cleanup.
+ (WebCore::RealtimeMediaSource::requestStop): Ditto.
+ (WebCore::RealtimeMediaSource::mediaDataUpdated): Deleted.
+ * platform/mediastream/RealtimeMediaSource.h:
+
+ * platform/mediastream/mac/AVAudioCaptureSource.h:
+ * platform/mediastream/mac/AVAudioCaptureSource.mm:
+ (WebCore::AVAudioCaptureSource::AVAudioCaptureSource):
+ (WebCore::AVAudioCaptureSource::addObserver):
+ (WebCore::AVAudioCaptureSource::shutdownCaptureSession):
+ (WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection):
+ (WebCore::operator==): Deleted.
+ (WebCore::operator!=): Deleted.
+
+ * platform/mediastream/mac/AVVideoCaptureSource.mm:
+ (WebCore::AVVideoCaptureSource::processNewFrame): Call videoSampleAvailable, not mediaDataUpdated.
+
+ Render audio with a CoreAudio output unit.
+ * platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp: Added.
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::AudioTrackPrivateMediaStreamCocoa):
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::~AudioTrackPrivateMediaStreamCocoa):
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::playInternal):
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::play):
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::pause):
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::setVolume):
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::setupAudioUnit):
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::audioSamplesAvailable):
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::sourceStopped):
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::render):
+ (WebCore::AudioTrackPrivateMediaStreamCocoa::inputProc):
+ * platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h: Added.
+
+ * platform/mediastream/mac/MockRealtimeAudioSourceMac.h:
+ * platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
+ (WebCore::alignTo16Bytes):
+ (WebCore::MockRealtimeAudioSourceMac::emitSampleBuffers):
+ (WebCore::MockRealtimeAudioSourceMac::reconfigure): Minor cleanup.
+ (WebCore::MockRealtimeAudioSourceMac::render): Ditto.
+
+ * platform/mediastream/mac/MockRealtimeVideoSourceMac.mm:
+ (WebCore::MockRealtimeVideoSourceMac::updateSampleBuffer): Call videoSampleAvailable, not mediaDataUpdated.
+
+ * platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h:
+ * platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm:
+ (WebCore::WebAudioSourceProviderAVFObjC::~WebAudioSourceProviderAVFObjC):
+ (WebCore::WebAudioSourceProviderAVFObjC::provideInput): Use a mutex. Get rid of m_writeAheadCount,
+ it is always 0.
+ (WebCore::WebAudioSourceProviderAVFObjC::prepare): Use a lock.
+ (WebCore::WebAudioSourceProviderAVFObjC::unprepare): Ditto.
+ (WebCore::WebAudioSourceProviderAVFObjC::process): Ditto.
+ * platform/mock/MockRealtimeAudioSource.h:
+ (WebCore::MockRealtimeAudioSource::renderInterval): Decrease the render interval.
+
</ins><span class="cx"> 2017-02-06 Antoine Quint <graouts@apple.com>
</span><span class="cx">
</span><span class="cx"> [Modern Media Controls] Add a backdrop filter to the start button on macOS
</span></span></pre></div>
<a id="trunkSourceWebCoreWebCorexcodeprojprojectpbxproj"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -159,6 +159,7 @@
</span><span class="cx">                 07638A9A1884487200E15A1B /* MediaSessionManagerIOS.mm in Sources */ = {isa = PBXBuildFile; fileRef = 07638A981884487200E15A1B /* MediaSessionManagerIOS.mm */; };
</span><span class="cx">                 076970861463AD8700F502CF /* TextTrackList.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 076970841463AD8700F502CF /* TextTrackList.cpp */; };
</span><span class="cx">                 076970871463AD8700F502CF /* TextTrackList.h in Headers */ = {isa = PBXBuildFile; fileRef = 076970851463AD8700F502CF /* TextTrackList.h */; };
</span><ins>+                076EC1331E44F56D00E5D813 /* AudioTrackPrivateMediaStreamCocoa.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 076EC1321E44F2CB00E5D813 /* AudioTrackPrivateMediaStreamCocoa.cpp */; };
</ins><span class="cx">                 076F0D0E12B8192700C26AA4 /* MediaPlayerPrivateAVFoundation.h in Headers */ = {isa = PBXBuildFile; fileRef = 076F0D0A12B8192700C26AA4 /* MediaPlayerPrivateAVFoundation.h */; };
</span><span class="cx">                 07707CB01E205EE300005BF7 /* AudioSourceObserverObjC.h in Headers */ = {isa = PBXBuildFile; fileRef = 07707CAF1E205EC400005BF7 /* AudioSourceObserverObjC.h */; };
</span><span class="cx">                 077664FC183E6B5C00133B92 /* JSQuickTimePluginReplacement.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 077664FA183E6B5C00133B92 /* JSQuickTimePluginReplacement.cpp */; };
</span><span class="lines">@@ -284,7 +285,6 @@
</span><span class="cx">                 07B7116F1D899E63009F0FFB /* CaptureDeviceManager.h in Headers */ = {isa = PBXBuildFile; fileRef = 07B7116C1D899E63009F0FFB /* CaptureDeviceManager.h */; };
</span><span class="cx">                 07C046C31E42508B007201E7 /* CAAudioStreamDescription.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 073B87571E40DCFD0071C0EC /* CAAudioStreamDescription.cpp */; };
</span><span class="cx">                 07C046C41E42508B007201E7 /* CAAudioStreamDescription.h in Headers */ = {isa = PBXBuildFile; fileRef = 073B87581E40DCFD0071C0EC /* CAAudioStreamDescription.h */; settings = {ATTRIBUTES = (Private, ); }; };
</span><del>-                07C046C71E425155007201E7 /* AudioTrackPrivateMediaStreamCocoa.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 07C046C51E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.cpp */; };
</del><span class="cx">                 07C046C81E425155007201E7 /* AudioTrackPrivateMediaStreamCocoa.h in Headers */ = {isa = PBXBuildFile; fileRef = 07C046C61E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.h */; };
</span><span class="cx">                 07C046CB1E426413007201E7 /* AudioStreamDescription.h in Headers */ = {isa = PBXBuildFile; fileRef = 073B87561E40DCE50071C0EC /* AudioStreamDescription.h */; settings = {ATTRIBUTES = (Private, ); }; };
</span><span class="cx">                 07C1C0E21BFB600100BD2256 /* MediaTrackSupportedConstraints.h in Headers */ = {isa = PBXBuildFile; fileRef = 07C1C0E01BFB600100BD2256 /* MediaTrackSupportedConstraints.h */; };
</span><span class="lines">@@ -7256,6 +7256,7 @@
</span><span class="cx">                 07638A981884487200E15A1B /* MediaSessionManagerIOS.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = MediaSessionManagerIOS.mm; sourceTree = "<group>"; };
</span><span class="cx">                 076970841463AD8700F502CF /* TextTrackList.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = TextTrackList.cpp; sourceTree = "<group>"; };
</span><span class="cx">                 076970851463AD8700F502CF /* TextTrackList.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = TextTrackList.h; sourceTree = "<group>"; };
</span><ins>+                076EC1321E44F2CB00E5D813 /* AudioTrackPrivateMediaStreamCocoa.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = AudioTrackPrivateMediaStreamCocoa.cpp; sourceTree = "<group>"; };
</ins><span class="cx">                 076F0D0912B8192700C26AA4 /* MediaPlayerPrivateAVFoundation.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = MediaPlayerPrivateAVFoundation.cpp; sourceTree = "<group>"; };
</span><span class="cx">                 076F0D0A12B8192700C26AA4 /* MediaPlayerPrivateAVFoundation.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = MediaPlayerPrivateAVFoundation.h; sourceTree = "<group>"; };
</span><span class="cx">                 07707CAF1E205EC400005BF7 /* AudioSourceObserverObjC.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioSourceObserverObjC.h; sourceTree = "<group>"; };
</span><span class="lines">@@ -7337,6 +7338,7 @@
</span><span class="cx">                 07B7116A1D899E63009F0FFB /* CaptureDevice.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CaptureDevice.h; sourceTree = "<group>"; };
</span><span class="cx">                 07B7116B1D899E63009F0FFB /* CaptureDeviceManager.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = CaptureDeviceManager.cpp; sourceTree = "<group>"; };
</span><span class="cx">                 07B7116C1D899E63009F0FFB /* CaptureDeviceManager.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CaptureDeviceManager.h; sourceTree = "<group>"; };
</span><ins>+                07C046C61E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioTrackPrivateMediaStreamCocoa.h; sourceTree = "<group>"; };
</ins><span class="cx">                 07C1C0E01BFB600100BD2256 /* MediaTrackSupportedConstraints.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = MediaTrackSupportedConstraints.h; sourceTree = "<group>"; };
</span><span class="cx">                 07C1C0E11BFB600100BD2256 /* MediaTrackSupportedConstraints.idl */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = MediaTrackSupportedConstraints.idl; sourceTree = "<group>"; };
</span><span class="cx">                 07C1C0E41BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RealtimeMediaSourceSupportedConstraints.h; sourceTree = "<group>"; };
</span><span class="lines">@@ -15402,6 +15404,7 @@
</span><span class="cx">                 0729B14D17CFCCA0004F1D60 /* mac */ = {
</span><span class="cx">                         isa = PBXGroup;
</span><span class="cx">                         children = (
</span><ins>+                                076EC1321E44F2CB00E5D813 /* AudioTrackPrivateMediaStreamCocoa.cpp */,
</ins><span class="cx">                                 5CDD83391E4324BB00621E92 /* RealtimeIncomingVideoSource.cpp */,
</span><span class="cx">                                 5CDD833A1E4324BB00621E92 /* RealtimeIncomingVideoSource.h */,
</span><span class="cx">                                 5CDD833B1E4324BB00621E92 /* RealtimeOutgoingVideoSource.cpp */,
</span><span class="lines">@@ -15408,7 +15411,6 @@
</span><span class="cx">                                 5CDD833C1E4324BB00621E92 /* RealtimeOutgoingVideoSource.h */,
</span><span class="cx">                                 07707CB11E20649C00005BF7 /* AudioCaptureSourceProviderObjC.h */,
</span><span class="cx">                                 07707CAF1E205EC400005BF7 /* AudioSourceObserverObjC.h */,
</span><del>-                                07C046C51E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.cpp */,
</del><span class="cx">                                 07C046C61E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.h */,
</span><span class="cx">                                 070363D8181A1CDC00C074A5 /* AVAudioCaptureSource.h */,
</span><span class="cx">                                 070363D9181A1CDC00C074A5 /* AVAudioCaptureSource.mm */,
</span><span class="lines">@@ -25245,6 +25247,7 @@
</span><span class="cx">                                 CDE3A85417F5FCE600C5BE20 /* AudioTrackPrivateAVF.h in Headers */,
</span><span class="cx">                                 CDE3A85817F6020400C5BE20 /* AudioTrackPrivateAVFObjC.h in Headers */,
</span><span class="cx">                                 CD54A763180F9F7000B076C9 /* AudioTrackPrivateMediaSourceAVFObjC.h in Headers */,
</span><ins>+                                07C046C81E425155007201E7 /* AudioTrackPrivateMediaStreamCocoa.h in Headers */,
</ins><span class="cx">                                 07D6A4F81BF2307D00174146 /* AudioTrackPrivateMediaStream.h in Headers */,
</span><span class="cx">                                 FD31608B12B026F700C1A359 /* AudioUtilities.h in Headers */,
</span><span class="cx">                                 7EE6846012D26E3800E79415 /* AuthenticationCF.h in Headers */,
</span><span class="lines">@@ -31870,6 +31873,7 @@
</span><span class="cx">                                 7C39C3741DDBB8D300FEFB29 /* SVGTransformListValues.cpp in Sources */,
</span><span class="cx">                                 7CE58D571DD7D96D00128552 /* SVGTransformValue.cpp in Sources */,
</span><span class="cx">                                 B2227AE10D00BF220071B782 /* SVGTRefElement.cpp in Sources */,
</span><ins>+                                076EC1331E44F56D00E5D813 /* AudioTrackPrivateMediaStreamCocoa.cpp in Sources */,
</ins><span class="cx">                                 B2227AE40D00BF220071B782 /* SVGTSpanElement.cpp in Sources */,
</span><span class="cx">                                 B2227AE90D00BF220071B782 /* SVGURIReference.cpp in Sources */,
</span><span class="cx">                                 B2227AEC0D00BF220071B782 /* SVGUseElement.cpp in Sources */,
</span><span class="lines">@@ -32119,7 +32123,6 @@
</span><span class="cx">                                 49C7B9E51042D32F0009D447 /* WebGLTexture.cpp in Sources */,
</span><span class="cx">                                 6F995A231A7078B100A735F4 /* WebGLTransformFeedback.cpp in Sources */,
</span><span class="cx">                                 0C3F1F5A10C8871200D72CE1 /* WebGLUniformLocation.cpp in Sources */,
</span><del>-                                07C046C71E425155007201E7 /* AudioTrackPrivateMediaStreamCocoa.cpp in Sources */,
</del><span class="cx">                                 6F995A251A7078B100A735F4 /* WebGLVertexArrayObject.cpp in Sources */,
</span><span class="cx">                                 6F222B761AB52D8A0094651A /* WebGLVertexArrayObjectBase.cpp in Sources */,
</span><span class="cx">                                 77A17A7712F28642004E02F6 /* WebGLVertexArrayObjectOES.cpp in Sources */,
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiomacAudioSampleDataSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.cpp (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.cpp        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.cpp        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -255,14 +255,13 @@
</span><span class="cx"> const double tenMS = .01;
</span><span class="cx"> const double fiveMS = .005;
</span><span class="cx"> double sampleRate = m_outputDescription->sampleRate();
</span><ins>+ m_outputSampleOffset = timeStamp + m_timeStamp;
</ins><span class="cx"> if (buffered > sampleRate * twentyMS)
</span><del>- m_outputSampleOffset = m_timeStamp - sampleRate * twentyMS;
</del><ins>+ m_outputSampleOffset -= sampleRate * twentyMS;
</ins><span class="cx"> else if (buffered > sampleRate * tenMS)
</span><del>- m_outputSampleOffset = m_timeStamp - sampleRate * tenMS;
</del><ins>+ m_outputSampleOffset -= sampleRate * tenMS;
</ins><span class="cx"> else if (buffered > sampleRate * fiveMS)
</span><del>- m_outputSampleOffset = m_timeStamp - sampleRate * fiveMS;
- else
- m_outputSampleOffset = m_timeStamp;
</del><ins>+ m_outputSampleOffset -= sampleRate * fiveMS;
</ins><span class="cx">
</span><span class="cx"> m_transitioningFromPaused = false;
</span><span class="cx"> }
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -45,7 +45,7 @@
</span><span class="cx">
</span><span class="cx"> namespace WebCore {
</span><span class="cx">
</span><del>-class AudioTrackPrivateMediaStream;
</del><ins>+class AudioTrackPrivateMediaStreamCocoa;
</ins><span class="cx"> class AVVideoCaptureSource;
</span><span class="cx"> class Clock;
</span><span class="cx"> class MediaSourcePrivateClient;
</span><span class="lines">@@ -55,10 +55,6 @@
</span><span class="cx"> class VideoFullscreenLayerManager;
</span><span class="cx"> #endif
</span><span class="cx">
</span><del>-#if __has_include(<AVFoundation/AVSampleBufferRenderSynchronizer.h>)
-#define USE_RENDER_SYNCHRONIZER 1
-#endif
-
</del><span class="cx"> class MediaPlayerPrivateMediaStreamAVFObjC final : public MediaPlayerPrivateInterface, private MediaStreamPrivate::Observer, private MediaStreamTrackPrivate::Observer {
</span><span class="cx"> public:
</span><span class="cx"> explicit MediaPlayerPrivateMediaStreamAVFObjC(MediaPlayer*);
</span><span class="lines">@@ -81,7 +77,6 @@
</span><span class="cx"> void ensureLayer();
</span><span class="cx"> void destroyLayer();
</span><span class="cx">
</span><del>- void rendererStatusDidChange(AVSampleBufferAudioRenderer*, NSNumber*);
</del><span class="cx"> void layerStatusDidChange(AVSampleBufferDisplayLayer*, NSNumber*);
</span><span class="cx">
</span><span class="cx"> private:
</span><span class="lines">@@ -144,13 +139,6 @@
</span><span class="cx"> void flushAndRemoveVideoSampleBuffers();
</span><span class="cx"> void requestNotificationWhenReadyForVideoData();
</span><span class="cx">
</span><del>- void enqueueAudioSample(MediaStreamTrackPrivate&, MediaSample&);
- void createAudioRenderer(AtomicString);
- void destroyAudioRenderer(AVSampleBufferAudioRenderer*);
- void destroyAudioRenderer(AtomicString);
- void destroyAudioRenderers();
- void requestNotificationWhenReadyForAudioData(AtomicString);
-
</del><span class="cx"> void paint(GraphicsContext&, const FloatRect&) override;
</span><span class="cx"> void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) override;
</span><span class="cx"> bool metaDataAvailable() const { return m_mediaStreamPrivate && m_readyState >= MediaPlayer::HaveMetadata; }
</span><span class="lines">@@ -210,9 +198,7 @@
</span><span class="cx">
</span><span class="cx"> MediaTime streamTime() const;
</span><span class="cx">
</span><del>-#if USE(RENDER_SYNCHRONIZER)
</del><span class="cx"> AudioSourceProvider* audioSourceProvider() final;
</span><del>-#endif
</del><span class="cx">
</span><span class="cx"> MediaPlayer* m_player { nullptr };
</span><span class="cx"> WeakPtrFactory<MediaPlayerPrivateMediaStreamAVFObjC> m_weakPtrFactory;
</span><span class="lines">@@ -222,22 +208,14 @@
</span><span class="cx">
</span><span class="cx"> RetainPtr<WebAVSampleBufferStatusChangeListener> m_statusChangeListener;
</span><span class="cx"> RetainPtr<AVSampleBufferDisplayLayer> m_sampleBufferDisplayLayer;
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- HashMap<String, RetainPtr<AVSampleBufferAudioRenderer>> m_audioRenderers;
- RetainPtr<AVSampleBufferRenderSynchronizer> m_synchronizer;
-#else
</del><span class="cx"> std::unique_ptr<Clock> m_clock;
</span><del>-#endif
</del><span class="cx">
</span><span class="cx"> MediaTime m_pausedTime;
</span><span class="cx"> RetainPtr<CGImageRef> m_pausedImage;
</span><span class="cx">
</span><del>- HashMap<String, RefPtr<AudioTrackPrivateMediaStream>> m_audioTrackMap;
</del><ins>+ HashMap<String, RefPtr<AudioTrackPrivateMediaStreamCocoa>> m_audioTrackMap;
</ins><span class="cx"> HashMap<String, RefPtr<VideoTrackPrivateMediaStream>> m_videoTrackMap;
</span><span class="cx"> PendingSampleQueue m_pendingVideoSampleQueue;
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- PendingSampleQueue m_pendingAudioSampleQueue;
-#endif
</del><span class="cx">
</span><span class="cx"> MediaPlayer::NetworkState m_networkState { MediaPlayer::Empty };
</span><span class="cx"> MediaPlayer::ReadyState m_readyState { MediaPlayer::HaveNothing };
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformgraphicsavfoundationobjcMediaPlayerPrivateMediaStreamAVFObjCmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -29,7 +29,7 @@
</span><span class="cx"> #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
</span><span class="cx">
</span><span class="cx"> #import "AVFoundationSPI.h"
</span><del>-#import "AudioTrackPrivateMediaStream.h"
</del><ins>+#import "AudioTrackPrivateMediaStreamCocoa.h"
</ins><span class="cx"> #import "Clock.h"
</span><span class="cx"> #import "CoreMediaSoftLink.h"
</span><span class="cx"> #import "GraphicsContext.h"
</span><span class="lines">@@ -52,7 +52,6 @@
</span><span class="cx">
</span><span class="cx"> SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
</span><span class="cx">
</span><del>-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferAudioRenderer)
</del><span class="cx"> SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer)
</span><span class="cx"> SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferRenderSynchronizer)
</span><span class="cx">
</span><span class="lines">@@ -67,7 +66,6 @@
</span><span class="cx"> @interface WebAVSampleBufferStatusChangeListener : NSObject {
</span><span class="cx"> MediaPlayerPrivateMediaStreamAVFObjC* _parent;
</span><span class="cx"> Vector<RetainPtr<AVSampleBufferDisplayLayer>> _layers;
</span><del>- Vector<RetainPtr<AVSampleBufferAudioRenderer>> _renderers;
</del><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> - (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)callback;
</span><span class="lines">@@ -74,8 +72,6 @@
</span><span class="cx"> - (void)invalidate;
</span><span class="cx"> - (void)beginObservingLayer:(AVSampleBufferDisplayLayer *)layer;
</span><span class="cx"> - (void)stopObservingLayer:(AVSampleBufferDisplayLayer *)layer;
</span><del>-- (void)beginObservingRenderer:(AVSampleBufferAudioRenderer *)renderer;
-- (void)stopObservingRenderer:(AVSampleBufferAudioRenderer *)renderer;
</del><span class="cx"> @end
</span><span class="cx">
</span><span class="cx"> @implementation WebAVSampleBufferStatusChangeListener
</span><span class="lines">@@ -101,10 +97,6 @@
</span><span class="cx"> [layer removeObserver:self forKeyPath:@"status"];
</span><span class="cx"> _layers.clear();
</span><span class="cx">
</span><del>- for (auto& renderer : _renderers)
- [renderer removeObserver:self forKeyPath:@"status"];
- _renderers.clear();
-
</del><span class="cx"> [[NSNotificationCenter defaultCenter] removeObserver:self];
</span><span class="cx">
</span><span class="cx"> _parent = nullptr;
</span><span class="lines">@@ -128,24 +120,6 @@
</span><span class="cx"> _layers.remove(_layers.find(layer));
</span><span class="cx"> }
</span><span class="cx">
</span><del>-- (void)beginObservingRenderer:(AVSampleBufferAudioRenderer*)renderer
-{
- ASSERT(_parent);
- ASSERT(!_renderers.contains(renderer));
-
- _renderers.append(renderer);
- [renderer addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nullptr];
-}
-
-- (void)stopObservingRenderer:(AVSampleBufferAudioRenderer*)renderer
-{
- ASSERT(_parent);
- ASSERT(_renderers.contains(renderer));
-
- [renderer removeObserver:self forKeyPath:@"status"];
- _renderers.remove(_renderers.find(renderer));
-}
-
</del><span class="cx"> - (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
</span><span class="cx"> {
</span><span class="cx"> UNUSED_PARAM(context);
</span><span class="lines">@@ -167,19 +141,6 @@
</span><span class="cx"> protectedSelf->_parent->layerStatusDidChange(layer.get(), status.get());
</span><span class="cx"> });
</span><span class="cx">
</span><del>- } else if ([object isKindOfClass:getAVSampleBufferAudioRendererClass()]) {
- RetainPtr<AVSampleBufferAudioRenderer> renderer = (AVSampleBufferAudioRenderer *)object;
- RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey];
-
- ASSERT(_renderers.contains(renderer.get()));
- ASSERT([keyPath isEqualToString:@"status"]);
-
- callOnMainThread([protectedSelf = WTFMove(protectedSelf), renderer = WTFMove(renderer), status = WTFMove(status)] {
- if (!protectedSelf->_parent)
- return;
-
- protectedSelf->_parent->rendererStatusDidChange(renderer.get(), status.get());
- });
</del><span class="cx"> } else
</span><span class="cx"> ASSERT_NOT_REACHED();
</span><span class="cx"> }
</span><span class="lines">@@ -196,11 +157,7 @@
</span><span class="cx"> : m_player(player)
</span><span class="cx"> , m_weakPtrFactory(this)
</span><span class="cx"> , m_statusChangeListener(adoptNS([[WebAVSampleBufferStatusChangeListener alloc] initWithParent:this]))
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- , m_synchronizer(adoptNS([allocAVSampleBufferRenderSynchronizerInstance() init]))
-#else
</del><span class="cx"> , m_clock(Clock::create())
</span><del>-#endif
</del><span class="cx"> #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
</span><span class="cx"> , m_videoFullscreenLayerManager(VideoFullscreenLayerManager::create())
</span><span class="cx"> #endif
</span><span class="lines">@@ -211,6 +168,9 @@
</span><span class="cx"> MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC()
</span><span class="cx"> {
</span><span class="cx"> LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC(%p)", this);
</span><ins>+ for (const auto& track : m_audioTrackMap.values())
+ track->pause();
+
</ins><span class="cx"> if (m_mediaStreamPrivate) {
</span><span class="cx"> m_mediaStreamPrivate->removeObserver(*this);
</span><span class="cx">
</span><span class="lines">@@ -219,9 +179,6 @@
</span><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> destroyLayer();
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- destroyAudioRenderers();
-#endif
</del><span class="cx">
</span><span class="cx"> m_audioTrackMap.clear();
</span><span class="cx"> m_videoTrackMap.clear();
</span><span class="lines">@@ -315,33 +272,6 @@
</span><span class="cx"> return timelineOffset;
</span><span class="cx"> }
</span><span class="cx">
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-void MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample(MediaStreamTrackPrivate& track, MediaSample& sample)
-{
- ASSERT(m_audioTrackMap.contains(track.id()));
- ASSERT(m_audioRenderers.contains(sample.trackID()));
-
- auto audioTrack = m_audioTrackMap.get(track.id());
- MediaTime timelineOffset = audioTrack->timelineOffset();
- if (timelineOffset == MediaTime::invalidTime()) {
- timelineOffset = calculateTimelineOffset(sample, rendererLatency);
- audioTrack->setTimelineOffset(timelineOffset);
- LOG(MediaCaptureSamples, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample: timeline offset for track %s set to %s", track.id().utf8().data(), toString(timelineOffset).utf8().data());
- }
-
- updateSampleTimes(sample, timelineOffset, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample");
-
- auto renderer = m_audioRenderers.get(sample.trackID());
- if (![renderer isReadyForMoreMediaData]) {
- addSampleToPendingQueue(m_pendingAudioSampleQueue, sample);
- requestNotificationWhenReadyForAudioData(sample.trackID());
- return;
- }
-
- [renderer enqueueSampleBuffer:sample.platformSample().sample.cmSampleBuffer];
-}
-#endif
-
</del><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPrivate& track, MediaSample& sample)
</span><span class="cx"> {
</span><span class="cx"> ASSERT(m_videoTrackMap.contains(track.id()));
</span><span class="lines">@@ -400,102 +330,12 @@
</span><span class="cx"> }];
</span><span class="cx"> }
</span><span class="cx">
</span><del>-#if USE(RENDER_SYNCHRONIZER)
-void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData(AtomicString trackID)
-{
- if (!m_audioRenderers.contains(trackID))
- return;
-
- auto renderer = m_audioRenderers.get(trackID);
- [renderer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
- [renderer stopRequestingMediaData];
-
- auto audioTrack = m_audioTrackMap.get(trackID);
- while (!m_pendingAudioSampleQueue.isEmpty()) {
- if (![renderer isReadyForMoreMediaData]) {
- requestNotificationWhenReadyForAudioData(trackID);
- return;
- }
-
- auto sample = m_pendingAudioSampleQueue.takeFirst();
- enqueueAudioSample(audioTrack->streamTrack(), sample.get());
- }
- }];
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer(AtomicString trackID)
-{
- ASSERT(!m_audioRenderers.contains(trackID));
- auto renderer = adoptNS([allocAVSampleBufferAudioRendererInstance() init]);
- [renderer setAudioTimePitchAlgorithm:(m_player->preservesPitch() ? AVAudioTimePitchAlgorithmSpectral : AVAudioTimePitchAlgorithmVarispeed)];
- m_audioRenderers.set(trackID, renderer);
- [m_synchronizer addRenderer:renderer.get()];
- [m_statusChangeListener beginObservingRenderer:renderer.get()];
- if (m_audioRenderers.size() == 1)
- renderingModeChanged();
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer(AVSampleBufferAudioRenderer* renderer)
-{
- [m_statusChangeListener stopObservingRenderer:renderer];
- [renderer flush];
- [renderer stopRequestingMediaData];
-
- CMTime now = CMTimebaseGetTime([m_synchronizer timebase]);
- [m_synchronizer removeRenderer:renderer atTime:now withCompletionHandler:^(BOOL) { }];
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer(AtomicString trackID)
-{
- if (!m_audioRenderers.contains(trackID))
- return;
-
- destroyAudioRenderer(m_audioRenderers.get(trackID).get());
- m_audioRenderers.remove(trackID);
- if (!m_audioRenderers.size())
- renderingModeChanged();
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers()
-{
- m_pendingAudioSampleQueue.clear();
- for (auto& renderer : m_audioRenderers.values())
- destroyAudioRenderer(renderer.get());
- m_audioRenderers.clear();
-}
-
</del><span class="cx"> AudioSourceProvider* MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider()
</span><span class="cx"> {
</span><span class="cx"> // FIXME: This should return a mix of all audio tracks - https://bugs.webkit.org/show_bug.cgi?id=160305
</span><del>- for (const auto& track : m_audioTrackMap.values()) {
- if (track->streamTrack().ended() || !track->streamTrack().enabled() || track->streamTrack().muted())
- continue;
-
- return track->streamTrack().audioSourceProvider();
- }
</del><span class="cx"> return nullptr;
</span><span class="cx"> }
</span><del>-#endif
</del><span class="cx">
</span><del>-void MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange(AVSampleBufferAudioRenderer* renderer, NSNumber* status)
-{
-#if USE(RENDER_SYNCHRONIZER)
- String trackID;
- for (auto& pair : m_audioRenderers) {
- if (pair.value == renderer) {
- trackID = pair.key;
- break;
- }
- }
- ASSERT(!trackID.isEmpty());
- if (status.integerValue == AVQueuedSampleBufferRenderingStatusRendering)
- m_audioTrackMap.get(trackID)->setTimelineOffset(MediaTime::invalidTime());
-#else
- UNUSED_PARAM(renderer);
- UNUSED_PARAM(status);
-#endif
-}
-
</del><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer, NSNumber* status)
</span><span class="cx"> {
</span><span class="cx"> if (status.integerValue != AVQueuedSampleBufferRenderingStatusRendering)
</span><span class="lines">@@ -513,11 +353,6 @@
</span><span class="cx"> {
</span><span class="cx"> if (m_sampleBufferDisplayLayer)
</span><span class="cx"> [m_sampleBufferDisplayLayer flush];
</span><del>-
-#if USE(RENDER_SYNCHRONIZER)
- for (auto& renderer : m_audioRenderers.values())
- [renderer flush];
-#endif
</del><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> bool MediaPlayerPrivateMediaStreamAVFObjC::shouldEnqueueVideoSampleBuffer() const
</span><span class="lines">@@ -549,10 +384,6 @@
</span><span class="cx"> m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
</span><span class="cx"> [m_statusChangeListener beginObservingLayer:m_sampleBufferDisplayLayer.get()];
</span><span class="cx">
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()];
-#endif
-
</del><span class="cx"> renderingModeChanged();
</span><span class="cx">
</span><span class="cx"> #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
</span><span class="lines">@@ -570,13 +401,6 @@
</span><span class="cx"> [m_statusChangeListener stopObservingLayer:m_sampleBufferDisplayLayer.get()];
</span><span class="cx"> [m_sampleBufferDisplayLayer stopRequestingMediaData];
</span><span class="cx"> [m_sampleBufferDisplayLayer flush];
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- CMTime currentTime = CMTimebaseGetTime([m_synchronizer timebase]);
- [m_synchronizer removeRenderer:m_sampleBufferDisplayLayer.get() atTime:currentTime withCompletionHandler:^(BOOL) {
- // No-op.
- }];
- m_sampleBufferDisplayLayer = nullptr;
-#endif
</del><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> renderingModeChanged();
</span><span class="lines">@@ -700,14 +524,12 @@
</span><span class="cx"> return;
</span><span class="cx">
</span><span class="cx"> m_playing = true;
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- if (!m_synchronizer.get().rate)
- [m_synchronizer setRate:1 ]; // streamtime
-#else
</del><span class="cx"> if (!m_clock->isRunning())
</span><span class="cx"> m_clock->start();
</span><del>-#endif
</del><span class="cx">
</span><ins>+ for (const auto& track : m_audioTrackMap.values())
+ track->play();
+
</ins><span class="cx"> m_haveEverPlayed = true;
</span><span class="cx"> scheduleDeferredTask([this] {
</span><span class="cx"> updateDisplayMode();
</span><span class="lines">@@ -725,6 +547,9 @@
</span><span class="cx"> m_pausedTime = currentMediaTime();
</span><span class="cx"> m_playing = false;
</span><span class="cx">
</span><ins>+ for (const auto& track : m_audioTrackMap.values())
+ track->pause();
+
</ins><span class="cx"> updateDisplayMode();
</span><span class="cx"> updatePausedImage();
</span><span class="cx"> flushRenderers();
</span><span class="lines">@@ -743,11 +568,8 @@
</span><span class="cx"> return;
</span><span class="cx">
</span><span class="cx"> m_volume = volume;
</span><del>-
-#if USE(RENDER_SYNCHRONIZER)
- for (auto& renderer : m_audioRenderers.values())
- [renderer setVolume:volume];
-#endif
</del><ins>+ for (const auto& track : m_audioTrackMap.values())
+ track->setVolume(m_volume);
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> void MediaPlayerPrivateMediaStreamAVFObjC::setMuted(bool muted)
</span><span class="lines">@@ -758,11 +580,6 @@
</span><span class="cx"> return;
</span><span class="cx">
</span><span class="cx"> m_muted = muted;
</span><del>-
-#if USE(RENDER_SYNCHRONIZER)
- for (auto& renderer : m_audioRenderers.values())
- [renderer setMuted:muted];
-#endif
</del><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> bool MediaPlayerPrivateMediaStreamAVFObjC::hasVideo() const
</span><span class="lines">@@ -796,11 +613,7 @@
</span><span class="cx">
</span><span class="cx"> MediaTime MediaPlayerPrivateMediaStreamAVFObjC::streamTime() const
</span><span class="cx"> {
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- return toMediaTime(CMTimebaseGetTime([m_synchronizer timebase]));
-#else
</del><span class="cx"> return MediaTime::createWithDouble(m_clock->currentTime());
</span><del>-#endif
</del><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> MediaPlayer::NetworkState MediaPlayerPrivateMediaStreamAVFObjC::networkState() const
</span><span class="lines">@@ -925,19 +738,11 @@
</span><span class="cx"> if (!m_playing || streamTime().toDouble() < 0)
</span><span class="cx"> return;
</span><span class="cx">
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- if (!CMTimebaseGetEffectiveRate([m_synchronizer timebase]))
- return;
-#endif
-
</del><span class="cx"> switch (track.type()) {
</span><span class="cx"> case RealtimeMediaSource::None:
</span><span class="cx"> // Do nothing.
</span><span class="cx"> break;
</span><span class="cx"> case RealtimeMediaSource::Audio:
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- enqueueAudioSample(track, mediaSample);
-#endif
</del><span class="cx"> break;
</span><span class="cx"> case RealtimeMediaSource::Video:
</span><span class="cx"> if (&track == m_activeVideoTrack.get())
</span><span class="lines">@@ -1037,36 +842,23 @@
</span><span class="cx"> {
</span><span class="cx"> MediaStreamTrackPrivateVector currentTracks = m_mediaStreamPrivate->tracks();
</span><span class="cx">
</span><del>- Function<void(RefPtr<AudioTrackPrivateMediaStream>, int, TrackState)> setAudioTrackState = [this](auto track, int index, TrackState state)
</del><ins>+ Function<void(RefPtr<AudioTrackPrivateMediaStreamCocoa>, int, TrackState)> setAudioTrackState = [this](auto track, int index, TrackState state)
</ins><span class="cx"> {
</span><span class="cx"> switch (state) {
</span><span class="cx"> case TrackState::Remove:
</span><del>- track->streamTrack().removeObserver(*this);
</del><span class="cx"> m_player->removeAudioTrack(*track);
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- destroyAudioRenderer(track->id());
-#endif
</del><span class="cx"> break;
</span><span class="cx"> case TrackState::Add:
</span><del>- track->streamTrack().addObserver(*this);
</del><span class="cx"> m_player->addAudioTrack(*track);
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- createAudioRenderer(track->id());
-#endif
</del><span class="cx"> break;
</span><span class="cx"> case TrackState::Configure:
</span><span class="cx"> track->setTrackIndex(index);
</span><span class="cx"> bool enabled = track->streamTrack().enabled() && !track->streamTrack().muted();
</span><span class="cx"> track->setEnabled(enabled);
</span><del>-#if USE(RENDER_SYNCHRONIZER)
- auto renderer = m_audioRenderers.get(track->id());
- ASSERT(renderer);
- renderer.get().muted = !enabled;
-#endif
</del><span class="cx"> break;
</span><span class="cx"> }
</span><span class="cx"> };
</span><del>- updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &AudioTrackPrivateMediaStream::create, setAudioTrackState);
</del><ins>+ updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &AudioTrackPrivateMediaStreamCocoa::create, setAudioTrackState);
</ins><span class="cx">
</span><span class="cx"> Function<void(RefPtr<VideoTrackPrivateMediaStream>, int, TrackState)> setVideoTrackState = [&](auto track, int index, TrackState state)
</span><span class="cx"> {
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreamAudioTrackPrivateMediaStreamh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -32,7 +32,7 @@
</span><span class="cx">
</span><span class="cx"> namespace WebCore {
</span><span class="cx">
</span><del>-class AudioTrackPrivateMediaStream final : public AudioTrackPrivate {
</del><ins>+class AudioTrackPrivateMediaStream : public AudioTrackPrivate {
</ins><span class="cx"> WTF_MAKE_NONCOPYABLE(AudioTrackPrivateMediaStream)
</span><span class="cx"> public:
</span><span class="cx"> static RefPtr<AudioTrackPrivateMediaStream> create(MediaStreamTrackPrivate& streamTrack)
</span><span class="lines">@@ -53,7 +53,7 @@
</span><span class="cx"> MediaTime timelineOffset() const { return m_timelineOffset; }
</span><span class="cx"> void setTimelineOffset(const MediaTime& offset) { m_timelineOffset = offset; }
</span><span class="cx">
</span><del>-private:
</del><ins>+protected:
</ins><span class="cx"> AudioTrackPrivateMediaStream(MediaStreamTrackPrivate& track)
</span><span class="cx"> : m_streamTrack(track)
</span><span class="cx"> , m_id(track.id())
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreamMediaStreamTrackPrivatecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -52,12 +52,12 @@
</span><span class="cx"> , m_isEnabled(true)
</span><span class="cx"> , m_isEnded(false)
</span><span class="cx"> {
</span><del>- m_source->addObserver(this);
</del><ins>+ m_source->addObserver(*this);
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> MediaStreamTrackPrivate::~MediaStreamTrackPrivate()
</span><span class="cx"> {
</span><del>- m_source->removeObserver(this);
</del><ins>+ m_source->removeObserver(*this);
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> void MediaStreamTrackPrivate::addObserver(MediaStreamTrackPrivate::Observer& observer)
</span><span class="lines">@@ -198,7 +198,7 @@
</span><span class="cx"> return !m_isEnded;
</span><span class="cx"> }
</span><span class="cx">
</span><del>-void MediaStreamTrackPrivate::sourceHasMoreMediaData(MediaSample& mediaSample)
</del><ins>+void MediaStreamTrackPrivate::videoSampleAvailable(MediaSample& mediaSample)
</ins><span class="cx"> {
</span><span class="cx"> mediaSample.setTrackID(id());
</span><span class="cx"> for (auto& observer : m_observers)
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreamMediaStreamTrackPrivateh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -100,7 +100,7 @@
</span><span class="cx"> void sourceMutedChanged() final;
</span><span class="cx"> void sourceSettingsChanged() final;
</span><span class="cx"> bool preventSourceFromStopping() final;
</span><del>- void sourceHasMoreMediaData(MediaSample&) final;
</del><ins>+ void videoSampleAvailable(MediaSample&) final;
</ins><span class="cx">
</span><span class="cx"> Vector<Observer*> m_observers;
</span><span class="cx"> Ref<RealtimeMediaSource> m_source;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreamRealtimeMediaSourcecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -67,16 +67,16 @@
</span><span class="cx"> m_remote = false;
</span><span class="cx"> }
</span><span class="cx">
</span><del>-void RealtimeMediaSource::addObserver(RealtimeMediaSource::Observer* observer)
</del><ins>+void RealtimeMediaSource::addObserver(RealtimeMediaSource::Observer& observer)
</ins><span class="cx"> {
</span><del>- m_observers.append(observer);
</del><ins>+ m_observers.append(&observer);
</ins><span class="cx"> }
</span><span class="cx">
</span><del>-void RealtimeMediaSource::removeObserver(RealtimeMediaSource::Observer* observer)
</del><ins>+void RealtimeMediaSource::removeObserver(RealtimeMediaSource::Observer& observer)
</ins><span class="cx"> {
</span><del>- size_t pos = m_observers.find(observer);
- if (pos != notFound)
- m_observers.remove(pos);
</del><ins>+ m_observers.removeFirstMatching([&observer](auto* anObserver) {
+ return anObserver == &observer;
+ });
</ins><span class="cx">
</span><span class="cx"> if (!m_observers.size())
</span><span class="cx"> stop();
</span><span class="lines">@@ -112,12 +112,19 @@
</span><span class="cx"> });
</span><span class="cx"> }
</span><span class="cx">
</span><del>-void RealtimeMediaSource::mediaDataUpdated(MediaSample& mediaSample)
</del><ins>+void RealtimeMediaSource::videoSampleAvailable(MediaSample& mediaSample)
</ins><span class="cx"> {
</span><del>- for (auto& observer : m_observers)
- observer->sourceHasMoreMediaData(mediaSample);
</del><ins>+ ASSERT(isMainThread());
+ for (const auto& observer : m_observers)
+ observer->videoSampleAvailable(mediaSample);
</ins><span class="cx"> }
</span><span class="cx">
</span><ins>+void RealtimeMediaSource::audioSamplesAvailable(const MediaTime& time, void* audioData, const AudioStreamDescription& description, size_t numberOfFrames)
+{
+ for (const auto& observer : m_observers)
+ observer->audioSamplesAvailable(time, audioData, description, numberOfFrames);
+}
+
</ins><span class="cx"> bool RealtimeMediaSource::readonly() const
</span><span class="cx"> {
</span><span class="cx"> return m_readonly;
</span><span class="lines">@@ -130,7 +137,7 @@
</span><span class="cx">
</span><span class="cx"> m_stopped = true;
</span><span class="cx">
</span><del>- for (auto* observer : m_observers) {
</del><ins>+ for (const auto& observer : m_observers) {
</ins><span class="cx"> if (observer != callingObserver)
</span><span class="cx"> observer->sourceStopped();
</span><span class="cx"> }
</span><span class="lines">@@ -143,7 +150,7 @@
</span><span class="cx"> if (stopped())
</span><span class="cx"> return;
</span><span class="cx">
</span><del>- for (auto* observer : m_observers) {
</del><ins>+ for (const auto& observer : m_observers) {
</ins><span class="cx"> if (observer->preventSourceFromStopping())
</span><span class="cx"> return;
</span><span class="cx"> }
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreamRealtimeMediaSourceh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -47,8 +47,13 @@
</span><span class="cx"> #include <wtf/WeakPtr.h>
</span><span class="cx"> #include <wtf/text/WTFString.h>
</span><span class="cx">
</span><ins>+namespace WTF {
+class MediaTime;
+}
+
</ins><span class="cx"> namespace WebCore {
</span><span class="cx">
</span><ins>+class AudioStreamDescription;
</ins><span class="cx"> class FloatRect;
</span><span class="cx"> class GraphicsContext;
</span><span class="cx"> class MediaStreamPrivate;
</span><span class="lines">@@ -68,8 +73,11 @@
</span><span class="cx"> // Observer state queries.
</span><span class="cx"> virtual bool preventSourceFromStopping() = 0;
</span><span class="cx">
</span><del>- // Media data changes.
- virtual void sourceHasMoreMediaData(MediaSample&) = 0;
</del><ins>+ // Called on the main thread.
+ virtual void videoSampleAvailable(MediaSample&) { }
+
+ // May be called on a background thread.
+ virtual void audioSamplesAvailable(const MediaTime&, void* /*audioData*/, const AudioStreamDescription&, size_t /*numberOfFrames*/) { }
</ins><span class="cx"> };
</span><span class="cx">
</span><span class="cx"> virtual ~RealtimeMediaSource() { }
</span><span class="lines">@@ -99,7 +107,9 @@
</span><span class="cx"> virtual bool supportsConstraints(const MediaConstraints&, String&);
</span><span class="cx">
</span><span class="cx"> virtual void settingsDidChange();
</span><del>- void mediaDataUpdated(MediaSample&);
</del><ins>+
+ void videoSampleAvailable(MediaSample&);
+ void audioSamplesAvailable(const MediaTime&, void*, const AudioStreamDescription&, size_t);
</ins><span class="cx">
</span><span class="cx"> bool stopped() const { return m_stopped; }
</span><span class="cx">
</span><span class="lines">@@ -112,8 +122,8 @@
</span><span class="cx"> virtual bool remote() const { return m_remote; }
</span><span class="cx"> virtual void setRemote(bool remote) { m_remote = remote; }
</span><span class="cx">
</span><del>- void addObserver(Observer*);
- void removeObserver(Observer*);
</del><ins>+ void addObserver(Observer&);
+ void removeObserver(Observer&);
</ins><span class="cx">
</span><span class="cx"> virtual void startProducingData() { }
</span><span class="cx"> virtual void stopProducingData() { }
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAVAudioCaptureSourceh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -30,8 +30,10 @@
</span><span class="cx">
</span><span class="cx"> #include "AVMediaCaptureSource.h"
</span><span class="cx"> #include "AudioCaptureSourceProviderObjC.h"
</span><ins>+#include "CAAudioStreamDescription.h"
</ins><span class="cx"> #include <wtf/Lock.h>
</span><span class="cx">
</span><ins>+typedef struct AudioBufferList AudioBufferList;
</ins><span class="cx"> typedef struct AudioStreamBasicDescription AudioStreamBasicDescription;
</span><span class="cx"> typedef const struct opaqueCMFormatDescription *CMFormatDescriptionRef;
</span><span class="cx">
</span><span class="lines">@@ -64,9 +66,11 @@
</span><span class="cx"> AudioSourceProvider* audioSourceProvider() override;
</span><span class="cx">
</span><span class="cx"> RetainPtr<AVCaptureConnection> m_audioConnection;
</span><ins>+ size_t m_listBufferSize { 0 };
+ std::unique_ptr<AudioBufferList> m_list;
</ins><span class="cx">
</span><span class="cx"> RefPtr<WebAudioSourceProviderAVFObjC> m_audioSourceProvider;
</span><del>- std::unique_ptr<AudioStreamBasicDescription> m_inputDescription;
</del><ins>+ std::unique_ptr<CAAudioStreamDescription> m_inputDescription;
</ins><span class="cx"> Vector<AudioSourceObserverObjC*> m_observers;
</span><span class="cx"> Lock m_lock;
</span><span class="cx"> };
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAVAudioCaptureSourcemm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -28,7 +28,9 @@
</span><span class="cx">
</span><span class="cx"> #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
</span><span class="cx">
</span><ins>+#import "AudioSampleBufferList.h"
</ins><span class="cx"> #import "AudioSourceObserverObjC.h"
</span><ins>+#import "CAAudioStreamDescription.h"
</ins><span class="cx"> #import "Logging.h"
</span><span class="cx"> #import "MediaConstraints.h"
</span><span class="cx"> #import "MediaSampleAVFObjC.h"
</span><span class="lines">@@ -91,7 +93,6 @@
</span><span class="cx"> AVAudioCaptureSource::AVAudioCaptureSource(AVCaptureDeviceTypedef* device, const AtomicString& id)
</span><span class="cx"> : AVMediaCaptureSource(device, id, RealtimeMediaSource::Audio)
</span><span class="cx"> {
</span><del>- m_inputDescription = std::make_unique<AudioStreamBasicDescription>();
</del><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> AVAudioCaptureSource::~AVAudioCaptureSource()
</span><span class="lines">@@ -120,8 +121,8 @@
</span><span class="cx"> {
</span><span class="cx"> LockHolder lock(m_lock);
</span><span class="cx"> m_observers.append(&observer);
</span><del>- if (m_inputDescription->mSampleRate)
- observer.prepare(m_inputDescription.get());
</del><ins>+ if (m_inputDescription)
+ observer.prepare(&m_inputDescription->streamDescription());
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> void AVAudioCaptureSource::removeObserver(AudioSourceObserverObjC& observer)
</span><span class="lines">@@ -162,7 +163,7 @@
</span><span class="cx"> LockHolder lock(m_lock);
</span><span class="cx">
</span><span class="cx"> m_audioConnection = nullptr;
</span><del>- m_inputDescription = std::make_unique<AudioStreamBasicDescription>();
</del><ins>+ m_inputDescription = nullptr;
</ins><span class="cx">
</span><span class="cx"> for (auto& observer : m_observers)
</span><span class="cx"> observer->unprepare();
</span><span class="lines">@@ -174,23 +175,6 @@
</span><span class="cx"> m_audioSourceProvider = nullptr;
</span><span class="cx"> }
</span><span class="cx">
</span><del>-static bool operator==(const AudioStreamBasicDescription& a, const AudioStreamBasicDescription& b)
-{
- return a.mSampleRate == b.mSampleRate
- && a.mFormatID == b.mFormatID
- && a.mFormatFlags == b.mFormatFlags
- && a.mBytesPerPacket == b.mBytesPerPacket
- && a.mFramesPerPacket == b.mFramesPerPacket
- && a.mBytesPerFrame == b.mBytesPerFrame
- && a.mChannelsPerFrame == b.mChannelsPerFrame
- && a.mBitsPerChannel == b.mBitsPerChannel;
-}
-
-static bool operator!=(const AudioStreamBasicDescription& a, const AudioStreamBasicDescription& b)
-{
- return !(a == b);
-}
-
</del><span class="cx"> void AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*)
</span><span class="cx"> {
</span><span class="cx"> if (muted())
</span><span class="lines">@@ -200,11 +184,6 @@
</span><span class="cx"> if (!formatDescription)
</span><span class="cx"> return;
</span><span class="cx">
</span><del>- RetainPtr<CMSampleBufferRef> buffer = sampleBuffer;
- scheduleDeferredTask([this, buffer] {
- mediaDataUpdated(MediaSampleAVFObjC::create(buffer.get()));
- });
-
</del><span class="cx"> std::unique_lock<Lock> lock(m_lock, std::try_to_lock);
</span><span class="cx"> if (!lock.owns_lock()) {
</span><span class="cx"> // Failed to acquire the lock, just return instead of blocking.
</span><span class="lines">@@ -211,16 +190,31 @@
</span><span class="cx"> return;
</span><span class="cx"> }
</span><span class="cx">
</span><ins>+ const AudioStreamBasicDescription* streamDescription = CMAudioFormatDescriptionGetStreamBasicDescription(formatDescription);
+ if (!m_inputDescription || *m_inputDescription != *streamDescription) {
+ m_inputDescription = std::make_unique<CAAudioStreamDescription>(*streamDescription);
+ m_listBufferSize = AudioSampleBufferList::audioBufferListSizeForStream(*m_inputDescription.get());
+ m_list = std::unique_ptr<AudioBufferList>(static_cast<AudioBufferList*>(::operator new (m_listBufferSize)));
+ memset(m_list.get(), 0, m_listBufferSize);
+ m_list->mNumberBuffers = m_inputDescription->numberOfChannelStreams();
+
+ if (!m_observers.isEmpty()) {
+ for (auto& observer : m_observers)
+ observer->prepare(streamDescription);
+ }
+ }
+
+ CMItemCount frameCount = CMSampleBufferGetNumSamples(sampleBuffer);
+ CMBlockBufferRef buffer = nil;
+ OSStatus err = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, nullptr, m_list.get(), m_listBufferSize, kCFAllocatorSystemDefault, kCFAllocatorSystemDefault, kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, &buffer);
+ if (!err)
+ audioSamplesAvailable(toMediaTime(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)), m_list->mBuffers[0].mData, CAAudioStreamDescription(*streamDescription), frameCount);
+ else
+ LOG_ERROR("AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection(%p) - CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer returned error %d (%.4s)", this, (int)err, (char*)&err);
+
</ins><span class="cx"> if (m_observers.isEmpty())
</span><span class="cx"> return;
</span><span class="cx">
</span><del>- const AudioStreamBasicDescription* streamDescription = CMAudioFormatDescriptionGetStreamBasicDescription(formatDescription);
- if (*m_inputDescription != *streamDescription) {
- m_inputDescription = std::make_unique<AudioStreamBasicDescription>(*streamDescription);
- for (auto& observer : m_observers)
- observer->prepare(m_inputDescription.get());
- }
-
</del><span class="cx"> for (auto& observer : m_observers)
</span><span class="cx"> observer->process(formatDescription, sampleBuffer);
</span><span class="cx"> }
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAVVideoCaptureSourcemm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -423,7 +423,7 @@
</span><span class="cx"> if (settingsChanged)
</span><span class="cx"> settingsDidChange();
</span><span class="cx">
</span><del>- mediaDataUpdated(MediaSampleAVFObjC::create(m_buffer.get()));
</del><ins>+ videoSampleAvailable(MediaSampleAVFObjC::create(m_buffer.get()));
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*)
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAudioTrackPrivateMediaStreamCocoacpp"></a>
<div class="addfile"><h4>Added: trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp (0 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp         (rev 0)
+++ trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.cpp        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -0,0 +1,254 @@
</span><ins>+/*
+ * Copyright (C) 2017 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ * notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ * notice, this list of conditions and the following disclaimer in the
+ * documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
+ * THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#include "config.h"
+#include "AudioTrackPrivateMediaStreamCocoa.h"
+
+#include "AudioSampleBufferList.h"
+#include "AudioSampleDataSource.h"
+#include "AudioSession.h"
+#include "CAAudioStreamDescription.h"
+#include "Logging.h"
+
+#include "CoreMediaSoftLink.h"
+
+#if ENABLE(VIDEO_TRACK)
+
+namespace WebCore {
+
+const int renderBufferSize = 128;
+
+AudioTrackPrivateMediaStreamCocoa::AudioTrackPrivateMediaStreamCocoa(MediaStreamTrackPrivate& track)
+ : AudioTrackPrivateMediaStream(track)
+{
+ track.source().addObserver(*this);
+}
+
+AudioTrackPrivateMediaStreamCocoa::~AudioTrackPrivateMediaStreamCocoa()
+{
+ std::lock_guard<Lock> lock(m_internalStateLock);
+
+ streamTrack().source().removeObserver(*this);
+
+ if (m_dataSource)
+ m_dataSource->setPaused(true);
+
+ if (m_remoteIOUnit) {
+ AudioOutputUnitStop(m_remoteIOUnit);
+ AudioComponentInstanceDispose(m_remoteIOUnit);
+ m_remoteIOUnit = nullptr;
+ }
+
+ m_dataSource = nullptr;
+ m_inputDescription = nullptr;
+ m_outputDescription = nullptr;
+}
+
+void AudioTrackPrivateMediaStreamCocoa::playInternal()
+{
+ ASSERT(m_internalStateLock.isHeld());
+
+ if (m_isPlaying)
+ return;
+
+ if (m_remoteIOUnit) {
+ ASSERT(m_dataSource);
+ m_dataSource->setPaused(false);
+ if (!AudioOutputUnitStart(m_remoteIOUnit))
+ m_isPlaying = true;
+ }
+
+ m_autoPlay = !m_isPlaying;
+}
+
+void AudioTrackPrivateMediaStreamCocoa::play()
+{
+ std::lock_guard<Lock> lock(m_internalStateLock);
+ playInternal();
+}
+
+void AudioTrackPrivateMediaStreamCocoa::pause()
+{
+ std::lock_guard<Lock> lock(m_internalStateLock);
+
+ m_isPlaying = false;
+ m_autoPlay = false;
+
+ if (m_remoteIOUnit)
+ AudioOutputUnitStop(m_remoteIOUnit);
+ if (m_dataSource)
+ m_dataSource->setPaused(true);
+}
+
+void AudioTrackPrivateMediaStreamCocoa::setVolume(float volume)
+{
+ m_volume = volume;
+ if (m_dataSource)
+ m_dataSource->setVolume(m_volume);
+}
+
+OSStatus AudioTrackPrivateMediaStreamCocoa::setupAudioUnit()
+{
+ ASSERT(m_internalStateLock.isHeld());
+
+ AudioComponentDescription ioUnitDescription { kAudioUnitType_Output, 0, kAudioUnitManufacturer_Apple, 0, 0 };
+#if PLATFORM(IOS)
+ ioUnitDescription.componentSubType = kAudioUnitSubType_RemoteIO;
+#else
+ ioUnitDescription.componentSubType = kAudioUnitSubType_DefaultOutput;
+#endif
+
+ AudioComponent ioComponent = AudioComponentFindNext(nullptr, &ioUnitDescription);
+ ASSERT(ioComponent);
+ if (!ioComponent) {
+ LOG(Media, "AudioTrackPrivateMediaStreamCocoa::setupAudioUnit(%p) unable to find remote IO unit component", this);
+ return -1;
+ }
+
+ OSStatus err = AudioComponentInstanceNew(ioComponent, &m_remoteIOUnit);
+ if (err) {
+ LOG(Media, "AudioTrackPrivateMediaStreamCocoa::setupAudioUnit(%p) unable to open vpio unit, error %d (%.4s)", this, (int)err, (char*)&err);
+ return -1;
+ }
+
+#if PLATFORM(IOS)
+ UInt32 param = 1;
+ err = AudioUnitSetProperty(m_remoteIOUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, 0, &param, sizeof(param));
+ if (err) {
+ LOG(Media, "AudioTrackPrivateMediaStreamCocoa::setupAudioUnit(%p) unable to enable vpio unit output, error %d (%.4s)", this, (int)err, (char*)&err);
+ return err;
+ }
+#endif
+
+ AURenderCallbackStruct callback = { inputProc, this };
+ err = AudioUnitSetProperty(m_remoteIOUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, 0, &callback, sizeof(callback));
+ if (err) {
+ LOG(Media, "AudioTrackPrivateMediaStreamCocoa::setupAudioUnit(%p) unable to set vpio unit speaker proc, error %d (%.4s)", this, (int)err, (char*)&err);
+ return err;
+ }
+
+ AudioStreamBasicDescription outputDescription = { };
+ UInt32 size = sizeof(outputDescription);
+ err = AudioUnitGetProperty(m_remoteIOUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &outputDescription, &size);
+ if (err) {
+ LOG(Media, "AudioTrackPrivateMediaStreamCocoa::setupAudioUnits(%p) unable to get input stream format, error %d (%.4s)", this, (int)err, (char*)&err);
+ return err;
+ }
+
+ outputDescription = m_inputDescription->streamDescription();
+ outputDescription.mSampleRate = AudioSession::sharedSession().sampleRate();
+
+ err = AudioUnitSetProperty(m_remoteIOUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &outputDescription, sizeof(outputDescription));
+ if (err) {
+ LOG(Media, "AudioTrackPrivateMediaStreamCocoa::setupAudioUnits(%p) unable to set input stream format, error %d (%.4s)", this, (int)err, (char*)&err);
+ return err;
+ }
+ m_outputDescription = std::make_unique<CAAudioStreamDescription>(outputDescription);
+
+ err = AudioUnitInitialize(m_remoteIOUnit);
+ if (err) {
+ LOG(Media, "AudioTrackPrivateMediaStreamCocoa::setupAudioUnits(%p) AudioUnitInitialize() failed, error %d (%.4s)", this, (int)err, (char*)&err);
+ return err;
+ }
+
+ AudioSession::sharedSession().setPreferredBufferSize(renderBufferSize);
+
+ return err;
+}
+
+void AudioTrackPrivateMediaStreamCocoa::audioSamplesAvailable(const MediaTime& sampleTime, void* audioData, const AudioStreamDescription& description, size_t sampleCount)
+{
+ ASSERT(description.platformDescription().type == PlatformDescription::CAAudioStreamBasicType);
+
+ std::lock_guard<Lock> lock(m_internalStateLock);
+
+ CAAudioStreamDescription streamDescription = toCAAudioStreamDescription(description);
+ if (!m_inputDescription || *m_inputDescription != description) {
+
+ m_inputDescription = nullptr;
+ m_outputDescription = nullptr;
+
+ if (m_remoteIOUnit) {
+ AudioOutputUnitStop(m_remoteIOUnit);
+ AudioComponentInstanceDispose(m_remoteIOUnit);
+ m_remoteIOUnit = nullptr;
+ }
+
+ m_inputDescription = std::make_unique<CAAudioStreamDescription>(streamDescription);
+ if (setupAudioUnit()) {
+ m_inputDescription = nullptr;
+ return;
+ }
+
+ if (!m_dataSource)
+ m_dataSource = AudioSampleDataSource::create(description.sampleRate() * 2);
+ if (!m_dataSource)
+ return;
+
+ if (m_dataSource->setInputFormat(streamDescription))
+ return;
+ if (m_dataSource->setOutputFormat(*m_outputDescription.get()))
+ return;
+
+ m_dataSource->setVolume(m_volume);
+ }
+
+ m_dataSource->pushSamples(m_inputDescription->streamDescription(), sampleTime, audioData, sampleCount);
+
+ if (m_autoPlay)
+ playInternal();
+}
+
+void AudioTrackPrivateMediaStreamCocoa::sourceStopped()
+{
+ pause();
+}
+
+OSStatus AudioTrackPrivateMediaStreamCocoa::render(UInt32 sampleCount, AudioBufferList& ioData, UInt32 /*inBusNumber*/, const AudioTimeStamp& timeStamp, AudioUnitRenderActionFlags& actionFlags)
+{
+ std::unique_lock<Lock> lock(m_internalStateLock, std::try_to_lock);
+ if (!lock.owns_lock())
+ return kAudioConverterErr_UnspecifiedError;
+
+ if (!m_isPlaying || m_muted || !m_dataSource || streamTrack().muted() || streamTrack().ended() || !streamTrack().enabled()) {
+ AudioSampleBufferList::zeroABL(ioData, static_cast<size_t>(sampleCount));
+ actionFlags = kAudioUnitRenderAction_OutputIsSilence;
+ return 0;
+ }
+
+ m_dataSource->pullSamples(ioData, static_cast<size_t>(sampleCount), timeStamp.mSampleTime, timeStamp.mHostTime, AudioSampleDataSource::Copy);
+
+ return 0;
+}
+
+OSStatus AudioTrackPrivateMediaStreamCocoa::inputProc(void* userData, AudioUnitRenderActionFlags* actionFlags, const AudioTimeStamp* timeStamp, UInt32 inBusNumber, UInt32 sampleCount, AudioBufferList* ioData)
+{
+ return static_cast<AudioTrackPrivateMediaStreamCocoa*>(userData)->render(sampleCount, *ioData, inBusNumber, *timeStamp, *actionFlags);
+}
+
+
+} // namespace WebCore
+
+#endif // ENABLE(VIDEO_TRACK)
</ins></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacAudioTrackPrivateMediaStreamCocoah"></a>
<div class="addfile"><h4>Added: trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h (0 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h         (rev 0)
+++ trunk/Source/WebCore/platform/mediastream/mac/AudioTrackPrivateMediaStreamCocoa.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -0,0 +1,94 @@
</span><ins>+/*
+ * Copyright (C) 2017 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ * notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ * notice, this list of conditions and the following disclaimer in the
+ * documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
+ * THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#pragma once
+
+#if ENABLE(VIDEO_TRACK) && ENABLE(MEDIA_STREAM)
+
+#include "AudioSourceObserverObjC.h"
+#include "AudioTrackPrivateMediaStream.h"
+#include <AudioToolbox/AudioToolbox.h>
+#include <CoreAudio/CoreAudioTypes.h>
+#include <wtf/Lock.h>
+
+namespace WebCore {
+
+class AudioSampleDataSource;
+class AudioSampleBufferList;
+class CAAudioStreamDescription;
+
+class AudioTrackPrivateMediaStreamCocoa final : public AudioTrackPrivateMediaStream, private RealtimeMediaSource::Observer {
+ WTF_MAKE_NONCOPYABLE(AudioTrackPrivateMediaStreamCocoa)
+public:
+ static RefPtr<AudioTrackPrivateMediaStreamCocoa> create(MediaStreamTrackPrivate& streamTrack)
+ {
+ return adoptRef(*new AudioTrackPrivateMediaStreamCocoa(streamTrack));
+ }
+
+ void play();
+ void pause();
+ bool isPlaying() { return m_isPlaying; }
+
+ void setVolume(float);
+ float volume() const { return m_volume; }
+
+ void setMuted(bool muted) { m_muted = muted; }
+ bool muted() const { return m_muted; }
+
+private:
+ AudioTrackPrivateMediaStreamCocoa(MediaStreamTrackPrivate&);
+ ~AudioTrackPrivateMediaStreamCocoa();
+
+ // RealtimeMediaSource::Observer
+ void sourceStopped() final;
+ void sourceMutedChanged() final { }
+ void sourceSettingsChanged() final { }
+ bool preventSourceFromStopping() final { return false; }
+ void audioSamplesAvailable(const MediaTime&, void*, const AudioStreamDescription&, size_t) final;
+
+ static OSStatus inputProc(void*, AudioUnitRenderActionFlags*, const AudioTimeStamp*, UInt32 inBusNumber, UInt32 numberOfFrames, AudioBufferList*);
+ OSStatus render(UInt32 sampleCount, AudioBufferList&, UInt32 inBusNumber, const AudioTimeStamp&, AudioUnitRenderActionFlags&);
+
+ OSStatus setupAudioUnit();
+ void cleanup();
+ void zeroBufferList(AudioBufferList&, size_t);
+ void playInternal();
+
+ AudioComponentInstance m_remoteIOUnit { nullptr };
+ std::unique_ptr<CAAudioStreamDescription> m_inputDescription;
+ std::unique_ptr<CAAudioStreamDescription> m_outputDescription;
+
+ RefPtr<AudioSampleDataSource> m_dataSource;
+
+ Lock m_internalStateLock;
+ float m_volume { 1 };
+ bool m_isPlaying { false };
+ bool m_autoPlay { false };
+ bool m_muted { false };
+};
+
+}
+
+#endif // ENABLE(VIDEO_TRACK) && ENABLE(MEDIA_STREAM)
</ins></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMach"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -72,7 +72,7 @@
</span><span class="cx">
</span><span class="cx"> uint32_t m_maximiumFrameCount;
</span><span class="cx"> uint32_t m_sampleRate { 44100 };
</span><del>- double m_bytesPerFrame { sizeof(Float32) };
</del><ins>+ uint64_t m_bytesEmitted { 0 };
</ins><span class="cx">
</span><span class="cx"> RetainPtr<CMFormatDescriptionRef> m_formatDescription;
</span><span class="cx"> AudioStreamBasicDescription m_streamFormat;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacMockRealtimeAudioSourceMacmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -32,6 +32,8 @@
</span><span class="cx"> #import "MockRealtimeAudioSourceMac.h"
</span><span class="cx">
</span><span class="cx"> #if ENABLE(MEDIA_STREAM)
</span><ins>+#import "AudioSampleBufferList.h"
+#import "CAAudioStreamDescription.h"
</ins><span class="cx"> #import "MediaConstraints.h"
</span><span class="cx"> #import "MediaSampleAVFObjC.h"
</span><span class="cx"> #import "NotImplemented.h"
</span><span class="lines">@@ -49,6 +51,11 @@
</span><span class="cx">
</span><span class="cx"> namespace WebCore {
</span><span class="cx">
</span><ins>+static inline size_t alignTo16Bytes(size_t size)
+{
+ return (size + 15) & ~15;
+}
+
</ins><span class="cx"> RefPtr<MockRealtimeAudioSource> MockRealtimeAudioSource::create(const String& name, const MediaConstraints* constraints)
</span><span class="cx"> {
</span><span class="cx"> auto source = adoptRef(new MockRealtimeAudioSourceMac(name));
</span><span class="lines">@@ -92,7 +99,11 @@
</span><span class="cx"> {
</span><span class="cx"> ASSERT(m_formatDescription);
</span><span class="cx">
</span><del>- CMTime startTime = CMTimeMake(elapsedTime() * m_sampleRate, m_sampleRate);
</del><ins>+ CMTime startTime = CMTimeMake(m_bytesEmitted, m_sampleRate);
+ m_bytesEmitted += frameCount;
+
+ audioSamplesAvailable(toMediaTime(startTime), m_audioBufferList->mBuffers[0].mData, CAAudioStreamDescription(m_streamFormat), frameCount);
+
</ins><span class="cx"> CMSampleBufferRef sampleBuffer;
</span><span class="cx"> OSStatus result = CMAudioSampleBufferCreateWithPacketDescriptions(nullptr, nullptr, true, nullptr, nullptr, m_formatDescription.get(), frameCount, startTime, nullptr, &sampleBuffer);
</span><span class="cx"> ASSERT(sampleBuffer);
</span><span class="lines">@@ -108,9 +119,7 @@
</span><span class="cx"> result = CMSampleBufferSetDataReady(sampleBuffer);
</span><span class="cx"> ASSERT(!result);
</span><span class="cx">
</span><del>- mediaDataUpdated(MediaSampleAVFObjC::create(sampleBuffer));
-
- for (auto& observer : m_observers)
</del><ins>+ for (const auto& observer : m_observers)
</ins><span class="cx"> observer->process(m_formatDescription.get(), sampleBuffer);
</span><span class="cx"> }
</span><span class="cx">
</span><span class="lines">@@ -119,10 +128,24 @@
</span><span class="cx"> m_maximiumFrameCount = WTF::roundUpToPowerOfTwo(renderInterval() / 1000. * m_sampleRate * 2);
</span><span class="cx"> ASSERT(m_maximiumFrameCount);
</span><span class="cx">
</span><ins>+ const int bytesPerFloat = sizeof(Float32);
+ const int bitsPerByte = 8;
+ int channelCount = 1;
+ m_streamFormat = { };
+ m_streamFormat.mSampleRate = m_sampleRate;
+ m_streamFormat.mFormatID = kAudioFormatLinearPCM;
+ m_streamFormat.mFormatFlags = kAudioFormatFlagsNativeFloatPacked;
+ m_streamFormat.mBytesPerPacket = bytesPerFloat * channelCount;
+ m_streamFormat.mFramesPerPacket = 1;
+ m_streamFormat.mBytesPerFrame = bytesPerFloat * channelCount;
+ m_streamFormat.mChannelsPerFrame = channelCount;
+ m_streamFormat.mBitsPerChannel = bitsPerByte * bytesPerFloat;
+
</ins><span class="cx"> // AudioBufferList is a variable-length struct, so create on the heap with a generic new() operator
</span><span class="cx"> // with a custom size, and initialize the struct manually.
</span><del>- uint32_t bufferDataSize = m_bytesPerFrame * m_maximiumFrameCount;
- uint32_t baseSize = offsetof(AudioBufferList, mBuffers) + sizeof(AudioBuffer);
</del><ins>+ uint32_t bufferDataSize = m_streamFormat.mBytesPerFrame * m_maximiumFrameCount;
+ uint32_t baseSize = AudioSampleBufferList::audioBufferListSizeForStream(m_streamFormat);
+
</ins><span class="cx"> uint64_t bufferListSize = baseSize + bufferDataSize;
</span><span class="cx"> ASSERT(bufferListSize <= SIZE_MAX);
</span><span class="cx"> if (bufferListSize > SIZE_MAX)
</span><span class="lines">@@ -132,24 +155,9 @@
</span><span class="cx"> m_audioBufferList = std::unique_ptr<AudioBufferList>(static_cast<AudioBufferList*>(::operator new (m_audioBufferListBufferSize)));
</span><span class="cx"> memset(m_audioBufferList.get(), 0, m_audioBufferListBufferSize);
</span><span class="cx">
</span><del>- m_audioBufferList->mNumberBuffers = 1;
- auto& buffer = m_audioBufferList->mBuffers[0];
- buffer.mNumberChannels = 1;
- buffer.mDataByteSize = bufferDataSize;
- buffer.mData = reinterpret_cast<uint8_t*>(m_audioBufferList.get()) + baseSize;
</del><ins>+ uint8_t* bufferData = reinterpret_cast<uint8_t*>(m_audioBufferList.get()) + baseSize;
+ AudioSampleBufferList::configureBufferListForStream(*m_audioBufferList.get(), m_streamFormat, bufferData, bufferDataSize);
</ins><span class="cx">
</span><del>- const int bytesPerFloat = sizeof(Float32);
- const int bitsPerByte = 8;
- m_streamFormat = { };
- m_streamFormat.mSampleRate = m_sampleRate;
- m_streamFormat.mFormatID = kAudioFormatLinearPCM;
- m_streamFormat.mFormatFlags = kAudioFormatFlagsNativeFloatPacked | kAudioFormatFlagIsNonInterleaved;
- m_streamFormat.mBytesPerPacket = bytesPerFloat;
- m_streamFormat.mFramesPerPacket = 1;
- m_streamFormat.mBytesPerFrame = bytesPerFloat;
- m_streamFormat.mChannelsPerFrame = 1;
- m_streamFormat.mBitsPerChannel = bitsPerByte * bytesPerFloat;
-
</del><span class="cx"> CMFormatDescriptionRef formatDescription;
</span><span class="cx"> CMAudioFormatDescriptionCreate(NULL, &m_streamFormat, 0, NULL, 0, NULL, NULL, &formatDescription);
</span><span class="cx"> m_formatDescription = adoptCF(formatDescription);
</span><span class="lines">@@ -162,11 +170,12 @@
</span><span class="cx"> {
</span><span class="cx"> static double theta;
</span><span class="cx"> static const double frequencies[] = { 1500., 500. };
</span><ins>+ static const double tau = 2 * M_PI;
</ins><span class="cx">
</span><span class="cx"> if (!m_audioBufferList)
</span><span class="cx"> reconfigure();
</span><span class="cx">
</span><del>- uint32_t totalFrameCount = delta * m_sampleRate;
</del><ins>+ uint32_t totalFrameCount = alignTo16Bytes(delta * m_sampleRate);
</ins><span class="cx"> uint32_t frameCount = std::min(totalFrameCount, m_maximiumFrameCount);
</span><span class="cx"> double elapsed = elapsedTime();
</span><span class="cx"> while (frameCount) {
</span><span class="lines">@@ -180,7 +189,7 @@
</span><span class="cx"> case 0:
</span><span class="cx"> case 14: {
</span><span class="cx"> int index = fmod(elapsed, 1) * 2;
</span><del>- increment = 2.0 * M_PI * frequencies[index] / m_sampleRate;
</del><ins>+ increment = tau * frequencies[index] / m_sampleRate;
</ins><span class="cx"> silent = false;
</span><span class="cx"> break;
</span><span class="cx"> }
</span><span class="lines">@@ -193,10 +202,12 @@
</span><span class="cx"> continue;
</span><span class="cx"> }
</span><span class="cx">
</span><del>- buffer[frame] = sin(theta) * 0.25;
- theta += increment;
- if (theta > 2.0 * M_PI)
- theta -= 2.0 * M_PI;
</del><ins>+ float tone = sin(theta) * 0.25;
+ buffer[frame] = tone;
+
+ theta += increment;
+ if (theta > tau)
+ theta -= tau;
</ins><span class="cx"> elapsed += 1 / m_sampleRate;
</span><span class="cx"> }
</span><span class="cx">
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacMockRealtimeVideoSourceMacmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -125,7 +125,7 @@
</span><span class="cx"> auto pixelBuffer = pixelBufferFromCGImage(imageBuffer()->copyImage()->nativeImage().get());
</span><span class="cx"> auto sampleBuffer = CMSampleBufferFromPixelBuffer(pixelBuffer.get());
</span><span class="cx">
</span><del>- mediaDataUpdated(MediaSampleAVFObjC::create(sampleBuffer.get()));
</del><ins>+ videoSampleAvailable(MediaSampleAVFObjC::create(sampleBuffer.get()));
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> } // namespace WebCore
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacWebAudioSourceProviderAVFObjCh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -23,8 +23,7 @@
</span><span class="cx"> * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
</span><span class="cx"> */
</span><span class="cx">
</span><del>-#ifndef WebAudioSourceProviderAVFObjC_h
-#define WebAudioSourceProviderAVFObjC_h
</del><ins>+#pragma once
</ins><span class="cx">
</span><span class="cx"> #if ENABLE(WEB_AUDIO) && ENABLE(MEDIA_STREAM)
</span><span class="cx">
</span><span class="lines">@@ -31,6 +30,7 @@
</span><span class="cx"> #include "AudioCaptureSourceProviderObjC.h"
</span><span class="cx"> #include "AudioSourceObserverObjC.h"
</span><span class="cx"> #include "AudioSourceProvider.h"
</span><ins>+#include <wtf/Lock.h>
</ins><span class="cx"> #include <wtf/RefCounted.h>
</span><span class="cx"> #include <wtf/RefPtr.h>
</span><span class="cx">
</span><span class="lines">@@ -68,11 +68,11 @@
</span><span class="cx"> std::unique_ptr<AudioStreamBasicDescription> m_outputDescription;
</span><span class="cx"> std::unique_ptr<CARingBuffer> m_ringBuffer;
</span><span class="cx">
</span><del>- uint64_t m_writeAheadCount { 0 };
</del><span class="cx"> uint64_t m_writeCount { 0 };
</span><span class="cx"> uint64_t m_readCount { 0 };
</span><span class="cx"> AudioSourceProviderClient* m_client { nullptr };
</span><span class="cx"> AudioCaptureSourceProviderObjC* m_captureSource { nullptr };
</span><ins>+ Lock m_mutex;
</ins><span class="cx"> bool m_connected { false };
</span><span class="cx"> };
</span><span class="cx">
</span><span class="lines">@@ -79,5 +79,3 @@
</span><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> #endif
</span><del>-
-#endif
</del></span></pre></div>
<a id="trunkSourceWebCoreplatformmediastreammacWebAudioSourceProviderAVFObjCmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mediastream/mac/WebAudioSourceProviderAVFObjC.mm        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -1,5 +1,5 @@
</span><span class="cx"> /*
</span><del>- * Copyright (C) 2015 Apple Inc. All rights reserved.
</del><ins>+ * Copyright (C) 2015-2017 Apple Inc. All rights reserved.
</ins><span class="cx"> *
</span><span class="cx"> * Redistribution and use in source and binary forms, with or without
</span><span class="cx"> * modification, are permitted provided that the following conditions
</span><span class="lines">@@ -65,6 +65,8 @@
</span><span class="cx">
</span><span class="cx"> WebAudioSourceProviderAVFObjC::~WebAudioSourceProviderAVFObjC()
</span><span class="cx"> {
</span><ins>+ std::lock_guard<Lock> lock(m_mutex);
+
</ins><span class="cx"> if (m_converter) {
</span><span class="cx"> // FIXME: make and use a smart pointer for AudioConverter
</span><span class="cx"> AudioConverterDispose(m_converter);
</span><span class="lines">@@ -76,7 +78,8 @@
</span><span class="cx">
</span><span class="cx"> void WebAudioSourceProviderAVFObjC::provideInput(AudioBus* bus, size_t framesToProcess)
</span><span class="cx"> {
</span><del>- if (!m_ringBuffer) {
</del><ins>+ std::unique_lock<Lock> lock(m_mutex, std::try_to_lock);
+ if (!lock.owns_lock() || !m_ringBuffer) {
</ins><span class="cx"> bus->zero();
</span><span class="cx"> return;
</span><span class="cx"> }
</span><span class="lines">@@ -85,12 +88,12 @@
</span><span class="cx"> uint64_t endFrame = 0;
</span><span class="cx"> m_ringBuffer->getCurrentFrameBounds(startFrame, endFrame);
</span><span class="cx">
</span><del>- if (m_writeCount <= m_readCount + m_writeAheadCount) {
</del><ins>+ if (m_writeCount <= m_readCount) {
</ins><span class="cx"> bus->zero();
</span><span class="cx"> return;
</span><span class="cx"> }
</span><span class="cx">
</span><del>- uint64_t framesAvailable = endFrame - (m_readCount + m_writeAheadCount);
</del><ins>+ uint64_t framesAvailable = endFrame - m_readCount;
</ins><span class="cx"> if (framesAvailable < framesToProcess) {
</span><span class="cx"> framesToProcess = static_cast<size_t>(framesAvailable);
</span><span class="cx"> bus->zero();
</span><span class="lines">@@ -136,6 +139,8 @@
</span><span class="cx">
</span><span class="cx"> void WebAudioSourceProviderAVFObjC::prepare(const AudioStreamBasicDescription* format)
</span><span class="cx"> {
</span><ins>+ std::lock_guard<Lock> lock(m_mutex);
+
</ins><span class="cx"> LOG(Media, "WebAudioSourceProviderAVFObjC::prepare(%p)", this);
</span><span class="cx">
</span><span class="cx"> m_inputDescription = std::make_unique<AudioStreamBasicDescription>(*format);
</span><span class="lines">@@ -200,6 +205,8 @@
</span><span class="cx">
</span><span class="cx"> void WebAudioSourceProviderAVFObjC::unprepare()
</span><span class="cx"> {
</span><ins>+ std::lock_guard<Lock> lock(m_mutex);
+
</ins><span class="cx"> m_inputDescription = nullptr;
</span><span class="cx"> m_outputDescription = nullptr;
</span><span class="cx"> m_ringBuffer = nullptr;
</span><span class="lines">@@ -214,6 +221,8 @@
</span><span class="cx">
</span><span class="cx"> void WebAudioSourceProviderAVFObjC::process(CMFormatDescriptionRef, CMSampleBufferRef sampleBuffer)
</span><span class="cx"> {
</span><ins>+ std::lock_guard<Lock> lock(m_mutex);
+
</ins><span class="cx"> if (!m_ringBuffer)
</span><span class="cx"> return;
</span><span class="cx">
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformmockMockRealtimeAudioSourceh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/mock/MockRealtimeAudioSource.h (211727 => 211728)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/mock/MockRealtimeAudioSource.h        2017-02-06 17:07:24 UTC (rev 211727)
+++ trunk/Source/WebCore/platform/mock/MockRealtimeAudioSource.h        2017-02-06 17:22:27 UTC (rev 211728)
</span><span class="lines">@@ -56,7 +56,7 @@
</span><span class="cx"> virtual void render(double) { }
</span><span class="cx">
</span><span class="cx"> double elapsedTime();
</span><del>- static int renderInterval() { return 125; }
</del><ins>+ static int renderInterval() { return 60; }
</ins><span class="cx">
</span><span class="cx"> private:
</span><span class="cx">
</span></span></pre>
</div>
</div>
</body>
</html>