<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[182141] trunk</title>
</head>
<body>
<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; }
#msg dl a { font-weight: bold}
#msg dl a:link { color:#fc3; }
#msg dl a:active { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/182141">182141</a></dd>
<dt>Author</dt> <dd>jer.noble@apple.com</dd>
<dt>Date</dt> <dd>2015-03-30 09:15:00 -0700 (Mon, 30 Mar 2015)</dd>
</dl>
<h3>Log Message</h3>
<pre>[iOS] When Web Audio is interrupted by a phone call, it cannot be restarted.
https://bugs.webkit.org/show_bug.cgi?id=143190
Reviewed by Darin Adler.
Source/WebCore:
Tests: webaudio/audiocontext-state-interrupted.html
webaudio/audiocontext-state.html
Implement the following methods and properties from the Web Audio spec:
close(), suspend(), resume(), onstatechange.
AudioContext will take more responsibility for tracking state and interruptions (and
AudioDestination will give up that responsibility). This means AudioContext must be a
MediaSessionClient, and own its own MediaSession. In return, AudioDestinationIOS and
AudioDestinationMac relinquish both.
* Modules/webaudio/AudioContext.cpp:
(WebCore::AudioContext::AudioContext): Set default values in header.
(WebCore::AudioContext::uninitialize): Call setState() instead of setting m_state.
(WebCore::AudioContext::addReaction): Added. Append the callback to the appropriate vector for the state.
(WebCore::AudioContext::setState): Added. Fire events and resolve the appropriate reaction callbacks for the new state.
(WebCore::AudioContext::state): Return a string representing the context's state.
(WebCore::AudioContext::stop): Close the event queue.
(WebCore::AudioContext::startRendering): Call setState().
(WebCore::AudioContext::fireCompletionEvent): Call setState().
(WebCore::AudioContext::suspendContext): Added. Add reaction callback and call suspend() on the destination node.
(WebCore::AudioContext::resumeContext): Added. Add reaction callback and call resume() on the destination node.
(WebCore::AudioContext::closeContext): Added. Add reaction callback and call close() on the destination node.
(WebCore::AudioContext::suspendPlayback): Added. Suspend playback and set state to interrupted.
(WebCore::AudioContext::mayResumePlayback): Added. Conditionally resume playback.
* bindings/js/JSAudioContextCustom.cpp:
(WebCore::JSAudioContext::suspend): Added. Create and return a new Promise object.
(WebCore::JSAudioContext::resume): Ditto.
(WebCore::JSAudioContext::close): Ditto.
* Modules/webaudio/AudioContext.idl: Add new methods and properties.
Extensive organizational changes were made to AudioContext.h to group instance
variables together and add C++11 initializers in their declarations:
* Modules/webaudio/AudioContext.h:
(WebCore::AudioContext::mediaType): Moved from AudioDestinationNode.
(WebCore::AudioContext::presentationType): Ditto.
(WebCore::AudioContext::canReceiveRemoteControlCommands): Ditto.
(WebCore::AudioContext::didReceiveRemoteControlCommand): Ditto.
(WebCore::AudioContext::overrideBackgroundPlaybackRestriction): Ditto.
Other changes to support the new AudioContext methods:
* Modules/webaudio/AudioDestinationNode.h:
(WebCore::AudioDestinationNode::resume): Add empty default virtual method.
(WebCore::AudioDestinationNode::suspend): Ditto.
(WebCore::AudioDestinationNode::close): Ditto.
* Modules/webaudio/DefaultAudioDestinationNode.cpp:
(WebCore::DefaultAudioDestinationNode::resume): Added. Pass to AudioDestination.
(WebCore::DefaultAudioDestinationNode::suspend): Ditto.
(WebCore::DefaultAudioDestinationNode::close): Ditto.
* Modules/webaudio/DefaultAudioDestinationNode.h:
* bindings/js/JSDOMPromise.h:
(WebCore::DeferredWrapper::resolve): Add an overload for a nullptr resolve value.
* dom/EventNames.h: Added 'statechange'.
* dom/ScriptExecutionContext.h:
(WebCore::ScriptExecutionContext::Task::Task): Add a constructor which takes a void() callback.
Modify MediaSession, AudioSession, and MediaSessionManager to support the new
interruption behavior.
* html/HTMLMediaElement.cpp:
(WebCore::HTMLMediaElement::suspendPlayback): Renamed from pausePlayback().
(WebCore::HTMLMediaElement::mayResumePlayback): Renamed from resumePlayback().
* html/HTMLMediaElement.h:
* platform/audio/AudioSession.cpp:
(WebCore::AudioSession::tryToSetActive): Renamed from setActive. Return true by default.
(WebCore::AudioSession::setActive): Deleted.
* platform/audio/AudioSession.h:
* platform/audio/MediaSession.cpp:
(WebCore::MediaSession::beginInterruption): pausePlayback() was renamed to suspendPlayback().
(WebCore::MediaSession::endInterruption): Always notify the client, telling it whether to resume.
(WebCore::MediaSession::clientWillBeginPlayback): Bail early if reentrant. Check the (new)
return value of sessionWillBeginPlayback() and remember to resume once the interruption ends.
(WebCore::MediaSession::clientWillPausePlayback): Bail early if reentrant.
(WebCore::MediaSession::pauseSession): pausePlayback() was renamed to suspendPlayback().
* platform/audio/MediaSession.h:
* platform/audio/MediaSessionManager.cpp:
(WebCore::MediaSessionManager::sessionWillBeginPlayback): Return false if not allowed to break interruption or
if activating the audio session failed. Otherwise, end the interruption.
* platform/audio/MediaSessionManager.h:
* platform/audio/ios/AudioDestinationIOS.cpp:
(WebCore::AudioDestinationIOS::AudioDestinationIOS): m_mediaSession was removed.
(WebCore::AudioDestinationIOS::start): Ditto.
* platform/audio/ios/AudioDestinationIOS.h:
* platform/audio/ios/AudioSessionIOS.mm:
(WebCore::AudioSession::tryToSetActive): Renamed from setActive. Return false if the AVAudioSession could not be activated.
(WebCore::AudioSession::setActive): Deleted.
* platform/audio/ios/MediaSessionManagerIOS.h:
* platform/audio/ios/MediaSessionManagerIOS.mm:
(WebCore::MediaSessionManageriOS::sessionWillBeginPlayback): Do not update the now playing info if session playback was blocked.
* platform/audio/mac/AudioDestinationMac.cpp:
(WebCore::AudioDestinationMac::AudioDestinationMac): m_mediaSession was removed.
* platform/audio/mac/AudioDestinationMac.h:
* platform/audio/mac/AudioSessionMac.cpp:
(WebCore::AudioSession::tryToSetActive): Renamed from setActive(). Return true by default.
(WebCore::AudioSession::setActive): Deleted.
* platform/audio/mac/MediaSessionManagerMac.cpp:
(MediaSessionManager::updateSessionState): No longer attempt to activate the session, as this is done
MediaSessionManager::sessionWillBeginPlayback().
* testing/Internals.cpp:
(WebCore::Internals::setMediaSessionRestrictions): Add "InterruptedPlaybackNotPermitted".
LayoutTests:
* webaudio/audiocontext-state-expected.txt: Added.
* webaudio/audiocontext-state-interrupted-expected.txt: Added.
* webaudio/audiocontext-state-interrupted.html: Added.
* webaudio/audiocontext-state.html: Added.</pre>
<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkLayoutTestsChangeLog">trunk/LayoutTests/ChangeLog</a></li>
<li><a href="#trunkSourceWebCoreChangeLog">trunk/Source/WebCore/ChangeLog</a></li>
<li><a href="#trunkSourceWebCoreModuleswebaudioAudioContextcpp">trunk/Source/WebCore/Modules/webaudio/AudioContext.cpp</a></li>
<li><a href="#trunkSourceWebCoreModuleswebaudioAudioContexth">trunk/Source/WebCore/Modules/webaudio/AudioContext.h</a></li>
<li><a href="#trunkSourceWebCoreModuleswebaudioAudioContextidl">trunk/Source/WebCore/Modules/webaudio/AudioContext.idl</a></li>
<li><a href="#trunkSourceWebCoreModuleswebaudioAudioDestinationNodeh">trunk/Source/WebCore/Modules/webaudio/AudioDestinationNode.h</a></li>
<li><a href="#trunkSourceWebCoreModuleswebaudioDefaultAudioDestinationNodecpp">trunk/Source/WebCore/Modules/webaudio/DefaultAudioDestinationNode.cpp</a></li>
<li><a href="#trunkSourceWebCoreModuleswebaudioDefaultAudioDestinationNodeh">trunk/Source/WebCore/Modules/webaudio/DefaultAudioDestinationNode.h</a></li>
<li><a href="#trunkSourceWebCorebindingsjsJSAudioContextCustomcpp">trunk/Source/WebCore/bindings/js/JSAudioContextCustom.cpp</a></li>
<li><a href="#trunkSourceWebCorebindingsjsJSCallbackDatah">trunk/Source/WebCore/bindings/js/JSCallbackData.h</a></li>
<li><a href="#trunkSourceWebCorebindingsjsJSDOMPromiseh">trunk/Source/WebCore/bindings/js/JSDOMPromise.h</a></li>
<li><a href="#trunkSourceWebCoredomEventNamesh">trunk/Source/WebCore/dom/EventNames.h</a></li>
<li><a href="#trunkSourceWebCoredomScriptExecutionContexth">trunk/Source/WebCore/dom/ScriptExecutionContext.h</a></li>
<li><a href="#trunkSourceWebCorehtmlHTMLMediaElementcpp">trunk/Source/WebCore/html/HTMLMediaElement.cpp</a></li>
<li><a href="#trunkSourceWebCorehtmlHTMLMediaElementh">trunk/Source/WebCore/html/HTMLMediaElement.h</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioAudioSessioncpp">trunk/Source/WebCore/platform/audio/AudioSession.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioAudioSessionh">trunk/Source/WebCore/platform/audio/AudioSession.h</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioMediaSessioncpp">trunk/Source/WebCore/platform/audio/MediaSession.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioMediaSessionh">trunk/Source/WebCore/platform/audio/MediaSession.h</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioMediaSessionManagercpp">trunk/Source/WebCore/platform/audio/MediaSessionManager.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioMediaSessionManagerh">trunk/Source/WebCore/platform/audio/MediaSessionManager.h</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioiosAudioDestinationIOScpp">trunk/Source/WebCore/platform/audio/ios/AudioDestinationIOS.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioiosAudioDestinationIOSh">trunk/Source/WebCore/platform/audio/ios/AudioDestinationIOS.h</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioiosAudioSessionIOSmm">trunk/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioiosMediaSessionManagerIOSh">trunk/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.h</a></li>
<li><a href="#trunkSourceWebCoreplatformaudioiosMediaSessionManagerIOSmm">trunk/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm</a></li>
<li><a href="#trunkSourceWebCoreplatformaudiomacAudioDestinationMaccpp">trunk/Source/WebCore/platform/audio/mac/AudioDestinationMac.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformaudiomacAudioDestinationMach">trunk/Source/WebCore/platform/audio/mac/AudioDestinationMac.h</a></li>
<li><a href="#trunkSourceWebCoreplatformaudiomacAudioSessionMaccpp">trunk/Source/WebCore/platform/audio/mac/AudioSessionMac.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformaudiomacMediaSessionManagerMaccpp">trunk/Source/WebCore/platform/audio/mac/MediaSessionManagerMac.cpp</a></li>
<li><a href="#trunkSourceWebCoretestingInternalscpp">trunk/Source/WebCore/testing/Internals.cpp</a></li>
</ul>
<h3>Added Paths</h3>
<ul>
<li><a href="#trunkLayoutTestswebaudioaudiocontextstateexpectedtxt">trunk/LayoutTests/webaudio/audiocontext-state-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebaudioaudiocontextstateinterruptedexpectedtxt">trunk/LayoutTests/webaudio/audiocontext-state-interrupted-expected.txt</a></li>
<li><a href="#trunkLayoutTestswebaudioaudiocontextstateinterruptedhtml">trunk/LayoutTests/webaudio/audiocontext-state-interrupted.html</a></li>
<li><a href="#trunkLayoutTestswebaudioaudiocontextstatehtml">trunk/LayoutTests/webaudio/audiocontext-state.html</a></li>
</ul>
</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkLayoutTestsChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/ChangeLog (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/ChangeLog        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/LayoutTests/ChangeLog        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -1,3 +1,15 @@
</span><ins>+2015-03-30 Jer Noble <jer.noble@apple.com>
+
+ [iOS] When Web Audio is interrupted by a phone call, it cannot be restarted.
+ https://bugs.webkit.org/show_bug.cgi?id=143190
+
+ Reviewed by Darin Adler.
+
+ * webaudio/audiocontext-state-expected.txt: Added.
+ * webaudio/audiocontext-state-interrupted-expected.txt: Added.
+ * webaudio/audiocontext-state-interrupted.html: Added.
+ * webaudio/audiocontext-state.html: Added.
+
</ins><span class="cx"> 2015-03-30 Marcos ChavarrÃa Teijeiro <chavarria1991@gmail.com>
</span><span class="cx">
</span><span class="cx"> Update expectations for delete-emoji test since the bug is fixed now.
</span></span></pre></div>
<a id="trunkLayoutTestswebaudioaudiocontextstateexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webaudio/audiocontext-state-expected.txt (0 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webaudio/audiocontext-state-expected.txt         (rev 0)
+++ trunk/LayoutTests/webaudio/audiocontext-state-expected.txt        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -0,0 +1,26 @@
</span><ins>+Basic tests for AudioNode API.
+
+On success, you will see a series of "PASS" messages, followed by "TEST COMPLETE".
+
+PASS context.state is "suspended"
+node.connect(context.destination)
+PASS context.state is "running"
+Calling context.suspend()
+PASS context.suspend() promise resolved
+PASS context.state is "suspended"
+Calling context.resume()
+PASS context.resume() promise resolved
+PASS context.state is "running"
+Calling context.close()
+PASS context.close() promise resolved
+PASS context.state is "closed"
+Calling context.resume() (should fail)
+PASS context.resume() promise rejected (correctly)
+PASS context.state is "closed"
+Calling context.suspend() (should fail)
+PASS context.resume() promise rejected (correctly)
+PASS context.state is "closed"
+PASS successfullyParsed is true
+
+TEST COMPLETE
+
</ins></span></pre></div>
<a id="trunkLayoutTestswebaudioaudiocontextstateinterruptedexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webaudio/audiocontext-state-interrupted-expected.txt (0 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webaudio/audiocontext-state-interrupted-expected.txt         (rev 0)
+++ trunk/LayoutTests/webaudio/audiocontext-state-interrupted-expected.txt        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -0,0 +1,50 @@
</span><ins>+Basic tests for AudioNode API.
+
+On success, you will see a series of "PASS" messages, followed by "TEST COMPLETE".
+
+PASS context.state is "suspended"
+node.connect(context.destination)
+EVENT statechange
+PASS context.state is "running"
+
+Test 1: resume() while interrupted is resolved after the interruption ends.
+internals.beginMediaSessionInterruption()
+EVENT statechange
+PASS context.state is "interrupted"
+internals.setMediaSessionRestrictions("WebAudio", "InterruptedPlaybackNotPermitted")
+Calling context.resume()
+Delaying 100ms
+PASS context.state is "interrupted"
+internals.endMediaSessionInterruption("MayResumePlaying")
+PASS context.resume() promise resolved
+PASS context.state is "running"
+
+Test 2: resume() while interrupted will cause interruption to end.
+internals.beginMediaSessionInterruption()
+EVENT statechange
+PASS context.state is "interrupted"
+internals.setMediaSessionRestrictions("WebAudio", "")
+Calling context.resume()
+PASS context.resume() promise resolved
+PASS context.state is "running"
+
+Test 3: running AudioContexts will not resume after an interruption ends.
+internals.beginMediaSessionInterruption()
+EVENT statechange
+PASS context.state is "interrupted"
+internals.endMediaSessionInterruption("")
+EVENT statechange
+PASS context.state is "suspended"
+
+Test 4: resume() while interrupted will not resume playback after an interruption.
+internals.beginMediaSessionInterruption()
+EVENT statechange
+PASS context.state is "interrupted"
+Calling context.resume()
+internals.endMediaSessionInterruption("")
+EVENT statechange
+PASS context.state is "suspended"
+PASS successfullyParsed is true
+
+TEST COMPLETE
+
</ins></span></pre></div>
<a id="trunkLayoutTestswebaudioaudiocontextstateinterruptedhtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webaudio/audiocontext-state-interrupted.html (0 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webaudio/audiocontext-state-interrupted.html         (rev 0)
+++ trunk/LayoutTests/webaudio/audiocontext-state-interrupted.html        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -0,0 +1,150 @@
</span><ins>+<!DOCTYPE html>
+
+<html>
+<head>
+<script src="../resources/js-test-pre.js"></script>
+<script type="text/javascript" src="resources/audio-testing.js"></script>
+</head>
+
+<body>
+<div id="description"></div>
+<div id="console"></div>
+
+<script>
+description('Basic tests for AudioNode API.');
+
+var context = null;
+var node = null;
+
+function runTest() {
+ if (window.testRunner) {
+ testRunner.dumpAsText();
+ testRunner.waitUntilDone();
+ }
+
+ window.jsTestIsAsync = true;
+
+ context = new webkitAudioContext();
+
+ shouldBe('context.state', '"suspended"');
+ context.onstatechange = beganRunning;
+
+ node = context.createBufferSource();
+ evalAndLog('node.connect(context.destination)');
+
+}
+
+function beganRunning(e) {
+ debug('EVENT ' + event.type);
+ shouldBe('context.state', '"running"');
+
+ debug('');
+ debug('Test 1: resume() while interrupted is resolved after the interruption ends.');
+
+ context.onstatechange = firstInterruptionStarted;
+ if (window.internals)
+ evalAndLog('internals.beginMediaSessionInterruption()');
+
+}
+
+function firstInterruptionStarted(e) {
+ debug('EVENT ' + event.type);
+ shouldBe('context.state', '"interrupted"');
+ if (window.internals)
+ evalAndLog('internals.setMediaSessionRestrictions("WebAudio", "InterruptedPlaybackNotPermitted")');
+
+ context.onstatechange = null;
+
+ debug('Calling context.resume()');
+ context.resume().then(firstResumeSucceeded);
+ debug('Delaying 100ms');
+ setTimeout(function() {
+ shouldBe('context.state', '"interrupted"');
+ if (window.internals)
+ evalAndLog('internals.endMediaSessionInterruption("MayResumePlaying")');
+ }, 100);
+}
+
+function firstResumeSucceeded() {
+ testPassed('context.resume() promise resolved');
+ shouldBe('context.state', '"running"');
+
+ debug('');
+ debug('Test 2: resume() while interrupted will cause interruption to end.')
+
+ context.onstatechange = secondInterruptionStarted;
+ if (window.internals)
+ evalAndLog('internals.beginMediaSessionInterruption()');
+}
+
+function secondInterruptionStarted(e) {
+ debug('EVENT ' + event.type);
+ shouldBe('context.state', '"interrupted"');
+ if (window.internals)
+ evalAndLog('internals.setMediaSessionRestrictions("WebAudio", "")');
+
+ context.onstatechange = null;
+
+ debug('Calling context.resume()');
+ context.resume().then(secondResumeSucceeded);
+}
+
+function secondResumeSucceeded() {
+ testPassed('context.resume() promise resolved');
+ shouldBe('context.state', '"running"');
+
+ debug('');
+ debug('Test 3: running AudioContexts will not resume after an interruption ends.')
+
+ context.onstatechange = thirdInterruptionStarted;
+ if (window.internals)
+ evalAndLog('internals.beginMediaSessionInterruption()');
+}
+
+function thirdInterruptionStarted() {
+ debug('EVENT ' + event.type);
+ shouldBe('context.state', '"interrupted"');
+
+ context.onstatechange = thirdInterruptionEnded;
+ if (window.internals)
+ evalAndLog('internals.endMediaSessionInterruption("")');
+}
+
+function thirdInterruptionEnded() {
+ debug('EVENT ' + event.type);
+ shouldBe('context.state', '"suspended"');
+
+ debug('');
+ debug('Test 4: resume() while interrupted will not resume playback after an interruption.')
+
+ context.onstatechange = fourthInterruptionStarted;
+ if (window.internals)
+ evalAndLog('internals.beginMediaSessionInterruption()');
+}
+
+function fourthInterruptionStarted() {
+ debug('EVENT ' + event.type);
+ shouldBe('context.state', '"interrupted"');
+
+ context.onstatechange = fourthInterruptionEnded;
+
+ debug('Calling context.resume()');
+ context.resume();
+
+ if (window.internals)
+ evalAndLog('internals.endMediaSessionInterruption("")');
+}
+
+function fourthInterruptionEnded() {
+ debug('EVENT ' + event.type);
+ shouldBe('context.state', '"suspended"');
+ finishJSTest();
+}
+
+runTest();
+
+</script>
+
+<script src="../resources/js-test-post.js"></script>
+</body>
+</html>
</ins></span></pre></div>
<a id="trunkLayoutTestswebaudioaudiocontextstatehtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/webaudio/audiocontext-state.html (0 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/webaudio/audiocontext-state.html         (rev 0)
+++ trunk/LayoutTests/webaudio/audiocontext-state.html        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -0,0 +1,109 @@
</span><ins>+<!DOCTYPE html>
+
+<html>
+<head>
+<script src="../resources/js-test-pre.js"></script>
+<script type="text/javascript" src="resources/audio-testing.js"></script>
+</head>
+
+<body>
+<div id="description"></div>
+<div id="console"></div>
+
+<script>
+description('Basic tests for AudioNode API.');
+
+var context = null;
+var node = null;
+
+function runTest() {
+ if (window.testRunner) {
+ testRunner.dumpAsText();
+ testRunner.waitUntilDone();
+ }
+
+ window.jsTestIsAsync = true;
+
+ context = new webkitAudioContext();
+
+ shouldBe('context.state', '"suspended"');
+
+ node = context.createBufferSource();
+ evalAndLog('node.connect(context.destination)');
+
+ shouldBe('context.state', '"running"');
+
+ debug('Calling context.suspend()');
+ context.suspend().then(suspendSucceeded, suspendFailed);
+}
+
+function suspendFailed() {
+ testFailed('context.suspend() promise rejected');
+ finishJSTest();
+}
+
+function suspendSucceeded() {
+ testPassed('context.suspend() promise resolved');
+ shouldBe('context.state', '"suspended"');
+
+ debug('Calling context.resume()');
+ context.resume().then(resumeSucceeded, resumeFailed);
+}
+
+function resumeFailed() {
+ testFailed('context.resume() promise rejected');
+ finishJSTest();
+}
+
+function resumeSucceeded() {
+ testPassed('context.resume() promise resolved');
+ shouldBe('context.state', '"running"');
+
+ debug('Calling context.close()');
+ context.close().then(closeSucceeded, closeFailed);
+}
+
+function closeFailed() {
+ testFailed('context.close() promise rejected');
+ finishJSTest();
+}
+
+function closeSucceeded() {
+ testPassed('context.close() promise resolved');
+ shouldBe('context.state', '"closed"');
+
+ debug('Calling context.resume() (should fail)');
+ context.resume().then(resumeSucceededIncorrectly, resumeFailedCorrectly);
+}
+
+function resumeSucceededIncorrectly() {
+ testFailed('context.resume() promise resolved (should have rejected)');
+ finishJSTest();
+}
+
+function resumeFailedCorrectly() {
+ testPassed('context.resume() promise rejected (correctly)');
+ shouldBe('context.state', '"closed"');
+
+ debug('Calling context.suspend() (should fail)');
+ context.suspend().then(suspendSucceededIncorrectly, suspendFailedCorrectly);
+}
+
+function suspendSucceededIncorrectly() {
+ testFailed('context.suspend() promise resolved (should have rejected)');
+ finishJSTest();
+}
+
+function suspendFailedCorrectly() {
+ testPassed('context.resume() promise rejected (correctly)');
+ shouldBe('context.state', '"closed"');
+ finishJSTest();
+}
+
+runTest();
+
+</script>
+
+<script src="../resources/js-test-post.js"></script>
+</body>
+</html>
</ins></span></pre></div>
<a id="trunkSourceWebCoreChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/ChangeLog (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/ChangeLog        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/ChangeLog        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -1,3 +1,113 @@
</span><ins>+2015-03-30 Jer Noble <jer.noble@apple.com>
+
+ [iOS] When Web Audio is interrupted by a phone call, it cannot be restarted.
+ https://bugs.webkit.org/show_bug.cgi?id=143190
+
+ Reviewed by Darin Adler.
+
+ Tests: webaudio/audiocontext-state-interrupted.html
+ webaudio/audiocontext-state.html
+
+ Implement the following methods and properties from the Web Audio spec:
+ close(), suspend(), resume(), onstatechange.
+
+ AudioContext will take more responsibility for tracking state and interruptions (and
+ AudioDestination will give up that responsibility). This means AudioContext must be a
+ MediaSessionClient, and own its own MediaSession. In return, AudioDestinationIOS and
+ AudioDestinationMac relinquish both.
+
+ * Modules/webaudio/AudioContext.cpp:
+ (WebCore::AudioContext::AudioContext): Set default values in header.
+ (WebCore::AudioContext::uninitialize): Call setState() instead of setting m_state.
+ (WebCore::AudioContext::addReaction): Added. Append the callback to the appropriate vector for the state.
+ (WebCore::AudioContext::setState): Added. Fire events and resolve the appropriate reaction callbacks for the new state.
+ (WebCore::AudioContext::state): Return a string representing the context's state.
+ (WebCore::AudioContext::stop): Close the event queue.
+ (WebCore::AudioContext::startRendering): Call setState().
+ (WebCore::AudioContext::fireCompletionEvent): Call setState().
+ (WebCore::AudioContext::suspendContext): Added. Add reaction callback and call suspend() on the destination node.
+ (WebCore::AudioContext::resumeContext): Added. Add reaction callback and call resume() on the destination node.
+ (WebCore::AudioContext::closeContext): Added. Add reaction callback and call close() on the destination node.
+ (WebCore::AudioContext::suspendPlayback): Added. Suspend playback and set state to interrupted.
+ (WebCore::AudioContext::mayResumePlayback): Added. Conditionally resume playback.
+ * bindings/js/JSAudioContextCustom.cpp:
+ (WebCore::JSAudioContext::suspend): Added. Create and return a new Promise object.
+ (WebCore::JSAudioContext::resume): Ditto.
+ (WebCore::JSAudioContext::close): Ditto.
+ * Modules/webaudio/AudioContext.idl: Add new methods and properties.
+
+ Extensive organizational changes were made to AudioContext.h to group instance
+ variables together and add C++11 initializers in their declarations:
+
+ * Modules/webaudio/AudioContext.h:
+ (WebCore::AudioContext::mediaType): Moved from AudioDestinationNode.
+ (WebCore::AudioContext::presentationType): Ditto.
+ (WebCore::AudioContext::canReceiveRemoteControlCommands): Ditto.
+ (WebCore::AudioContext::didReceiveRemoteControlCommand): Ditto.
+ (WebCore::AudioContext::overrideBackgroundPlaybackRestriction): Ditto.
+
+ Other changes to support the new AudioContext methods:
+
+ * Modules/webaudio/AudioDestinationNode.h:
+ (WebCore::AudioDestinationNode::resume): Add empty default virtual method.
+ (WebCore::AudioDestinationNode::suspend): Ditto.
+ (WebCore::AudioDestinationNode::close): Ditto.
+ * Modules/webaudio/DefaultAudioDestinationNode.cpp:
+ (WebCore::DefaultAudioDestinationNode::resume): Added. Pass to AudioDestination.
+ (WebCore::DefaultAudioDestinationNode::suspend): Ditto.
+ (WebCore::DefaultAudioDestinationNode::close): Ditto.
+ * Modules/webaudio/DefaultAudioDestinationNode.h:
+ * bindings/js/JSDOMPromise.h:
+ (WebCore::DeferredWrapper::resolve): Add an overload for a nullptr resolve value.
+ * dom/EventNames.h: Added 'statechange'.
+ * dom/ScriptExecutionContext.h:
+ (WebCore::ScriptExecutionContext::Task::Task): Add a constructor which takes a void() callback.
+
+ Modify MediaSession, AudioSession, and MediaSessionManager to support the new
+ interruption behavior.
+
+ * html/HTMLMediaElement.cpp:
+ (WebCore::HTMLMediaElement::suspendPlayback): Renamed from pausePlayback().
+ (WebCore::HTMLMediaElement::mayResumePlayback): Renamed from resumePlayback().
+ * html/HTMLMediaElement.h:
+ * platform/audio/AudioSession.cpp:
+ (WebCore::AudioSession::tryToSetActive): Renamed from setActive. Return true by default.
+ (WebCore::AudioSession::setActive): Deleted.
+ * platform/audio/AudioSession.h:
+ * platform/audio/MediaSession.cpp:
+ (WebCore::MediaSession::beginInterruption): pausePlayback() was renamed to suspendPlayback().
+ (WebCore::MediaSession::endInterruption): Always notify the client, telling it whether to resume.
+ (WebCore::MediaSession::clientWillBeginPlayback): Bail early if reentrant. Check the (new)
+ return value of sessionWillBeginPlayback() and remember to resume once the interruption ends.
+ (WebCore::MediaSession::clientWillPausePlayback): Bail early if reentrant.
+ (WebCore::MediaSession::pauseSession): pausePlayback() was renamed to suspendPlayback().
+ * platform/audio/MediaSession.h:
+ * platform/audio/MediaSessionManager.cpp:
+ (WebCore::MediaSessionManager::sessionWillBeginPlayback): Return false if not allowed to break interruption or
+ if activating the audio session failed. Otherwise, end the interruption.
+ * platform/audio/MediaSessionManager.h:
+ * platform/audio/ios/AudioDestinationIOS.cpp:
+ (WebCore::AudioDestinationIOS::AudioDestinationIOS): m_mediaSession was removed.
+ (WebCore::AudioDestinationIOS::start): Ditto.
+ * platform/audio/ios/AudioDestinationIOS.h:
+ * platform/audio/ios/AudioSessionIOS.mm:
+ (WebCore::AudioSession::tryToSetActive): Renamed from setActive. Return false if the AVAudioSession could not be activated.
+ (WebCore::AudioSession::setActive): Deleted.
+ * platform/audio/ios/MediaSessionManagerIOS.h:
+ * platform/audio/ios/MediaSessionManagerIOS.mm:
+ (WebCore::MediaSessionManageriOS::sessionWillBeginPlayback): Do not update the now playing info if session playback was blocked.
+ * platform/audio/mac/AudioDestinationMac.cpp:
+ (WebCore::AudioDestinationMac::AudioDestinationMac): m_mediaSession was removed.
+ * platform/audio/mac/AudioDestinationMac.h:
+ * platform/audio/mac/AudioSessionMac.cpp:
+ (WebCore::AudioSession::tryToSetActive): Renamed from setActive(). Return true by default.
+ (WebCore::AudioSession::setActive): Deleted.
+ * platform/audio/mac/MediaSessionManagerMac.cpp:
+ (MediaSessionManager::updateSessionState): No longer attempt to activate the session, as this is done
+ MediaSessionManager::sessionWillBeginPlayback().
+ * testing/Internals.cpp:
+ (WebCore::Internals::setMediaSessionRestrictions): Add "InterruptedPlaybackNotPermitted".
+
</ins><span class="cx"> 2015-03-25 Xabier Rodriguez Calvar <calvaris@igalia.com> and Youenn Fablet <youenn.fablet@crf.canon.fr>
</span><span class="cx">
</span><span class="cx"> [Streams API] Error storage should be moved from source to stream/reader
</span></span></pre></div>
<a id="trunkSourceWebCoreModuleswebaudioAudioContextcpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/Modules/webaudio/AudioContext.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/Modules/webaudio/AudioContext.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/Modules/webaudio/AudioContext.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -44,9 +44,11 @@
</span><span class="cx"> #include "DelayNode.h"
</span><span class="cx"> #include "Document.h"
</span><span class="cx"> #include "DynamicsCompressorNode.h"
</span><ins>+#include "EventNames.h"
</ins><span class="cx"> #include "ExceptionCode.h"
</span><span class="cx"> #include "FFTFrame.h"
</span><span class="cx"> #include "GainNode.h"
</span><ins>+#include "GenericEventQueue.h"
</ins><span class="cx"> #include "HRTFDatabaseLoader.h"
</span><span class="cx"> #include "HRTFPanner.h"
</span><span class="cx"> #include "OfflineAudioCompletionEvent.h"
</span><span class="lines">@@ -55,10 +57,11 @@
</span><span class="cx"> #include "Page.h"
</span><span class="cx"> #include "PannerNode.h"
</span><span class="cx"> #include "PeriodicWave.h"
</span><del>-#include <inspector/ScriptCallStack.h>
</del><span class="cx"> #include "ScriptController.h"
</span><span class="cx"> #include "ScriptProcessorNode.h"
</span><span class="cx"> #include "WaveShaperNode.h"
</span><ins>+#include <inspector/ScriptCallStack.h>
+#include <wtf/NeverDestroyed.h>
</ins><span class="cx">
</span><span class="cx"> #if ENABLE(MEDIA_STREAM)
</span><span class="cx"> #include "MediaStream.h"
</span><span class="lines">@@ -126,19 +129,9 @@
</span><span class="cx"> // Constructor for rendering to the audio hardware.
</span><span class="cx"> AudioContext::AudioContext(Document& document)
</span><span class="cx"> : ActiveDOMObject(&document)
</span><del>- , m_isStopScheduled(false)
- , m_isInitialized(false)
- , m_isAudioThreadFinished(false)
- , m_destinationNode(0)
- , m_isDeletionScheduled(false)
- , m_automaticPullNodesNeedUpdating(false)
- , m_connectionCount(0)
- , m_audioThread(0)
</del><ins>+ , m_mediaSession(MediaSession::create(*this))
+ , m_eventQueue(std::make_unique<GenericEventQueue>(*this))
</ins><span class="cx"> , m_graphOwnerThread(UndefinedThreadIdentifier)
</span><del>- , m_isOfflineContext(false)
- , m_activeSourceCount(0)
- , m_restrictions(NoRestrictions)
- , m_state(State::Suspended)
</del><span class="cx"> {
</span><span class="cx"> constructCommon();
</span><span class="cx">
</span><span class="lines">@@ -151,18 +144,10 @@
</span><span class="cx"> // Constructor for offline (non-realtime) rendering.
</span><span class="cx"> AudioContext::AudioContext(Document& document, unsigned numberOfChannels, size_t numberOfFrames, float sampleRate)
</span><span class="cx"> : ActiveDOMObject(&document)
</span><del>- , m_isStopScheduled(false)
- , m_isInitialized(false)
- , m_isAudioThreadFinished(false)
- , m_destinationNode(0)
- , m_automaticPullNodesNeedUpdating(false)
- , m_connectionCount(0)
- , m_audioThread(0)
</del><ins>+ , m_isOfflineContext(true)
+ , m_mediaSession(MediaSession::create(*this))
+ , m_eventQueue(std::make_unique<GenericEventQueue>(*this))
</ins><span class="cx"> , m_graphOwnerThread(UndefinedThreadIdentifier)
</span><del>- , m_isOfflineContext(true)
- , m_activeSourceCount(0)
- , m_restrictions(NoRestrictions)
- , m_state(State::Suspended)
</del><span class="cx"> {
</span><span class="cx"> constructCommon();
</span><span class="cx">
</span><span class="lines">@@ -277,7 +262,7 @@
</span><span class="cx"> --s_hardwareContextCount;
</span><span class="cx">
</span><span class="cx"> // Offline contexts move to 'Closed' state when dispatching the completion event.
</span><del>- m_state = State::Closed;
</del><ins>+ setState(State::Closed);
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> // Get rid of the sources which may still be playing.
</span><span class="lines">@@ -291,6 +276,56 @@
</span><span class="cx"> return m_isInitialized;
</span><span class="cx"> }
</span><span class="cx">
</span><ins>+void AudioContext::addReaction(State state, std::function<void()> reaction)
+{
+ size_t stateIndex = static_cast<size_t>(state);
+ if (stateIndex >= m_stateReactions.size())
+ m_stateReactions.resize(stateIndex + 1);
+
+ m_stateReactions[stateIndex].append(reaction);
+}
+
+void AudioContext::setState(State state)
+{
+ if (m_state == state)
+ return;
+
+ m_state = state;
+ m_eventQueue->enqueueEvent(Event::create(eventNames().statechangeEvent, true, false));
+
+ size_t stateIndex = static_cast<size_t>(state);
+ if (stateIndex >= m_stateReactions.size())
+ return;
+
+ Vector<std::function<void()>> reactions;
+ m_stateReactions[stateIndex].swap(reactions);
+
+ for (auto& reaction : reactions)
+ reaction();
+}
+
+const AtomicString& AudioContext::state() const
+{
+ static NeverDestroyed<AtomicString> suspended("suspended");
+ static NeverDestroyed<AtomicString> running("running");
+ static NeverDestroyed<AtomicString> interrupted("interrupted");
+ static NeverDestroyed<AtomicString> closed("closed");
+
+ switch (m_state) {
+ case State::Suspended:
+ return suspended;
+ case State::Running:
+ return running;
+ case State::Interrupted:
+ return interrupted;
+ case State::Closed:
+ return closed;
+ }
+
+ ASSERT_NOT_REACHED();
+ return suspended;
+}
+
</ins><span class="cx"> void AudioContext::stopDispatch(void* userData)
</span><span class="cx"> {
</span><span class="cx"> AudioContext* context = reinterpret_cast<AudioContext*>(userData);
</span><span class="lines">@@ -311,6 +346,8 @@
</span><span class="cx">
</span><span class="cx"> document()->updateIsPlayingAudio();
</span><span class="cx">
</span><ins>+ m_eventQueue->close();
+
</ins><span class="cx"> // Don't call uninitialize() immediately here because the ScriptExecutionContext is in the middle
</span><span class="cx"> // of dealing with all of its ActiveDOMObjects at this point. uninitialize() can de-reference other
</span><span class="cx"> // ActiveDOMObjects so let's schedule uninitialize() to be called later.
</span><span class="lines">@@ -980,7 +1017,7 @@
</span><span class="cx"> removeBehaviorRestriction(AudioContext::RequirePageConsentForAudioStartRestriction);
</span><span class="cx"> }
</span><span class="cx"> destination()->startRendering();
</span><del>- m_state = State::Running;
</del><ins>+ setState(State::Running);
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> void AudioContext::mediaCanStart()
</span><span class="lines">@@ -1011,7 +1048,7 @@
</span><span class="cx"> return;
</span><span class="cx">
</span><span class="cx"> AudioBuffer* renderedBuffer = m_renderTarget.get();
</span><del>- m_state = State::Closed;
</del><ins>+ setState(State::Closed);
</ins><span class="cx">
</span><span class="cx"> ASSERT(renderedBuffer);
</span><span class="cx"> if (!renderedBuffer)
</span><span class="lines">@@ -1020,7 +1057,7 @@
</span><span class="cx"> // Avoid firing the event if the document has already gone away.
</span><span class="cx"> if (scriptExecutionContext()) {
</span><span class="cx"> // Call the offline rendering completion event listener.
</span><del>- dispatchEvent(OfflineAudioCompletionEvent::create(renderedBuffer));
</del><ins>+ m_eventQueue->enqueueEvent(OfflineAudioCompletionEvent::create(renderedBuffer));
</ins><span class="cx"> }
</span><span class="cx"> }
</span><span class="cx">
</span><span class="lines">@@ -1034,6 +1071,127 @@
</span><span class="cx"> --m_activeSourceCount;
</span><span class="cx"> }
</span><span class="cx">
</span><ins>+void AudioContext::suspendContext(std::function<void()> successCallback, std::function<void()> failureCallback, ExceptionCode& ec)
+{
+ ASSERT(successCallback);
+ ASSERT(failureCallback);
+
+ if (isOfflineContext()) {
+ ec = INVALID_STATE_ERR;
+ return;
+ }
+
+ if (m_state == State::Suspended) {
+ scriptExecutionContext()->postTask(successCallback);
+ return;
+ }
+
+ if (m_state == State::Closed || m_state == State::Interrupted || !m_destinationNode) {
+ scriptExecutionContext()->postTask(failureCallback);
+ return;
+ }
+
+ addReaction(State::Suspended, successCallback);
+
+ if (!m_mediaSession->clientWillPausePlayback())
+ return;
+
+ RefPtr<AudioContext> strongThis(this);
+ m_destinationNode->suspend([strongThis] {
+ strongThis->setState(State::Suspended);
+ });
+}
+
+void AudioContext::resumeContext(std::function<void()> successCallback, std::function<void()> failureCallback, ExceptionCode& ec)
+{
+ ASSERT(successCallback);
+ ASSERT(failureCallback);
+
+ if (isOfflineContext()) {
+ ec = INVALID_STATE_ERR;
+ return;
+ }
+
+ if (m_state == State::Running) {
+ scriptExecutionContext()->postTask(successCallback);
+ return;
+ }
+
+ if (m_state == State::Closed || !m_destinationNode) {
+ scriptExecutionContext()->postTask(failureCallback);
+ return;
+ }
+
+ addReaction(State::Running, successCallback);
+
+ if (!m_mediaSession->clientWillBeginPlayback())
+ return;
+
+ RefPtr<AudioContext> strongThis(this);
+ m_destinationNode->resume([strongThis] {
+ strongThis->setState(State::Running);
+ });
+}
+
+void AudioContext::closeContext(std::function<void()> successCallback, std::function<void()>, ExceptionCode& ec)
+{
+ ASSERT(successCallback);
+
+ if (isOfflineContext()) {
+ ec = INVALID_STATE_ERR;
+ return;
+ }
+
+ if (m_state == State::Closed || !m_destinationNode) {
+ scriptExecutionContext()->postTask(successCallback);
+ return;
+ }
+
+ addReaction(State::Closed, successCallback);
+
+ RefPtr<AudioContext> strongThis(this);
+ m_destinationNode->close([strongThis, successCallback] {
+ strongThis->setState(State::Closed);
+ strongThis->uninitialize();
+ });
+}
+
+
+void AudioContext::suspendPlayback()
+{
+ if (!m_destinationNode || m_state == State::Closed)
+ return;
+
+ if (m_state == State::Suspended) {
+ if (m_mediaSession->state() == MediaSession::Interrupted)
+ setState(State::Interrupted);
+ return;
+ }
+
+ RefPtr<AudioContext> strongThis(this);
+ m_destinationNode->suspend([strongThis] {
+ bool interrupted = strongThis->m_mediaSession->state() == MediaSession::Interrupted;
+ strongThis->setState(interrupted ? State::Interrupted : State::Suspended);
+ });
+}
+
+void AudioContext::mayResumePlayback(bool shouldResume)
+{
+ if (!m_destinationNode || m_state == State::Closed || m_state == State::Running)
+ return;
+
+ if (!shouldResume) {
+ setState(State::Suspended);
+ return;
+ }
+
+ RefPtr<AudioContext> strongThis(this);
+ m_destinationNode->resume([strongThis] {
+ strongThis->setState(State::Running);
+ });
+}
+
+
</ins><span class="cx"> } // namespace WebCore
</span><span class="cx">
</span><span class="cx"> #endif // ENABLE(WEB_AUDIO)
</span></span></pre></div>
<a id="trunkSourceWebCoreModuleswebaudioAudioContexth"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/Modules/webaudio/AudioContext.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/Modules/webaudio/AudioContext.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/Modules/webaudio/AudioContext.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -33,6 +33,7 @@
</span><span class="cx"> #include "EventListener.h"
</span><span class="cx"> #include "EventTarget.h"
</span><span class="cx"> #include "MediaCanStartListener.h"
</span><ins>+#include "MediaSession.h"
</ins><span class="cx"> #include <atomic>
</span><span class="cx"> #include <wtf/HashSet.h>
</span><span class="cx"> #include <wtf/MainThread.h>
</span><span class="lines">@@ -57,6 +58,7 @@
</span><span class="cx"> class ChannelMergerNode;
</span><span class="cx"> class ChannelSplitterNode;
</span><span class="cx"> class GainNode;
</span><ins>+class GenericEventQueue;
</ins><span class="cx"> class PannerNode;
</span><span class="cx"> class AudioListener;
</span><span class="cx"> class AudioSummingJunction;
</span><span class="lines">@@ -74,7 +76,7 @@
</span><span class="cx"> // AudioContext is the cornerstone of the web audio API and all AudioNodes are created from it.
</span><span class="cx"> // For thread safety between the audio thread and the main thread, it has a rendering graph locking mechanism.
</span><span class="cx">
</span><del>-class AudioContext : public ActiveDOMObject, public ThreadSafeRefCounted<AudioContext>, public EventTargetWithInlineData, public MediaCanStartListener, public AudioProducer {
</del><ins>+class AudioContext : public ActiveDOMObject, public ThreadSafeRefCounted<AudioContext>, public EventTargetWithInlineData, public MediaCanStartListener, public AudioProducer, private MediaSessionClient {
</ins><span class="cx"> public:
</span><span class="cx"> // Create an AudioContext for rendering to the audio hardware.
</span><span class="cx"> static RefPtr<AudioContext> create(Document&, ExceptionCode&);
</span><span class="lines">@@ -104,6 +106,11 @@
</span><span class="cx">
</span><span class="cx"> AudioListener* listener() { return m_listener.get(); }
</span><span class="cx">
</span><ins>+ void suspendContext(std::function<void()>, std::function<void()>, ExceptionCode&);
+ void resumeContext(std::function<void()>, std::function<void()>, ExceptionCode&);
+ void closeContext(std::function<void()>, std::function<void()>, ExceptionCode&);
+ const AtomicString& state() const;
+
</ins><span class="cx"> // The AudioNode create methods are called on the main thread (from JavaScript).
</span><span class="cx"> PassRefPtr<AudioBufferSourceNode> createBufferSource();
</span><span class="cx"> #if ENABLE(VIDEO)
</span><span class="lines">@@ -264,9 +271,11 @@
</span><span class="cx"> void lazyInitialize();
</span><span class="cx"> void uninitialize();
</span><span class="cx">
</span><ins>+ enum class State { Suspended, Running, Interrupted, Closed };
+ void setState(State);
+
</ins><span class="cx"> // ScriptExecutionContext calls stop twice.
</span><span class="cx"> // We'd like to schedule only one stop action for them.
</span><del>- bool m_isStopScheduled;
</del><span class="cx"> static void stopDispatch(void* userData);
</span><span class="cx"> void clear();
</span><span class="cx">
</span><span class="lines">@@ -279,9 +288,6 @@
</span><span class="cx"> virtual bool isPlayingAudio() override;
</span><span class="cx"> virtual void pageMutedStateDidChange() override;
</span><span class="cx">
</span><del>- bool m_isInitialized;
- bool m_isAudioThreadFinished;
-
</del><span class="cx"> // The context itself keeps a reference to all source nodes. The source nodes, then reference all nodes they're connected to.
</span><span class="cx"> // In turn, these nodes reference all nodes they're connected to. All nodes are ultimately connected to the AudioDestinationNode.
</span><span class="cx"> // When the context dereferences a source node, it will be deactivated from the rendering graph along with all other nodes it is
</span><span class="lines">@@ -298,9 +304,25 @@
</span><span class="cx"> // Make sure to dereference them here.
</span><span class="cx"> void derefUnfinishedSourceNodes();
</span><span class="cx">
</span><del>- RefPtr<AudioDestinationNode> m_destinationNode;
- RefPtr<AudioListener> m_listener;
</del><ins>+ // MediaSessionClient
+ virtual MediaSession::MediaType mediaType() const { return MediaSession::WebAudio; }
+ virtual MediaSession::MediaType presentationType() const { return MediaSession::WebAudio; }
+ virtual bool canReceiveRemoteControlCommands() const { return false; }
+ virtual void didReceiveRemoteControlCommand(MediaSession::RemoteControlCommandType) { }
+ virtual bool overrideBackgroundPlaybackRestriction() const { return false; }
+ virtual void suspendPlayback() override;
+ virtual void mayResumePlayback(bool shouldResume) override;
</ins><span class="cx">
</span><ins>+ // EventTarget
+ virtual void refEventTarget() override { ref(); }
+ virtual void derefEventTarget() override { deref(); }
+
+ void handleDirtyAudioSummingJunctions();
+ void handleDirtyAudioNodeOutputs();
+
+ void addReaction(State, std::function<void()>);
+ void updateAutomaticPullNodes();
+
</ins><span class="cx"> // Only accessed in the audio thread.
</span><span class="cx"> Vector<AudioNode*> m_finishedNodes;
</span><span class="cx">
</span><span class="lines">@@ -317,40 +339,40 @@
</span><span class="cx">
</span><span class="cx"> // They will be scheduled for deletion (on the main thread) at the end of a render cycle (in realtime thread).
</span><span class="cx"> Vector<AudioNode*> m_nodesToDelete;
</span><del>- bool m_isDeletionScheduled;
</del><span class="cx">
</span><ins>+ bool m_isDeletionScheduled { false };
+ bool m_isStopScheduled { false };
+ bool m_isInitialized { false };
+ bool m_isAudioThreadFinished { false };
+ bool m_automaticPullNodesNeedUpdating { false };
+ bool m_isOfflineContext { false };
+
</ins><span class="cx"> // Only accessed when the graph lock is held.
</span><span class="cx"> HashSet<AudioSummingJunction*> m_dirtySummingJunctions;
</span><span class="cx"> HashSet<AudioNodeOutput*> m_dirtyAudioNodeOutputs;
</span><del>- void handleDirtyAudioSummingJunctions();
- void handleDirtyAudioNodeOutputs();
</del><span class="cx">
</span><span class="cx"> // For the sake of thread safety, we maintain a seperate Vector of automatic pull nodes for rendering in m_renderingAutomaticPullNodes.
</span><span class="cx"> // It will be copied from m_automaticPullNodes by updateAutomaticPullNodes() at the very start or end of the rendering quantum.
</span><span class="cx"> HashSet<AudioNode*> m_automaticPullNodes;
</span><span class="cx"> Vector<AudioNode*> m_renderingAutomaticPullNodes;
</span><del>- // m_automaticPullNodesNeedUpdating keeps track if m_automaticPullNodes is modified.
- bool m_automaticPullNodesNeedUpdating;
- void updateAutomaticPullNodes();
</del><ins>+ // Only accessed in the audio thread.
+ Vector<AudioNode*> m_deferredFinishDerefList;
+ Vector<Vector<std::function<void()>>> m_stateReactions;
</ins><span class="cx">
</span><del>- unsigned m_connectionCount;
</del><ins>+ std::unique_ptr<MediaSession> m_mediaSession;
+ std::unique_ptr<GenericEventQueue> m_eventQueue;
</ins><span class="cx">
</span><ins>+ RefPtr<AudioBuffer> m_renderTarget;
+ RefPtr<AudioDestinationNode> m_destinationNode;
+ RefPtr<AudioListener> m_listener;
+
+ unsigned m_connectionCount { 0 };
+
</ins><span class="cx"> // Graph locking.
</span><span class="cx"> Mutex m_contextGraphMutex;
</span><del>- volatile ThreadIdentifier m_audioThread;
</del><ins>+ volatile ThreadIdentifier m_audioThread { 0 };
</ins><span class="cx"> volatile ThreadIdentifier m_graphOwnerThread; // if the lock is held then this is the thread which owns it, otherwise == UndefinedThreadIdentifier
</span><del>-
- // Only accessed in the audio thread.
- Vector<AudioNode*> m_deferredFinishDerefList;
</del><span class="cx">
</span><del>- // EventTarget
- virtual void refEventTarget() override { ref(); }
- virtual void derefEventTarget() override { deref(); }
-
- RefPtr<AudioBuffer> m_renderTarget;
-
- bool m_isOfflineContext;
-
</del><span class="cx"> AsyncAudioDecoder m_audioDecoder;
</span><span class="cx">
</span><span class="cx"> // This is considering 32 is large enough for multiple channels audio.
</span><span class="lines">@@ -358,12 +380,11 @@
</span><span class="cx"> enum { MaxNumberOfChannels = 32 };
</span><span class="cx">
</span><span class="cx"> // Number of AudioBufferSourceNodes that are active (playing).
</span><del>- std::atomic<int> m_activeSourceCount;
</del><ins>+ std::atomic<int> m_activeSourceCount { 0 };
</ins><span class="cx">
</span><del>- BehaviorRestrictions m_restrictions;
</del><ins>+ BehaviorRestrictions m_restrictions { NoRestrictions };
</ins><span class="cx">
</span><del>- enum class State { Suspended, Running, Closed };
- State m_state;
</del><ins>+ State m_state { State::Suspended };
</ins><span class="cx"> };
</span><span class="cx">
</span><span class="cx"> } // WebCore
</span></span></pre></div>
<a id="trunkSourceWebCoreModuleswebaudioAudioContextidl"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/Modules/webaudio/AudioContext.idl (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/Modules/webaudio/AudioContext.idl        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/Modules/webaudio/AudioContext.idl        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -23,6 +23,13 @@
</span><span class="cx"> * SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
</span><span class="cx"> */
</span><span class="cx">
</span><ins>+enum AudioContextState {
+ "suspended",
+ "running",
+ "interrupted",
+ "closed"
+};
+
</ins><span class="cx"> [
</span><span class="cx"> EnabledBySetting=WebAudio,
</span><span class="cx"> Conditional=WEB_AUDIO,
</span><span class="lines">@@ -43,6 +50,13 @@
</span><span class="cx"> // All panning is relative to this listener.
</span><span class="cx"> readonly attribute AudioListener listener;
</span><span class="cx">
</span><ins>+ [Custom] Promise suspend();
+ [Custom] Promise resume();
+ [Custom] Promise close();
+
+ readonly attribute AudioContextState state;
+ attribute EventHandler onstatechange;
+
</ins><span class="cx"> // Number of AudioBufferSourceNodes that are currently playing.
</span><span class="cx"> readonly attribute unsigned long activeSourceCount;
</span><span class="cx">
</span></span></pre></div>
<a id="trunkSourceWebCoreModuleswebaudioAudioDestinationNodeh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/Modules/webaudio/AudioDestinationNode.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/Modules/webaudio/AudioDestinationNode.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/Modules/webaudio/AudioDestinationNode.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -35,7 +35,7 @@
</span><span class="cx">
</span><span class="cx"> class AudioBus;
</span><span class="cx"> class AudioContext;
</span><del>-
</del><ins>+
</ins><span class="cx"> class AudioDestinationNode : public AudioNode, public AudioIOCallback {
</span><span class="cx"> public:
</span><span class="cx"> AudioDestinationNode(AudioContext*, float sampleRate);
</span><span class="lines">@@ -58,6 +58,9 @@
</span><span class="cx"> virtual void enableInput(const String& inputDeviceId) = 0;
</span><span class="cx">
</span><span class="cx"> virtual void startRendering() = 0;
</span><ins>+ virtual void resume(std::function<void()>) { }
+ virtual void suspend(std::function<void()>) { }
+ virtual void close(std::function<void()>) { }
</ins><span class="cx">
</span><span class="cx"> AudioSourceProvider* localAudioInputProvider() { return &m_localAudioInputProvider; }
</span><span class="cx">
</span></span></pre></div>
<a id="trunkSourceWebCoreModuleswebaudioDefaultAudioDestinationNodecpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/Modules/webaudio/DefaultAudioDestinationNode.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/Modules/webaudio/DefaultAudioDestinationNode.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/Modules/webaudio/DefaultAudioDestinationNode.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -31,6 +31,7 @@
</span><span class="cx"> #include "AudioContext.h"
</span><span class="cx"> #include "ExceptionCode.h"
</span><span class="cx"> #include "Logging.h"
</span><ins>+#include "ScriptExecutionContext.h"
</ins><span class="cx"> #include <wtf/MainThread.h>
</span><span class="cx">
</span><span class="cx"> const unsigned EnabledInputChannels = 2;
</span><span class="lines">@@ -105,6 +106,29 @@
</span><span class="cx"> m_destination->start();
</span><span class="cx"> }
</span><span class="cx">
</span><ins>+void DefaultAudioDestinationNode::resume(std::function<void()> function)
+{
+ ASSERT(isInitialized());
+ if (isInitialized())
+ m_destination->start();
+ context()->scriptExecutionContext()->postTask(function);
+}
+
+void DefaultAudioDestinationNode::suspend(std::function<void()> function)
+{
+ ASSERT(isInitialized());
+ if (isInitialized())
+ m_destination->stop();
+ context()->scriptExecutionContext()->postTask(function);
+}
+
+void DefaultAudioDestinationNode::close(std::function<void()> function)
+{
+ ASSERT(isInitialized());
+ uninitialize();
+ context()->scriptExecutionContext()->postTask(function);
+}
+
</ins><span class="cx"> unsigned long DefaultAudioDestinationNode::maxChannelCount() const
</span><span class="cx"> {
</span><span class="cx"> return AudioDestination::maxChannelCount();
</span></span></pre></div>
<a id="trunkSourceWebCoreModuleswebaudioDefaultAudioDestinationNodeh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/Modules/webaudio/DefaultAudioDestinationNode.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/Modules/webaudio/DefaultAudioDestinationNode.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/Modules/webaudio/DefaultAudioDestinationNode.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -50,6 +50,9 @@
</span><span class="cx"> // AudioDestinationNode
</span><span class="cx"> virtual void enableInput(const String& inputDeviceId) override;
</span><span class="cx"> virtual void startRendering() override;
</span><ins>+ virtual void resume(std::function<void()>) override;
+ virtual void suspend(std::function<void()>) override;
+ virtual void close(std::function<void()>) override;
</ins><span class="cx"> virtual unsigned long maxChannelCount() const override;
</span><span class="cx"> virtual bool isPlaying() override;
</span><span class="cx">
</span></span></pre></div>
<a id="trunkSourceWebCorebindingsjsJSAudioContextCustomcpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/bindings/js/JSAudioContextCustom.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/bindings/js/JSAudioContextCustom.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/bindings/js/JSAudioContextCustom.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -33,6 +33,7 @@
</span><span class="cx"> #include "JSAudioBuffer.h"
</span><span class="cx"> #include "JSAudioContext.h"
</span><span class="cx"> #include "JSDOMBinding.h"
</span><ins>+#include "JSDOMPromise.h"
</ins><span class="cx"> #include "JSOfflineAudioContext.h"
</span><span class="cx"> #include "OfflineAudioContext.h"
</span><span class="cx"> #include <runtime/ArrayBuffer.h>
</span><span class="lines">@@ -110,6 +111,69 @@
</span><span class="cx"> return JSValue::encode(CREATE_DOM_WRAPPER(jsConstructor->globalObject(), AudioContext, audioContext.get()));
</span><span class="cx"> }
</span><span class="cx">
</span><ins>+JSValue JSAudioContext::suspend(ExecState* exec)
+{
+ DeferredWrapper wrapper(exec, globalObject());
+ auto successCallback = [wrapper]() mutable {
+ wrapper.resolve(nullptr);
+ };
+ auto failureCallback = [wrapper]() mutable {
+ wrapper.reject(nullptr);
+ };
+
+ ExceptionCode ec = 0;
+ impl().suspendContext(WTF::move(successCallback), WTF::move(failureCallback), ec);
+
+ if (ec) {
+ setDOMException(exec, ec);
+ return jsUndefined();
+ }
+
+ return wrapper.promise();
+}
+
+JSValue JSAudioContext::resume(ExecState* exec)
+{
+ DeferredWrapper wrapper(exec, globalObject());
+ auto successCallback = [wrapper]() mutable {
+ wrapper.resolve(nullptr);
+ };
+ auto failureCallback = [wrapper]() mutable {
+ wrapper.reject(nullptr);
+ };
+
+ ExceptionCode ec = 0;
+ impl().resumeContext(WTF::move(successCallback), WTF::move(failureCallback), ec);
+
+ if (ec) {
+ setDOMException(exec, ec);
+ return jsUndefined();
+ }
+
+ return wrapper.promise();
+}
+
+JSValue JSAudioContext::close(ExecState* exec)
+{
+ DeferredWrapper wrapper(exec, globalObject());
+ auto successCallback = [wrapper]() mutable {
+ wrapper.resolve(nullptr);
+ };
+ auto failureCallback = [wrapper]() mutable {
+ wrapper.reject(nullptr);
+ };
+
+ ExceptionCode ec = 0;
+ impl().closeContext(WTF::move(successCallback), WTF::move(failureCallback), ec);
+
+ if (ec) {
+ setDOMException(exec, ec);
+ return jsUndefined();
+ }
+
+ return wrapper.promise();
+}
+
</ins><span class="cx"> } // namespace WebCore
</span><span class="cx">
</span><span class="cx"> #endif // ENABLE(WEB_AUDIO)
</span></span></pre></div>
<a id="trunkSourceWebCorebindingsjsJSCallbackDatah"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/bindings/js/JSCallbackData.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/bindings/js/JSCallbackData.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/bindings/js/JSCallbackData.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -80,9 +80,9 @@
</span><span class="cx"> class DeleteCallbackDataTask : public ScriptExecutionContext::Task {
</span><span class="cx"> public:
</span><span class="cx"> DeleteCallbackDataTask(JSCallbackData* data)
</span><del>- : ScriptExecutionContext::Task({ ScriptExecutionContext::Task::CleanupTask, [data] (ScriptExecutionContext&) {
</del><ins>+ : ScriptExecutionContext::Task(ScriptExecutionContext::Task::CleanupTask, [data] (ScriptExecutionContext&) {
</ins><span class="cx"> delete data;
</span><del>- } })
</del><ins>+ })
</ins><span class="cx"> {
</span><span class="cx"> }
</span><span class="cx"> };
</span></span></pre></div>
<a id="trunkSourceWebCorebindingsjsJSDOMPromiseh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/bindings/js/JSDOMPromise.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/bindings/js/JSDOMPromise.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/bindings/js/JSDOMPromise.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -107,6 +107,14 @@
</span><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> template<>
</span><ins>+inline void DeferredWrapper::resolve(const std::nullptr_t&)
+{
+ JSC::ExecState* exec = m_globalObject->globalExec();
+ JSC::JSLockHolder locker(exec);
+ resolve(exec, JSC::jsNull());
+}
+
+template<>
</ins><span class="cx"> inline void DeferredWrapper::reject<String>(const String& result)
</span><span class="cx"> {
</span><span class="cx"> JSC::ExecState* exec = m_globalObject->globalExec();
</span></span></pre></div>
<a id="trunkSourceWebCoredomEventNamesh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/dom/EventNames.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/dom/EventNames.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/dom/EventNames.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -190,6 +190,7 @@
</span><span class="cx"> macro(stalled) \
</span><span class="cx"> macro(start) \
</span><span class="cx"> macro(started) \
</span><ins>+ macro(statechange) \
</ins><span class="cx"> macro(storage) \
</span><span class="cx"> macro(submit) \
</span><span class="cx"> macro(success) \
</span></span></pre></div>
<a id="trunkSourceWebCoredomScriptExecutionContexth"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/dom/ScriptExecutionContext.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/dom/ScriptExecutionContext.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/dom/ScriptExecutionContext.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -130,6 +130,12 @@
</span><span class="cx"> {
</span><span class="cx"> }
</span><span class="cx">
</span><ins>+ Task(std::function<void()> task)
+ : m_task([task](ScriptExecutionContext&) { task(); })
+ , m_isCleanupTask(false)
+ {
+ }
+
</ins><span class="cx"> template<typename T, typename = typename std::enable_if<std::is_convertible<T, std::function<void (ScriptExecutionContext&)>>::value>::type>
</span><span class="cx"> Task(CleanupTaskTag, T task)
</span><span class="cx"> : m_task(WTF::move(task))
</span></span></pre></div>
<a id="trunkSourceWebCorehtmlHTMLMediaElementcpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/html/HTMLMediaElement.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/html/HTMLMediaElement.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/html/HTMLMediaElement.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -6031,17 +6031,17 @@
</span><span class="cx"> }
</span><span class="cx"> #endif
</span><span class="cx">
</span><del>-void HTMLMediaElement::pausePlayback()
</del><ins>+void HTMLMediaElement::suspendPlayback()
</ins><span class="cx"> {
</span><span class="cx"> LOG(Media, "HTMLMediaElement::pausePlayback(%p) - paused = %s", this, boolString(paused()));
</span><span class="cx"> if (!paused())
</span><span class="cx"> pause();
</span><span class="cx"> }
</span><span class="cx">
</span><del>-void HTMLMediaElement::resumePlayback()
</del><ins>+void HTMLMediaElement::mayResumePlayback(bool shouldResume)
</ins><span class="cx"> {
</span><span class="cx"> LOG(Media, "HTMLMediaElement::resumePlayback(%p) - paused = %s", this, boolString(paused()));
</span><del>- if (paused())
</del><ins>+ if (paused() && shouldResume)
</ins><span class="cx"> play();
</span><span class="cx"> }
</span><span class="cx">
</span></span></pre></div>
<a id="trunkSourceWebCorehtmlHTMLMediaElementh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/html/HTMLMediaElement.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/html/HTMLMediaElement.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/html/HTMLMediaElement.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -707,8 +707,8 @@
</span><span class="cx"> virtual MediaSession::MediaType mediaType() const override;
</span><span class="cx"> virtual MediaSession::MediaType presentationType() const override;
</span><span class="cx"> virtual MediaSession::DisplayType displayType() const override;
</span><del>- virtual void pausePlayback() override;
- virtual void resumePlayback() override;
</del><ins>+ virtual void suspendPlayback() override;
+ virtual void mayResumePlayback(bool shouldResume) override;
</ins><span class="cx"> virtual String mediaSessionTitle() const override;
</span><span class="cx"> virtual double mediaSessionDuration() const override { return duration(); }
</span><span class="cx"> virtual double mediaSessionCurrentTime() const override { return currentTime(); }
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioAudioSessioncpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/AudioSession.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/AudioSession.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/AudioSession.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -86,9 +86,10 @@
</span><span class="cx"> return 0;
</span><span class="cx"> }
</span><span class="cx">
</span><del>-void AudioSession::setActive(bool)
</del><ins>+bool AudioSession::tryToSetActive(bool)
</ins><span class="cx"> {
</span><span class="cx"> notImplemented();
</span><ins>+ return true;
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> size_t AudioSession::preferredBufferSize() const
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioAudioSessionh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/AudioSession.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/AudioSession.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/AudioSession.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -59,7 +59,7 @@
</span><span class="cx"> float sampleRate() const;
</span><span class="cx"> size_t numberOfOutputChannels() const;
</span><span class="cx">
</span><del>- void setActive(bool);
</del><ins>+ bool tryToSetActive(bool);
</ins><span class="cx">
</span><span class="cx"> size_t preferredBufferSize() const;
</span><span class="cx"> void setPreferredBufferSize(size_t);
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioMediaSessioncpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/MediaSession.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/MediaSession.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/MediaSession.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -88,8 +88,8 @@
</span><span class="cx">
</span><span class="cx"> m_stateToRestore = state();
</span><span class="cx"> m_notifyingClient = true;
</span><del>- client().pausePlayback();
</del><span class="cx"> setState(Interrupted);
</span><ins>+ client().suspendPlayback();
</ins><span class="cx"> m_notifyingClient = false;
</span><span class="cx"> }
</span><span class="cx">
</span><span class="lines">@@ -109,28 +109,35 @@
</span><span class="cx"> m_stateToRestore = Idle;
</span><span class="cx"> setState(Paused);
</span><span class="cx">
</span><del>- if (flags & MayResumePlaying && stateToRestore == Playing) {
- LOG(Media, "MediaSession::endInterruption - resuming playback");
- client().resumePlayback();
- }
</del><ins>+ bool shouldResume = flags & MayResumePlaying && stateToRestore == Playing;
+ client().mayResumePlayback(shouldResume);
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> bool MediaSession::clientWillBeginPlayback()
</span><span class="cx"> {
</span><ins>+ if (m_notifyingClient)
+ return true;
+
+ if (!MediaSessionManager::sharedManager().sessionWillBeginPlayback(*this)) {
+ if (state() == Interrupted)
+ m_stateToRestore = Playing;
+ return false;
+ }
+
</ins><span class="cx"> setState(Playing);
</span><del>- MediaSessionManager::sharedManager().sessionWillBeginPlayback(*this);
</del><span class="cx"> updateClientDataBuffering();
</span><span class="cx"> return true;
</span><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> bool MediaSession::clientWillPausePlayback()
</span><span class="cx"> {
</span><ins>+ if (m_notifyingClient)
+ return true;
+
</ins><span class="cx"> LOG(Media, "MediaSession::clientWillPausePlayback(%p)- state = %s", this, stateName(m_state));
</span><span class="cx"> if (state() == Interrupted) {
</span><del>- if (!m_notifyingClient) {
- m_stateToRestore = Paused;
- LOG(Media, " setting stateToRestore to \"Paused\"");
- }
</del><ins>+ m_stateToRestore = Paused;
+ LOG(Media, " setting stateToRestore to \"Paused\"");
</ins><span class="cx"> return false;
</span><span class="cx"> }
</span><span class="cx">
</span><span class="lines">@@ -144,7 +151,7 @@
</span><span class="cx"> void MediaSession::pauseSession()
</span><span class="cx"> {
</span><span class="cx"> LOG(Media, "MediaSession::pauseSession(%p)", this);
</span><del>- m_client.pausePlayback();
</del><ins>+ m_client.suspendPlayback();
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> MediaSession::MediaType MediaSession::mediaType() const
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioMediaSessionh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/MediaSession.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/MediaSession.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/MediaSession.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -149,8 +149,8 @@
</span><span class="cx"> virtual MediaSession::MediaType presentationType() const = 0;
</span><span class="cx"> virtual MediaSession::DisplayType displayType() const { return MediaSession::Normal; }
</span><span class="cx">
</span><del>- virtual void resumePlayback() = 0;
- virtual void pausePlayback() = 0;
</del><ins>+ virtual void mayResumePlayback(bool shouldResume) = 0;
+ virtual void suspendPlayback() = 0;
</ins><span class="cx">
</span><span class="cx"> virtual String mediaSessionTitle() const;
</span><span class="cx"> virtual double mediaSessionDuration() const;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioMediaSessionManagercpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/MediaSessionManager.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/MediaSessionManager.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/MediaSessionManager.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -28,6 +28,7 @@
</span><span class="cx">
</span><span class="cx"> #if ENABLE(VIDEO)
</span><span class="cx">
</span><ins>+#include "AudioSession.h"
</ins><span class="cx"> #include "Logging.h"
</span><span class="cx"> #include "NotImplemented.h"
</span><span class="cx"> #include "MediaSession.h"
</span><span class="lines">@@ -166,7 +167,7 @@
</span><span class="cx"> return m_restrictions[type];
</span><span class="cx"> }
</span><span class="cx">
</span><del>-void MediaSessionManager::sessionWillBeginPlayback(MediaSession& session)
</del><ins>+bool MediaSessionManager::sessionWillBeginPlayback(MediaSession& session)
</ins><span class="cx"> {
</span><span class="cx"> LOG(Media, "MediaSessionManager::sessionWillBeginPlayback - %p", &session);
</span><span class="cx">
</span><span class="lines">@@ -174,8 +175,19 @@
</span><span class="cx">
</span><span class="cx"> MediaSession::MediaType sessionType = session.mediaType();
</span><span class="cx"> SessionRestrictions restrictions = m_restrictions[sessionType];
</span><ins>+ if (session.state() == MediaSession::Interrupted && restrictions & InterruptedPlaybackNotPermitted)
+ return false;
+
+#if USE(AUDIO_SESSION)
+ if (activeAudioSessionRequired() && !AudioSession::sharedSession().tryToSetActive(true))
+ return false;
+#endif
+
+ if (m_interrupted)
+ endInterruption(MediaSession::NoFlags);
+
</ins><span class="cx"> if (!restrictions & ConcurrentPlaybackNotPermitted)
</span><del>- return;
</del><ins>+ return true;
</ins><span class="cx">
</span><span class="cx"> Vector<MediaSession*> sessions = m_sessions;
</span><span class="cx"> for (auto* oneSession : sessions) {
</span><span class="lines">@@ -188,6 +200,7 @@
</span><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> updateSessionState();
</span><ins>+ return true;
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> void MediaSessionManager::sessionWillEndPlayback(MediaSession& session)
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioMediaSessionManagerh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/MediaSessionManager.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/MediaSessionManager.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/MediaSessionManager.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -64,6 +64,7 @@
</span><span class="cx"> AutoPreloadingNotPermitted = 1 << 3,
</span><span class="cx"> BackgroundProcessPlaybackRestricted = 1 << 4,
</span><span class="cx"> BackgroundTabPlaybackRestricted = 1 << 5,
</span><ins>+ InterruptedPlaybackNotPermitted = 1 << 6,
</ins><span class="cx"> };
</span><span class="cx"> typedef unsigned SessionRestrictions;
</span><span class="cx">
</span><span class="lines">@@ -72,7 +73,7 @@
</span><span class="cx"> WEBCORE_EXPORT SessionRestrictions restrictions(MediaSession::MediaType);
</span><span class="cx"> virtual void resetRestrictions();
</span><span class="cx">
</span><del>- virtual void sessionWillBeginPlayback(MediaSession&);
</del><ins>+ virtual bool sessionWillBeginPlayback(MediaSession&);
</ins><span class="cx"> virtual void sessionWillEndPlayback(MediaSession&);
</span><span class="cx">
</span><span class="cx"> bool sessionRestrictsInlineVideoPlayback(const MediaSession&) const;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioiosAudioDestinationIOScpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/ios/AudioDestinationIOS.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/ios/AudioDestinationIOS.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/ios/AudioDestinationIOS.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -100,7 +100,6 @@
</span><span class="cx"> : m_outputUnit(0)
</span><span class="cx"> , m_callback(callback)
</span><span class="cx"> , m_renderBus(AudioBus::create(2, kRenderBufferSize, false))
</span><del>- , m_mediaSession(MediaSession::create(*this))
</del><span class="cx"> , m_sampleRate(sampleRate)
</span><span class="cx"> , m_isPlaying(false)
</span><span class="cx"> {
</span><span class="lines">@@ -184,10 +183,6 @@
</span><span class="cx"> void AudioDestinationIOS::start()
</span><span class="cx"> {
</span><span class="cx"> LOG(Media, "AudioDestinationIOS::start");
</span><del>- if (!m_mediaSession->clientWillBeginPlayback()) {
- LOG(Media, " returning because of interruption");
- return;
- }
</del><span class="cx">
</span><span class="cx"> OSStatus result = AudioOutputUnitStart(m_outputUnit);
</span><span class="cx"> if (!result)
</span><span class="lines">@@ -197,10 +192,6 @@
</span><span class="cx"> void AudioDestinationIOS::stop()
</span><span class="cx"> {
</span><span class="cx"> LOG(Media, "AudioDestinationIOS::stop");
</span><del>- if (!m_mediaSession->clientWillPausePlayback()) {
- LOG(Media, " returning because of interruption");
- return;
- }
</del><span class="cx">
</span><span class="cx"> OSStatus result = AudioOutputUnitStop(m_outputUnit);
</span><span class="cx"> if (!result)
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioiosAudioDestinationIOSh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/ios/AudioDestinationIOS.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/ios/AudioDestinationIOS.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/ios/AudioDestinationIOS.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -32,7 +32,6 @@
</span><span class="cx">
</span><span class="cx"> #include "AudioBus.h"
</span><span class="cx"> #include "AudioDestination.h"
</span><del>-#include "MediaSession.h"
</del><span class="cx"> #include <AudioUnit/AudioUnit.h>
</span><span class="cx"> #include <wtf/RefPtr.h>
</span><span class="cx">
</span><span class="lines">@@ -40,7 +39,7 @@
</span><span class="cx">
</span><span class="cx"> // An AudioDestination using CoreAudio's default output AudioUnit
</span><span class="cx">
</span><del>-class AudioDestinationIOS final : public AudioDestination, private MediaSessionClient {
</del><ins>+class AudioDestinationIOS final : public AudioDestination {
</ins><span class="cx"> public:
</span><span class="cx"> AudioDestinationIOS(AudioIOCallback&, double sampleRate);
</span><span class="cx"> virtual ~AudioDestinationIOS();
</span><span class="lines">@@ -54,15 +53,6 @@
</span><span class="cx"> virtual bool isPlaying() override { return m_isPlaying; }
</span><span class="cx"> virtual float sampleRate() const override { return m_sampleRate; }
</span><span class="cx">
</span><del>- // MediaSessionClient
- virtual MediaSession::MediaType mediaType() const { return MediaSession::WebAudio; }
- virtual MediaSession::MediaType presentationType() const { return MediaSession::WebAudio; }
- virtual bool canReceiveRemoteControlCommands() const { return false; }
- virtual void didReceiveRemoteControlCommand(MediaSession::RemoteControlCommandType) { }
- virtual bool overrideBackgroundPlaybackRestriction() const { return false; }
- virtual void pausePlayback() override { stop(); }
- virtual void resumePlayback() override { start(); }
-
</del><span class="cx"> // DefaultOutputUnit callback
</span><span class="cx"> static OSStatus inputProc(void* userData, AudioUnitRenderActionFlags*, const AudioTimeStamp*, UInt32 busNumber, UInt32 numberOfFrames, AudioBufferList* ioData);
</span><span class="cx"> static void frameSizeChangedProc(void *inRefCon, AudioUnit inUnit, AudioUnitPropertyID inID, AudioUnitScope inScope, AudioUnitElement inElement);
</span><span class="lines">@@ -75,7 +65,6 @@
</span><span class="cx"> AudioUnit m_outputUnit;
</span><span class="cx"> AudioIOCallback& m_callback;
</span><span class="cx"> RefPtr<AudioBus> m_renderBus;
</span><del>- std::unique_ptr<MediaSession> m_mediaSession;
</del><span class="cx">
</span><span class="cx"> double m_sampleRate;
</span><span class="cx"> bool m_isPlaying;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioiosAudioSessionIOSmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -176,11 +176,11 @@
</span><span class="cx"> return [[AVAudioSession sharedInstance] outputNumberOfChannels];
</span><span class="cx"> }
</span><span class="cx">
</span><del>-void AudioSession::setActive(bool active)
</del><ins>+bool AudioSession::tryToSetActive(bool active)
</ins><span class="cx"> {
</span><span class="cx"> NSError *error = nil;
</span><span class="cx"> [[AVAudioSession sharedInstance] setActive:active error:&error];
</span><del>- ASSERT(!error);
</del><ins>+ return !error;
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> size_t AudioSession::preferredBufferSize() const
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioiosMediaSessionManagerIOSh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -53,7 +53,7 @@
</span><span class="cx">
</span><span class="cx"> MediaSessionManageriOS();
</span><span class="cx">
</span><del>- virtual void sessionWillBeginPlayback(MediaSession&) override;
</del><ins>+ virtual bool sessionWillBeginPlayback(MediaSession&) override;
</ins><span class="cx"> virtual void sessionWillEndPlayback(MediaSession&) override;
</span><span class="cx">
</span><span class="cx"> void updateNowPlayingInfo();
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudioiosMediaSessionManagerIOSmm"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -185,10 +185,13 @@
</span><span class="cx"> [m_objcObserver stopMonitoringAirPlayRoutes];
</span><span class="cx"> }
</span><span class="cx">
</span><del>-void MediaSessionManageriOS::sessionWillBeginPlayback(MediaSession& session)
</del><ins>+bool MediaSessionManageriOS::sessionWillBeginPlayback(MediaSession& session)
</ins><span class="cx"> {
</span><del>- MediaSessionManager::sessionWillBeginPlayback(session);
</del><ins>+ if (!MediaSessionManager::sessionWillBeginPlayback(session))
+ return false;
+
</ins><span class="cx"> updateNowPlayingInfo();
</span><ins>+ return true;
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> void MediaSessionManageriOS::sessionWillEndPlayback(MediaSession& session)
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiomacAudioDestinationMaccpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/mac/AudioDestinationMac.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/mac/AudioDestinationMac.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/mac/AudioDestinationMac.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -85,7 +85,6 @@
</span><span class="cx"> , m_renderBus(AudioBus::create(2, kBufferSize, false))
</span><span class="cx"> , m_sampleRate(sampleRate)
</span><span class="cx"> , m_isPlaying(false)
</span><del>- , m_mediaSession(MediaSession::create(*this))
</del><span class="cx"> {
</span><span class="cx"> // Open and initialize DefaultOutputUnit
</span><span class="cx"> AudioComponent comp;
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiomacAudioDestinationMach"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/mac/AudioDestinationMac.h (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/mac/AudioDestinationMac.h        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/mac/AudioDestinationMac.h        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -31,7 +31,6 @@
</span><span class="cx">
</span><span class="cx"> #include "AudioBus.h"
</span><span class="cx"> #include "AudioDestination.h"
</span><del>-#include "MediaSession.h"
</del><span class="cx"> #include <AudioUnit/AudioUnit.h>
</span><span class="cx"> #include <wtf/RefPtr.h>
</span><span class="cx">
</span><span class="lines">@@ -39,7 +38,7 @@
</span><span class="cx">
</span><span class="cx"> // An AudioDestination using CoreAudio's default output AudioUnit
</span><span class="cx">
</span><del>-class AudioDestinationMac : public AudioDestination, public MediaSessionClient {
</del><ins>+class AudioDestinationMac : public AudioDestination {
</ins><span class="cx"> public:
</span><span class="cx"> AudioDestinationMac(AudioIOCallback&, float sampleRate);
</span><span class="cx"> virtual ~AudioDestinationMac();
</span><span class="lines">@@ -53,19 +52,9 @@
</span><span class="cx"> OSStatus render(UInt32 numberOfFrames, AudioBufferList* ioData);
</span><span class="cx"> void setIsPlaying(bool);
</span><span class="cx">
</span><del>- virtual MediaSession::MediaType mediaType() const override { return MediaSession::WebAudio; }
- virtual MediaSession::MediaType presentationType() const { return MediaSession::WebAudio; }
- virtual bool canReceiveRemoteControlCommands() const override { return false; }
- virtual void didReceiveRemoteControlCommand(MediaSession::RemoteControlCommandType) override { }
- virtual bool overrideBackgroundPlaybackRestriction() const override { return false; }
-
</del><span class="cx"> virtual void start() override;
</span><span class="cx"> virtual void stop() override;
</span><span class="cx"> virtual bool isPlaying() override { return m_isPlaying; }
</span><del>-
- virtual void pausePlayback() override { stop(); }
- virtual void resumePlayback() override { start(); }
-
</del><span class="cx"> virtual float sampleRate() const override { return m_sampleRate; }
</span><span class="cx">
</span><span class="cx"> AudioUnit m_outputUnit;
</span><span class="lines">@@ -74,10 +63,6 @@
</span><span class="cx">
</span><span class="cx"> float m_sampleRate;
</span><span class="cx"> bool m_isPlaying;
</span><del>-
-#if USE(AUDIO_SESSION)
- std::unique_ptr<MediaSession> m_mediaSession;
-#endif
</del><span class="cx"> };
</span><span class="cx">
</span><span class="cx"> } // namespace WebCore
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiomacAudioSessionMaccpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/mac/AudioSessionMac.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/mac/AudioSessionMac.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/mac/AudioSessionMac.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -107,9 +107,10 @@
</span><span class="cx"> return 0;
</span><span class="cx"> }
</span><span class="cx">
</span><del>-void AudioSession::setActive(bool)
</del><ins>+bool AudioSession::tryToSetActive(bool)
</ins><span class="cx"> {
</span><span class="cx"> notImplemented();
</span><ins>+ return true;
</ins><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> size_t AudioSession::preferredBufferSize() const
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiomacMediaSessionManagerMaccpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/mac/MediaSessionManagerMac.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/mac/MediaSessionManagerMac.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/platform/audio/mac/MediaSessionManagerMac.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -57,9 +57,6 @@
</span><span class="cx"> }
</span><span class="cx">
</span><span class="cx"> #if PLATFORM(IOS)
</span><del>- if (activeAudioSessionRequired())
- AudioSession::sharedSession().setActive(true);
-
</del><span class="cx"> if (!Settings::shouldManageAudioSessionCategory())
</span><span class="cx"> return;
</span><span class="cx">
</span></span></pre></div>
<a id="trunkSourceWebCoretestingInternalscpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/testing/Internals.cpp (182140 => 182141)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/testing/Internals.cpp        2015-03-30 15:23:24 UTC (rev 182140)
+++ trunk/Source/WebCore/testing/Internals.cpp        2015-03-30 16:15:00 UTC (rev 182141)
</span><span class="lines">@@ -2467,6 +2467,8 @@
</span><span class="cx"> restrictions += MediaSessionManager::BackgroundProcessPlaybackRestricted;
</span><span class="cx"> if (equalIgnoringCase(restrictionsString, "BackgroundTabPlaybackRestricted"))
</span><span class="cx"> restrictions += MediaSessionManager::BackgroundTabPlaybackRestricted;
</span><ins>+ if (equalIgnoringCase(restrictionsString, "InterruptedPlaybackNotPermitted"))
+ restrictions += MediaSessionManager::InterruptedPlaybackNotPermitted;
</ins><span class="cx">
</span><span class="cx"> MediaSessionManager::sharedManager().addRestriction(mediaType, restrictions);
</span><span class="cx"> }
</span></span></pre>
</div>
</div>
</body>
</html>