<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[277256] trunk/Source/WebKit</title>
</head>
<body>

<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt;  }
#msg dl a { font-weight: bold}
#msg dl a:link    { color:#fc3; }
#msg dl a:active  { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff  {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/277256">277256</a></dd>
<dt>Author</dt> <dd>youenn@apple.com</dd>
<dt>Date</dt> <dd>2021-05-10 00:10:31 -0700 (Mon, 10 May 2021)</dd>
</dl>

<h3>Log Message</h3>
<pre>Use IPC::Semaphore instead of sending an IPC message for every captured audio sample
https://bugs.webkit.org/show_bug.cgi?id=225452

Reviewed by Eric Carlson.

Previously, we were sending an IPC message from UIProcess or GPUProcess to WebProcess for every microphone audio sample chunk.
We are now using IPC::Semaphore to signal that a new chunk is to be processed.

We no longer send the chunk timestamp. Instead, we reconstruct it from the number of previously processed samples.
At audio storage change, we send the start time and we assume that there is continuous timing based on sample counts after that.
That is why we recreate a new audio storage change anytime we need to reset or the configuration changes, which should not happen often in practice.

We process fixed-size chunks on WebProcess side and signal it on GPUProcess/UIProcess side.
This size is sent through IPC at audio storage change time and is the max of 128 samples (WebAudio quantum) and AudioSession preferred size.
In case WebAudio is used, it should be 128 samples. In case WebAudio is not used, it should be 20 ms of audio data.

Covered by existing tests and manually tested.

* UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp:
(WebKit::UserMediaCaptureManagerProxy::SourceProxy::start):
(WebKit::UserMediaCaptureManagerProxy::SourceProxy::storageChanged):
* WebProcess/cocoa/RemoteCaptureSampleManager.cpp:
(WebKit::RemoteCaptureSampleManager::audioStorageChanged):
(WebKit::RemoteCaptureSampleManager::RemoteAudio::RemoteAudio):
(WebKit::RemoteCaptureSampleManager::RemoteAudio::~RemoteAudio):
(WebKit::RemoteCaptureSampleManager::RemoteAudio::stopThread):
(WebKit::RemoteCaptureSampleManager::RemoteAudio::startThread):
(WebKit::RemoteCaptureSampleManager::RemoteAudio::setStorage):
* WebProcess/cocoa/RemoteCaptureSampleManager.h:
* WebProcess/cocoa/RemoteCaptureSampleManager.messages.in:</pre>

<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkSourceWebKitChangeLog">trunk/Source/WebKit/ChangeLog</a></li>
<li><a href="#trunkSourceWebKitUIProcessCocoaUserMediaCaptureManagerProxycpp">trunk/Source/WebKit/UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp</a></li>
<li><a href="#trunkSourceWebKitWebProcesscocoaRemoteCaptureSampleManagercpp">trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.cpp</a></li>
<li><a href="#trunkSourceWebKitWebProcesscocoaRemoteCaptureSampleManagerh">trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.h</a></li>
<li><a href="#trunkSourceWebKitWebProcesscocoaRemoteCaptureSampleManagermessagesin">trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.messages.in</a></li>
</ul>

</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkSourceWebKitChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebKit/ChangeLog (277255 => 277256)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebKit/ChangeLog    2021-05-10 01:48:09 UTC (rev 277255)
+++ trunk/Source/WebKit/ChangeLog       2021-05-10 07:10:31 UTC (rev 277256)
</span><span class="lines">@@ -1,3 +1,36 @@
</span><ins>+2021-05-10  Youenn Fablet  <youenn@apple.com>
+
+        Use IPC::Semaphore instead of sending an IPC message for every captured audio sample
+        https://bugs.webkit.org/show_bug.cgi?id=225452
+
+        Reviewed by Eric Carlson.
+
+        Previously, we were sending an IPC message from UIProcess or GPUProcess to WebProcess for every microphone audio sample chunk.
+        We are now using IPC::Semaphore to signal that a new chunk is to be processed.
+
+        We no longer send the chunk timestamp. Instead, we reconstruct it from the number of previously processed samples.
+        At audio storage change, we send the start time and we assume that there is continuous timing based on sample counts after that.
+        That is why we recreate a new audio storage change anytime we need to reset or the configuration changes, which should not happen often in practice.
+
+        We process fixed-size chunks on WebProcess side and signal it on GPUProcess/UIProcess side.
+        This size is sent through IPC at audio storage change time and is the max of 128 samples (WebAudio quantum) and AudioSession preferred size.
+        In case WebAudio is used, it should be 128 samples. In case WebAudio is not used, it should be 20 ms of audio data.
+
+        Covered by existing tests and manually tested.
+
+        * UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp:
+        (WebKit::UserMediaCaptureManagerProxy::SourceProxy::start):
+        (WebKit::UserMediaCaptureManagerProxy::SourceProxy::storageChanged):
+        * WebProcess/cocoa/RemoteCaptureSampleManager.cpp:
+        (WebKit::RemoteCaptureSampleManager::audioStorageChanged):
+        (WebKit::RemoteCaptureSampleManager::RemoteAudio::RemoteAudio):
+        (WebKit::RemoteCaptureSampleManager::RemoteAudio::~RemoteAudio):
+        (WebKit::RemoteCaptureSampleManager::RemoteAudio::stopThread):
+        (WebKit::RemoteCaptureSampleManager::RemoteAudio::startThread):
+        (WebKit::RemoteCaptureSampleManager::RemoteAudio::setStorage):
+        * WebProcess/cocoa/RemoteCaptureSampleManager.h:
+        * WebProcess/cocoa/RemoteCaptureSampleManager.messages.in:
+
</ins><span class="cx"> 2021-05-09  Ryosuke Niwa  <rniwa@webkit.org>
</span><span class="cx"> 
</span><span class="cx">         IPC testing API should have the ability to send and receive shared memory
</span></span></pre></div>
<a id="trunkSourceWebKitUIProcessCocoaUserMediaCaptureManagerProxycpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebKit/UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp (277255 => 277256)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebKit/UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp     2021-05-10 01:48:09 UTC (rev 277255)
+++ trunk/Source/WebKit/UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp        2021-05-10 07:10:31 UTC (rev 277256)
</span><span class="lines">@@ -36,6 +36,7 @@
</span><span class="cx"> #include "WebCoreArgumentCoders.h"
</span><span class="cx"> #include "WebProcessProxy.h"
</span><span class="cx"> #include <WebCore/AudioSession.h>
</span><ins>+#include <WebCore/AudioUtilities.h>
</ins><span class="cx"> #include <WebCore/CARingBuffer.h>
</span><span class="cx"> #include <WebCore/ImageRotationSessionVT.h>
</span><span class="cx"> #include <WebCore/MediaConstraints.h>
</span><span class="lines">@@ -60,7 +61,6 @@
</span><span class="cx">         : m_id(id)
</span><span class="cx">         , m_connection(WTFMove(connection))
</span><span class="cx">         , m_source(WTFMove(source))
</span><del>-        , m_ringBuffer(makeUniqueRef<SharedRingBufferStorage>(std::bind(&SourceProxy::storageChanged, this, std::placeholders::_1, std::placeholders::_2, std::placeholders::_3)))
</del><span class="cx">     {
</span><span class="cx">         m_source->addObserver(*this);
</span><span class="cx">         switch (m_source->type()) {
</span><span class="lines">@@ -77,7 +77,8 @@
</span><span class="cx"> 
</span><span class="cx">     ~SourceProxy()
</span><span class="cx">     {
</span><del>-        storage().invalidate();
</del><ins>+        if (m_ringBuffer)
+            static_cast<SharedRingBufferStorage&>(m_ringBuffer->storage()).invalidate();
</ins><span class="cx"> 
</span><span class="cx">         switch (m_source->type()) {
</span><span class="cx">         case RealtimeMediaSource::Type::Audio:
</span><span class="lines">@@ -93,7 +94,6 @@
</span><span class="cx">     }
</span><span class="cx"> 
</span><span class="cx">     RealtimeMediaSource& source() { return m_source; }
</span><del>-    SharedRingBufferStorage& storage() { return static_cast<SharedRingBufferStorage&>(m_ringBuffer.storage()); }
</del><span class="cx">     CAAudioStreamDescription& description() { return m_description; }
</span><span class="cx">     int64_t numberOfFrames() { return m_numberOfFrames; }
</span><span class="cx"> 
</span><span class="lines">@@ -108,6 +108,7 @@
</span><span class="cx"> 
</span><span class="cx">     void start()
</span><span class="cx">     {
</span><ins>+        m_shouldReset = true;
</ins><span class="cx">         m_isEnded = false;
</span><span class="cx">         m_source->start();
</span><span class="cx">     }
</span><span class="lines">@@ -145,20 +146,35 @@
</span><span class="cx"> 
</span><span class="cx">     // May get called on a background thread.
</span><span class="cx">     void audioSamplesAvailable(const MediaTime& time, const PlatformAudioData& audioData, const AudioStreamDescription& description, size_t numberOfFrames) final {
</span><del>-        DisableMallocRestrictionsForCurrentThreadScope scope;
</del><ins>+        if (m_description != description || m_shouldReset) {
+            DisableMallocRestrictionsForCurrentThreadScope scope;
</ins><span class="cx"> 
</span><del>-        if (m_description != description) {
</del><ins>+            m_shouldReset = false;
+            m_writeOffset = 0;
+            m_remainingFrameCount = 0;
+            m_startTime = time;
+            m_captureSemaphore = makeUnique<IPC::Semaphore>();
</ins><span class="cx">             ASSERT(description.platformDescription().type == PlatformDescription::CAAudioStreamBasicType);
</span><span class="cx">             m_description = *WTF::get<const AudioStreamBasicDescription*>(description.platformDescription().description);
</span><span class="cx"> 
</span><ins>+            m_frameChunkSize = std::max(WebCore::AudioUtilities::renderQuantumSize, AudioSession::sharedSession().preferredBufferSize());
+
</ins><span class="cx">             // Allocate a ring buffer large enough to contain 2 seconds of audio.
</span><span class="cx">             m_numberOfFrames = m_description.sampleRate() * 2;
</span><del>-            m_ringBuffer.allocate(m_description.streamDescription(), m_numberOfFrames);
</del><ins>+            m_ringBuffer.reset();
+            auto storage = makeUniqueRef<SharedRingBufferStorage>(std::bind(&SourceProxy::storageChanged, this, std::placeholders::_1, std::placeholders::_2, std::placeholders::_3));
+            m_ringBuffer = makeUnique<CARingBuffer>(WTFMove(storage), m_description.streamDescription(), m_numberOfFrames);
</ins><span class="cx">         }
</span><span class="cx"> 
</span><span class="cx">         ASSERT(is<WebAudioBufferList>(audioData));
</span><del>-        m_ringBuffer.store(downcast<WebAudioBufferList>(audioData).list(), numberOfFrames, time.timeValue());
-        m_connection->send(Messages::RemoteCaptureSampleManager::AudioSamplesAvailable(m_id, time, numberOfFrames), 0);
</del><ins>+        m_ringBuffer->store(downcast<WebAudioBufferList>(audioData).list(), numberOfFrames, m_writeOffset);
+        m_writeOffset += numberOfFrames;
+
+        size_t framesToSend = numberOfFrames + m_remainingFrameCount;
+        size_t signalCount = framesToSend / m_frameChunkSize;
+        m_remainingFrameCount = framesToSend - (signalCount * m_frameChunkSize);
+        for (unsigned i = 0; i < signalCount; ++i)
+            m_captureSemaphore->signal();
</ins><span class="cx">     }
</span><span class="cx"> 
</span><span class="cx">     void videoSampleAvailable(MediaSample& sample) final
</span><span class="lines">@@ -199,7 +215,6 @@
</span><span class="cx"> 
</span><span class="cx">     void storageChanged(SharedMemory* storage, const WebCore::CAAudioStreamDescription& format, size_t frameCount)
</span><span class="cx">     {
</span><del>-        DisableMallocRestrictionsForCurrentThreadScope scope;
</del><span class="cx">         SharedMemory::Handle handle;
</span><span class="cx">         if (storage)
</span><span class="cx">             storage->createHandle(handle, SharedMemory::Protection::ReadOnly);
</span><span class="lines">@@ -210,7 +225,7 @@
</span><span class="cx"> #else
</span><span class="cx">         uint64_t dataSize = 0;
</span><span class="cx"> #endif
</span><del>-        m_connection->send(Messages::RemoteCaptureSampleManager::AudioStorageChanged(m_id, SharedMemory::IPCHandle { WTFMove(handle),  dataSize }, format, frameCount), 0);
</del><ins>+        m_connection->send(Messages::RemoteCaptureSampleManager::AudioStorageChanged(m_id, SharedMemory::IPCHandle { WTFMove(handle),  dataSize }, format, frameCount, *m_captureSemaphore, m_startTime, m_frameChunkSize), 0);
</ins><span class="cx">     }
</span><span class="cx"> 
</span><span class="cx">     bool preventSourceFromStopping()
</span><span class="lines">@@ -223,12 +238,18 @@
</span><span class="cx">     WeakPtr<PlatformMediaSessionManager> m_sessionManager;
</span><span class="cx">     Ref<IPC::Connection> m_connection;
</span><span class="cx">     Ref<RealtimeMediaSource> m_source;
</span><del>-    CARingBuffer m_ringBuffer;
</del><ins>+    std::unique_ptr<CARingBuffer> m_ringBuffer;
</ins><span class="cx">     CAAudioStreamDescription m_description { };
</span><span class="cx">     int64_t m_numberOfFrames { 0 };
</span><span class="cx">     bool m_isEnded { false };
</span><span class="cx">     std::unique_ptr<ImageRotationSessionVT> m_rotationSession;
</span><span class="cx">     bool m_shouldApplyRotation { false };
</span><ins>+    std::unique_ptr<IPC::Semaphore> m_captureSemaphore;
+    int64_t m_writeOffset { 0 };
+    int64_t m_remainingFrameCount { 0 };
+    size_t m_frameChunkSize { 0 };
+    MediaTime m_startTime;
+    bool m_shouldReset { false };
</ins><span class="cx"> };
</span><span class="cx"> 
</span><span class="cx"> UserMediaCaptureManagerProxy::UserMediaCaptureManagerProxy(UniqueRef<ConnectionProxy>&& connectionProxy)
</span></span></pre></div>
<a id="trunkSourceWebKitWebProcesscocoaRemoteCaptureSampleManagercpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.cpp (277255 => 277256)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.cpp      2021-05-10 01:48:09 UTC (rev 277255)
+++ trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.cpp 2021-05-10 07:10:31 UTC (rev 277256)
</span><span class="lines">@@ -125,7 +125,7 @@
</span><span class="cx">     m_queue->dispatch(WTFMove(callback));
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-void RemoteCaptureSampleManager::audioStorageChanged(WebCore::RealtimeMediaSourceIdentifier identifier, const SharedMemory::IPCHandle& ipcHandle, const WebCore::CAAudioStreamDescription& description, uint64_t numberOfFrames)
</del><ins>+void RemoteCaptureSampleManager::audioStorageChanged(WebCore::RealtimeMediaSourceIdentifier identifier, const SharedMemory::IPCHandle& ipcHandle, const WebCore::CAAudioStreamDescription& description, uint64_t numberOfFrames, IPC::Semaphore&& semaphore, const MediaTime& mediaTime, size_t frameChunkSize)
</ins><span class="cx"> {
</span><span class="cx">     ASSERT(!WTF::isMainRunLoop());
</span><span class="cx"> 
</span><span class="lines">@@ -134,21 +134,9 @@
</span><span class="cx">         RELEASE_LOG_ERROR(WebRTC, "Unable to find source %llu for storageChanged", identifier.toUInt64());
</span><span class="cx">         return;
</span><span class="cx">     }
</span><del>-    iterator->value->setStorage(ipcHandle.handle, description, numberOfFrames);
</del><ins>+    iterator->value->setStorage(ipcHandle.handle, description, numberOfFrames, WTFMove(semaphore), mediaTime, frameChunkSize);
</ins><span class="cx"> }
</span><span class="cx"> 
</span><del>-void RemoteCaptureSampleManager::audioSamplesAvailable(WebCore::RealtimeMediaSourceIdentifier identifier, MediaTime time, uint64_t numberOfFrames)
-{
-    ASSERT(!WTF::isMainRunLoop());
-
-    auto iterator = m_audioSources.find(identifier);
-    if (iterator == m_audioSources.end()) {
-        RELEASE_LOG_ERROR(WebRTC, "Unable to find source %llu for audioSamplesAvailable", identifier.toUInt64());
-        return;
-    }
-    iterator->value->audioSamplesAvailable(time, numberOfFrames);
-}
-
</del><span class="cx"> void RemoteCaptureSampleManager::videoSampleAvailable(RealtimeMediaSourceIdentifier identifier, RemoteVideoSample&& sample)
</span><span class="cx"> {
</span><span class="cx">     ASSERT(!WTF::isMainRunLoop());
</span><span class="lines">@@ -163,34 +151,68 @@
</span><span class="cx"> 
</span><span class="cx"> RemoteCaptureSampleManager::RemoteAudio::RemoteAudio(Ref<RemoteRealtimeAudioSource>&& source)
</span><span class="cx">     : m_source(WTFMove(source))
</span><del>-    , m_ringBuffer(makeUnique<CARingBuffer>())
</del><span class="cx"> {
</span><span class="cx"> }
</span><span class="cx"> 
</span><del>-void RemoteCaptureSampleManager::RemoteAudio::setStorage(const SharedMemory::Handle& handle, const WebCore::CAAudioStreamDescription& description, uint64_t numberOfFrames)
</del><ins>+RemoteCaptureSampleManager::RemoteAudio::~RemoteAudio()
</ins><span class="cx"> {
</span><del>-    m_description = description;
-    m_ringBuffer = makeUnique<CARingBuffer>(makeUniqueRef<ReadOnlySharedRingBufferStorage>(handle), description, numberOfFrames);
-    m_buffer = makeUnique<WebAudioBufferList>(description, numberOfFrames);
</del><ins>+    stopThread();
</ins><span class="cx"> }
</span><span class="cx"> 
</span><del>-void RemoteCaptureSampleManager::RemoteAudio::audioSamplesAvailable(MediaTime time, uint64_t numberOfFrames)
</del><ins>+void RemoteCaptureSampleManager::RemoteAudio::stopThread()
</ins><span class="cx"> {
</span><del>-    if (!m_buffer) {
-        RELEASE_LOG_ERROR(WebRTC, "buffer for audio source %llu is null", m_source->identifier().toUInt64());
</del><ins>+    if (!m_thread)
</ins><span class="cx">         return;
</span><del>-    }
</del><span class="cx"> 
</span><del>-    if (!WebAudioBufferList::isSupportedDescription(m_description, numberOfFrames)) {
-        RELEASE_LOG_ERROR(WebRTC, "Unable to support description with given number of frames for audio source %llu", m_source->identifier().toUInt64());
</del><ins>+    m_shouldStopThread = true;
+    m_semaphore.signal();
+    m_thread->waitForCompletion();
+    m_thread = nullptr;
+}
+
+void RemoteCaptureSampleManager::RemoteAudio::startThread()
+{
+    ASSERT(!m_thread);
+    m_shouldStopThread = false;
+    auto threadLoop = [this]() mutable {
+        m_readOffset = 0;
+        do {
+            // If waitFor fails, the semaphore on the other side was probably destroyed and we should just exit here and wait to launch a new thread.
+            if (!m_semaphore.waitFor(Seconds::infinity()))
+                break;
+            if (m_shouldStopThread)
+                break;
+
+            auto currentTime = m_startTime + MediaTime { m_readOffset, static_cast<uint32_t>(m_description.sampleRate()) };
+            m_ringBuffer->fetch(m_buffer->list(), m_frameChunkSize, m_readOffset);
+            m_readOffset += m_frameChunkSize;
+
+            m_source->remoteAudioSamplesAvailable(currentTime, *m_buffer, m_description, m_frameChunkSize);
+        } while (!m_shouldStopThread);
+    };
+    m_thread = Thread::create("RemoteAudioSourceProviderManager::RemoteAudio thread", WTFMove(threadLoop), ThreadType::Audio, Thread::QOS::UserInteractive);
+}
+
+void RemoteCaptureSampleManager::RemoteAudio::setStorage(const SharedMemory::Handle& handle, const WebCore::CAAudioStreamDescription& description, uint64_t numberOfFrames, IPC::Semaphore&& semaphore, const MediaTime& mediaTime, size_t frameChunkSize)
+{
+    stopThread();
+
+    if (!numberOfFrames) {
+        m_ringBuffer = nullptr;
+        m_buffer = nullptr;
</ins><span class="cx">         return;
</span><span class="cx">     }
</span><span class="cx"> 
</span><del>-    m_buffer->setSampleCount(numberOfFrames);
</del><ins>+    m_semaphore = WTFMove(semaphore);
+    m_description = description;
+    m_startTime = mediaTime;
+    m_frameChunkSize = frameChunkSize;
</ins><span class="cx"> 
</span><del>-    m_ringBuffer->fetch(m_buffer->list(), numberOfFrames, time.timeValue());
</del><ins>+    m_ringBuffer = makeUnique<CARingBuffer>(makeUniqueRef<ReadOnlySharedRingBufferStorage>(handle), description, numberOfFrames);
+    m_buffer = makeUnique<WebAudioBufferList>(description, numberOfFrames);
+    m_buffer->setSampleCount(m_frameChunkSize);
</ins><span class="cx"> 
</span><del>-    m_source->remoteAudioSamplesAvailable(time, *m_buffer, m_description, numberOfFrames);
</del><ins>+    startThread();
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> RemoteCaptureSampleManager::RemoteVideo::RemoteVideo(Ref<RemoteRealtimeVideoSource>&& source)
</span></span></pre></div>
<a id="trunkSourceWebKitWebProcesscocoaRemoteCaptureSampleManagerh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.h (277255 => 277256)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.h        2021-05-10 01:48:09 UTC (rev 277255)
+++ trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.h   2021-05-10 07:10:31 UTC (rev 277256)
</span><span class="lines">@@ -28,6 +28,7 @@
</span><span class="cx"> #if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
</span><span class="cx"> 
</span><span class="cx"> #include "Connection.h"
</span><ins>+#include "IPCSemaphore.h"
</ins><span class="cx"> #include "MessageReceiver.h"
</span><span class="cx"> #include "RemoteRealtimeAudioSource.h"
</span><span class="cx"> #include "RemoteRealtimeVideoSource.h"
</span><span class="lines">@@ -65,7 +66,7 @@
</span><span class="cx">     void dispatchToThread(Function<void()>&&) final;
</span><span class="cx"> 
</span><span class="cx">     // Messages
</span><del>-    void audioStorageChanged(WebCore::RealtimeMediaSourceIdentifier, const SharedMemory::IPCHandle&, const WebCore::CAAudioStreamDescription&, uint64_t numberOfFrames);
</del><ins>+    void audioStorageChanged(WebCore::RealtimeMediaSourceIdentifier, const SharedMemory::IPCHandle&, const WebCore::CAAudioStreamDescription&, uint64_t numberOfFrames, IPC::Semaphore&&, const MediaTime&, size_t frameSampleSize);
</ins><span class="cx">     void audioSamplesAvailable(WebCore::RealtimeMediaSourceIdentifier, MediaTime, uint64_t numberOfFrames);
</span><span class="cx">     void videoSampleAvailable(WebCore::RealtimeMediaSourceIdentifier, WebCore::RemoteVideoSample&&);
</span><span class="cx"> 
</span><span class="lines">@@ -75,15 +76,25 @@
</span><span class="cx">         WTF_MAKE_FAST_ALLOCATED;
</span><span class="cx">     public:
</span><span class="cx">         explicit RemoteAudio(Ref<RemoteRealtimeAudioSource>&&);
</span><ins>+        ~RemoteAudio();
</ins><span class="cx"> 
</span><del>-        void setStorage(const SharedMemory::Handle&, const WebCore::CAAudioStreamDescription&, uint64_t numberOfFrames);
-        void audioSamplesAvailable(MediaTime, uint64_t numberOfFrames);
</del><ins>+        void setStorage(const SharedMemory::Handle&, const WebCore::CAAudioStreamDescription&, uint64_t numberOfFrames, IPC::Semaphore&&, const MediaTime&, size_t frameChunkSize);
</ins><span class="cx"> 
</span><span class="cx">     private:
</span><ins>+        void stopThread();
+        void startThread();
+
</ins><span class="cx">         Ref<RemoteRealtimeAudioSource> m_source;
</span><span class="cx">         WebCore::CAAudioStreamDescription m_description;
</span><ins>+        std::unique_ptr<WebCore::WebAudioBufferList> m_buffer;
</ins><span class="cx">         std::unique_ptr<WebCore::CARingBuffer> m_ringBuffer;
</span><del>-        std::unique_ptr<WebCore::WebAudioBufferList> m_buffer;
</del><ins>+        int64_t m_readOffset { 0 };
+        MediaTime m_startTime;
+        size_t m_frameChunkSize { 0 };
+
+        IPC::Semaphore m_semaphore;
+        RefPtr<Thread> m_thread;
+        std::atomic<bool> m_shouldStopThread { false };
</ins><span class="cx">     };
</span><span class="cx"> 
</span><span class="cx">     class RemoteVideo {
</span></span></pre></div>
<a id="trunkSourceWebKitWebProcesscocoaRemoteCaptureSampleManagermessagesin"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.messages.in (277255 => 277256)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.messages.in      2021-05-10 01:48:09 UTC (rev 277255)
+++ trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.messages.in 2021-05-10 07:10:31 UTC (rev 277256)
</span><span class="lines">@@ -24,8 +24,7 @@
</span><span class="cx"> #if ENABLE(MEDIA_STREAM)
</span><span class="cx"> 
</span><span class="cx"> messages -> RemoteCaptureSampleManager NotRefCounted {
</span><del>-    AudioStorageChanged(WebCore::RealtimeMediaSourceIdentifier id, WebKit::SharedMemory::IPCHandle storageHandle, WebCore::CAAudioStreamDescription description, uint64_t numberOfFrames)
-    AudioSamplesAvailable(WebCore::RealtimeMediaSourceIdentifier id, MediaTime time, uint64_t numberOfFrames)
</del><ins>+    AudioStorageChanged(WebCore::RealtimeMediaSourceIdentifier id, WebKit::SharedMemory::IPCHandle storageHandle, WebCore::CAAudioStreamDescription description, uint64_t numberOfFrames, IPC::Semaphore captureSemaphore, MediaTime mediaTime, size_t frameChunkSize);
</ins><span class="cx">     VideoSampleAvailable(WebCore::RealtimeMediaSourceIdentifier id, WebCore::RemoteVideoSample sample)
</span><span class="cx"> }
</span><span class="cx"> 
</span></span></pre>
</div>
</div>

</body>
</html>