[webkit-changes] [WebKit/WebKit] 87395a: Distorted audio after getUserMedia when playing wi...

Chris Dumez noreply at github.com
Tue Feb 7 09:49:43 PST 2023


  Branch: refs/heads/main
  Home:   https://github.com/WebKit/WebKit
  Commit: 87395a602807aca417e72e768bba2404f2e9ff42
      https://github.com/WebKit/WebKit/commit/87395a602807aca417e72e768bba2404f2e9ff42
  Author: Chris Dumez <cdumez at apple.com>
  Date:   2023-02-07 (Tue, 07 Feb 2023)

  Changed paths:
    M Source/WebCore/platform/audio/cocoa/MediaSessionManagerCocoa.mm
    M Source/WebKit/WebProcess/GPU/media/RemoteAudioDestinationProxy.cpp
    M Source/WebKit/WebProcess/GPU/media/RemoteAudioDestinationProxy.h

  Log Message:
  -----------
  Distorted audio after getUserMedia when playing with AudioWorkletNode
https://bugs.webkit.org/show_bug.cgi?id=251091
rdar://104870451

Reviewed by Youenn Fablet and Jer Noble.

Our WebAudio rendering logic was trying to deal with buffer sizes greater than
128 by signaling the IPC semaphore multiple times so that the producer on the
WebProcess side would produce enough 128 frames-sized chunks to satisfy the
reader on the GPUProcess side.

This logic isn't exercised a lot though since
MediaSessionManagerCocoa::updateSessionState() requests a buffer size of 128
whenever WebAudio is in use. However, 246166 at main added logic in
RemoteAudioSessionProxyManager::updatePreferredBufferSizeForProcess() that
delays setting the preferred buffer size if we're currently capturing media.

In the demo case, we're capturing media so we would end up using WebAudio with
a buffer size of 960. There were multiple issues here:
1. 960 wasn't a multiple of 128 so the GPUProcess would signal the semaphore
   an inconsistent number of times for each render quantum (sometimes
   requesting 1024 frames, sometimes 896).
2. Because the demo is using an Audio Worklet, we were doing 7 to 8 dispatches
   to the Audio Worklet thread (from the Audio Thread) in order to do the
   rendering on the WebProcess size. This was unnecessarily expensive.

To address the issue, I made 2 changes:
1. MediaSessionManagerCocoa::updateSessionState() now rounds the preferred
   buffer size to the upper power of 2.
2. RemoteAudioDestinationProxy now relies on IPCSemaphore::waitFor(0_s) to
   see if the consumer is requesting more than 128 samples at once. Once it has
   determined the actual number of frames the consumer wants, it calls
   renderQuantum() once with this number. As a result, when an AudioWorklet is
   used, we greatly reduce the number of dispatches to the AudioWorklet thread.
   In the case of the demo, we end up with a buffer size of 1024 and we
   dispatch once per 1024 quantum instead of 7-8 times. We do the splitting
   into 128-frames chunks on the AudioWorklet threads.

* Source/WebCore/platform/audio/cocoa/MediaSessionManagerCocoa.mm:
(WebCore::MediaSessionManagerCocoa::updateSessionState):
* Source/WebKit/WebProcess/GPU/media/RemoteAudioDestinationProxy.cpp:
(WebKit::RemoteAudioDestinationProxy::startRenderingThread):
(WebKit::RemoteAudioDestinationProxy::connection):
(WebKit::RemoteAudioDestinationProxy::renderQuantum):
* Source/WebKit/WebProcess/GPU/media/RemoteAudioDestinationProxy.h:

Canonical link: https://commits.webkit.org/259964@main




More information about the webkit-changes mailing list