<video> elements are playing concurrently. This happens even after a proper user gesture and successful play() call.On 30 Apr 2025, at 9:57 pm, Yury Yarashevich via webkit-dev <webkit-dev@lists.webkit.org> wrote:Hi WebKit team,
I’m curious about the original rationale behind the restriction that prevents concurrent playback of multiple
<video>elements. Was it primarily introduced to save battery life?I’m not aware of any such restrictions.Here is a test page that will literally let you play hundreds of video elements at the same time.It works on any Apple devices: iPhone, iPad, Vision Pro, Mac .Are you referring to audible video elements requiring a user gesture for playback to start?In practice, this behavior appears to have unintended side effects. There’s a reproducible issue where playback can be started with the video muted and then immediately unmuted, effectively bypassing the restriction. However, this often results in videos being randomly paused later—sometimes very frequently—leading to a “play/pause ping-pong” between Safari/WebKit and JavaScript restarting playback. This erratic behavior may actually increase battery consumption, despite appearing to work smoothly from the user’s perspective.
This behaviour isn’t erratic and is well defined and and up to know I thought it was well understood. For an audible element to play, the user needs to first interact with the video element such as clicking on the video or its controls.This behaviour is even defined through HTML5 specifications including how you can detect if the User Agent will allow video to start autoplaying without a user gesture.Similar policies are implemented by other user agents such as Firefox and Chrome.There’s an article on MDN on how to deal with them
Here is a more WebKit-focus articleHere is a similar article written by the Chrome team on the same topicHere is the HTML5 specifications related to this matter
Even if this workaround is eventually blocked, developers who rely on concurrent playback (e.g., outside of WebRTC contexts) will turn to more complex solutions, such as decoding video and audio using WebCodecs or/and WebAssembly, and rendering via
<canvas>andAudioContext. While technically feasible, these approaches are likely to be significantly less power-efficient than simply allowing multiple<video>elements to play concurrently.
Another similarly inefficient workaround would be to synthesize aMediaStreamusing theVideoTrackGeneratorAPI andAudioContext.createMediaStreamDestination().Lastly, another issue is that creating a
MediaElementSourcefrom a<video>element and routing its audio through a sharedAudioContextalso does not disable the playback restriction—whereas it is disabled when the<video>element itself is muted. This feels inconsistent and may point to a separate bug.Could you please clarify the motivation behind this restriction, and whether there are any plans to revisit or improve its behavior?
I believe I answered this question above, and through the various links provided.CheersJY