<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[177058] trunk/Source/WebCore</title>
</head>
<body>

<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt;  }
#msg dl a { font-weight: bold}
#msg dl a:link    { color:#fc3; }
#msg dl a:active  { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff  {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/177058">177058</a></dd>
<dt>Author</dt> <dd>philn@webkit.org</dd>
<dt>Date</dt> <dd>2014-12-10 08:00:45 -0800 (Wed, 10 Dec 2014)</dd>
</dl>

<h3>Log Message</h3>
<pre>[GStreamer] AudioSourceProvider support in the MediaPlayer
https://bugs.webkit.org/show_bug.cgi?id=78883

Reviewed by Gustavo Noronha Silva.

GStreamer-based audio source provider for the GTK and EFL
ports. This new component gathers decoded raw audio data from the
MediaPlayer and pipes it to an AudioBus when required by the
User Agent.

* PlatformEfl.cmake: New files in the build.
* PlatformGTK.cmake: Ditto.
* platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp: Added.
(WebCore::onAppsinkNewBufferCallback): Function called when a new
buffer can be pulled from appsink.
(WebCore::onGStreamerDeinterleavePadAddedCallback): Function
called when a new source pad has been added to deinterleave.
(WebCore::onGStreamerDeinterleaveReadyCallback): Function called
when the deinterleave element completed the configuration of all
its source pads.
(WebCore::copyGstreamerBuffersToAudioChannel): Called for each
channel of the AudioBus that needs data as input.
(WebCore::AudioSourceProviderGStreamer::AudioSourceProviderGStreamer):
Create an audio bin that by default routes buffers only to
autoaudiosink. A new route is added if the provider has a client.
(WebCore::AudioSourceProviderGStreamer::~AudioSourceProviderGStreamer):
Clean buffer adapters and audio bin.
(WebCore::AudioSourceProviderGStreamer::configureAudioBin):
(WebCore::AudioSourceProviderGStreamer::provideInput): Transfer
data from the buffer adapters to the bus channels.
(WebCore::AudioSourceProviderGStreamer::handleAudioBuffer): Pull a
buffer from appsink and queue it to the buffer adapter.
(WebCore::AudioSourceProviderGStreamer::setClient): Complete the
construction of the audio bin by adding a new chain to the tee
element. This new chain will deinterleave the buffer stream to
planar audio channels and route them to an appsink per channel for
data extraction.
(WebCore::AudioSourceProviderGStreamer::handleNewDeinterleavePad):
A new appsink after a new source pad has been added to deinterleave.
(WebCore::AudioSourceProviderGStreamer::deinterleavePadsConfigured):
Configure the client Node format (number of channels and sample
rate) once the provider knows how many audio channels are managed
by the pipeline.
(WebCore::cleanUpElementsAfterDeinterleaveSourcePadCallback):
(WebCore::AudioSourceProviderGStreamer::cleanUpElementsAfterDeinterleaveSourcePad):
Remove the elements after the given deinterleave source pad.
(WebCore::AudioSourceProviderGStreamer::reset): Cleanup the
deinterleave source pads. This is especially needed before the
whole pipeline goes to NULL and later on prerolls again.
* platform/audio/gstreamer/AudioSourceProviderGStreamer.h: Added.
(WebCore::AudioSourceProviderGStreamer::create): Use this to
create the provider and get an OwnPtr of it.
(WebCore::AudioSourceProviderGStreamer::client): Provider client getter.
(WebCore::AudioSourceProviderGStreamer::getAudioBin): Audio bin
getter, used by the media player to configure its
playbin::audio-sink property.
* platform/graphics/gstreamer/GRefPtrGStreamer.cpp:
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
(WebCore::MediaPlayerPrivateGStreamer::MediaPlayerPrivateGStreamer):
Provider life cycle management and reset the audio provider before
going to NULL.
(WebCore::MediaPlayerPrivateGStreamer::~MediaPlayerPrivateGStreamer): Ditto.
(WebCore::MediaPlayerPrivateGStreamer::handlePluginInstallerResult): Ditto.
(WebCore::MediaPlayerPrivateGStreamer::cancelLoad): Ditto.
(WebCore::MediaPlayerPrivateGStreamer::didEnd): Ditto.
(WebCore::MediaPlayerPrivateGStreamer::createAudioSink): Configure
the audio source provider if needed.
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
(WebCore::MediaPlayerPrivateGStreamer::audioSourceProvider):
Provider getter, used by MediaPlayer and MediaElement.</pre>

<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkSourceWebCoreChangeLog">trunk/Source/WebCore/ChangeLog</a></li>
<li><a href="#trunkSourceWebCorePlatformEflcmake">trunk/Source/WebCore/PlatformEfl.cmake</a></li>
<li><a href="#trunkSourceWebCorePlatformGTKcmake">trunk/Source/WebCore/PlatformGTK.cmake</a></li>
<li><a href="#trunkSourceWebCoreplatformgraphicsgstreamerMediaPlayerPrivateGStreamercpp">trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformgraphicsgstreamerMediaPlayerPrivateGStreamerh">trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h</a></li>
</ul>

<h3>Added Paths</h3>
<ul>
<li><a href="#trunkSourceWebCoreplatformaudiogstreamerAudioSourceProviderGStreamercpp">trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp</a></li>
<li><a href="#trunkSourceWebCoreplatformaudiogstreamerAudioSourceProviderGStreamerh">trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h</a></li>
</ul>

</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkSourceWebCoreChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/ChangeLog (177057 => 177058)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/ChangeLog        2014-12-10 13:00:05 UTC (rev 177057)
+++ trunk/Source/WebCore/ChangeLog        2014-12-10 16:00:45 UTC (rev 177058)
</span><span class="lines">@@ -1,3 +1,76 @@
</span><ins>+2014-12-08  Philippe Normand  &lt;pnormand@igalia.com&gt;
+
+        [GStreamer] AudioSourceProvider support in the MediaPlayer
+        https://bugs.webkit.org/show_bug.cgi?id=78883
+
+        Reviewed by Gustavo Noronha Silva.
+
+        GStreamer-based audio source provider for the GTK and EFL
+        ports. This new component gathers decoded raw audio data from the
+        MediaPlayer and pipes it to an AudioBus when required by the
+        User Agent.
+
+        * PlatformEfl.cmake: New files in the build.
+        * PlatformGTK.cmake: Ditto.
+        * platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp: Added.
+        (WebCore::onAppsinkNewBufferCallback): Function called when a new
+        buffer can be pulled from appsink.
+        (WebCore::onGStreamerDeinterleavePadAddedCallback): Function
+        called when a new source pad has been added to deinterleave.
+        (WebCore::onGStreamerDeinterleaveReadyCallback): Function called
+        when the deinterleave element completed the configuration of all
+        its source pads.
+        (WebCore::copyGstreamerBuffersToAudioChannel): Called for each
+        channel of the AudioBus that needs data as input.
+        (WebCore::AudioSourceProviderGStreamer::AudioSourceProviderGStreamer):
+        Create an audio bin that by default routes buffers only to
+        autoaudiosink. A new route is added if the provider has a client.
+        (WebCore::AudioSourceProviderGStreamer::~AudioSourceProviderGStreamer):
+        Clean buffer adapters and audio bin.
+        (WebCore::AudioSourceProviderGStreamer::configureAudioBin):
+        (WebCore::AudioSourceProviderGStreamer::provideInput): Transfer
+        data from the buffer adapters to the bus channels.
+        (WebCore::AudioSourceProviderGStreamer::handleAudioBuffer): Pull a
+        buffer from appsink and queue it to the buffer adapter.
+        (WebCore::AudioSourceProviderGStreamer::setClient): Complete the
+        construction of the audio bin by adding a new chain to the tee
+        element. This new chain will deinterleave the buffer stream to
+        planar audio channels and route them to an appsink per channel for
+        data extraction.
+        (WebCore::AudioSourceProviderGStreamer::handleNewDeinterleavePad):
+        A new appsink after a new source pad has been added to deinterleave.
+        (WebCore::AudioSourceProviderGStreamer::deinterleavePadsConfigured):
+        Configure the client Node format (number of channels and sample
+        rate) once the provider knows how many audio channels are managed
+        by the pipeline.
+        (WebCore::cleanUpElementsAfterDeinterleaveSourcePadCallback):
+        (WebCore::AudioSourceProviderGStreamer::cleanUpElementsAfterDeinterleaveSourcePad):
+        Remove the elements after the given deinterleave source pad.
+        (WebCore::AudioSourceProviderGStreamer::reset): Cleanup the
+        deinterleave source pads. This is especially needed before the
+        whole pipeline goes to NULL and later on prerolls again.
+        * platform/audio/gstreamer/AudioSourceProviderGStreamer.h: Added.
+        (WebCore::AudioSourceProviderGStreamer::create): Use this to
+        create the provider and get an OwnPtr of it.
+        (WebCore::AudioSourceProviderGStreamer::client): Provider client getter.
+        (WebCore::AudioSourceProviderGStreamer::getAudioBin): Audio bin
+        getter, used by the media player to configure its
+        playbin::audio-sink property.
+        * platform/graphics/gstreamer/GRefPtrGStreamer.cpp:
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
+        (WebCore::MediaPlayerPrivateGStreamer::MediaPlayerPrivateGStreamer):
+        Provider life cycle management and reset the audio provider before
+        going to NULL.
+        (WebCore::MediaPlayerPrivateGStreamer::~MediaPlayerPrivateGStreamer): Ditto.
+        (WebCore::MediaPlayerPrivateGStreamer::handlePluginInstallerResult): Ditto.
+        (WebCore::MediaPlayerPrivateGStreamer::cancelLoad): Ditto.
+        (WebCore::MediaPlayerPrivateGStreamer::didEnd): Ditto.
+        (WebCore::MediaPlayerPrivateGStreamer::createAudioSink): Configure
+        the audio source provider if needed.
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
+        (WebCore::MediaPlayerPrivateGStreamer::audioSourceProvider):
+        Provider getter, used by MediaPlayer and MediaElement.
+
</ins><span class="cx"> 2014-12-09  Myles C. Maxfield  &lt;mmaxfield@apple.com&gt;
</span><span class="cx"> 
</span><span class="cx">         Scrolling to anchor tags does nothing in vertical-rl writing mode
</span></span></pre></div>
<a id="trunkSourceWebCorePlatformEflcmake"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/PlatformEfl.cmake (177057 => 177058)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/PlatformEfl.cmake        2014-12-10 13:00:05 UTC (rev 177057)
+++ trunk/Source/WebCore/PlatformEfl.cmake        2014-12-10 16:00:45 UTC (rev 177058)
</span><span class="lines">@@ -72,6 +72,7 @@
</span><span class="cx"> 
</span><span class="cx">     platform/audio/gstreamer/AudioDestinationGStreamer.cpp
</span><span class="cx">     platform/audio/gstreamer/AudioFileReaderGStreamer.cpp
</span><ins>+    platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp
</ins><span class="cx">     platform/audio/gstreamer/FFTFrameGStreamer.cpp
</span><span class="cx">     platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp
</span><span class="cx"> 
</span></span></pre></div>
<a id="trunkSourceWebCorePlatformGTKcmake"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/PlatformGTK.cmake (177057 => 177058)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/PlatformGTK.cmake        2014-12-10 13:00:05 UTC (rev 177057)
+++ trunk/Source/WebCore/PlatformGTK.cmake        2014-12-10 16:00:45 UTC (rev 177058)
</span><span class="lines">@@ -56,6 +56,7 @@
</span><span class="cx"> 
</span><span class="cx">     platform/audio/gstreamer/AudioDestinationGStreamer.cpp
</span><span class="cx">     platform/audio/gstreamer/AudioFileReaderGStreamer.cpp
</span><ins>+    platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp
</ins><span class="cx">     platform/audio/gstreamer/FFTFrameGStreamer.cpp
</span><span class="cx">     platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp
</span><span class="cx"> 
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiogstreamerAudioSourceProviderGStreamercpp"></a>
<div class="addfile"><h4>Added: trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp (0 => 177058)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp                                (rev 0)
+++ trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp        2014-12-10 16:00:45 UTC (rev 177058)
</span><span class="lines">@@ -0,0 +1,349 @@
</span><ins>+/*
+ *  Copyright (C) 2014 Igalia S.L
+ *
+ *  This library is free software; you can redistribute it and/or
+ *  modify it under the terms of the GNU Lesser General Public
+ *  License as published by the Free Software Foundation; either
+ *  version 2 of the License, or (at your option) any later version.
+ *
+ *  This library is distributed in the hope that it will be useful,
+ *  but WITHOUT ANY WARRANTY; without even the implied warranty of
+ *  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ *  Lesser General Public License for more details.
+ *
+ *  You should have received a copy of the GNU Lesser General Public
+ *  License along with this library; if not, write to the Free Software
+ *  Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301  USA
+ */
+
+#include &quot;config.h&quot;
+#include &quot;AudioSourceProviderGStreamer.h&quot;
+
+#if ENABLE(WEB_AUDIO) &amp;&amp; ENABLE(VIDEO) &amp;&amp; USE(GSTREAMER)
+
+#include &quot;AudioBus.h&quot;
+#include &quot;AudioSourceProviderClient.h&quot;
+#include &lt;gst/app/gstappsink.h&gt;
+#include &lt;gst/audio/audio.h&gt;
+#include &lt;gst/base/gstadapter.h&gt;
+#include &lt;wtf/gobject/GMutexLocker.h&gt;
+
+
+namespace WebCore {
+
+// For now the provider supports only stereo files at a fixed sample
+// bitrate.
+static const int gNumberOfChannels = 2;
+static const float gSampleBitRate = 44100;
+
+static GstFlowReturn onAppsinkNewBufferCallback(GstAppSink* sink, gpointer userData)
+{
+    return static_cast&lt;AudioSourceProviderGStreamer*&gt;(userData)-&gt;handleAudioBuffer(sink);
+}
+
+static void onGStreamerDeinterleavePadAddedCallback(GstElement*, GstPad* pad, AudioSourceProviderGStreamer* provider)
+{
+    provider-&gt;handleNewDeinterleavePad(pad);
+}
+
+static void onGStreamerDeinterleaveReadyCallback(GstElement*, AudioSourceProviderGStreamer* provider)
+{
+    provider-&gt;deinterleavePadsConfigured();
+}
+
+static void onGStreamerDeinterleavePadRemovedCallback(GstElement*, GstPad* pad, AudioSourceProviderGStreamer* provider)
+{
+    provider-&gt;handleRemovedDeinterleavePad(pad);
+}
+
+static GstPadProbeReturn onAppsinkFlushCallback(GstPad*, GstPadProbeInfo* info, gpointer userData)
+{
+    if (GST_PAD_PROBE_INFO_TYPE(info) &amp; (GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM | GST_PAD_PROBE_TYPE_EVENT_FLUSH)) {
+        GstEvent* event = GST_PAD_PROBE_INFO_EVENT(info);
+        if (GST_EVENT_TYPE(event) == GST_EVENT_FLUSH_STOP) {
+            AudioSourceProviderGStreamer* provider = reinterpret_cast&lt;AudioSourceProviderGStreamer*&gt;(userData);
+            provider-&gt;clearAdapters();
+        }
+    }
+    return GST_PAD_PROBE_OK;
+}
+
+static void copyGStreamerBuffersToAudioChannel(GstAdapter* adapter, AudioBus* bus , int channelNumber, size_t framesToProcess)
+{
+    if (!gst_adapter_available(adapter)) {
+        bus-&gt;zero();
+        return;
+    }
+
+    size_t bytes = framesToProcess * sizeof(float);
+    if (gst_adapter_available(adapter) &gt;= bytes) {
+        gst_adapter_copy(adapter, bus-&gt;channel(channelNumber)-&gt;mutableData(), 0, bytes);
+        gst_adapter_flush(adapter, bytes);
+    }
+}
+
+AudioSourceProviderGStreamer::AudioSourceProviderGStreamer()
+    : m_client(0)
+    , m_deinterleaveSourcePads(0)
+    , m_deinterleavePadAddedHandlerId(0)
+    , m_deinterleaveNoMorePadsHandlerId(0)
+    , m_deinterleavePadRemovedHandlerId(0)
+{
+    g_mutex_init(&amp;m_adapterMutex);
+    m_frontLeftAdapter = gst_adapter_new();
+    m_frontRightAdapter = gst_adapter_new();
+}
+
+AudioSourceProviderGStreamer::~AudioSourceProviderGStreamer()
+{
+    GRefPtr&lt;GstElement&gt; deinterleave = adoptGRef(gst_bin_get_by_name(GST_BIN(m_audioSinkBin.get()), &quot;deinterleave&quot;));
+    if (deinterleave) {
+        g_signal_handler_disconnect(deinterleave.get(), m_deinterleavePadAddedHandlerId);
+        g_signal_handler_disconnect(deinterleave.get(), m_deinterleaveNoMorePadsHandlerId);
+        g_signal_handler_disconnect(deinterleave.get(), m_deinterleavePadRemovedHandlerId);
+    }
+
+    g_object_unref(m_frontLeftAdapter);
+    g_object_unref(m_frontRightAdapter);
+    g_mutex_clear(&amp;m_adapterMutex);
+}
+
+void AudioSourceProviderGStreamer::configureAudioBin(GstElement* audioBin, GstElement* teePredecessor)
+{
+    m_audioSinkBin = audioBin;
+
+    GstElement* audioTee = gst_element_factory_make(&quot;tee&quot;, &quot;audioTee&quot;);
+    GstElement* audioQueue = gst_element_factory_make(&quot;queue&quot;, 0);
+    GstElement* audioConvert = gst_element_factory_make(&quot;audioconvert&quot;, 0);
+    GstElement* audioConvert2 = gst_element_factory_make(&quot;audioconvert&quot;, 0);
+    GstElement* audioResample = gst_element_factory_make(&quot;audioresample&quot;, 0);
+    GstElement* audioResample2 = gst_element_factory_make(&quot;audioresample&quot;, 0);
+    GstElement* volumeElement = gst_element_factory_make(&quot;volume&quot;, &quot;volume&quot;);
+    GstElement* audioSink = gst_element_factory_make(&quot;autoaudiosink&quot;, 0);
+
+    gst_bin_add_many(GST_BIN(m_audioSinkBin.get()), audioTee, audioQueue, audioConvert, audioResample, volumeElement, audioConvert2, audioResample2, audioSink, nullptr);
+
+    // In cases where the audio-sink needs elements before tee (such
+    // as scaletempo) they need to be linked to tee which in this case
+    // doesn't need a ghost pad. It is assumed that the teePredecessor
+    // chain already configured a ghost pad.
+    if (teePredecessor)
+        gst_element_link_pads_full(teePredecessor, &quot;src&quot;, audioTee, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    else {
+        // Add a ghostpad to the bin so it can proxy to tee.
+        GRefPtr&lt;GstPad&gt; audioTeeSinkPad = adoptGRef(gst_element_get_static_pad(audioTee, &quot;sink&quot;));
+        gst_element_add_pad(m_audioSinkBin.get(), gst_ghost_pad_new(&quot;sink&quot;, audioTeeSinkPad.get()));
+    }
+
+    // Link a new src pad from tee to queue ! audioconvert !
+    // audioresample ! volume ! audioconvert ! audioresample !
+    // autoaudiosink. The audioresample and audioconvert are needed to
+    // ensure the audio sink receives buffers in the correct format.
+    gst_element_link_pads_full(audioTee, &quot;src_%u&quot;, audioQueue, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioQueue, &quot;src&quot;, audioConvert, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioConvert, &quot;src&quot;, audioResample, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioResample, &quot;src&quot;, volumeElement, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(volumeElement, &quot;src&quot;, audioConvert2, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioConvert2, &quot;src&quot;, audioResample2, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioResample2, &quot;src&quot;, audioSink, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+}
+
+void AudioSourceProviderGStreamer::provideInput(AudioBus* bus, size_t framesToProcess)
+{
+    GMutexLocker&lt;GMutex&gt; lock(m_adapterMutex);
+    copyGStreamerBuffersToAudioChannel(m_frontLeftAdapter, bus, 0, framesToProcess);
+    copyGStreamerBuffersToAudioChannel(m_frontRightAdapter, bus, 1, framesToProcess);
+}
+
+GstFlowReturn AudioSourceProviderGStreamer::handleAudioBuffer(GstAppSink* sink)
+{
+    if (!m_client)
+        return GST_FLOW_OK;
+
+    // Pull a buffer from appsink and store it the appropriate buffer
+    // list for the audio channel it represents.
+    GRefPtr&lt;GstSample&gt; sample = adoptGRef(gst_app_sink_pull_sample(sink));
+    if (!sample)
+        return gst_app_sink_is_eos(sink) ? GST_FLOW_EOS : GST_FLOW_ERROR;
+
+    GstBuffer* buffer = gst_sample_get_buffer(sample.get());
+    if (!buffer)
+        return GST_FLOW_ERROR;
+
+    GstCaps* caps = gst_sample_get_caps(sample.get());
+    if (!caps)
+        return GST_FLOW_ERROR;
+
+    GstAudioInfo info;
+    gst_audio_info_from_caps(&amp;info, caps);
+
+    GMutexLocker&lt;GMutex&gt; lock(m_adapterMutex);
+
+    // Check the first audio channel. The buffer is supposed to store
+    // data of a single channel anyway.
+    switch (GST_AUDIO_INFO_POSITION(&amp;info, 0)) {
+    case GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT:
+    case GST_AUDIO_CHANNEL_POSITION_MONO:
+        gst_adapter_push(m_frontLeftAdapter, gst_buffer_ref(buffer));
+        break;
+    case GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT:
+        gst_adapter_push(m_frontRightAdapter, gst_buffer_ref(buffer));
+        break;
+    default:
+        break;
+    }
+
+    return GST_FLOW_OK;
+}
+
+void AudioSourceProviderGStreamer::setClient(AudioSourceProviderClient* client)
+{
+    ASSERT(client);
+    m_client = client;
+
+    // The volume element is used to mute audio playback towards the
+    // autoaudiosink. This is needed to avoid double playback of audio
+    // from our audio sink and from the WebAudio AudioDestination node
+    // supposedly configured already by application side.
+    GRefPtr&lt;GstElement&gt; volumeElement = adoptGRef(gst_bin_get_by_name(GST_BIN(m_audioSinkBin.get()), &quot;volume&quot;));
+    g_object_set(volumeElement.get(), &quot;mute&quot;, TRUE, nullptr);
+
+    // The audioconvert and audioresample elements are needed to
+    // ensure deinterleave and the sinks downstream receive buffers in
+    // the format specified by the capsfilter.
+    GstElement* audioQueue = gst_element_factory_make(&quot;queue&quot;, 0);
+    GstElement* audioConvert  = gst_element_factory_make(&quot;audioconvert&quot;, 0);
+    GstElement* audioResample = gst_element_factory_make(&quot;audioresample&quot;, 0);
+    GstElement* capsFilter = gst_element_factory_make(&quot;capsfilter&quot;, 0);
+    GstElement* deInterleave = gst_element_factory_make(&quot;deinterleave&quot;, &quot;deinterleave&quot;);
+
+    g_object_set(deInterleave, &quot;keep-positions&quot;, TRUE, nullptr);
+    m_deinterleavePadAddedHandlerId = g_signal_connect(deInterleave, &quot;pad-added&quot;, G_CALLBACK(onGStreamerDeinterleavePadAddedCallback), this);
+    m_deinterleaveNoMorePadsHandlerId = g_signal_connect(deInterleave, &quot;no-more-pads&quot;, G_CALLBACK(onGStreamerDeinterleaveReadyCallback), this);
+    m_deinterleavePadRemovedHandlerId = g_signal_connect(deInterleave, &quot;pad-removed&quot;, G_CALLBACK(onGStreamerDeinterleavePadRemovedCallback), this);
+
+    GstCaps* caps = gst_caps_new_simple(&quot;audio/x-raw&quot;, &quot;rate&quot;, G_TYPE_INT, static_cast&lt;int&gt;(gSampleBitRate),
+        &quot;channels&quot;, G_TYPE_INT, gNumberOfChannels,
+        &quot;format&quot;, G_TYPE_STRING, GST_AUDIO_NE(F32),
+        &quot;layout&quot;, G_TYPE_STRING, &quot;interleaved&quot;, nullptr);
+
+    g_object_set(capsFilter, &quot;caps&quot;, caps, nullptr);
+    gst_caps_unref(caps);
+
+    gst_bin_add_many(GST_BIN(m_audioSinkBin.get()), audioQueue, audioConvert, audioResample, capsFilter, deInterleave, nullptr);
+
+    GRefPtr&lt;GstElement&gt; audioTee = adoptGRef(gst_bin_get_by_name(GST_BIN(m_audioSinkBin.get()), &quot;audioTee&quot;));
+
+    // Link a new src pad from tee to queue ! audioconvert !
+    // audioresample ! capsfilter ! deinterleave. Later
+    // on each deinterleaved planar audio channel will be routed to an
+    // appsink for data extraction and processing.
+    gst_element_link_pads_full(audioTee.get(), &quot;src_%u&quot;, audioQueue, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioQueue, &quot;src&quot;, audioConvert, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioConvert, &quot;src&quot;, audioResample, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioResample, &quot;src&quot;, capsFilter, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(capsFilter, &quot;src&quot;, deInterleave, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+
+    gst_element_sync_state_with_parent(audioQueue);
+    gst_element_sync_state_with_parent(audioConvert);
+    gst_element_sync_state_with_parent(audioResample);
+    gst_element_sync_state_with_parent(capsFilter);
+    gst_element_sync_state_with_parent(deInterleave);
+}
+
+void AudioSourceProviderGStreamer::handleNewDeinterleavePad(GstPad* pad)
+{
+    m_deinterleaveSourcePads++;
+
+    if (m_deinterleaveSourcePads &gt; 2) {
+        g_warning(&quot;The AudioSourceProvider supports only mono and stereo audio. Silencing out this new channel.&quot;);
+        GstElement* queue = gst_element_factory_make(&quot;queue&quot;, 0);
+        GstElement* sink = gst_element_factory_make(&quot;fakesink&quot;, 0);
+        g_object_set(sink, &quot;async&quot;, FALSE, nullptr);
+        gst_bin_add_many(GST_BIN(m_audioSinkBin.get()), queue, sink, nullptr);
+
+        GRefPtr&lt;GstPad&gt; sinkPad = adoptGRef(gst_element_get_static_pad(queue, &quot;sink&quot;));
+        gst_pad_link_full(pad, sinkPad.get(), GST_PAD_LINK_CHECK_NOTHING);
+
+        GQuark quark = g_quark_from_static_string(&quot;peer&quot;);
+        g_object_set_qdata(G_OBJECT(pad), quark, sinkPad.get());
+        gst_element_link_pads_full(queue, &quot;src&quot;, sink, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+        gst_element_sync_state_with_parent(queue);
+        gst_element_sync_state_with_parent(sink);
+        return;
+    }
+
+    // A new pad for a planar channel was added in deinterleave. Plug
+    // in an appsink so we can pull the data from each
+    // channel. Pipeline looks like:
+    // ... deinterleave ! queue ! appsink.
+    GstElement* queue = gst_element_factory_make(&quot;queue&quot;, 0);
+    GstElement* sink = gst_element_factory_make(&quot;appsink&quot;, 0);
+
+    GstAppSinkCallbacks callbacks;
+    callbacks.eos = 0;
+    callbacks.new_preroll = 0;
+    callbacks.new_sample = onAppsinkNewBufferCallback;
+    gst_app_sink_set_callbacks(GST_APP_SINK(sink), &amp;callbacks, this, 0);
+
+    g_object_set(sink, &quot;async&quot;, FALSE, nullptr);
+
+    GRefPtr&lt;GstCaps&gt; caps = adoptGRef(gst_caps_new_simple(&quot;audio/x-raw&quot;, &quot;rate&quot;, G_TYPE_INT, static_cast&lt;int&gt;(gSampleBitRate),
+        &quot;channels&quot;, G_TYPE_INT, 1,
+        &quot;format&quot;, G_TYPE_STRING, GST_AUDIO_NE(F32),
+        &quot;layout&quot;, G_TYPE_STRING, &quot;interleaved&quot;, nullptr));
+
+    gst_app_sink_set_caps(GST_APP_SINK(sink), caps.get());
+
+    gst_bin_add_many(GST_BIN(m_audioSinkBin.get()), queue, sink, nullptr);
+
+    GRefPtr&lt;GstPad&gt; sinkPad = adoptGRef(gst_element_get_static_pad(queue, &quot;sink&quot;));
+    gst_pad_link_full(pad, sinkPad.get(), GST_PAD_LINK_CHECK_NOTHING);
+
+    GQuark quark = g_quark_from_static_string(&quot;peer&quot;);
+    g_object_set_qdata(G_OBJECT(pad), quark, sinkPad.get());
+
+    gst_element_link_pads_full(queue, &quot;src&quot;, sink, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
+
+    sinkPad = adoptGRef(gst_element_get_static_pad(sink, &quot;sink&quot;));
+    gst_pad_add_probe(sinkPad.get(), GST_PAD_PROBE_TYPE_EVENT_FLUSH, onAppsinkFlushCallback, this, nullptr);
+
+    gst_element_sync_state_with_parent(queue);
+    gst_element_sync_state_with_parent(sink);
+}
+
+void AudioSourceProviderGStreamer::handleRemovedDeinterleavePad(GstPad* pad)
+{
+    m_deinterleaveSourcePads--;
+
+    // Remove the queue ! appsink chain downstream of deinterleave.
+    GQuark quark = g_quark_from_static_string(&quot;peer&quot;);
+    GstPad* sinkPad = reinterpret_cast&lt;GstPad*&gt;(g_object_get_qdata(G_OBJECT(pad), quark));
+    GRefPtr&lt;GstElement&gt; queue = adoptGRef(gst_pad_get_parent_element(sinkPad));
+    GRefPtr&lt;GstPad&gt; queueSrcPad = adoptGRef(gst_element_get_static_pad(queue.get(), &quot;src&quot;));
+    GRefPtr&lt;GstPad&gt; appsinkSinkPad = adoptGRef(gst_pad_get_peer(queueSrcPad.get()));
+    GRefPtr&lt;GstElement&gt; sink = adoptGRef(gst_pad_get_parent_element(appsinkSinkPad.get()));
+    gst_element_set_state(sink.get(), GST_STATE_NULL);
+    gst_element_set_state(queue.get(), GST_STATE_NULL);
+    gst_element_unlink(queue.get(), sink.get());
+    gst_bin_remove_many(GST_BIN(m_audioSinkBin.get()), queue.get(), sink.get(), nullptr);
+}
+
+void AudioSourceProviderGStreamer::deinterleavePadsConfigured()
+{
+    ASSERT(m_client);
+    ASSERT(m_deinterleaveSourcePads == gNumberOfChannels);
+
+    m_client-&gt;setFormat(m_deinterleaveSourcePads, gSampleBitRate);
+}
+
+void AudioSourceProviderGStreamer::clearAdapters()
+{
+    GMutexLocker&lt;GMutex&gt; lock(m_adapterMutex);
+    gst_adapter_clear(m_frontLeftAdapter);
+    gst_adapter_clear(m_frontRightAdapter);
+}
+
+} // WebCore
+
+#endif // ENABLE(WEB_AUDIO) &amp;&amp; ENABLE(VIDEO) &amp;&amp; USE(GSTREAMER)
</ins></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiogstreamerAudioSourceProviderGStreamerh"></a>
<div class="addfile"><h4>Added: trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h (0 => 177058)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h                                (rev 0)
+++ trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h        2014-12-10 16:00:45 UTC (rev 177058)
</span><span class="lines">@@ -0,0 +1,72 @@
</span><ins>+/*
+ *  Copyright (C) 2014 Igalia S.L
+ *
+ *  This library is free software; you can redistribute it and/or
+ *  modify it under the terms of the GNU Lesser General Public
+ *  License as published by the Free Software Foundation; either
+ *  version 2 of the License, or (at your option) any later version.
+ *
+ *  This library is distributed in the hope that it will be useful,
+ *  but WITHOUT ANY WARRANTY; without even the implied warranty of
+ *  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ *  Lesser General Public License for more details.
+ *
+ *  You should have received a copy of the GNU Lesser General Public
+ *  License along with this library; if not, write to the Free Software
+ *  Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301  USA
+ */
+
+#ifndef AudioSourceProviderGStreamer_h
+#define AudioSourceProviderGStreamer_h
+
+#if ENABLE(WEB_AUDIO) &amp;&amp; ENABLE(VIDEO) &amp;&amp; USE(GSTREAMER)
+
+#include &quot;AudioSourceProvider.h&quot;
+#include &quot;GRefPtrGStreamer.h&quot;
+#include &lt;gst/gst.h&gt;
+#include &lt;wtf/Forward.h&gt;
+#include &lt;wtf/Noncopyable.h&gt;
+#include &lt;wtf/PassOwnPtr.h&gt;
+
+typedef struct _GstAdapter GstAdapter;
+typedef struct _GstAppSink GstAppSink;
+
+namespace WebCore {
+
+class AudioSourceProviderGStreamer : public AudioSourceProvider {
+    WTF_MAKE_NONCOPYABLE(AudioSourceProviderGStreamer);
+public:
+    static PassOwnPtr&lt;AudioSourceProviderGStreamer&gt; create() { return adoptPtr(new AudioSourceProviderGStreamer()); }
+    AudioSourceProviderGStreamer();
+    ~AudioSourceProviderGStreamer();
+
+    void configureAudioBin(GstElement* audioBin, GstElement* teePredecessor);
+
+    void provideInput(AudioBus*, size_t framesToProcess);
+    void setClient(AudioSourceProviderClient*);
+    const AudioSourceProviderClient* client() const { return m_client; }
+
+    void handleNewDeinterleavePad(GstPad*);
+    void deinterleavePadsConfigured();
+    void handleRemovedDeinterleavePad(GstPad*);
+
+    GstFlowReturn handleAudioBuffer(GstAppSink*);
+    GstElement* getAudioBin() const { return m_audioSinkBin.get(); }
+    void clearAdapters();
+
+private:
+    GRefPtr&lt;GstElement&gt; m_audioSinkBin;
+    AudioSourceProviderClient* m_client;
+    int m_deinterleaveSourcePads;
+    GstAdapter* m_frontLeftAdapter;
+    GstAdapter* m_frontRightAdapter;
+    unsigned long m_deinterleavePadAddedHandlerId;
+    unsigned long m_deinterleaveNoMorePadsHandlerId;
+    unsigned long m_deinterleavePadRemovedHandlerId;
+    GMutex m_adapterMutex;
+};
+
+}
+#endif // ENABLE(WEB_AUDIO) &amp;&amp; ENABLE(VIDEO) &amp;&amp; USE(GSTREAMER)
+
+#endif // AudioSourceProviderGStreamer_h
</ins></span></pre></div>
<a id="trunkSourceWebCoreplatformgraphicsgstreamerMediaPlayerPrivateGStreamercpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp (177057 => 177058)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp        2014-12-10 13:00:05 UTC (rev 177057)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp        2014-12-10 16:00:45 UTC (rev 177058)
</span><span class="lines">@@ -64,6 +64,10 @@
</span><span class="cx"> #include &quot;WebKitMediaSourceGStreamer.h&quot;
</span><span class="cx"> #endif
</span><span class="cx"> 
</span><ins>+#if ENABLE(WEB_AUDIO)
+#include &quot;AudioSourceProviderGStreamer.h&quot;
+#endif
+
</ins><span class="cx"> // Max interval in seconds to stay in the READY state on manual
</span><span class="cx"> // state change requests.
</span><span class="cx"> static const unsigned gReadyStateTimerInterval = 60;
</span><span class="lines">@@ -212,6 +216,9 @@
</span><span class="cx">     , m_hasAudio(false)
</span><span class="cx">     , m_totalBytes(0)
</span><span class="cx">     , m_preservesPitch(false)
</span><ins>+#if ENABLE(WEB_AUDIO)
+    , m_audioSourceProvider(AudioSourceProviderGStreamer::create())
+#endif
</ins><span class="cx">     , m_requestedState(GST_STATE_VOID_PENDING)
</span><span class="cx">     , m_missingPlugins(false)
</span><span class="cx"> {
</span><span class="lines">@@ -265,6 +272,10 @@
</span><span class="cx">         GRefPtr&lt;GstPad&gt; videoSinkPad = adoptGRef(gst_element_get_static_pad(m_webkitVideoSink.get(), &quot;sink&quot;));
</span><span class="cx">         g_signal_handlers_disconnect_by_func(videoSinkPad.get(), reinterpret_cast&lt;gpointer&gt;(mediaPlayerPrivateVideoSinkCapsChangedCallback), this);
</span><span class="cx">     }
</span><ins>+
+#if ENABLE(WEB_AUDIO)
+    m_audioSourceProvider.release();
+#endif
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> void MediaPlayerPrivateGStreamer::load(const String&amp; urlString)
</span><span class="lines">@@ -1845,35 +1856,56 @@
</span><span class="cx">     m_autoAudioSink = gst_element_factory_make(&quot;autoaudiosink&quot;, 0);
</span><span class="cx">     g_signal_connect(m_autoAudioSink.get(), &quot;child-added&quot;, G_CALLBACK(setAudioStreamPropertiesCallback), this);
</span><span class="cx"> 
</span><del>-    // Construct audio sink only if pitch preserving is enabled.
-    if (!m_preservesPitch)
-        return m_autoAudioSink.get();
</del><ins>+    GstElement* audioSinkBin;
</ins><span class="cx"> 
</span><del>-    // On 1.4.2 and newer we use the audio-filter property instead.
-    if (webkitGstCheckVersion(1, 4, 2))
</del><ins>+    if (webkitGstCheckVersion(1, 4, 2)) {
+#if ENABLE(WEB_AUDIO)
+        audioSinkBin = gst_bin_new(&quot;audio-sink&quot;);
+        m_audioSourceProvider-&gt;configureAudioBin(audioSinkBin, nullptr);
+        return audioSinkBin;
+#else
</ins><span class="cx">         return m_autoAudioSink.get();
</span><del>-
-    GstElement* scale = gst_element_factory_make(&quot;scaletempo&quot;, 0);
-    if (!scale) {
-        GST_WARNING(&quot;Failed to create scaletempo&quot;);
-        return m_autoAudioSink.get();
</del><ins>+#endif
</ins><span class="cx">     }
</span><span class="cx"> 
</span><del>-    GstElement* audioSinkBin = gst_bin_new(&quot;audio-sink&quot;);
-    GstElement* convert = gst_element_factory_make(&quot;audioconvert&quot;, 0);
-    GstElement* resample = gst_element_factory_make(&quot;audioresample&quot;, 0);
</del><ins>+    // Construct audio sink only if pitch preserving is enabled.
+    // If GStreamer 1.4.2 is used the audio-filter playbin property is used instead.
+    if (m_preservesPitch) {
+        GstElement* scale = gst_element_factory_make(&quot;scaletempo&quot;, nullptr);
+        if (!scale) {
+            GST_WARNING(&quot;Failed to create scaletempo&quot;);
+            return m_autoAudioSink.get();
+        }
</ins><span class="cx"> 
</span><del>-    gst_bin_add_many(GST_BIN(audioSinkBin), scale, convert, resample, m_autoAudioSink.get(), NULL);
</del><ins>+        audioSinkBin = gst_bin_new(&quot;audio-sink&quot;);
+        gst_bin_add(GST_BIN(audioSinkBin), scale);
+        GRefPtr&lt;GstPad&gt; pad = adoptGRef(gst_element_get_static_pad(scale, &quot;sink&quot;));
+        gst_element_add_pad(audioSinkBin, gst_ghost_pad_new(&quot;sink&quot;, pad.get()));
</ins><span class="cx"> 
</span><del>-    if (!gst_element_link_many(scale, convert, resample, m_autoAudioSink.get(), NULL)) {
-        GST_WARNING(&quot;Failed to link audio sink elements&quot;);
-        gst_object_unref(audioSinkBin);
-        return m_autoAudioSink.get();
</del><ins>+#if ENABLE(WEB_AUDIO)
+        m_audioSourceProvider-&gt;configureAudioBin(audioSinkBin, scale);
+#else
+        GstElement* convert = gst_element_factory_make(&quot;audioconvert&quot;, nullptr);
+        GstElement* resample = gst_element_factory_make(&quot;audioresample&quot;, nullptr);
+
+        gst_bin_add_many(GST_BIN(audioSinkBin), convert, resample, m_autoAudioSink.get(), nullptr);
+
+        if (!gst_element_link_many(scale, convert, resample, m_autoAudioSink.get(), nullptr)) {
+            GST_WARNING(&quot;Failed to link audio sink elements&quot;);
+            gst_object_unref(audioSinkBin);
+            return m_autoAudioSink.get();
+        }
+#endif
+        return audioSinkBin;
</ins><span class="cx">     }
</span><span class="cx"> 
</span><del>-    GRefPtr&lt;GstPad&gt; pad = adoptGRef(gst_element_get_static_pad(scale, &quot;sink&quot;));
-    gst_element_add_pad(audioSinkBin, gst_ghost_pad_new(&quot;sink&quot;, pad.get()));
</del><ins>+#if ENABLE(WEB_AUDIO)
+    audioSinkBin = gst_bin_new(&quot;audio-sink&quot;);
+    m_audioSourceProvider-&gt;configureAudioBin(audioSinkBin, nullptr);
</ins><span class="cx">     return audioSinkBin;
</span><ins>+#endif
+    ASSERT_NOT_REACHED();
+    return 0;
</ins><span class="cx"> }
</span><span class="cx"> 
</span><span class="cx"> GstElement* MediaPlayerPrivateGStreamer::audioSink() const
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformgraphicsgstreamerMediaPlayerPrivateGStreamerh"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h (177057 => 177058)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h        2014-12-10 13:00:05 UTC (rev 177057)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h        2014-12-10 16:00:45 UTC (rev 177058)
</span><span class="lines">@@ -50,6 +50,11 @@
</span><span class="cx"> 
</span><span class="cx"> namespace WebCore {
</span><span class="cx"> 
</span><ins>+#if ENABLE(WEB_AUDIO)
+class AudioSourceProvider;
+class AudioSourceProviderGStreamer;
+#endif
+
</ins><span class="cx"> class AudioTrackPrivateGStreamer;
</span><span class="cx"> class InbandMetadataTextTrackPrivateGStreamer;
</span><span class="cx"> class InbandTextTrackPrivateGStreamer;
</span><span class="lines">@@ -125,6 +130,10 @@
</span><span class="cx"> 
</span><span class="cx">     bool changePipelineState(GstState);
</span><span class="cx"> 
</span><ins>+#if ENABLE(WEB_AUDIO)
+    AudioSourceProvider* audioSourceProvider() { return reinterpret_cast&lt;AudioSourceProvider*&gt;(m_audioSourceProvider.get()); }
+#endif
+
</ins><span class="cx"> private:
</span><span class="cx">     MediaPlayerPrivateGStreamer(MediaPlayer*);
</span><span class="cx"> 
</span><span class="lines">@@ -211,6 +220,9 @@
</span><span class="cx">     mutable unsigned long long m_totalBytes;
</span><span class="cx">     URL m_url;
</span><span class="cx">     bool m_preservesPitch;
</span><ins>+#if ENABLE(WEB_AUDIO)
+    OwnPtr&lt;AudioSourceProviderGStreamer&gt; m_audioSourceProvider;
+#endif
</ins><span class="cx">     GstState m_requestedState;
</span><span class="cx">     GRefPtr&lt;GstElement&gt; m_autoAudioSink;
</span><span class="cx">     bool m_missingPlugins;
</span></span></pre>
</div>
</div>

</body>
</html>