<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[177085] trunk/Source/WebCore</title>
</head>
<body>

<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt;  }
#msg dl a { font-weight: bold}
#msg dl a:link    { color:#fc3; }
#msg dl a:active  { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff  {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/177085">177085</a></dd>
<dt>Author</dt> <dd>commit-queue@webkit.org</dd>
<dt>Date</dt> <dd>2014-12-10 11:43:00 -0800 (Wed, 10 Dec 2014)</dd>
</dl>

<h3>Log Message</h3>
<pre>[GStreamer] Use appsrcs instead of unconnected queues
https://bugs.webkit.org/show_bug.cgi?id=139490

Patch by Sebastian Dröge &lt;sebastian@centricular.com&gt; on 2014-12-10
Reviewed by Philippe Normand.

* platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp:
(webkit_web_audio_src_init):
(webKitWebAudioSrcConstructed):
(webKitWebAudioSrcFinalize):
(webKitWebAudioSrcSetProperty):
(webKitWebAudioSrcLoop):
(webKitWebAudioSrcChangeState):
Previously we directly chained buffers into unconnected queues,
which confused some code inside GStreamer and caused some harmless
warnings. Now we use appsrcs instead, which also allows us to remove
quite some code.</pre>

<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkSourceWebCoreChangeLog">trunk/Source/WebCore/ChangeLog</a></li>
<li><a href="#trunkSourceWebCoreplatformaudiogstreamerWebKitWebAudioSourceGStreamercpp">trunk/Source/WebCore/platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp</a></li>
</ul>

</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkSourceWebCoreChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/ChangeLog (177084 => 177085)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/ChangeLog        2014-12-10 19:41:37 UTC (rev 177084)
+++ trunk/Source/WebCore/ChangeLog        2014-12-10 19:43:00 UTC (rev 177085)
</span><span class="lines">@@ -1,3 +1,22 @@
</span><ins>+2014-12-10  Sebastian Dröge  &lt;sebastian@centricular.com&gt;
+
+        [GStreamer] Use appsrcs instead of unconnected queues
+        https://bugs.webkit.org/show_bug.cgi?id=139490
+
+        Reviewed by Philippe Normand.
+
+        * platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp:
+        (webkit_web_audio_src_init):
+        (webKitWebAudioSrcConstructed):
+        (webKitWebAudioSrcFinalize):
+        (webKitWebAudioSrcSetProperty):
+        (webKitWebAudioSrcLoop):
+        (webKitWebAudioSrcChangeState):
+        Previously we directly chained buffers into unconnected queues,
+        which confused some code inside GStreamer and caused some harmless
+        warnings. Now we use appsrcs instead, which also allows us to remove
+        quite some code.
+
</ins><span class="cx"> 2014-12-10  Enrica Casucci  &lt;enrica@apple.com&gt;
</span><span class="cx"> 
</span><span class="cx">         Fix iOS builders for 8.0
</span></span></pre></div>
<a id="trunkSourceWebCoreplatformaudiogstreamerWebKitWebAudioSourceGStreamercpp"></a>
<div class="modfile"><h4>Modified: trunk/Source/WebCore/platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp (177084 => 177085)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Source/WebCore/platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp        2014-12-10 19:41:37 UTC (rev 177084)
+++ trunk/Source/WebCore/platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp        2014-12-10 19:43:00 UTC (rev 177085)
</span><span class="lines">@@ -27,6 +27,7 @@
</span><span class="cx"> #include &quot;AudioIOCallback.h&quot;
</span><span class="cx"> #include &quot;GRefPtrGStreamer.h&quot;
</span><span class="cx"> #include &quot;GStreamerUtilities.h&quot;
</span><ins>+#include &lt;gst/app/app.h&gt;
</ins><span class="cx"> #include &lt;gst/audio/audio.h&gt;
</span><span class="cx"> #include &lt;gst/pbutils/pbutils.h&gt;
</span><span class="cx"> #include &lt;wtf/gobject/GUniquePtr.h&gt;
</span><span class="lines">@@ -52,17 +53,16 @@
</span><span class="cx">     AudioBus* bus;
</span><span class="cx">     AudioIOCallback* provider;
</span><span class="cx">     guint framesToPull;
</span><ins>+    guint bufferSize;
</ins><span class="cx"> 
</span><span class="cx">     GRefPtr&lt;GstElement&gt; interleave;
</span><span class="cx"> 
</span><span class="cx">     GRefPtr&lt;GstTask&gt; task;
</span><span class="cx">     GRecMutex mutex;
</span><span class="cx"> 
</span><del>-    GSList* pads; // List of queue sink pads. One queue for each planar audio channel.
</del><ins>+    GSList* sources; // List of appsrc. One appsrc for each planar audio channel.
</ins><span class="cx">     GstPad* sourcePad; // src pad of the element, interleaved wav data is pushed to it.
</span><span class="cx"> 
</span><del>-    bool newStreamEventPending;
-    GstSegment segment;
</del><span class="cx">     guint64 numberOfSamples;
</span><span class="cx"> 
</span><span class="cx">     GstBufferPool* pool;
</span><span class="lines">@@ -192,9 +192,6 @@
</span><span class="cx">     priv-&gt;provider = 0;
</span><span class="cx">     priv-&gt;bus = 0;
</span><span class="cx"> 
</span><del>-    priv-&gt;newStreamEventPending = true;
-    gst_segment_init(&amp;priv-&gt;segment, GST_FORMAT_TIME);
-
</del><span class="cx">     g_rec_mutex_init(&amp;priv-&gt;mutex);
</span><span class="cx">     priv-&gt;task = gst_task_new(reinterpret_cast&lt;GstTaskFunction&gt;(webKitWebAudioSrcLoop), src, 0);
</span><span class="cx"> 
</span><span class="lines">@@ -210,7 +207,7 @@
</span><span class="cx">     ASSERT(priv-&gt;provider);
</span><span class="cx">     ASSERT(priv-&gt;sampleRate);
</span><span class="cx"> 
</span><del>-    priv-&gt;interleave = gst_element_factory_make(&quot;interleave&quot;, 0);
</del><ins>+    priv-&gt;interleave = gst_element_factory_make(&quot;interleave&quot;, nullptr);
</ins><span class="cx"> 
</span><span class="cx">     if (!priv-&gt;interleave) {
</span><span class="cx">         GST_ERROR_OBJECT(src, &quot;Failed to create interleave&quot;);
</span><span class="lines">@@ -220,34 +217,29 @@
</span><span class="cx">     gst_bin_add(GST_BIN(src), priv-&gt;interleave.get());
</span><span class="cx"> 
</span><span class="cx">     // For each channel of the bus create a new upstream branch for interleave, like:
</span><del>-    // queue ! capsfilter. which is plugged to a new interleave request sinkpad.
</del><ins>+    // appsrc ! . which is plugged to a new interleave request sinkpad.
</ins><span class="cx">     for (unsigned channelIndex = 0; channelIndex &lt; priv-&gt;bus-&gt;numberOfChannels(); channelIndex++) {
</span><del>-        GUniquePtr&lt;gchar&gt; queueName(g_strdup_printf(&quot;webaudioQueue%u&quot;, channelIndex));
-        GstElement* queue = gst_element_factory_make(&quot;queue&quot;, queueName.get());
-        GstElement* capsfilter = gst_element_factory_make(&quot;capsfilter&quot;, 0);
-
</del><ins>+        GUniquePtr&lt;gchar&gt; appsrcName(g_strdup_printf(&quot;webaudioSrc%u&quot;, channelIndex));
+        GstElement* appsrc = gst_element_factory_make(&quot;appsrc&quot;, appsrcName.get());
</ins><span class="cx">         GRefPtr&lt;GstCaps&gt; monoCaps = adoptGRef(getGStreamerMonoAudioCaps(priv-&gt;sampleRate));
</span><span class="cx"> 
</span><span class="cx">         GstAudioInfo info;
</span><span class="cx">         gst_audio_info_from_caps(&amp;info, monoCaps.get());
</span><span class="cx">         GST_AUDIO_INFO_POSITION(&amp;info, 0) = webKitWebAudioGStreamerChannelPosition(channelIndex);
</span><span class="cx">         GRefPtr&lt;GstCaps&gt; caps = adoptGRef(gst_audio_info_to_caps(&amp;info));
</span><del>-        g_object_set(capsfilter, &quot;caps&quot;, caps.get(), NULL);
</del><span class="cx"> 
</span><del>-        // Configure the queue for minimal latency.
-        g_object_set(queue, &quot;max-size-buffers&quot;, static_cast&lt;guint&gt;(1), NULL);
</del><ins>+        // Configure the appsrc for minimal latency.
+        g_object_set(appsrc, &quot;max-bytes&quot;, 2 * priv-&gt;bufferSize, &quot;block&quot;, TRUE,
+            &quot;format&quot;, GST_FORMAT_TIME, &quot;caps&quot;, caps.get(), nullptr);
</ins><span class="cx"> 
</span><del>-        GstPad* pad = gst_element_get_static_pad(queue, &quot;sink&quot;);
-        priv-&gt;pads = g_slist_prepend(priv-&gt;pads, pad);
</del><ins>+        priv-&gt;sources = g_slist_prepend(priv-&gt;sources, gst_object_ref(appsrc));
</ins><span class="cx"> 
</span><del>-        gst_bin_add_many(GST_BIN(src), queue, capsfilter, NULL);
-        gst_element_link_pads_full(queue, &quot;src&quot;, capsfilter, &quot;sink&quot;, GST_PAD_LINK_CHECK_NOTHING);
-        gst_element_link_pads_full(capsfilter, &quot;src&quot;, priv-&gt;interleave.get(), &quot;sink_%u&quot;, GST_PAD_LINK_CHECK_NOTHING);
-
</del><ins>+        gst_bin_add(GST_BIN(src), appsrc);
+        gst_element_link_pads_full(appsrc, &quot;src&quot;, priv-&gt;interleave.get(), &quot;sink_%u&quot;, GST_PAD_LINK_CHECK_NOTHING);
</ins><span class="cx">     }
</span><del>-    priv-&gt;pads = g_slist_reverse(priv-&gt;pads);
</del><ins>+    priv-&gt;sources = g_slist_reverse(priv-&gt;sources);
</ins><span class="cx"> 
</span><del>-    // wavenc's src pad is the only visible pad of our element.
</del><ins>+    // interleave's src pad is the only visible pad of our element.
</ins><span class="cx">     GRefPtr&lt;GstPad&gt; targetPad = adoptGRef(gst_element_get_static_pad(priv-&gt;interleave.get(), &quot;src&quot;));
</span><span class="cx">     gst_ghost_pad_set_target(GST_GHOST_PAD(priv-&gt;sourcePad), targetPad.get());
</span><span class="cx"> }
</span><span class="lines">@@ -259,7 +251,7 @@
</span><span class="cx"> 
</span><span class="cx">     g_rec_mutex_clear(&amp;priv-&gt;mutex);
</span><span class="cx"> 
</span><del>-    g_slist_free_full(priv-&gt;pads, reinterpret_cast&lt;GDestroyNotify&gt;(gst_object_unref));
</del><ins>+    g_slist_free_full(priv-&gt;sources, reinterpret_cast&lt;GDestroyNotify&gt;(gst_object_unref));
</ins><span class="cx"> 
</span><span class="cx">     priv-&gt;~WebKitWebAudioSourcePrivate();
</span><span class="cx">     GST_CALL_PARENT(G_OBJECT_CLASS, finalize, ((GObject* )(src)));
</span><span class="lines">@@ -282,6 +274,7 @@
</span><span class="cx">         break;
</span><span class="cx">     case PROP_FRAMES:
</span><span class="cx">         priv-&gt;framesToPull = g_value_get_uint(value);
</span><ins>+        priv-&gt;bufferSize = sizeof(float) * priv-&gt;framesToPull;
</ins><span class="cx">         break;
</span><span class="cx">     default:
</span><span class="cx">         G_OBJECT_WARN_INVALID_PROPERTY_ID(object, propertyId, pspec);
</span><span class="lines">@@ -331,7 +324,7 @@
</span><span class="cx"> 
</span><span class="cx">     GSList* channelBufferList = 0;
</span><span class="cx">     register int i;
</span><del>-    for (i = g_slist_length(priv-&gt;pads) - 1; i &gt;= 0; i--) {
</del><ins>+    for (i = g_slist_length(priv-&gt;sources) - 1; i &gt;= 0; i--) {
</ins><span class="cx">         AudioSrcBuffer* buffer = g_new(AudioSrcBuffer, 1);
</span><span class="cx">         GstBuffer* channelBuffer;
</span><span class="cx"> 
</span><span class="lines">@@ -363,17 +356,11 @@
</span><span class="cx">     // FIXME: Add support for local/live audio input.
</span><span class="cx">     priv-&gt;provider-&gt;render(0, priv-&gt;bus, priv-&gt;framesToPull);
</span><span class="cx"> 
</span><del>-    GSList* padsIt = priv-&gt;pads;
</del><ins>+    GSList* sourcesIt = priv-&gt;sources;
</ins><span class="cx">     GSList* buffersIt = channelBufferList;
</span><span class="cx"> 
</span><del>-#if GST_CHECK_VERSION(1, 2, 0)
-    guint groupId = 0;
-    if (priv-&gt;newStreamEventPending)
-        groupId = gst_util_group_id_next();
-#endif
-
-    for (i = 0; padsIt &amp;&amp; buffersIt; padsIt = g_slist_next(padsIt), buffersIt = g_slist_next(buffersIt), ++i) {
-        GstPad* pad = static_cast&lt;GstPad*&gt;(padsIt-&gt;data);
</del><ins>+    for (i = 0; sourcesIt &amp;&amp; buffersIt; sourcesIt = g_slist_next(sourcesIt), buffersIt = g_slist_next(buffersIt), ++i) {
+        GstElement* appsrc = static_cast&lt;GstElement*&gt;(sourcesIt-&gt;data);
</ins><span class="cx">         AudioSrcBuffer* buffer = static_cast&lt;AudioSrcBuffer*&gt;(buffersIt-&gt;data);
</span><span class="cx">         GstBuffer* channelBuffer = buffer-&gt;buffer;
</span><span class="cx"> 
</span><span class="lines">@@ -381,37 +368,13 @@
</span><span class="cx">         gst_buffer_unmap(channelBuffer, &amp;buffer-&gt;info);
</span><span class="cx">         g_free(buffer);
</span><span class="cx"> 
</span><del>-        // Send stream-start, segment and caps events downstream, along with the first buffer.
-        if (priv-&gt;newStreamEventPending) {
-            GRefPtr&lt;GstElement&gt; queue = adoptGRef(gst_pad_get_parent_element(pad));
-            GRefPtr&lt;GstPad&gt; sinkPad = adoptGRef(gst_element_get_static_pad(queue.get(), &quot;sink&quot;));
-            GUniquePtr&lt;gchar&gt; queueName(gst_element_get_name(queue.get()));
-            GUniquePtr&lt;gchar&gt; streamId(g_strdup_printf(&quot;webaudio/%s&quot;, queueName.get()));
-            GstEvent* streamStartEvent = gst_event_new_stream_start(streamId.get());
-#if GST_CHECK_VERSION(1, 2, 0)
-            gst_event_set_group_id(streamStartEvent, groupId);
-#endif
-            gst_pad_send_event(sinkPad.get(), streamStartEvent);
-
-            GRefPtr&lt;GstCaps&gt; monoCaps = adoptGRef(getGStreamerMonoAudioCaps(priv-&gt;sampleRate));
-            GstAudioInfo info;
-            gst_audio_info_from_caps(&amp;info, monoCaps.get());
-            GST_AUDIO_INFO_POSITION(&amp;info, 0) = webKitWebAudioGStreamerChannelPosition(i);
-            GRefPtr&lt;GstCaps&gt; capsWithChannelPosition = adoptGRef(gst_audio_info_to_caps(&amp;info));
-            gst_pad_send_event(sinkPad.get(), gst_event_new_caps(capsWithChannelPosition.get()));
-
-            gst_pad_send_event(sinkPad.get(), gst_event_new_segment(&amp;priv-&gt;segment));
-        }
-
-        GstFlowReturn ret = gst_pad_chain(pad, channelBuffer);
</del><ins>+        GstFlowReturn ret = gst_app_src_push_buffer(GST_APP_SRC(appsrc), channelBuffer);
</ins><span class="cx">         if (ret != GST_FLOW_OK) {
</span><del>-            GST_ELEMENT_ERROR(src, CORE, PAD, (&quot;Internal WebAudioSrc error&quot;), (&quot;Failed to push buffer on %s:%s flow: %s&quot;, GST_DEBUG_PAD_NAME(pad), gst_flow_get_name(ret)));
</del><ins>+            GST_ELEMENT_ERROR(src, CORE, PAD, (&quot;Internal WebAudioSrc error&quot;), (&quot;Failed to push buffer on %s flow: %s&quot;, GST_OBJECT_NAME(appsrc), gst_flow_get_name(ret)));
</ins><span class="cx">             gst_task_pause(src-&gt;priv-&gt;task.get());
</span><span class="cx">         }
</span><span class="cx">     }
</span><span class="cx"> 
</span><del>-    priv-&gt;newStreamEventPending = false;
-
</del><span class="cx">     g_slist_free(channelBufferList);
</span><span class="cx"> }
</span><span class="cx"> 
</span><span class="lines">@@ -444,7 +407,7 @@
</span><span class="cx">         GST_DEBUG_OBJECT(src, &quot;READY-&gt;PAUSED&quot;);
</span><span class="cx">         src-&gt;priv-&gt;pool = gst_buffer_pool_new();
</span><span class="cx">         GstStructure* config = gst_buffer_pool_get_config(src-&gt;priv-&gt;pool);
</span><del>-        gst_buffer_pool_config_set_params(config, nullptr, src-&gt;priv-&gt;framesToPull * sizeof(float), 0, 0);
</del><ins>+        gst_buffer_pool_config_set_params(config, nullptr, src-&gt;priv-&gt;bufferSize, 0, 0);
</ins><span class="cx">         gst_buffer_pool_set_config(src-&gt;priv-&gt;pool, config);
</span><span class="cx">         if (!gst_buffer_pool_set_active(src-&gt;priv-&gt;pool, TRUE))
</span><span class="cx">             returnValue = GST_STATE_CHANGE_FAILURE;
</span><span class="lines">@@ -453,7 +416,6 @@
</span><span class="cx">         break;
</span><span class="cx">     }
</span><span class="cx">     case GST_STATE_CHANGE_PAUSED_TO_READY:
</span><del>-        src-&gt;priv-&gt;newStreamEventPending = true;
</del><span class="cx">         GST_DEBUG_OBJECT(src, &quot;PAUSED-&gt;READY&quot;);
</span><span class="cx">         if (!gst_task_join(src-&gt;priv-&gt;task.get()))
</span><span class="cx">             returnValue = GST_STATE_CHANGE_FAILURE;
</span></span></pre>
</div>
</div>

</body>
</html>