[Webkit-unassigned] [Bug 115354] Disable HTTP request "Accept-Encoding:" header field on gstreamer source element to avoid receiving the wrong size when retrieving data

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Mon Apr 29 09:03:10 PDT 2013


https://bugs.webkit.org/show_bug.cgi?id=115354





--- Comment #11 from Sergio Villar Senin <svillar at igalia.com>  2013-04-29 09:01:30 PST ---
(In reply to comment #9)
> (In reply to comment #8)
> > (In reply to comment #7)
> > > (In reply to comment #6)
> > > > Could this be the cause of bug 90732 ?
> > > 
> > > hmm, could be, but most likely you need the patch from bug 115353. What happens here is that libsoup downloads the compressed data and extracts it using SoupContentDecoder, but returns the size of the specified Content-Length instead, thus the size of the uncompressed data differs from the actual reported size.
> > 
> > I don't think this is the right fix. By doing that you're increasing the http transfers potentially by a lot since the server won't compress data. The problem is that you cannot assume that you could know the actual size of the data after receiving the http headers. It is not true for compressed data as you said but it isn't also true for "Transfer-Encoding: chunked" or for responses without encoding (ended when the connection closes).
> > 
> > So my advice would be to try to reorganize the gstreamer code in order not to depend on the size of the data, not sure if that's possible tough.
> 
> hmm, the webkit source element uses the content length to compute the duration of the stream. If the size is reported as 0, the stream will be considered a live stream, failing to seek and other small "issues".

What I mean is, for non live streams, you cannot know for sure the size of the stream until all the data is received except in very specific cases (non compressed data with Content-Length header).

> I am not sure what would happen if using "Transfer-Encoding: chunked" (not even sure if that is set by libsoup when requesting). Right now IIRC ResourceHandleClient::didReceiveResponse() is called only once and we get the size from response.expectedContentLength() and set the duration based on it. ResourceHandleClient::didReceiveData() is called on every data chunk downloaded though, but we don't use the size reported there as we want to know the duration in advance, not during playback (when the data is actually downloaded).
> 
> In other words, it could be that "Transfer-Encoding: chunked" would lead to some issue, but I am not sure and we tested these patches in a series of scenarios using DASH/MSS/HLS adaptive and other non-adaptive streams and didn't find any issue.

Maybe because media is normally streamed using content-length encodings (no idea here, maybe philn knows better). My point is, disabling compression is a bad idea IMO not only because of the increase in the amount of data but also because the problem is still there for chunked or EOF transfer-encodings. Again, maybe multimedia streams are not generally transferred that way and compression does not significantly increase the size of the data (since multimedia data is normally already compressed), I honestly don't know, I just give my opinion from the network stack POV.

I'm pretty sure all the multimedia libraries have to deal with these issues, how do they normally handle this case?

-- 
Configure bugmail: https://bugs.webkit.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.



More information about the webkit-unassigned mailing list