[webkit-dev] Question regarding priorities of subresource content retrieval
Tony Gentilcore
tonyg at chromium.org
Tue Feb 8 08:48:53 PST 2011
Your test case isn't really about prioritization. The HTML5 spec
defines very specifically when parsing must stop. The two main cases
are:
1. Waiting for an external script to download
2. Waiting for an external stylesheet to download when any script
block is reached
In these cases, the parser does not continue parsing the document to
discover new subresources to download. However, as an optimization,
the PreloadScanner speculatively scans the source (which it is not
allowed to parse yet) for any subresources which should probably be
downloaded. This way when parsing does continue the resources are
already available or at least have a head start. So if we aren't able
to scan ahead and at least discover these resources, prioritization is
moot.
Now, assume we have discovered all subresources on the page and could
prioritize them altogether. I'm still not sure I'd buy your argument
about resources from another domain being less important. Many sites
use CDNs on different domains to download resources. Also, many sites
include their JS libraries from common locations. In either of those
cases, another domain could be holding the critical blocking resource.
Perhaps it is worth experimenting with the heuristic you suggest, but
I certainly don't think we can just assert that is the case.
On Tue, Feb 8, 2011 at 2:23 AM, Silvio Ventres <silvio.ventres at gmail.com> wrote:
> This argument - "web developer is to blame for choosing a slow
> ad/tracking/etc server" - is incorrect.
> Web developers in general do not have any control over the ad provider
> or, frankly, any other type of external functionality provider.
> Google Analytics being a good point in case, you would not want most
> of the world's web pages to suddenly hang if something happens inside
> Google.
>
> The web browser should clearly prioritize developer-controllable
> resources over ones that are beyond web developer's control.
> Also, as an application run by the user and not by the developer, the
> browser should arguably prioritize actual content against
> pseudo-content which purpose is functionality that is not visibile to
> the actual user, such as ad/tracker scripts. This actual content has
> higher probability to be important when sourced from the
> domain/subdomain of the webpage itself, based on current trends.
>
> Domain check is a reasonable approximation that fits both purposes.
>
> --
> silvio
>
>
> On Tue, Feb 8, 2011 at 5:13 AM, Jerry Seeger <vikingjs at mac.com> wrote:
>> I'm reasonably sure that javascript in the header must be loaded synchronously, as it might affect the rest of the load. This is why tools like YSlow advise Web designers to move javascript loads that are not needed for rendering until after the rest of the page loads.
>>
>> Blocking on loading the css is less clear-cut, as in some cases it could mean several seconds of ugly page. I don't know if it's right or wrong, but a lot of pages out there rely on the CSS being loaded before the page starts to render to avoid terrible layout and the appearance of items meant to be hidden for the seconds it takes the css to load.
>>
>> In general, while things could certainly be improved, it's up to the owner of the page to not rely on a a slow ad server, or build the page so the ads load after the primary content.
>>
>> Jerry Seeger
>>
>>
>> On Feb 7, 2011, at 5:47 PM, Silvio Ventres wrote:
>>
>>> IE/Opera are delaying only for 4 seconds, same as Mobile Safari
>>> The reason looks to be the url for the script/css.
>>> If the url is the same twice, Chrome/Firefox serializes the requests,
>>> while IE/Opera/MobileSafari launches both requests simultaneously.
>>>
>>> Of course, requesting simultaneously doesn't fix anything, as you can
>>> see by trying a link-stuffed version at
>>> http://solid.eqoppa.com/testlag2.html
>>>
>>> This one has 45 css and 38 javascript links. It hangs all browsers nicely.
>>> The main point here is that it might be acceptable if it's coming from
>>> the webpage domain itself.
>>> But the links are coming from a completely different place.
>>>
>>> This is exactly what makes browsing pages with any third-party
>>> analytics, tracking or ad addons so slow and frustrating.
>>> Fixing priorities in subresource download should make experience
>>> considerably more interactive and fun.
>>>
>>> --
>>> silvio
>>
>>
> _______________________________________________
> webkit-dev mailing list
> webkit-dev at lists.webkit.org
> http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev
>
More information about the webkit-dev
mailing list