[webkit-dev] Question regarding priorities of subresource content retrieval

Silvio Ventres silvio.ventres at gmail.com
Tue Feb 8 09:24:45 PST 2011


Do you have any example of scripts or css that are externally sourced,
and where developer cares to reasonably optimize the web page?
The main use case of such external scripts currently is ads and
statistics gatherers for analysis. This, arguably, is not critical
content that the user is interested in.

If your argument is indeed "Web developer should have control", then,
when you have no choice but including external scripts (ads, f.e.),
you would probably hate those to break the latency of your website.
If you are talking about the http://muddledramblings.com/ website, for
example, you can clearly see that most scripts there are
domain-internal.
Do you deem your user experience more or less important than Google
Analytics capability ? If Google Analytics hangs, for example, for 4
seconds, would you like the user to wait, or start reading while it
loads?

A change to HTML standard might be a good idea, though the problem
here is that there are millions of pages on the 'net already, and the
developers won't suddenly start changing them.

This heuristic will allow the users to view 90% of the current Web
more interactively.
Keep in mind that at least 38% of all statistics is taken out of thin
air :), but, really, please, show at least two pages which this
heuristic will NOT work on.

--
 silvio

On Tue, Feb 8, 2011 at 6:52 PM, Jerry Seeger <vikingjs at mac.com> wrote:
> My argument is less "it's the Web developer's fault" than it is "the Web developer should have control." I am hardly a sophisticated Web developer but I have javascript from a different  domain that must be loaded first and I have Google analytics, which I should load after the rest of the page (though to be honest I'm not sure I do after my redesign... hm). While I would love it if there were standardized rules for which scripts would be loaded synchronously and which wouldn't, I would hate it if one browser required me to move my scripts to a different domain.
>
> Having said all that, I hate it when I have to wait for a resource out outside of my control, so I'd love to see a solution to this. If there were a more reliable way than simple domain checking to prioritize content, that would be fantastic. I think ideally this is something for the standards board - perhaps an extension of the script and link tags to specify a priority, or something like that.
>
> Jerry
>
>
> On Feb 8, 2011, at 2:23 AM, Silvio Ventres wrote:
>
>> This argument - "web developer is to blame for choosing a slow
>> ad/tracking/etc server" - is incorrect.
>> Web developers in general do not have any control over the ad provider
>> or, frankly, any other type of external functionality provider.
>> Google Analytics being a good point in case, you would not want most
>> of the world's web pages to suddenly hang if something happens inside
>> Google.
>>
>> The web browser should clearly prioritize developer-controllable
>> resources over ones that are beyond web developer's control.
>> Also, as an application run by the user and not by the developer, the
>> browser should arguably prioritize actual content against
>> pseudo-content which purpose is functionality that is not visibile to
>> the actual user, such as ad/tracker scripts. This actual content has
>> higher probability to be important when sourced from the
>> domain/subdomain of the webpage itself, based on current trends.
>>
>> Domain check is a reasonable approximation that fits both purposes.
>>
>> --
>> silvio
>>
>>
>> On Tue, Feb 8, 2011 at 5:13 AM, Jerry Seeger <vikingjs at mac.com> wrote:
>>> I'm reasonably sure that javascript in the header must be loaded synchronously, as it might affect the rest of the load. This is why tools like YSlow advise Web designers to move javascript loads that are not needed for rendering until after the rest of the page loads.
>>>
>>> Blocking on loading the css is less clear-cut, as in some cases it could mean several seconds of ugly page. I don't know if it's right or wrong, but a lot of pages out there rely on the CSS being loaded before the page starts to render to avoid terrible layout and the appearance of items meant to be hidden for the seconds it takes the css to load.
>>>
>>> In general, while things could certainly be improved, it's up to the owner of the page to not rely on a a slow ad server, or build the page so the ads load after the primary content.
>>>
>>> Jerry Seeger
>>>
>>>
>>> On Feb 7, 2011, at 5:47 PM, Silvio Ventres wrote:
>>>
>>>> IE/Opera are delaying only for 4 seconds, same as Mobile Safari
>>>> The reason looks to be the url for the script/css.
>>>> If the url is the same twice, Chrome/Firefox serializes the requests,
>>>> while IE/Opera/MobileSafari launches both requests simultaneously.
>>>>
>>>> Of course, requesting simultaneously doesn't fix anything, as you can
>>>> see by trying a link-stuffed version at
>>>> http://solid.eqoppa.com/testlag2.html
>>>>
>>>> This one has 45 css and 38 javascript links. It hangs all browsers nicely.
>>>> The main point here is that it might be acceptable if it's coming from
>>>> the webpage domain itself.
>>>> But the links are coming from a completely different place.
>>>>
>>>> This is exactly what makes browsing pages with any third-party
>>>> analytics, tracking or ad addons so slow and frustrating.
>>>> Fixing priorities in subresource download should make experience
>>>> considerably more interactive and fun.
>>>>
>>>> --
>>>> silvio
>>>
>>>
>
>


More information about the webkit-dev mailing list