[webkit-dev] Question regarding priorities of subresource content retrieval

Silvio Ventres silvio.ventres at gmail.com
Tue Feb 8 14:50:57 PST 2011

Let's take each argument apart one by one:

1. If the plugin, W3 Total Cache for WordPress, by itself moves the
script loads after the main content, there will be no visible change
using heuristic.

1a. In the plugin installation guide, the developers advise on either
using a CDN or "static.*" subdomain.
It is not obvious why a web developer would host his scripts on a
server _external_ to the main domain, except a CDN or a source of
ads/tracking scripts.

1b. Quoting plugin developers from here:
"An additional 500ms latency reduces traffic by 20%."
They recommend you show the content to user as soon as possible.

2. What is the impact on CDN scripts?
Best case - the CDN is not slow - no impact.
Worst case - the CDN is slow - the user is shown an unscripted, and
possibly unstyled webpage while he waits for the CDN script to load.
Compare to the no-heuristic case: the user is shown a blank page,
making the site seem to "hang" - highly frustrating.
Beneficial side effect: if the CDN dies, user is shown content content
anyway, instead of stalling forever.

2a. In case a CDN whitelist exists, imported from YSlow, f.e., the
impact is even less.

3. What is the impact on ads/tracking scripts?
Best case - the external source is fast - no impact.
Worst case - the external source is slow - user is not made to wait on
these sources, and proceeds to consume the content.
Compare to the no-heuristic case: user waits for content for a second,
sees a blank page, and either closes it or moves to another tab.
Beneficial side effect: if the external source dies, there is no
visible impact for the user. Ad sources die/stall _very often_.

4. Loading scripts not in the order they appear in the page source
makes rendering the page harder.
The complexity might rise, but the main impact will be forcing several
more repaints.
But then, during the resource loading wait, you have all the CPU-time
you need to paint an intermediate render.
A user will generally prefer to trade 0.5 seconds of repaint time in
exchange of being able to interact with the page immediately.

5. As a developer, there is no impact on you.
The developers are mostly lazy - working around that for the benefit
of the viewer has more merit than trying to teach web developers to

The heuristic is automatic and will cover 80-90% of the market. There
should not be any impact on the total page loading time for the 10-20%
left. Most probably, they will also benefit from reduced initial-paint
latency, but less than the most.

Why would you want to know what is inside the script?
The script, whether loaded initially or at the end, has a chance of
forcing repaint. By reordering the scripts you risk wasting cpu-cycles
on rendering incomplete page that will not be usable afterwards. But,
by losing a few cpu-cycles on that, you gain immediate content view
that increases the interactivity immensely. You trade increased total
render time for decreased initial paint latency. And, for most pages,
the increase might actually be zero.


On Tue, Feb 8, 2011 at 8:40 PM, Jerry Seeger <vikingjs at mac.com> wrote:
> I'm still fiddling with the scripts on muddledramblings.com after a
> redesign, but I intend to move static resources to a cookieless domain to
> improve performance. This is a petty common tactic - sort of a poor man's
> CDN. The key is that I can decide to do this. (Yes, I could rearrange my
> site and use www.muddledramblings.com and cookieless.muddledramblings.com,
> but you're making me do things a different way to support one Web browser.)
> (On a side note, muddledramblings.com's biggest performance problem right
> now is the host. Don't use iPage. </rant>)
> Keep in mind that scripts not executing when expected can totally break a
> site, not just make it less pleasant. A script that generates
> content must be executed in a predictable fashion no matter where it came
> from. Long ago I had a moon phase widget that generated content, and raised
> hell on browsers that did not block correctly when the script loaded. (I
> once had a widget with a script that generated a script. The results were...
> inconsistent.) These days all browsers block correctly and the Web is a
> better place for it.
> I can't see telling Web designers, "If your script uses document.write, it
> must come from the same domain or a known whitelist." (And let's hope the
> latency of the whitelist server is really low.) I can't see telling Joe
> Blogger why the visitor counter in his sidebar now writes the number at the
> bottom of the page.
> The WordPress plugin W3 Super Cache includes features to automate moving
> static content (including scripts) to a separate, cookieless domain. A lot
> of people use the plugin, but I can't speak to how many use the pseudo-cdn
> feature. My guess is not that many, but the ones who do will expect their
> scripts to execute where encountered, before the rest of the page loads, as
> mandated by the standards.
> The Web designer can already cause javascripts to load after the rest of the
> page (the above plugin automates this as well). Were I to run ads, you can
> bet that those scripts would not be loaded in the header (well, if I weren't
> lazy you could bet it). If I'm not already loading Google analytics late,
> it's because I haven't finished getting my script strategy finalized.
> While I would certainly like to see an automated mechanism for setting
> external resource priority, allowing me to continue in my lazy ways and not
> pay a performance price, (and make the Web more responsive in general, since
> most of us are lazy), simple domain check is not adequate when it comes to
> scripts. I wish I could think of an automated way to augment using the
> domain, but all my ideas require knowing what's in the script ahead of time
> (scripts that only define event handlers, for instance).
> Jerry Seeger
> On Feb 8, 2011, at 9:24 AM, Silvio Ventres wrote:
> Do you have any example of scripts or css that are externally sourced,
> and where developer cares to reasonably optimize the web page?
> The main use case of such external scripts currently is ads and
> statistics gatherers for analysis. This, arguably, is not critical
> content that the user is interested in.
> If your argument is indeed "Web developer should have control", then,
> when you have no choice but including external scripts (ads, f.e.),
> you would probably hate those to break the latency of your website.
> If you are talking about the http://muddledramblings.com/ website, for
> example, you can clearly see that most scripts there are
> domain-internal.
> Do you deem your user experience more or less important than Google
> Analytics capability ? If Google Analytics hangs, for example, for 4
> seconds, would you like the user to wait, or start reading while it
> loads?
> A change to HTML standard might be a good idea, though the problem
> here is that there are millions of pages on the 'net already, and the
> developers won't suddenly start changing them.
> This heuristic will allow the users to view 90% of the current Web
> more interactively.
> Keep in mind that at least 38% of all statistics is taken out of thin
> air :), but, really, please, show at least two pages which this
> heuristic will NOT work on.
> --
> silvio
> On Tue, Feb 8, 2011 at 6:52 PM, Jerry Seeger <vikingjs at mac.com> wrote:
> My argument is less "it's the Web developer's fault" than it is "the Web
> developer should have control." I am hardly a sophisticated Web developer
> but I have javascript from a different  domain that must be loaded first and
> I have Google analytics, which I should load after the rest of the page
> (though to be honest I'm not sure I do after my redesign... hm). While I
> would love it if there were standardized rules for which scripts would be
> loaded synchronously and which wouldn't, I would hate it if one browser
> required me to move my scripts to a different domain.
> Having said all that, I hate it when I have to wait for a resource out
> outside of my control, so I'd love to see a solution to this. If there were
> a more reliable way than simple domain checking to prioritize content, that
> would be fantastic. I think ideally this is something for the standards
> board - perhaps an extension of the script and link tags to specify a
> priority, or something like that.
> Jerry
> On Feb 8, 2011, at 2:23 AM, Silvio Ventres wrote:
> This argument - "web developer is to blame for choosing a slow
> ad/tracking/etc server" - is incorrect.
> Web developers in general do not have any control over the ad provider
> or, frankly, any other type of external functionality provider.
> Google Analytics being a good point in case, you would not want most
> of the world's web pages to suddenly hang if something happens inside
> Google.
> The web browser should clearly prioritize developer-controllable
> resources over ones that are beyond web developer's control.
> Also, as an application run by the user and not by the developer, the
> browser should arguably prioritize actual content against
> pseudo-content which purpose is functionality that is not visibile to
> the actual user, such as ad/tracker scripts. This actual content has
> higher probability to be important when sourced from the
> domain/subdomain of the webpage itself, based on current trends.
> Domain check is a reasonable approximation that fits both purposes.
> --
> silvio
> On Tue, Feb 8, 2011 at 5:13 AM, Jerry Seeger <vikingjs at mac.com> wrote:
> I'm reasonably sure that javascript in the header must be loaded
> synchronously, as it might affect the rest of the load. This is why tools
> like YSlow advise Web designers to move javascript loads that are not needed
> for rendering until after the rest of the page loads.
> Blocking on loading the css is less clear-cut, as in some cases it could
> mean several seconds of ugly page. I don't know if it's right or wrong, but
> a lot of pages out there rely on the CSS being loaded before the page starts
> to render to avoid terrible layout and the appearance of items meant to be
> hidden for the seconds it takes the css to load.
> In general, while things could certainly be improved, it's up to the owner
> of the page to not rely on a a slow ad server, or build the page so the ads
> load after the primary content.
> Jerry Seeger
> On Feb 7, 2011, at 5:47 PM, Silvio Ventres wrote:
> IE/Opera are delaying only for 4 seconds, same as Mobile Safari
> The reason looks to be the url for the script/css.
> If the url is the same twice, Chrome/Firefox serializes the requests,
> while IE/Opera/MobileSafari launches both requests simultaneously.
> Of course, requesting simultaneously doesn't fix anything, as you can
> see by trying a link-stuffed version at
> http://solid.eqoppa.com/testlag2.html
> This one has 45 css and 38 javascript links. It hangs all browsers nicely.
> The main point here is that it might be acceptable if it's coming from
> the webpage domain itself.
> But the links are coming from a completely different place.
> This is exactly what makes browsing pages with any third-party
> analytics, tracking or ad addons so slow and frustrating.
> Fixing priorities in subresource download should make experience
> considerably more interactive and fun.
> --
> silvio
> _______________________________________________
> webkit-dev mailing list
> webkit-dev at lists.webkit.org
> http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev

More information about the webkit-dev mailing list