[webkit-dev] Making browsers faster: Resource Packages
pkasting at google.com
Tue Nov 17 18:01:14 PST 2009
On Tue, Nov 17, 2009 at 5:36 PM, Alexander Limi <limi at mozilla.com> wrote:
> On Tue, Nov 17, 2009 at 2:44 PM, Peter Kasting <pkasting at google.com>
>> Reduced parallelism is a big concern of mine. Lots of sites make heavy
>> use of resource sharding across many hostnames to take advantage of multiple
>> connections, which this defeats.
> If you package up everything in a single zip file, yes. Realistically, if
> you have a lot of resources, you'd want to spread them out over several
> files to increase parallelism. Also, there's usually resources that are
> page-specific (e.g. belong to the article being rendered). As with
> everything, there are possibilities to use this the wrong way, and packaging
> up everything in one zip file will definitely affect parallelism. Don't do
But at this point it's not clear what the site author should do. *Any*
packaging reduces parallelism somewhat. How much do you reduce parallelism?
Best practices vary dramatically depending on the details of the user's
connection. I realize some of these issues already exist when sites try to
determine how to shard their resource servers, but if you want to split your
resources among several packages *today*, you can put all the images in one
file, all the scripts in one file, etc., today, and this proposal doesn't
buy terribly much over that. (Plus it has costs, see below.)
You note that SPDY has to be implemented by both UAs and web servers, but
>> conversely this proposal needs to be implemented by UAs and _authors_. I
>> would rather burden the guys writing Apache than the guys making webpages,
>> and I think if a technique is extremely useful, it's easier to get support
>> into Apache than into, say, 50% of the webpages out there.
> There's no damage if you *don't* do this as a web author. If you care
> enough to do CSS spriting and CSS/JS combining, this gives you a more
> maintainable, easier, faster solution.
Neither proposal does harm when people don't implement it. What I am saying
is that there's much more of a burden to try and get this to happen on a
per-site-author basis than a per-web-server-codebase basis. And with a
technique that needs expertise not to backfire, I'm definitely interested in
not forcing each site author to make individual decisions about how to use
I think mitigating this is why there are optional manifests. I agree that
>> if there's no manifest, this is really, really painful. I think manifests
>> should be made mandatory.
> The manifests *are* mandatory. Without a manifest, it won't do anything
> (ie. proceed to load the resources as usual), since that would block page
> loads, which is not an option.
Your doc explicitly says manifests are optional: "To give the browser the
ability to know up front what files are in the zip file without reading the
entire file first, we support an *optional* manifest file that can contain
this information." (emphasis mine)
As I noted, even with a manifest, you're introducing extra overhead before
the browser knows how to handle other referenced resources, although it is
only the overhead of contacting the web server and obtaining the manifest,
rather than the overhead of obtaining the entire bundle. James Robinson
does a good job in his latest message of covering some of the issues here in
In the end, the initial proposal comes across a bit like "just bundle
everything up in one archive!", but as you note, doing that will _harm_ page
load speeds in many cases. The actual usage of this feature needs to be
carefully considered by site authors, and ends up providing capabilities
very similar to spriting and combining script files, except with the
additional problem that the browser has to obtain manifests before it knows
how to process any resources referenced in the document. This just doesn't
feel like a very good tradeoff to me.
I agree with everyone who would like to see numbers. Of course, good
measurements here are extremely hard (as we've found while working on SPDY),
so I suspect providing meaningful, reliable ones that cover all the relevant
cases might take quite a bit of doing. I will be interested to see what
Steve Souders has come up with, and especially what sort of conditions lead
to the numbers he has.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the webkit-dev