[webkit-dev] Making browsers faster: Resource Packages

Alexander Limi limi at mozilla.com
Tue Nov 17 17:36:34 PST 2009


 (Adding in some of the people involved with Resource Packages earlier to
this thread, so they can help me out — I'm just a lowly UI designer, so some
of these questions have to be answered by people that know how browsers
work. I'm just the messenger. Hope you don't mind, guys, and remember that
webkit-dev requires you to sign up before you can post.)

On Tue, Nov 17, 2009 at 2:44 PM, Peter Kasting <pkasting at google.com> wrote:

> I have read the whole document, but I read it quickly, so please do point
> out places where I've overlooked an obvious response.
>

This is what everyone does, so no worries, happy to clarify. 95% of the
"this is why this won't work" statements are actually answered by the
article in some way. But I guess I shouldn't be surprised. :)


> Reduced parallelism is a big concern of mine.  Lots of sites make heavy use
> of resource sharding across many hostnames to take advantage of multiple
> connections, which this defeats.
>

If you package up everything in a single zip file, yes. Realistically, if
you have a lot of resources, you'd want to spread them out over several
files to increase parallelism. Also, there's usually resources that are
page-specific (e.g. belong to the article being rendered). As with
everything, there are possibilities to use this the wrong way, and packaging
up everything in one zip file will definitely affect parallelism. Don't do
that.

I am concerned about the instruction to prefer the packaged resources to any
> separate resources.  This seems to increase the maintenance burden since you
> can never incrementally override the contents of a package, but always have
> to repackage.
>

This is something we could look at, of course. There are easy ways to
invalidate the zip using ETags etc.


> If an author has resources only used on some pages, then he can either make
> multiple packages (more maintenance burden and exacerbates problem above),
> or include everything in one package (may result in downloading excessive
> resources for pages where clients don't need them).
>

I don't think it's unreasonable to expect most big sites to have a standard
core of resources they use everywhere. It's important not to try to put
*everything* in resource packages, just the stuff that should be present
everywhere (and the specialized thumbnail search result case I mentioned).


> You note that SPDY has to be implemented by both UAs and web servers, but
> conversely this proposal needs to be implemented by UAs and _authors_.  I
> would rather burden the guys writing Apache than the guys making webpages,
> and I think if a technique is extremely useful, it's easier to get support
> into Apache than into, say, 50% of the webpages out there.
>

There's no damage if you *don't* do this as a web author. If you care enough
to do CSS spriting and CSS/JS combining, this gives you a more maintainable,
easier, faster solution.

On Tue, Nov 17, 2009 at 3:00 PM, James Robinson <jamesr at google.com> wrote:
>
> It seems like a browser will have to essentially stop rendering until
>  it has finished downloading the entire .zip and examined it.


No. That's why the manifest is there, since it can be read early on, so the
browser doesn't have to block.

I see a lot of "I don't think this will work" or "I don't think this will be
any faster" here. I guess I should get someone to help me create some
reasonable benchmarks and show what the difference would be. Maybe Steve
Souders or someone else that is better at this stuff than me can help out
with some data.

On Tue, Nov 17, 2009 at 3:02 PM, Peter Kasting <pkasting at google.com> wrote:

> I think mitigating this is why there are optional manifests.  I agree that
> if there's no manifest, this is really, really painful.  I think manifests
> should be made mandatory.
>

The manifests *are* mandatory. Without a manifest, it won't do anything (ie.
proceed to load the resources as usual), since that would block page loads,
which is not an option.


On Tue, Nov 17, 2009 at 3:12 PM, Simon Fraser <simon.fraser at apple.com>wrote:

> If you require a manifest, why not pick an archive format where there's a
> TOC which is guaranteed to be at the head of the file, which the browser can
> parse without having to wait for the entire file to download?
>

If there are other formats that can a) be streamed and unpacked in partial
state, and b) is common enough that people will actually be able to use it,
let me know.

The tar format is sequential, and (I think) has the header first, but
doesn't do compression. If you add gzip to that, you can't partially unpack,
which will block page downloads. You could of course argue that using only
tar (without gzip) could work, and I think we're open to supporting that, if
those assumptions are correct — I haven't looked at the details for that
yet.

-- 
Alexander Limi · Firefox User Experience · http://limi.net
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.webkit.org/pipermail/webkit-dev/attachments/20091117/5fe70ea3/attachment.html>


More information about the webkit-dev mailing list