[webkit-dev] Adding archive.org-based page loading time performance tests

Darin Fisher darin at chromium.org
Sun Apr 29 23:15:17 PDT 2012

On Sun, Apr 29, 2012 at 3:44 PM, Ryosuke Niwa <rniwa at webkit.org> wrote:

> On Fri, Apr 27, 2012 at 1:49 AM, Nat Duca <nduca at chromium.org> wrote:
>> I'm concerned at how well this would work graphics performance tests.
>> Consider:
>> http://web.archive.org/web/20110111083848/http://techcrunch.com/
>> http://web.archive.org/web/20110222032916/http://www.nytimes.com/
>> http://web.archive.org/web/20110429194113/http://www.thewildernessdowntown.com/
>> What do we do for the cases where archive.org is getting bad/incomplete
>> ... erm, archives?
> There's no fix to it. If archive.org doesn't work, then we need to pull
> data directly from the website. We can do that. The infrastructure I'm
> developing is agnostic of whether we use archive.org or not. However,
> pulling data directly from websites will make the test suite behave
> differently depending on when you run the test so the test suite can't be
> open that way.
Does it matter if the page contents are bad/incomplete?  It seems like all
that matters is that they are consistent from pull-to-pull and somewhat
representative of pages we'd care to optimize.  Is the concern that those
URLs are missing too much content to be useful?

Note: The page cyclers used by Chromium all have data sets that are
bad/incomplete.  This was intentional.  For example, if a subresource was
not available for whatever reason, then the request to fetch it was
neutered (e.g., all "http" substrings were replaced with "httpdisabled").

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.webkit.org/pipermail/webkit-dev/attachments/20120429/36505bb3/attachment.html>

More information about the webkit-dev mailing list