<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[159803] trunk</title>
</head>
<body>

<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt;  }
#msg dl a { font-weight: bold}
#msg dl a:link    { color:#fc3; }
#msg dl a:active  { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff  {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/159803">159803</a></dd>
<dt>Author</dt> <dd>rniwa@webkit.org</dd>
<dt>Date</dt> <dd>2013-11-26 20:36:18 -0800 (Tue, 26 Nov 2013)</dd>
</dl>

<h3>Log Message</h3>
<pre>Remove replay performance tests as it's not actively maintained
https://bugs.webkit.org/show_bug.cgi?id=124764

Reviewed by Andreas Kling.

PerformanceTests:

Removed the replay performance tests. We can add them back when time comes.

* Replay/Chinese/chinaz.com.replay: Removed.
* Replay/Chinese/www.163.com.replay: Removed.
* Replay/Chinese/www.alipay.com.replay: Removed.
* Replay/Chinese/www.baidu.com.replay: Removed.
* Replay/Chinese/www.csdn.net.replay: Removed.
* Replay/Chinese/www.douban.com.replay: Removed.
* Replay/Chinese/www.hao123.com.replay: Removed.
* Replay/Chinese/www.xinhuanet.com.replay: Removed.
* Replay/Chinese/www.xunlei.com.replay: Removed.
* Replay/Chinese/www.youku.com.replay: Removed.
* Replay/English/beatonna.livejournal.com.replay: Removed.
* Replay/English/cakewrecks.blogspot.com.replay: Removed.
* Replay/English/chemistry.about.com.replay: Removed.
* Replay/English/digg.com.replay: Removed.
* Replay/English/en.wikipedia.org-rorschach_test.replay: Removed.
* Replay/English/icanhascheezburger.com.replay: Removed.
* Replay/English/imgur.com-gallery.replay: Removed.
* Replay/English/online.wsj.com.replay: Removed.
* Replay/English/stockoverflow.com-best-comment.replay: Removed.
* Replay/English/www.alibaba.com.replay: Removed.
* Replay/English/www.amazon.com-kindle.replay: Removed.
* Replay/English/www.apple.com.replay: Removed.
* Replay/English/www.cnet.com.replay: Removed.
* Replay/English/www.dailymotion.com.replay: Removed.
* Replay/English/www.ehow.com-prevent-fire.replay: Removed.
* Replay/English/www.filestube.com-amy-adams.replay: Removed.
* Replay/English/www.foxnews.replay: Removed.
* Replay/English/www.huffingtonpost.com.replay: Removed.
* Replay/English/www.imdb.com-twilight.replay: Removed.
* Replay/English/www.mozilla.com-all-order.replay: Removed.
* Replay/English/www.php.net.replay: Removed.
* Replay/English/www.reddit.com.replay: Removed.
* Replay/English/www.telegraph.co.uk.replay: Removed.
* Replay/English/www.w3.org-htmlcss.replay: Removed.
* Replay/English/www.w3schools.com-html.replay: Removed.
* Replay/English/www.youtube.com-music.replay: Removed.
* Replay/French/www.orange.fr.replay: Removed.
* Replay/Italian/www.repubblica.it.replay: Removed.
* Replay/Japanese/2ch.net-newsplus.replay: Removed.
* Replay/Japanese/entameblog.seesaa.net.replay: Removed.
* Replay/Japanese/ja.wikipedia.org.replay: Removed.
* Replay/Japanese/www.hatena.ne.jp.replay: Removed.
* Replay/Japanese/www.livedoor.com.replay: Removed.
* Replay/Japanese/www.nicovideo.jp.replay: Removed.
* Replay/Japanese/www.rakuten.co.jp.replay: Removed.
* Replay/Japanese/www.yahoo.co.jp.replay: Removed.
* Replay/Korean/www.naver.com.replay: Removed.
* Replay/Persian/blogfa.com.replay: Removed.
* Replay/Polish/www.wp.pl.replay: Removed.
* Replay/Portuguese/www.uol.com.br.replay: Removed.
* Replay/Russian/lenta.ru.replay: Removed.
* Replay/Russian/vkontakte.ru-help.replay: Removed.
* Replay/Russian/www.ixbt.com.replay: Removed.
* Replay/Russian/www.kp.ru.replay: Removed.
* Replay/Russian/www.liveinternet.ru.replay: Removed.
* Replay/Russian/www.pravda.ru.replay: Removed.
* Replay/Russian/www.rambler.ru.replay: Removed.
* Replay/Russian/www.ucoz.ru.replay: Removed.
* Replay/Russian/www.yandex.ru.replay: Removed.
* Replay/Spanish/www.taringa.net.replay: Removed.
* Replay/Swedish/www.flashback.se.replay: Removed.
* Replay/Swedish/www.tradera.com.replay: Removed.
* Replay/www.google.com.replay: Removed.
* Replay/www.techcrunch.com.replay: Removed.
* Replay/www.youtube.com.replay: Removed.

Tools:

Removed the feature.

* Scripts/webkitpy/performance_tests/perftest.py:
(SingleProcessPerfTest.__init__):
(PerfTestFactory):
* Scripts/webkitpy/performance_tests/perftest_unittest.py:
* Scripts/webkitpy/performance_tests/perftestsrunner.py:
(PerfTestsRunner._parse_args):
(PerfTestsRunner._collect_tests):
* Scripts/webkitpy/performance_tests/perftestsrunner_unittest.py:
(MainTest.test_collect_tests_with_ignored_skipped_list):
(MainTest.test_default_args):
* Scripts/webkitpy/thirdparty/__init__.py:
(AutoinstallImportHook.find_module):
(AutoinstallImportHook._install_unittest2):
* Scripts/webkitpy/thirdparty/__init___unittest.py:
(ThirdpartyTest.test_imports):</pre>

<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkPerformanceTestsChangeLog">trunk/PerformanceTests/ChangeLog</a></li>
<li><a href="#trunkToolsChangeLog">trunk/Tools/ChangeLog</a></li>
<li><a href="#trunkToolsScriptswebkitpyperformance_testsperftestpy">trunk/Tools/Scripts/webkitpy/performance_tests/perftest.py</a></li>
<li><a href="#trunkToolsScriptswebkitpyperformance_testsperftest_unittestpy">trunk/Tools/Scripts/webkitpy/performance_tests/perftest_unittest.py</a></li>
<li><a href="#trunkToolsScriptswebkitpyperformance_testsperftestsrunnerpy">trunk/Tools/Scripts/webkitpy/performance_tests/perftestsrunner.py</a></li>
<li><a href="#trunkToolsScriptswebkitpyperformance_testsperftestsrunner_unittestpy">trunk/Tools/Scripts/webkitpy/performance_tests/perftestsrunner_unittest.py</a></li>
<li><a href="#trunkToolsScriptswebkitpythirdparty__init__py">trunk/Tools/Scripts/webkitpy/thirdparty/__init__.py</a></li>
<li><a href="#trunkToolsScriptswebkitpythirdparty__init___unittestpy">trunk/Tools/Scripts/webkitpy/thirdparty/__init___unittest.py</a></li>
</ul>

<h3>Removed Paths</h3>
<ul>
<li>trunk/PerformanceTests/Replay/</li>
</ul>

</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkPerformanceTestsChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/PerformanceTests/ChangeLog (159802 => 159803)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/PerformanceTests/ChangeLog        2013-11-27 04:32:51 UTC (rev 159802)
+++ trunk/PerformanceTests/ChangeLog        2013-11-27 04:36:18 UTC (rev 159803)
</span><span class="lines">@@ -1,3 +1,78 @@
</span><ins>+2013-11-26  Ryosuke Niwa  &lt;rniwa@webkit.org&gt;
+
+        Remove replay performance tests as it's not actively maintained
+        https://bugs.webkit.org/show_bug.cgi?id=124764
+
+        Reviewed by Andreas Kling.
+
+        Removed the replay performance tests. We can add them back when time comes.
+
+        * Replay/Chinese/chinaz.com.replay: Removed.
+        * Replay/Chinese/www.163.com.replay: Removed.
+        * Replay/Chinese/www.alipay.com.replay: Removed.
+        * Replay/Chinese/www.baidu.com.replay: Removed.
+        * Replay/Chinese/www.csdn.net.replay: Removed.
+        * Replay/Chinese/www.douban.com.replay: Removed.
+        * Replay/Chinese/www.hao123.com.replay: Removed.
+        * Replay/Chinese/www.xinhuanet.com.replay: Removed.
+        * Replay/Chinese/www.xunlei.com.replay: Removed.
+        * Replay/Chinese/www.youku.com.replay: Removed.
+        * Replay/English/beatonna.livejournal.com.replay: Removed.
+        * Replay/English/cakewrecks.blogspot.com.replay: Removed.
+        * Replay/English/chemistry.about.com.replay: Removed.
+        * Replay/English/digg.com.replay: Removed.
+        * Replay/English/en.wikipedia.org-rorschach_test.replay: Removed.
+        * Replay/English/icanhascheezburger.com.replay: Removed.
+        * Replay/English/imgur.com-gallery.replay: Removed.
+        * Replay/English/online.wsj.com.replay: Removed.
+        * Replay/English/stockoverflow.com-best-comment.replay: Removed.
+        * Replay/English/www.alibaba.com.replay: Removed.
+        * Replay/English/www.amazon.com-kindle.replay: Removed.
+        * Replay/English/www.apple.com.replay: Removed.
+        * Replay/English/www.cnet.com.replay: Removed.
+        * Replay/English/www.dailymotion.com.replay: Removed.
+        * Replay/English/www.ehow.com-prevent-fire.replay: Removed.
+        * Replay/English/www.filestube.com-amy-adams.replay: Removed.
+        * Replay/English/www.foxnews.replay: Removed.
+        * Replay/English/www.huffingtonpost.com.replay: Removed.
+        * Replay/English/www.imdb.com-twilight.replay: Removed.
+        * Replay/English/www.mozilla.com-all-order.replay: Removed.
+        * Replay/English/www.php.net.replay: Removed.
+        * Replay/English/www.reddit.com.replay: Removed.
+        * Replay/English/www.telegraph.co.uk.replay: Removed.
+        * Replay/English/www.w3.org-htmlcss.replay: Removed.
+        * Replay/English/www.w3schools.com-html.replay: Removed.
+        * Replay/English/www.youtube.com-music.replay: Removed.
+        * Replay/French/www.orange.fr.replay: Removed.
+        * Replay/Italian/www.repubblica.it.replay: Removed.
+        * Replay/Japanese/2ch.net-newsplus.replay: Removed.
+        * Replay/Japanese/entameblog.seesaa.net.replay: Removed.
+        * Replay/Japanese/ja.wikipedia.org.replay: Removed.
+        * Replay/Japanese/www.hatena.ne.jp.replay: Removed.
+        * Replay/Japanese/www.livedoor.com.replay: Removed.
+        * Replay/Japanese/www.nicovideo.jp.replay: Removed.
+        * Replay/Japanese/www.rakuten.co.jp.replay: Removed.
+        * Replay/Japanese/www.yahoo.co.jp.replay: Removed.
+        * Replay/Korean/www.naver.com.replay: Removed.
+        * Replay/Persian/blogfa.com.replay: Removed.
+        * Replay/Polish/www.wp.pl.replay: Removed.
+        * Replay/Portuguese/www.uol.com.br.replay: Removed.
+        * Replay/Russian/lenta.ru.replay: Removed.
+        * Replay/Russian/vkontakte.ru-help.replay: Removed.
+        * Replay/Russian/www.ixbt.com.replay: Removed.
+        * Replay/Russian/www.kp.ru.replay: Removed.
+        * Replay/Russian/www.liveinternet.ru.replay: Removed.
+        * Replay/Russian/www.pravda.ru.replay: Removed.
+        * Replay/Russian/www.rambler.ru.replay: Removed.
+        * Replay/Russian/www.ucoz.ru.replay: Removed.
+        * Replay/Russian/www.yandex.ru.replay: Removed.
+        * Replay/Spanish/www.taringa.net.replay: Removed.
+        * Replay/Swedish/www.flashback.se.replay: Removed.
+        * Replay/Swedish/www.tradera.com.replay: Removed.
+        * Replay/www.google.com.replay: Removed.
+        * Replay/www.techcrunch.com.replay: Removed.
+        * Replay/www.youtube.com.replay: Removed.
+
</ins><span class="cx"> 2013-11-22  Ryosuke Niwa  &lt;rniwa@webkit.org&gt;
</span><span class="cx"> 
</span><span class="cx">         Layout Test editing/deleting/password-delete-performance.html is failing
</span></span></pre></div>
<a id="trunkToolsChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Tools/ChangeLog (159802 => 159803)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/ChangeLog        2013-11-27 04:32:51 UTC (rev 159802)
+++ trunk/Tools/ChangeLog        2013-11-27 04:36:18 UTC (rev 159803)
</span><span class="lines">@@ -1,3 +1,28 @@
</span><ins>+2013-11-26  Ryosuke Niwa  &lt;rniwa@webkit.org&gt;
+
+        Remove replay performance tests as it's not actively maintained
+        https://bugs.webkit.org/show_bug.cgi?id=124764
+
+        Reviewed by Andreas Kling.
+
+        Removed the feature.
+
+        * Scripts/webkitpy/performance_tests/perftest.py:
+        (SingleProcessPerfTest.__init__):
+        (PerfTestFactory):
+        * Scripts/webkitpy/performance_tests/perftest_unittest.py:
+        * Scripts/webkitpy/performance_tests/perftestsrunner.py:
+        (PerfTestsRunner._parse_args):
+        (PerfTestsRunner._collect_tests):
+        * Scripts/webkitpy/performance_tests/perftestsrunner_unittest.py:
+        (MainTest.test_collect_tests_with_ignored_skipped_list):
+        (MainTest.test_default_args):
+        * Scripts/webkitpy/thirdparty/__init__.py:
+        (AutoinstallImportHook.find_module):
+        (AutoinstallImportHook._install_unittest2):
+        * Scripts/webkitpy/thirdparty/__init___unittest.py:
+        (ThirdpartyTest.test_imports):
+
</ins><span class="cx"> 2013-11-26  Filip Pizlo  &lt;fpizlo@apple.com&gt;
</span><span class="cx"> 
</span><span class="cx">         Enable aggressive DFG validation in testing
</span></span></pre></div>
<a id="trunkToolsScriptswebkitpyperformance_testsperftestpy"></a>
<div class="modfile"><h4>Modified: trunk/Tools/Scripts/webkitpy/performance_tests/perftest.py (159802 => 159803)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/Scripts/webkitpy/performance_tests/perftest.py        2013-11-27 04:32:51 UTC (rev 159802)
+++ trunk/Tools/Scripts/webkitpy/performance_tests/perftest.py        2013-11-27 04:36:18 UTC (rev 159803)
</span><span class="lines">@@ -40,11 +40,6 @@
</span><span class="cx"> import sys
</span><span class="cx"> import time
</span><span class="cx"> 
</span><del>-# Import for auto-install
-if sys.platform not in ('cygwin', 'win32'):
-    # FIXME: webpagereplay doesn't work on win32. See https://bugs.webkit.org/show_bug.cgi?id=88279.
-    import webkitpy.thirdparty.autoinstalled.webpagereplay.replay
-
</del><span class="cx"> from webkitpy.layout_tests.controllers.test_result_writer import TestResultWriter
</span><span class="cx"> from webkitpy.port.driver import DriverInput
</span><span class="cx"> from webkitpy.port.driver import DriverOutput
</span><span class="lines">@@ -256,161 +251,10 @@
</span><span class="cx">         super(SingleProcessPerfTest, self).__init__(port, test_name, test_path, test_runner_count)
</span><span class="cx"> 
</span><span class="cx"> 
</span><del>-class ReplayServer(object):
-    def __init__(self, archive, record):
-        self._process = None
-
-        # FIXME: Should error if local proxy isn't set to forward requests to localhost:8080 and localhost:8443
-
-        replay_path = webkitpy.thirdparty.autoinstalled.webpagereplay.replay.__file__
-        args = ['python', replay_path, '--no-dns_forwarding', '--port', '8080', '--ssl_port', '8443', '--use_closest_match', '--log_level', 'warning']
-        if record:
-            args.append('--record')
-        args.append(archive)
-
-        self._process = subprocess.Popen(args)
-
-    def wait_until_ready(self):
-        for i in range(0, 3):
-            try:
-                connection = socket.create_connection(('localhost', '8080'), timeout=1)
-                connection.close()
-                return True
-            except socket.error:
-                time.sleep(1)
-                continue
-        return False
-
-    def stop(self):
-        if self._process:
-            self._process.send_signal(signal.SIGINT)
-            self._process.wait()
-        self._process = None
-
-    def __del__(self):
-        self.stop()
-
-
-class ReplayPerfTest(PerfTest):
-    _FORCE_GC_FILE = 'resources/force-gc.html'
-
-    def __init__(self, port, test_name, test_path, test_runner_count=DEFAULT_TEST_RUNNER_COUNT):
-        super(ReplayPerfTest, self).__init__(port, test_name, test_path, test_runner_count)
-        self.force_gc_test = self._port.host.filesystem.join(self._port.perf_tests_dir(), self._FORCE_GC_FILE)
-
-    def _start_replay_server(self, archive, record):
-        try:
-            return ReplayServer(archive, record)
-        except OSError as error:
-            if error.errno == errno.ENOENT:
-                _log.error(&quot;Replay tests require web-page-replay.&quot;)
-            else:
-                raise error
-
-    def prepare(self, time_out_ms):
-        filesystem = self._port.host.filesystem
-        path_without_ext = filesystem.splitext(self.test_path())[0]
-
-        self._archive_path = filesystem.join(path_without_ext + '.wpr')
-        self._expected_image_path = filesystem.join(path_without_ext + '-expected.png')
-        self._url = filesystem.read_text_file(self.test_path()).split('\n')[0]
-
-        if filesystem.isfile(self._archive_path) and filesystem.isfile(self._expected_image_path):
-            _log.info(&quot;Replay ready for %s&quot; % self._archive_path)
-            return True
-
-        _log.info(&quot;Preparing replay for %s&quot; % self.test_name())
-
-        driver = self._port.create_driver(worker_number=0, no_timeout=True)
-        try:
-            output = self.run_single(driver, self._archive_path, time_out_ms, record=True)
-        finally:
-            driver.stop()
-
-        if not output or not filesystem.isfile(self._archive_path):
-            _log.error(&quot;Failed to prepare a replay for %s&quot; % self.test_name())
-            return False
-
-        _log.info(&quot;Prepared replay for %s&quot; % self.test_name())
-
-        return True
-
-    def _run_with_driver(self, driver, time_out_ms):
-        times = []
-        malloc = []
-        js_heap = []
-
-        for i in range(0, 6):
-            output = self.run_single(driver, self.test_path(), time_out_ms)
-            if not output or self.run_failed(output):
-                return False
-            if i == 0:
-                continue
-
-            times.append(output.test_time * 1000)
-
-            if not output.measurements:
-                continue
-
-            for metric, result in output.measurements.items():
-                assert metric == 'Malloc' or metric == 'JSHeap'
-                if metric == 'Malloc':
-                    malloc.append(result)
-                else:
-                    js_heap.append(result)
-
-        if times:
-            self._ensure_metrics('Time').append_group(times)
-        if malloc:
-            self._ensure_metrics('Malloc').append_group(malloc)
-        if js_heap:
-            self._ensure_metrics('JSHeap').append_group(js_heap)
-
-        return True
-
-    def run_single(self, driver, url, time_out_ms, record=False):
-        server = self._start_replay_server(self._archive_path, record)
-        if not server:
-            _log.error(&quot;Web page replay didn't start.&quot;)
-            return None
-
-        try:
-            _log.debug(&quot;Waiting for Web page replay to start.&quot;)
-            if not server.wait_until_ready():
-                _log.error(&quot;Web page replay didn't start.&quot;)
-                return None
-
-            _log.debug(&quot;Web page replay started. Loading the page.&quot;)
-            # Force GC to prevent pageload noise. See https://bugs.webkit.org/show_bug.cgi?id=98203
-            super(ReplayPerfTest, self).run_single(driver, self.force_gc_test, time_out_ms, False)
-            output = super(ReplayPerfTest, self).run_single(driver, self._url, time_out_ms, should_run_pixel_test=True)
-            if self.run_failed(output):
-                return None
-
-            if not output.image:
-                _log.error(&quot;Loading the page did not generate image results&quot;)
-                _log.error(output.text)
-                return None
-
-            filesystem = self._port.host.filesystem
-            dirname = filesystem.dirname(self._archive_path)
-            filename = filesystem.split(self._archive_path)[1]
-            writer = TestResultWriter(filesystem, self._port, dirname, filename)
-            if record:
-                writer.write_image_files(actual_image=None, expected_image=output.image)
-            else:
-                writer.write_image_files(actual_image=output.image, expected_image=None)
-
-            return output
-        finally:
-            server.stop()
-
-
</del><span class="cx"> class PerfTestFactory(object):
</span><span class="cx"> 
</span><span class="cx">     _pattern_map = [
</span><span class="cx">         (re.compile(r'^Dromaeo/'), SingleProcessPerfTest),
</span><del>-        (re.compile(r'(.+)\.replay$'), ReplayPerfTest),
</del><span class="cx">     ]
</span><span class="cx"> 
</span><span class="cx">     @classmethod
</span></span></pre></div>
<a id="trunkToolsScriptswebkitpyperformance_testsperftest_unittestpy"></a>
<div class="modfile"><h4>Modified: trunk/Tools/Scripts/webkitpy/performance_tests/perftest_unittest.py (159802 => 159803)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/Scripts/webkitpy/performance_tests/perftest_unittest.py        2013-11-27 04:32:51 UTC (rev 159802)
+++ trunk/Tools/Scripts/webkitpy/performance_tests/perftest_unittest.py        2013-11-27 04:36:18 UTC (rev 159803)
</span><span class="lines">@@ -39,7 +39,6 @@
</span><span class="cx"> from webkitpy.performance_tests.perftest import PerfTest
</span><span class="cx"> from webkitpy.performance_tests.perftest import PerfTestMetric
</span><span class="cx"> from webkitpy.performance_tests.perftest import PerfTestFactory
</span><del>-from webkitpy.performance_tests.perftest import ReplayPerfTest
</del><span class="cx"> from webkitpy.performance_tests.perftest import SingleProcessPerfTest
</span><span class="cx"> 
</span><span class="cx"> 
</span><span class="lines">@@ -197,241 +196,6 @@
</span><span class="cx">         self.assertEqual(called[0], 1)
</span><span class="cx"> 
</span><span class="cx"> 
</span><del>-class TestReplayPerfTest(unittest.TestCase):
-    class ReplayTestPort(MockPort):
-        def __init__(self, custom_run_test=None):
-
-            class ReplayTestDriver(TestDriver):
-                def run_test(self, text_input, stop_when_done):
-                    return custom_run_test(text_input, stop_when_done) if custom_run_test else None
-
-            self._custom_driver_class = ReplayTestDriver
-            super(self.__class__, self).__init__()
-
-        def _driver_class(self):
-            return self._custom_driver_class
-
-    class MockReplayServer(object):
-        def __init__(self, wait_until_ready=True):
-            self.wait_until_ready = lambda: wait_until_ready
-
-        def stop(self):
-            pass
-
-    def _add_file(self, port, dirname, filename, content=True):
-        port.host.filesystem.maybe_make_directory(dirname)
-        port.host.filesystem.write_binary_file(port.host.filesystem.join(dirname, filename), content)
-
-    def _setup_test(self, run_test=None):
-        test_port = self.ReplayTestPort(run_test)
-        self._add_file(test_port, '/path/some-dir', 'some-test.replay', 'http://some-test/')
-        test = ReplayPerfTest(test_port, 'some-test.replay', '/path/some-dir/some-test.replay')
-        test._start_replay_server = lambda archive, record: self.__class__.MockReplayServer()
-        return test, test_port
-
-    def test_run_single(self):
-        output_capture = OutputCapture()
-        output_capture.capture_output()
-
-        loaded_pages = []
-
-        def run_test(test_input, stop_when_done):
-            if test_input.test_name == test.force_gc_test:
-                loaded_pages.append(test_input)
-                return
-            if test_input.test_name != &quot;about:blank&quot;:
-                self.assertEqual(test_input.test_name, 'http://some-test/')
-            loaded_pages.append(test_input)
-            self._add_file(port, '/path/some-dir', 'some-test.wpr', 'wpr content')
-            return DriverOutput('actual text', 'actual image', 'actual checksum',
-                audio=None, crash=False, timeout=False, error=False, test_time=12345)
-
-        test, port = self._setup_test(run_test)
-        test._archive_path = '/path/some-dir/some-test.wpr'
-        test._url = 'http://some-test/'
-
-        try:
-            driver = port.create_driver(worker_number=1, no_timeout=True)
-            output = test.run_single(driver, '/path/some-dir/some-test.replay', time_out_ms=100)
-            self.assertTrue(output)
-        finally:
-            actual_stdout, actual_stderr, actual_logs = output_capture.restore_output()
-
-        self.assertEqual(len(loaded_pages), 2)
-        self.assertEqual(loaded_pages[0].test_name, test.force_gc_test)
-        self.assertEqual(loaded_pages[1].test_name, 'http://some-test/')
-        self.assertEqual(actual_stdout, '')
-        self.assertEqual(actual_stderr, '')
-        self.assertEqual(actual_logs, '')
-        self.assertEqual(port.host.filesystem.read_binary_file('/path/some-dir/some-test-actual.png'), 'actual image')
-        self.assertEqual(output.test_time, 12345)
-
-    def test_run_single_fails_without_webpagereplay(self):
-        output_capture = OutputCapture()
-        output_capture.capture_output()
-
-        test, port = self._setup_test()
-        test._start_replay_server = lambda archive, record: None
-        test._archive_path = '/path/some-dir.wpr'
-        test._url = 'http://some-test/'
-
-        try:
-            driver = port.create_driver(worker_number=1, no_timeout=True)
-            self.assertEqual(test.run_single(driver, '/path/some-dir/some-test.replay', time_out_ms=100), None)
-        finally:
-            actual_stdout, actual_stderr, actual_logs = output_capture.restore_output()
-        self.assertEqual(actual_stdout, '')
-        self.assertEqual(actual_stderr, '')
-        self.assertEqual(actual_logs, &quot;Web page replay didn't start.\n&quot;)
-
-    def test_run_with_driver_accumulates_results(self):
-        port = MockPort()
-        test, port = self._setup_test()
-        counter = [0]
-
-        def mock_run_signle(drive, path, timeout):
-            counter[0] += 1
-            return DriverOutput('some output', image=None, image_hash=None, audio=None, test_time=counter[0], measurements={})
-
-        test.run_single = mock_run_signle
-        output_capture = OutputCapture()
-        output_capture.capture_output()
-        try:
-            driver = port.create_driver(worker_number=1, no_timeout=True)
-            self.assertTrue(test._run_with_driver(driver, None))
-        finally:
-            actual_stdout, actual_stderr, actual_logs = output_capture.restore_output()
-
-        self.assertEqual(actual_stdout, '')
-        self.assertEqual(actual_stderr, '')
-        self.assertEqual(actual_logs, '')
-
-        self.assertEqual(test._metrics.keys(), ['Time'])
-        self.assertEqual(test._metrics['Time'].flattened_iteration_values(), [float(i * 1000) for i in range(2, 7)])
-
-    def test_run_with_driver_accumulates_memory_results(self):
-        port = MockPort()
-        test, port = self._setup_test()
-        counter = [0]
-
-        def mock_run_signle(drive, path, timeout):
-            counter[0] += 1
-            return DriverOutput('some output', image=None, image_hash=None, audio=None, test_time=counter[0], measurements={'Malloc': 10, 'JSHeap': 5})
-
-        test.run_single = mock_run_signle
-        output_capture = OutputCapture()
-        output_capture.capture_output()
-        try:
-            driver = port.create_driver(worker_number=1, no_timeout=True)
-            self.assertTrue(test._run_with_driver(driver, None))
-        finally:
-            actual_stdout, actual_stderr, actual_logs = output_capture.restore_output()
-
-        self.assertEqual(actual_stdout, '')
-        self.assertEqual(actual_stderr, '')
-        self.assertEqual(actual_logs, '')
-
-        metrics = test._metrics
-        self.assertEqual(sorted(metrics.keys()), ['JSHeap', 'Malloc', 'Time'])
-        self.assertEqual(metrics['Time'].flattened_iteration_values(), [float(i * 1000) for i in range(2, 7)])
-        self.assertEqual(metrics['Malloc'].flattened_iteration_values(), [float(10)] * 5)
-        self.assertEqual(metrics['JSHeap'].flattened_iteration_values(), [float(5)] * 5)
-
-    def test_prepare_fails_when_wait_until_ready_fails(self):
-        output_capture = OutputCapture()
-        output_capture.capture_output()
-
-        test, port = self._setup_test()
-        test._start_replay_server = lambda archive, record: self.__class__.MockReplayServer(wait_until_ready=False)
-        test._archive_path = '/path/some-dir.wpr'
-        test._url = 'http://some-test/'
-
-        try:
-            driver = port.create_driver(worker_number=1, no_timeout=True)
-            self.assertEqual(test.run_single(driver, '/path/some-dir/some-test.replay', time_out_ms=100), None)
-        finally:
-            actual_stdout, actual_stderr, actual_logs = output_capture.restore_output()
-
-        self.assertEqual(actual_stdout, '')
-        self.assertEqual(actual_stderr, '')
-        self.assertEqual(actual_logs, &quot;Web page replay didn't start.\n&quot;)
-
-    def test_run_single_fails_when_output_has_error(self):
-        output_capture = OutputCapture()
-        output_capture.capture_output()
-
-        loaded_pages = []
-
-        def run_test(test_input, stop_when_done):
-            loaded_pages.append(test_input)
-            self._add_file(port, '/path/some-dir', 'some-test.wpr', 'wpr content')
-            return DriverOutput('actual text', 'actual image', 'actual checksum',
-                audio=None, crash=False, timeout=False, error='some error')
-
-        test, port = self._setup_test(run_test)
-        test._archive_path = '/path/some-dir.wpr'
-        test._url = 'http://some-test/'
-
-        try:
-            driver = port.create_driver(worker_number=1, no_timeout=True)
-            self.assertEqual(test.run_single(driver, '/path/some-dir/some-test.replay', time_out_ms=100), None)
-        finally:
-            actual_stdout, actual_stderr, actual_logs = output_capture.restore_output()
-
-        self.assertEqual(len(loaded_pages), 2)
-        self.assertEqual(loaded_pages[0].test_name, test.force_gc_test)
-        self.assertEqual(loaded_pages[1].test_name, 'http://some-test/')
-        self.assertEqual(actual_stdout, '')
-        self.assertEqual(actual_stderr, '')
-        self.assertEqual(actual_logs, 'error: some-test.replay\nsome error\n')
-
-    def test_prepare(self):
-        output_capture = OutputCapture()
-        output_capture.capture_output()
-
-        def run_test(test_input, stop_when_done):
-            self._add_file(port, '/path/some-dir', 'some-test.wpr', 'wpr content')
-            return DriverOutput('actual text', 'actual image', 'actual checksum',
-                audio=None, crash=False, timeout=False, error=False)
-
-        test, port = self._setup_test(run_test)
-
-        try:
-            self.assertTrue(test.prepare(time_out_ms=100))
-        finally:
-            actual_stdout, actual_stderr, actual_logs = output_capture.restore_output()
-
-        self.assertEqual(actual_stdout, '')
-        self.assertEqual(actual_stderr, '')
-        self.assertEqual(actual_logs, 'Preparing replay for some-test.replay\nPrepared replay for some-test.replay\n')
-        self.assertEqual(port.host.filesystem.read_binary_file('/path/some-dir/some-test-expected.png'), 'actual image')
-
-    def test_prepare_calls_run_single(self):
-        output_capture = OutputCapture()
-        output_capture.capture_output()
-        called = [False]
-
-        def run_single(driver, url, time_out_ms, record):
-            self.assertTrue(record)
-            self.assertEqual(url, '/path/some-dir/some-test.wpr')
-            called[0] = True
-            return False
-
-        test, port = self._setup_test()
-        test.run_single = run_single
-
-        try:
-            self.assertFalse(test.prepare(time_out_ms=100))
-        finally:
-            actual_stdout, actual_stderr, actual_logs = output_capture.restore_output()
-        self.assertTrue(called[0])
-        self.assertEqual(test._archive_path, '/path/some-dir/some-test.wpr')
-        self.assertEqual(test._url, 'http://some-test/')
-        self.assertEqual(actual_stdout, '')
-        self.assertEqual(actual_stderr, '')
-        self.assertEqual(actual_logs, &quot;Preparing replay for some-test.replay\nFailed to prepare a replay for some-test.replay\n&quot;)
-
-
</del><span class="cx"> class TestPerfTestFactory(unittest.TestCase):
</span><span class="cx">     def test_regular_test(self):
</span><span class="cx">         test = PerfTestFactory.create_perf_test(MockPort(), 'some-dir/some-test', '/path/some-dir/some-test')
</span></span></pre></div>
<a id="trunkToolsScriptswebkitpyperformance_testsperftestsrunnerpy"></a>
<div class="modfile"><h4>Modified: trunk/Tools/Scripts/webkitpy/performance_tests/perftestsrunner.py (159802 => 159803)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/Scripts/webkitpy/performance_tests/perftestsrunner.py        2013-11-27 04:32:51 UTC (rev 159802)
+++ trunk/Tools/Scripts/webkitpy/performance_tests/perftestsrunner.py        2013-11-27 04:36:18 UTC (rev 159803)
</span><span class="lines">@@ -112,8 +112,6 @@
</span><span class="cx">                 help=&quot;Upload the generated JSON file to the specified server when --output-json-path is present.&quot;),
</span><span class="cx">             optparse.make_option(&quot;--webkit-test-runner&quot;, &quot;-2&quot;, action=&quot;store_true&quot;,
</span><span class="cx">                 help=&quot;Use WebKitTestRunner rather than DumpRenderTree.&quot;),
</span><del>-            optparse.make_option(&quot;--replay&quot;, dest=&quot;replay&quot;, action=&quot;store_true&quot;, default=False,
-                help=&quot;Run replay tests.&quot;),
</del><span class="cx">             optparse.make_option(&quot;--force&quot;, dest=&quot;use_skipped_list&quot;, action=&quot;store_false&quot;, default=True,
</span><span class="cx">                 help=&quot;Run all tests, including the ones in the Skipped list.&quot;),
</span><span class="cx">             optparse.make_option(&quot;--profile&quot;, action=&quot;store_true&quot;,
</span><span class="lines">@@ -134,8 +132,6 @@
</span><span class="cx"> 
</span><span class="cx">     def _collect_tests(self):
</span><span class="cx">         test_extensions = ['.html', '.svg']
</span><del>-        if self._options.replay:
-            test_extensions.append('.replay')
</del><span class="cx"> 
</span><span class="cx">         def _is_test_file(filesystem, dirname, filename):
</span><span class="cx">             return filesystem.splitext(filename)[1] in test_extensions
</span></span></pre></div>
<a id="trunkToolsScriptswebkitpyperformance_testsperftestsrunner_unittestpy"></a>
<div class="modfile"><h4>Modified: trunk/Tools/Scripts/webkitpy/performance_tests/perftestsrunner_unittest.py (159802 => 159803)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/Scripts/webkitpy/performance_tests/perftestsrunner_unittest.py        2013-11-27 04:32:51 UTC (rev 159802)
+++ trunk/Tools/Scripts/webkitpy/performance_tests/perftestsrunner_unittest.py        2013-11-27 04:36:18 UTC (rev 159803)
</span><span class="lines">@@ -111,18 +111,6 @@
</span><span class="cx">         port.skipped_perf_tests = lambda: ['inspector/unsupported_test1.html', 'unsupported']
</span><span class="cx">         self.assertItemsEqual(self._collect_tests_and_sort_test_name(runner), ['inspector/test1.html', 'inspector/test2.html', 'inspector/unsupported_test1.html', 'unsupported/unsupported_test2.html'])
</span><span class="cx"> 
</span><del>-    def test_collect_tests_should_ignore_replay_tests_by_default(self):
-        runner, port = self.create_runner()
-        self._add_file(runner, 'Replay', 'www.webkit.org.replay')
-        self.assertItemsEqual(runner._collect_tests(), [])
-
-    def test_collect_tests_with_replay_tests(self):
-        runner, port = self.create_runner(args=['--replay'])
-        self._add_file(runner, 'Replay', 'www.webkit.org.replay')
-        tests = runner._collect_tests()
-        self.assertEqual(len(tests), 1)
-        self.assertEqual(tests[0].__class__.__name__, 'ReplayPerfTest')
-
</del><span class="cx">     def test_default_args(self):
</span><span class="cx">         runner, port = self.create_runner()
</span><span class="cx">         options, args = PerfTestsRunner._parse_args([])
</span><span class="lines">@@ -130,7 +118,6 @@
</span><span class="cx">         self.assertEqual(options.time_out_ms, 600 * 1000)
</span><span class="cx">         self.assertTrue(options.generate_results)
</span><span class="cx">         self.assertTrue(options.show_results)
</span><del>-        self.assertFalse(options.replay)
</del><span class="cx">         self.assertTrue(options.use_skipped_list)
</span><span class="cx">         self.assertEqual(options.repeat, 1)
</span><span class="cx">         self.assertEqual(options.test_runner_count, DEFAULT_TEST_RUNNER_COUNT)
</span></span></pre></div>
<a id="trunkToolsScriptswebkitpythirdparty__init__py"></a>
<div class="modfile"><h4>Modified: trunk/Tools/Scripts/webkitpy/thirdparty/__init__.py (159802 => 159803)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/Scripts/webkitpy/thirdparty/__init__.py        2013-11-27 04:32:51 UTC (rev 159802)
+++ trunk/Tools/Scripts/webkitpy/thirdparty/__init__.py        2013-11-27 04:36:18 UTC (rev 159803)
</span><span class="lines">@@ -91,8 +91,6 @@
</span><span class="cx">             self._install_irc()
</span><span class="cx">         elif '.buildbot' in fullname:
</span><span class="cx">             self._install_buildbot()
</span><del>-        elif '.webpagereplay' in fullname:
-            self._install_webpagereplay()
</del><span class="cx"> 
</span><span class="cx">     def _install_mechanize(self):
</span><span class="cx">         return self._install(&quot;http://pypi.python.org/packages/source/m/mechanize/mechanize-0.2.5.tar.gz&quot;,
</span><span class="lines">@@ -161,17 +159,6 @@
</span><span class="cx">         self._ensure_autoinstalled_dir_is_in_sys_path()
</span><span class="cx">         return self._install(url=&quot;http://pypi.python.org/packages/source/u/unittest2/unittest2-0.5.1.tar.gz#md5=a0af5cac92bbbfa0c3b0e99571390e0f&quot;, url_subpath=&quot;unittest2-0.5.1/unittest2&quot;)
</span><span class="cx"> 
</span><del>-    def _install_webpagereplay(self):
-        did_install_something = False
-        if not self._fs.exists(self._fs.join(_AUTOINSTALLED_DIR, &quot;webpagereplay&quot;)):
-            did_install_something = self._install(&quot;http://web-page-replay.googlecode.com/files/webpagereplay-1.1.2.tar.gz&quot;, &quot;webpagereplay-1.1.2&quot;)
-            self._fs.move(self._fs.join(_AUTOINSTALLED_DIR, &quot;webpagereplay-1.1.2&quot;), self._fs.join(_AUTOINSTALLED_DIR, &quot;webpagereplay&quot;))
-
-        module_init_path = self._fs.join(_AUTOINSTALLED_DIR, &quot;webpagereplay&quot;, &quot;__init__.py&quot;)
-        if not self._fs.exists(module_init_path):
-            self._fs.write_text_file(module_init_path, &quot;&quot;)
-        return did_install_something
-
</del><span class="cx">     def _install(self, url, url_subpath=None, target_name=None):
</span><span class="cx">         installer = AutoInstaller(target_dir=_AUTOINSTALLED_DIR)
</span><span class="cx">         return installer.install(url=url, url_subpath=url_subpath, target_name=target_name)
</span></span></pre></div>
<a id="trunkToolsScriptswebkitpythirdparty__init___unittestpy"></a>
<div class="modfile"><h4>Modified: trunk/Tools/Scripts/webkitpy/thirdparty/__init___unittest.py (159802 => 159803)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/Scripts/webkitpy/thirdparty/__init___unittest.py        2013-11-27 04:32:51 UTC (rev 159802)
+++ trunk/Tools/Scripts/webkitpy/thirdparty/__init___unittest.py        2013-11-27 04:36:18 UTC (rev 159803)
</span><span class="lines">@@ -65,5 +65,4 @@
</span><span class="cx">         import webkitpy.thirdparty.autoinstalled.irc.irclib
</span><span class="cx">         import webkitpy.thirdparty.autoinstalled.mechanize
</span><span class="cx">         import webkitpy.thirdparty.autoinstalled.pylint
</span><del>-        import webkitpy.thirdparty.autoinstalled.webpagereplay
</del><span class="cx">         import webkitpy.thirdparty.autoinstalled.pep8
</span></span></pre>
</div>
</div>

</body>
</html>