<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[161171] trunk/Tools</title>
</head>
<body>
<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; }
#msg dl a { font-weight: bold}
#msg dl a:link { color:#fc3; }
#msg dl a:active { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/161171">161171</a></dd>
<dt>Author</dt> <dd>ap@apple.com</dd>
<dt>Date</dt> <dd>2013-12-30 22:32:59 -0800 (Mon, 30 Dec 2013)</dd>
</dl>
<h3>Log Message</h3>
<pre>full_results.json should distinguish unexpected failures from expected ones
https://bugs.webkit.org/show_bug.cgi?id=126300
Reviewed by Timothy Hatcher.
* Scripts/webkitpy/layout_tests/models/test_run_results.py:
(summarize_results): Add "report" element to JSON, which tells the consumer how
this result was counted for summary.
* Scripts/webkitpy/layout_tests/run_webkit_tests_integrationtest.py:
Updated results to include the new element.
* Scripts/webkitpy/layout_tests/views/buildbot_results.py:
(print_unexpected_results): Added a comment pointing to another place that
summarizes results, and should stay in sync.</pre>
<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkToolsChangeLog">trunk/Tools/ChangeLog</a></li>
<li><a href="#trunkToolsScriptswebkitpylayout_testsmodelstest_run_resultspy">trunk/Tools/Scripts/webkitpy/layout_tests/models/test_run_results.py</a></li>
<li><a href="#trunkToolsScriptswebkitpylayout_testsrun_webkit_tests_integrationtestpy">trunk/Tools/Scripts/webkitpy/layout_tests/run_webkit_tests_integrationtest.py</a></li>
<li><a href="#trunkToolsScriptswebkitpylayout_testsviewsbuildbot_resultspy">trunk/Tools/Scripts/webkitpy/layout_tests/views/buildbot_results.py</a></li>
</ul>
</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkToolsChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/Tools/ChangeLog (161170 => 161171)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/ChangeLog        2013-12-31 05:41:48 UTC (rev 161170)
+++ trunk/Tools/ChangeLog        2013-12-31 06:32:59 UTC (rev 161171)
</span><span class="lines">@@ -1,3 +1,21 @@
</span><ins>+2013-12-30 Alexey Proskuryakov <ap@apple.com>
+
+ full_results.json should distinguish unexpected failures from expected ones
+ https://bugs.webkit.org/show_bug.cgi?id=126300
+
+ Reviewed by Timothy Hatcher.
+
+ * Scripts/webkitpy/layout_tests/models/test_run_results.py:
+ (summarize_results): Add "report" element to JSON, which tells the consumer how
+ this result was counted for summary.
+
+ * Scripts/webkitpy/layout_tests/run_webkit_tests_integrationtest.py:
+ Updated results to include the new element.
+
+ * Scripts/webkitpy/layout_tests/views/buildbot_results.py:
+ (print_unexpected_results): Added a comment pointing to another place that
+ summarizes results, and should stay in sync.
+
</ins><span class="cx"> 2013-12-30 Ryuan Choi <ryuan.choi@samsung.com>
</span><span class="cx">
</span><span class="cx"> Replace remaning CoreIPC namespace to IPC
</span></span></pre></div>
<a id="trunkToolsScriptswebkitpylayout_testsmodelstest_run_resultspy"></a>
<div class="modfile"><h4>Modified: trunk/Tools/Scripts/webkitpy/layout_tests/models/test_run_results.py (161170 => 161171)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/Scripts/webkitpy/layout_tests/models/test_run_results.py        2013-12-31 05:41:48 UTC (rev 161170)
+++ trunk/Tools/Scripts/webkitpy/layout_tests/models/test_run_results.py        2013-12-31 06:32:59 UTC (rev 161171)
</span><span class="lines">@@ -117,7 +117,7 @@
</span><span class="cx">
</span><span class="cx"> return test_dict
</span><span class="cx">
</span><del>-
</del><ins>+# These results must match ones in print_unexpected_results() in views/buildbot_results.py.
</ins><span class="cx"> def summarize_results(port_obj, expectations, initial_results, retry_results, enabled_pixel_tests_in_retry, include_passes=False, include_time_and_modifiers=False):
</span><span class="cx"> """Returns a dictionary containing a summary of the test runs, with the following fields:
</span><span class="cx"> 'version': a version indicator
</span><span class="lines">@@ -179,25 +179,32 @@
</span><span class="cx"> elif result_type == test_expectations.CRASH:
</span><span class="cx"> if test_name in initial_results.unexpected_results_by_name:
</span><span class="cx"> num_regressions += 1
</span><ins>+ test_dict['report'] = 'REGRESSION'
</ins><span class="cx"> elif result_type == test_expectations.MISSING:
</span><span class="cx"> if test_name in initial_results.unexpected_results_by_name:
</span><span class="cx"> num_missing += 1
</span><ins>+ test_dict['report'] = 'MISSING'
</ins><span class="cx"> elif test_name in initial_results.unexpected_results_by_name:
</span><span class="cx"> if retry_results and test_name not in retry_results.unexpected_results_by_name:
</span><span class="cx"> actual.extend(expectations.model().get_expectations_string(test_name).split(" "))
</span><span class="cx"> num_flaky += 1
</span><ins>+ test_dict['report'] = 'FLAKY'
</ins><span class="cx"> elif retry_results:
</span><span class="cx"> retry_result_type = retry_results.unexpected_results_by_name[test_name].type
</span><span class="cx"> if result_type != retry_result_type:
</span><span class="cx"> if enabled_pixel_tests_in_retry and result_type == test_expectations.TEXT and retry_result_type == test_expectations.IMAGE_PLUS_TEXT:
</span><span class="cx"> num_regressions += 1
</span><ins>+ test_dict['report'] = 'REGRESSION'
</ins><span class="cx"> else:
</span><span class="cx"> num_flaky += 1
</span><ins>+ test_dict['report'] = 'FLAKY'
</ins><span class="cx"> actual.append(keywords[retry_result_type])
</span><span class="cx"> else:
</span><span class="cx"> num_regressions += 1
</span><ins>+ test_dict['report'] = 'REGRESSION'
</ins><span class="cx"> else:
</span><span class="cx"> num_regressions += 1
</span><ins>+ test_dict['report'] = 'REGRESSION'
</ins><span class="cx">
</span><span class="cx"> test_dict['expected'] = expected
</span><span class="cx"> test_dict['actual'] = " ".join(actual)
</span></span></pre></div>
<a id="trunkToolsScriptswebkitpylayout_testsrun_webkit_tests_integrationtestpy"></a>
<div class="modfile"><h4>Modified: trunk/Tools/Scripts/webkitpy/layout_tests/run_webkit_tests_integrationtest.py (161170 => 161171)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/Scripts/webkitpy/layout_tests/run_webkit_tests_integrationtest.py        2013-12-31 05:41:48 UTC (rev 161170)
+++ trunk/Tools/Scripts/webkitpy/layout_tests/run_webkit_tests_integrationtest.py        2013-12-31 06:32:59 UTC (rev 161171)
</span><span class="lines">@@ -496,7 +496,7 @@
</span><span class="cx"> tests_included=True, host=host)
</span><span class="cx"> file_list = host.filesystem.written_files.keys()
</span><span class="cx"> self.assertEqual(details.exit_code, 1)
</span><del>- expected_token = '"unexpected":{"text-image-checksum.html":{"expected":"PASS","actual":"IMAGE+TEXT","image_diff_percent":1},"missing_text.html":{"expected":"PASS","is_missing_text":true,"actual":"MISSING"}'
</del><ins>+ expected_token = '"unexpected":{"text-image-checksum.html":{"report":"REGRESSION","expected":"PASS","actual":"IMAGE+TEXT","image_diff_percent":1},"missing_text.html":{"report":"MISSING","expected":"PASS","is_missing_text":true,"actual":"MISSING"}'
</ins><span class="cx"> json_string = host.filesystem.read_text_file('/tmp/layout-test-results/full_results.json')
</span><span class="cx"> self.assertTrue(json_string.find(expected_token) != -1)
</span><span class="cx"> self.assertTrue(json_string.find('"num_regressions":1') != -1)
</span><span class="lines">@@ -513,7 +513,7 @@
</span><span class="cx"> details, err, _ = logging_run(extra_args=args, host=host, tests_included=True)
</span><span class="cx">
</span><span class="cx"> self.assertEqual(details.exit_code, 1)
</span><del>- expected_token = '"unexpected":{"pixeldir":{"image_in_pixeldir.html":{"expected":"PASS","actual":"IMAGE"'
</del><ins>+ expected_token = '"unexpected":{"pixeldir":{"image_in_pixeldir.html":{"report":"REGRESSION","expected":"PASS","actual":"IMAGE"'
</ins><span class="cx"> json_string = host.filesystem.read_text_file('/tmp/layout-test-results/full_results.json')
</span><span class="cx"> self.assertTrue(json_string.find(expected_token) != -1)
</span><span class="cx">
</span><span class="lines">@@ -537,7 +537,7 @@
</span><span class="cx"> def test_crash_with_stderr(self):
</span><span class="cx"> host = MockHost()
</span><span class="cx"> _, regular_output, _ = logging_run(['failures/unexpected/crash-with-stderr.html'], tests_included=True, host=host)
</span><del>- self.assertTrue(host.filesystem.read_text_file('/tmp/layout-test-results/full_results.json').find('{"crash-with-stderr.html":{"expected":"PASS","actual":"CRASH","has_stderr":true}}') != -1)
</del><ins>+ self.assertTrue(host.filesystem.read_text_file('/tmp/layout-test-results/full_results.json').find('{"crash-with-stderr.html":{"report":"REGRESSION","expected":"PASS","actual":"CRASH","has_stderr":true}}') != -1)
</ins><span class="cx">
</span><span class="cx"> def test_no_image_failure_with_image_diff(self):
</span><span class="cx"> host = MockHost()
</span><span class="lines">@@ -667,7 +667,7 @@
</span><span class="cx"> json_string = host.filesystem.read_text_file('/tmp/layout-test-results/full_results.json')
</span><span class="cx"> json = parse_full_results(json_string)
</span><span class="cx"> self.assertEqual(json["tests"]["failures"]["unexpected"]["text-image-checksum.html"],
</span><del>- {"expected": "PASS", "actual": "TEXT IMAGE+TEXT", "image_diff_percent": 1})
</del><ins>+ {"expected": "PASS", "actual": "TEXT IMAGE+TEXT", "image_diff_percent": 1, "report": "REGRESSION"})
</ins><span class="cx"> self.assertFalse(json["pixel_tests_enabled"])
</span><span class="cx"> self.assertEqual(details.enabled_pixel_tests_in_retry, True)
</span><span class="cx">
</span><span class="lines">@@ -749,7 +749,7 @@
</span><span class="cx"> host = MockHost()
</span><span class="cx"> _, err, _ = logging_run(['--no-show-results', 'reftests/foo/'], tests_included=True, host=host)
</span><span class="cx"> json_string = host.filesystem.read_text_file('/tmp/layout-test-results/full_results.json')
</span><del>- self.assertTrue(json_string.find('"unlistedtest.html":{"expected":"PASS","is_missing_text":true,"actual":"MISSING","is_missing_image":true}') != -1)
</del><ins>+ self.assertTrue(json_string.find('"unlistedtest.html":{"report":"MISSING","expected":"PASS","is_missing_text":true,"actual":"MISSING","is_missing_image":true}') != -1)
</ins><span class="cx"> self.assertTrue(json_string.find('"num_regressions":4') != -1)
</span><span class="cx"> self.assertTrue(json_string.find('"num_flaky":0') != -1)
</span><span class="cx"> self.assertTrue(json_string.find('"num_missing":1') != -1)
</span><span class="lines">@@ -850,11 +850,11 @@
</span><span class="cx"> self.assertTrue("multiple-mismatch-success.html" not in json["tests"]["reftests"]["foo"])
</span><span class="cx"> self.assertTrue("multiple-both-success.html" not in json["tests"]["reftests"]["foo"])
</span><span class="cx"> self.assertEqual(json["tests"]["reftests"]["foo"]["multiple-match-failure.html"],
</span><del>- {"expected": "PASS", "actual": "IMAGE", "reftest_type": ["=="], "image_diff_percent": 1})
</del><ins>+ {"expected": "PASS", "actual": "IMAGE", "reftest_type": ["=="], "image_diff_percent": 1, "report": "REGRESSION"})
</ins><span class="cx"> self.assertEqual(json["tests"]["reftests"]["foo"]["multiple-mismatch-failure.html"],
</span><del>- {"expected": "PASS", "actual": "IMAGE", "reftest_type": ["!="]})
</del><ins>+ {"expected": "PASS", "actual": "IMAGE", "reftest_type": ["!="], "report": "REGRESSION"})
</ins><span class="cx"> self.assertEqual(json["tests"]["reftests"]["foo"]["multiple-both-failure.html"],
</span><del>- {"expected": "PASS", "actual": "IMAGE", "reftest_type": ["==", "!="]})
</del><ins>+ {"expected": "PASS", "actual": "IMAGE", "reftest_type": ["==", "!="], "report": "REGRESSION"})
</ins><span class="cx">
</span><span class="cx">
</span><span class="cx"> class RebaselineTest(unittest.TestCase, StreamTestingMixin):
</span></span></pre></div>
<a id="trunkToolsScriptswebkitpylayout_testsviewsbuildbot_resultspy"></a>
<div class="modfile"><h4>Modified: trunk/Tools/Scripts/webkitpy/layout_tests/views/buildbot_results.py (161170 => 161171)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/Tools/Scripts/webkitpy/layout_tests/views/buildbot_results.py        2013-12-31 05:41:48 UTC (rev 161170)
+++ trunk/Tools/Scripts/webkitpy/layout_tests/views/buildbot_results.py        2013-12-31 06:32:59 UTC (rev 161171)
</span><span class="lines">@@ -88,6 +88,7 @@
</span><span class="cx"> pct = len(results) * 100.0 / not_passing
</span><span class="cx"> self._print(" %5d %-24s (%4.1f%%)" % (len(results), desc, pct))
</span><span class="cx">
</span><ins>+ # These results must match ones in summarize_results() in models/test_run_results.py.
</ins><span class="cx"> def print_unexpected_results(self, summarized_results, enabled_pixel_tests_in_retry=False):
</span><span class="cx"> passes = {}
</span><span class="cx"> flaky = {}
</span></span></pre>
</div>
</div>
</body>
</html>