<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="content-type" content="text/html; charset=utf-8" />
<title>[211333] trunk/LayoutTests/imported/w3c</title>
</head>
<body>

<style type="text/css"><!--
#msg dl.meta { border: 1px #006 solid; background: #369; padding: 6px; color: #fff; }
#msg dl.meta dt { float: left; width: 6em; font-weight: bold; }
#msg dt:after { content:':';}
#msg dl, #msg dt, #msg ul, #msg li, #header, #footer, #logmsg { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt;  }
#msg dl a { font-weight: bold}
#msg dl a:link    { color:#fc3; }
#msg dl a:active  { color:#ff0; }
#msg dl a:visited { color:#cc6; }
h3 { font-family: verdana,arial,helvetica,sans-serif; font-size: 10pt; font-weight: bold; }
#msg pre { overflow: auto; background: #ffc; border: 1px #fa0 solid; padding: 6px; }
#logmsg { background: #ffc; border: 1px #fa0 solid; padding: 1em 1em 0 1em; }
#logmsg p, #logmsg pre, #logmsg blockquote { margin: 0 0 1em 0; }
#logmsg p, #logmsg li, #logmsg dt, #logmsg dd { line-height: 14pt; }
#logmsg h1, #logmsg h2, #logmsg h3, #logmsg h4, #logmsg h5, #logmsg h6 { margin: .5em 0; }
#logmsg h1:first-child, #logmsg h2:first-child, #logmsg h3:first-child, #logmsg h4:first-child, #logmsg h5:first-child, #logmsg h6:first-child { margin-top: 0; }
#logmsg ul, #logmsg ol { padding: 0; list-style-position: inside; margin: 0 0 0 1em; }
#logmsg ul { text-indent: -1em; padding-left: 1em; }#logmsg ol { text-indent: -1.5em; padding-left: 1.5em; }
#logmsg > ul, #logmsg > ol { margin: 0 0 1em 0; }
#logmsg pre { background: #eee; padding: 1em; }
#logmsg blockquote { border: 1px solid #fa0; border-left-width: 10px; padding: 1em 1em 0 1em; background: white;}
#logmsg dl { margin: 0; }
#logmsg dt { font-weight: bold; }
#logmsg dd { margin: 0; padding: 0 0 0.5em 0; }
#logmsg dd:before { content:'\00bb';}
#logmsg table { border-spacing: 0px; border-collapse: collapse; border-top: 4px solid #fa0; border-bottom: 1px solid #fa0; background: #fff; }
#logmsg table th { text-align: left; font-weight: normal; padding: 0.2em 0.5em; border-top: 1px dotted #fa0; }
#logmsg table td { text-align: right; border-top: 1px dotted #fa0; padding: 0.2em 0.5em; }
#logmsg table thead th { text-align: center; border-bottom: 1px solid #fa0; }
#logmsg table th.Corner { text-align: left; }
#logmsg hr { border: none 0; border-top: 2px dashed #fa0; height: 1px; }
#header, #footer { color: #fff; background: #636; border: 1px #300 solid; padding: 6px; }
#patch { width: 100%; }
#patch h4 {font-family: verdana,arial,helvetica,sans-serif;font-size:10pt;padding:8px;background:#369;color:#fff;margin:0;}
#patch .propset h4, #patch .binary h4 {margin:0;}
#patch pre {padding:0;line-height:1.2em;margin:0;}
#patch .diff {width:100%;background:#eee;padding: 0 0 10px 0;overflow:auto;}
#patch .propset .diff, #patch .binary .diff  {padding:10px 0;}
#patch span {display:block;padding:0 10px;}
#patch .modfile, #patch .addfile, #patch .delfile, #patch .propset, #patch .binary, #patch .copfile {border:1px solid #ccc;margin:10px 0;}
#patch ins {background:#dfd;text-decoration:none;display:block;padding:0 10px;}
#patch del {background:#fdd;text-decoration:none;display:block;padding:0 10px;}
#patch .lines, .info {color:#888;background:#fff;}
--></style>
<div id="msg">
<dl class="meta">
<dt>Revision</dt> <dd><a href="http://trac.webkit.org/projects/webkit/changeset/211333">211333</a></dd>
<dt>Author</dt> <dd>joepeck@webkit.org</dd>
<dt>Date</dt> <dd>2017-01-28 01:26:27 -0800 (Sat, 28 Jan 2017)</dd>
</dl>

<h3>Log Message</h3>
<pre>Import web-platform-tests/user-timing
https://bugs.webkit.org/show_bug.cgi?id=167542
&lt;rdar://problem/22746307&gt;

Rubber-stamped by Ryosuke Niwa.

Only failures are ones where User Timing Level 2 differs
from User Timing Level 1.

* resources/ImportExpectations:
* web-platform-tests/user-timing/OWNERS: Added.
* web-platform-tests/user-timing/idlharness-expected.txt: Added.
* web-platform-tests/user-timing/idlharness.html: Added.
* web-platform-tests/user-timing/resources/w3c-import.log: Added.
* web-platform-tests/user-timing/resources/webperftestharness.js: Added.
* web-platform-tests/user-timing/resources/webperftestharnessextension.js: Added.
* web-platform-tests/user-timing/test_user_timing_clear_marks-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_clear_marks.html: Added.
* web-platform-tests/user-timing/test_user_timing_clear_measures-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_clear_measures.html: Added.
* web-platform-tests/user-timing/test_user_timing_entry_type-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_entry_type.html: Added.
* web-platform-tests/user-timing/test_user_timing_exists-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_exists.html: Added.
* web-platform-tests/user-timing/test_user_timing_mark-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_mark.html: Added.
* web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.html: Added.
* web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.js: Added.
* web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter.html: Added.
* web-platform-tests/user-timing/test_user_timing_mark_exceptions-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_mark_exceptions.html: Added.
* web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute.html: Added.
* web-platform-tests/user-timing/test_user_timing_measure-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_measure.html: Added.
* web-platform-tests/user-timing/test_user_timing_measure_exceptions-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_measure_exceptions.html: Added.
* web-platform-tests/user-timing/test_user_timing_measure_navigation_timing-expected.txt: Added.
* web-platform-tests/user-timing/test_user_timing_measure_navigation_timing.html: Added.
* web-platform-tests/user-timing/w3c-import.log: Added.</pre>

<h3>Modified Paths</h3>
<ul>
<li><a href="#trunkLayoutTestsimportedw3cChangeLog">trunk/LayoutTests/imported/w3c/ChangeLog</a></li>
<li><a href="#trunkLayoutTestsimportedw3cresourcesImportExpectations">trunk/LayoutTests/imported/w3c/resources/ImportExpectations</a></li>
</ul>

<h3>Added Paths</h3>
<ul>
<li>trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/</li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingOWNERS">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/OWNERS</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingidlharnessexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/idlharness-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingidlharnesshtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/idlharness.html</a></li>
<li>trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/</li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingresourcesw3cimportlog">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/w3c-import.log</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingresourceswebperftestharnessjs">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/webperftestharness.js</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingresourceswebperftestharnessextensionjs">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/webperftestharnessextension.js</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_clear_marksexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_marks-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_clear_markshtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_marks.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_clear_measuresexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_measures-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_clear_measureshtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_measures.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_entry_typeexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_entry_type-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_entry_typehtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_entry_type.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_existsexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_exists-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_existshtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_exists.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_markexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_markhtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributesexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributeshtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributesjs">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.js</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_and_measure_exception_when_invoke_without_parameterexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_and_measure_exception_when_invoke_without_parameterhtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_exceptionsexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_exceptions-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_exceptionshtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_exceptions.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_with_name_of_navigation_timing_optional_attributeexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_with_name_of_navigation_timing_optional_attributehtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measureexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measurehtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measure_exceptionsexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_exceptions-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measure_exceptionshtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_exceptions.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measure_navigation_timingexpectedtxt">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_navigation_timing-expected.txt</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measure_navigation_timinghtml">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_navigation_timing.html</a></li>
<li><a href="#trunkLayoutTestsimportedw3cwebplatformtestsusertimingw3cimportlog">trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/w3c-import.log</a></li>
</ul>

</div>
<div id="patch">
<h3>Diff</h3>
<a id="trunkLayoutTestsimportedw3cChangeLog"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/imported/w3c/ChangeLog (211332 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/ChangeLog        2017-01-28 09:26:14 UTC (rev 211332)
+++ trunk/LayoutTests/imported/w3c/ChangeLog        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -1,3 +1,48 @@
</span><ins>+2017-01-27  Joseph Pecoraro  &lt;pecoraro@apple.com&gt;
+
+        Import web-platform-tests/user-timing
+        https://bugs.webkit.org/show_bug.cgi?id=167542
+        &lt;rdar://problem/22746307&gt;
+
+        Rubber-stamped by Ryosuke Niwa.
+
+        Only failures are ones where User Timing Level 2 differs
+        from User Timing Level 1.
+
+        * resources/ImportExpectations:
+        * web-platform-tests/user-timing/OWNERS: Added.
+        * web-platform-tests/user-timing/idlharness-expected.txt: Added.
+        * web-platform-tests/user-timing/idlharness.html: Added.
+        * web-platform-tests/user-timing/resources/w3c-import.log: Added.
+        * web-platform-tests/user-timing/resources/webperftestharness.js: Added.
+        * web-platform-tests/user-timing/resources/webperftestharnessextension.js: Added.
+        * web-platform-tests/user-timing/test_user_timing_clear_marks-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_clear_marks.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_clear_measures-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_clear_measures.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_entry_type-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_entry_type.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_exists-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_exists.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.js: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark_exceptions-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark_exceptions.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_measure-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_measure.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_measure_exceptions-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_measure_exceptions.html: Added.
+        * web-platform-tests/user-timing/test_user_timing_measure_navigation_timing-expected.txt: Added.
+        * web-platform-tests/user-timing/test_user_timing_measure_navigation_timing.html: Added.
+        * web-platform-tests/user-timing/w3c-import.log: Added.
+
</ins><span class="cx"> 2017-01-21  Chris Dumez  &lt;cdumez@apple.com&gt;
</span><span class="cx"> 
</span><span class="cx">         innerText should replace existing text node
</span></span></pre></div>
<a id="trunkLayoutTestsimportedw3cresourcesImportExpectations"></a>
<div class="modfile"><h4>Modified: trunk/LayoutTests/imported/w3c/resources/ImportExpectations (211332 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/resources/ImportExpectations        2017-01-28 09:26:14 UTC (rev 211332)
+++ trunk/LayoutTests/imported/w3c/resources/ImportExpectations        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -284,7 +284,7 @@
</span><span class="cx"> web-platform-tests/uievents [ Skip ]
</span><span class="cx"> web-platform-tests/upgrade-insecure-requests [ Skip ]
</span><span class="cx"> #web-platform-tests/url [ Pass ]
</span><del>-web-platform-tests/user-timing [ Skip ]
</del><ins>+#web-platform-tests/user-timing [ Pass ]
</ins><span class="cx"> web-platform-tests/vibration [ Skip ]
</span><span class="cx"> web-platform-tests/wai-aria [ Skip ]
</span><span class="cx"> web-platform-tests/web-animations [ Skip ]
</span></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingOWNERS"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/OWNERS (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/OWNERS                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/OWNERS        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,3 @@
</span><ins>+@plehegar
+@igrigorik
+@toddreifsteck
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingidlharnessexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/idlharness-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/idlharness-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/idlharness-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,42 @@
</span><ins>+User Timing IDL tests
+
+
+PASS Performance interface: operation mark(DOMString) 
+PASS Performance interface: operation clearMarks(DOMString) 
+PASS Performance interface: operation measure(DOMString,DOMString,DOMString) 
+PASS Performance interface: operation clearMeasures(DOMString) 
+PASS Performance must be primary interface of window.performance 
+PASS Stringification of window.performance 
+PASS Performance interface: window.performance must inherit property &quot;mark&quot; with the proper type (0) 
+PASS Performance interface: calling mark(DOMString) on window.performance with too few arguments must throw TypeError 
+PASS Performance interface: window.performance must inherit property &quot;clearMarks&quot; with the proper type (1) 
+PASS Performance interface: calling clearMarks(DOMString) on window.performance with too few arguments must throw TypeError 
+PASS Performance interface: window.performance must inherit property &quot;measure&quot; with the proper type (2) 
+PASS Performance interface: calling measure(DOMString,DOMString,DOMString) on window.performance with too few arguments must throw TypeError 
+PASS Performance interface: window.performance must inherit property &quot;clearMeasures&quot; with the proper type (3) 
+PASS Performance interface: calling clearMeasures(DOMString) on window.performance with too few arguments must throw TypeError 
+PASS PerformanceMark interface: existence and properties of interface object 
+PASS PerformanceMark interface object length 
+PASS PerformanceMark interface object name 
+PASS PerformanceMark interface: existence and properties of interface prototype object 
+PASS PerformanceMark interface: existence and properties of interface prototype object's &quot;constructor&quot; property 
+PASS PerformanceMeasure interface: existence and properties of interface object 
+PASS PerformanceMeasure interface object length 
+PASS PerformanceMeasure interface object name 
+PASS PerformanceMeasure interface: existence and properties of interface prototype object 
+PASS PerformanceMeasure interface: existence and properties of interface prototype object's &quot;constructor&quot; property 
+partial interface Performance {
+    void mark(DOMString markName);
+    void clearMarks(optional  DOMString markName);
+
+    void measure(DOMString measureName, optional DOMString startMark, optional DOMString endMark);
+    void clearMeasures(optional DOMString measureName);
+};
+
+interface PerformanceMark : PerformanceEntry {
+};
+
+interface PerformanceMeasure : PerformanceEntry {
+};
+
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingidlharnesshtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/idlharness.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/idlharness.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/idlharness.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,59 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+&lt;head&gt;
+&lt;meta charset=&quot;utf-8&quot;&gt;
+&lt;title&gt;User Timing IDL tests&lt;/title&gt;
+&lt;link rel=&quot;author&quot; title=&quot;W3C&quot; href=&quot;http://www.w3.org/&quot; /&gt;
+&lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#extensions-performance-interface&quot;/&gt;
+&lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#performancemark&quot;/&gt;
+&lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#performancemeasure&quot;/&gt;
+&lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+&lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+&lt;script src=&quot;/resources/WebIDLParser.js&quot;&gt;&lt;/script&gt;
+&lt;script src=&quot;/resources/idlharness.js&quot;&gt;&lt;/script&gt;
+&lt;/head&gt;
+&lt;body&gt;
+&lt;h1&gt;User Timing IDL tests&lt;/h1&gt;
+&lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+
+&lt;pre id='untested_idl' style='display:none'&gt;
+interface Performance {
+};
+
+interface PerformanceEntry {
+};
+&lt;/pre&gt;
+
+&lt;pre id='idl'&gt;
+partial interface Performance {
+    void mark(DOMString markName);
+    void clearMarks(optional  DOMString markName);
+
+    void measure(DOMString measureName, optional DOMString startMark, optional DOMString endMark);
+    void clearMeasures(optional DOMString measureName);
+};
+
+interface PerformanceMark : PerformanceEntry {
+};
+
+interface PerformanceMeasure : PerformanceEntry {
+};
+
+&lt;/pre&gt;
+
+&lt;script&gt;
+
+(function() {
+  var idl_array = new IdlArray();
+
+  idl_array.add_untested_idls(document.getElementById(&quot;untested_idl&quot;).textContent);
+  idl_array.add_idls(document.getElementById(&quot;idl&quot;).textContent);
+
+  idl_array.add_objects({Performance: [&quot;window.performance&quot;]});
+
+  idl_array.test();
+})();
+
+&lt;/script&gt;
+&lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingresourcesw3cimportlog"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/w3c-import.log (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/w3c-import.log                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/w3c-import.log        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,19 @@
</span><ins>+The tests in this directory were imported from the W3C repository.
+Do NOT modify these tests directly in WebKit.
+Instead, create a pull request on the W3C CSS or WPT github:
+        https://github.com/w3c/csswg-test
+        https://github.com/w3c/web-platform-tests
+
+Then run the Tools/Scripts/import-w3c-tests in WebKit to reimport
+
+Do NOT modify or remove this file.
+
+------------------------------------------------------------------------
+Properties requiring vendor prefixes:
+None
+Property values requiring vendor prefixes:
+None
+------------------------------------------------------------------------
+List of files:
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/webperftestharness.js
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/webperftestharnessextension.js
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingresourceswebperftestharnessjs"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/webperftestharness.js (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/webperftestharness.js                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/webperftestharness.js        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,155 @@
</span><ins>+/*
+Distributed under both the W3C Test Suite License [1] and the W3C
+3-clause BSD License [2]. To contribute to a W3C Test Suite, see the
+policies and contribution forms [3].
+
+[1] http://www.w3.org/Consortium/Legal/2008/04-testsuite-license
+[2] http://www.w3.org/Consortium/Legal/2008/03-bsd-license
+[3] http://www.w3.org/2004/10/27-testcases
+ */
+
+//
+// Helper Functions for NavigationTiming W3C tests
+//
+
+var performanceNamespace = self.performance;
+var timingAttributes = [
+    'connectEnd',
+    'connectStart',
+    'domComplete',
+    'domContentLoadedEventEnd',
+    'domContentLoadedEventStart',
+    'domInteractive',
+    'domLoading',
+    'domainLookupEnd',
+    'domainLookupStart',
+    'fetchStart',
+    'loadEventEnd',
+    'loadEventStart',
+    'navigationStart',
+    'redirectEnd',
+    'redirectStart',
+    'requestStart',
+    'responseEnd',
+    'responseStart',
+    'unloadEventEnd',
+    'unloadEventStart'
+];
+
+var namespace_check = false;
+
+//
+// All test() functions in the WebPerf test suite should use wp_test() instead.
+//
+// wp_test() validates the window.performance namespace exists prior to running tests and
+// immediately shows a single failure if it does not.
+//
+
+function wp_test(func, msg, properties)
+{
+    // only run the namespace check once
+    if (!namespace_check)
+    {
+        namespace_check = true;
+
+        if (performanceNamespace === undefined || performanceNamespace == null)
+        {
+            // show a single error that window.performance is undefined
+            test(function() { assert_true(performanceNamespace !== undefined &amp;&amp; performanceNamespace != null, &quot;window.performance is defined and not null&quot;); }, &quot;window.performance is defined and not null.&quot;, {author:&quot;W3C http://www.w3.org/&quot;,help:&quot;http://www.w3.org/TR/navigation-timing/#sec-window.performance-attribute&quot;,assert:&quot;The window.performance attribute provides a hosting area for performance related attributes. &quot;});
+        }
+    }
+
+    test(func, msg, properties);
+}
+
+function test_namespace(child_name, skip_root)
+{
+    if (skip_root === undefined) {
+        var msg = 'window.performance is defined';
+        wp_test(function () { assert_true(performanceNamespace !== undefined, msg); }, msg,{author:&quot;W3C http://www.w3.org/&quot;,help:&quot;http://www.w3.org/TR/navigation-timing/#sec-window.performance-attribute&quot;,assert:&quot;The window.performance attribute provides a hosting area for performance related attributes. &quot;});
+    }
+
+    if (child_name !== undefined) {
+        var msg2 = 'window.performance.' + child_name + ' is defined';
+        wp_test(function() { assert_true(performanceNamespace[child_name] !== undefined, msg2); }, msg2,{author:&quot;W3C http://www.w3.org/&quot;,help:&quot;http://www.w3.org/TR/navigation-timing/#sec-window.performance-attribute&quot;,assert:&quot;The window.performance attribute provides a hosting area for performance related attributes. &quot;});
+    }
+}
+
+function test_attribute_exists(parent_name, attribute_name, properties)
+{
+    var msg = 'window.performance.' + parent_name + '.' + attribute_name + ' is defined.';
+    wp_test(function() { assert_true(performanceNamespace[parent_name][attribute_name] !== undefined, msg); }, msg, properties);
+}
+
+function test_enum(parent_name, enum_name, value, properties)
+{
+    var msg = 'window.performance.' + parent_name + '.' + enum_name + ' is defined.';
+    wp_test(function() { assert_true(performanceNamespace[parent_name][enum_name] !== undefined, msg); }, msg, properties);
+
+    msg = 'window.performance.' + parent_name + '.' + enum_name + ' = ' + value;
+    wp_test(function() { assert_equals(performanceNamespace[parent_name][enum_name], value, msg); }, msg, properties);
+}
+
+function test_timing_order(attribute_name, greater_than_attribute, properties)
+{
+    // ensure it's not 0 first
+    var msg = &quot;window.performance.timing.&quot; + attribute_name + &quot; &gt; 0&quot;;
+    wp_test(function() { assert_true(performanceNamespace.timing[attribute_name] &gt; 0, msg); }, msg, properties);
+
+    // ensure it's in the right order
+    msg = &quot;window.performance.timing.&quot; + attribute_name + &quot; &gt;= window.performance.timing.&quot; + greater_than_attribute;
+    wp_test(function() { assert_true(performanceNamespace.timing[attribute_name] &gt;= performanceNamespace.timing[greater_than_attribute], msg); }, msg, properties);
+}
+
+function test_timing_greater_than(attribute_name, greater_than, properties)
+{
+    var msg = &quot;window.performance.timing.&quot; + attribute_name + &quot; &gt; &quot; + greater_than;
+    test_greater_than(performanceNamespace.timing[attribute_name], greater_than, msg, properties);
+}
+
+function test_timing_equals(attribute_name, equals, msg, properties)
+{
+    var test_msg = msg || &quot;window.performance.timing.&quot; + attribute_name + &quot; == &quot; + equals;
+    test_equals(performanceNamespace.timing[attribute_name], equals, test_msg, properties);
+}
+
+//
+// Non-test related helper functions
+//
+
+function sleep_milliseconds(n)
+{
+    var start = new Date().getTime();
+    while (true) {
+        if ((new Date().getTime() - start) &gt;= n) break;
+    }
+}
+
+//
+// Common helper functions
+//
+
+function test_true(value, msg, properties)
+{
+    wp_test(function () { assert_true(value, msg); }, msg, properties);
+}
+
+function test_equals(value, equals, msg, properties)
+{
+    wp_test(function () { assert_equals(value, equals, msg); }, msg, properties);
+}
+
+function test_greater_than(value, greater_than, msg, properties)
+{
+    wp_test(function () { assert_true(value &gt; greater_than, msg); }, msg, properties);
+}
+
+function test_greater_or_equals(value, greater_than, msg, properties)
+{
+    wp_test(function () { assert_true(value &gt;= greater_than, msg); }, msg, properties);
+}
+
+function test_not_equals(value, notequals, msg, properties)
+{
+    wp_test(function() { assert_true(value !== notequals, msg); }, msg, properties);
+}
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingresourceswebperftestharnessextensionjs"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/webperftestharnessextension.js (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/webperftestharnessextension.js                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/resources/webperftestharnessextension.js        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,199 @@
</span><ins>+/*
+Distributed under both the W3C Test Suite License [1] and the W3C
+3-clause BSD License [2]. To contribute to a W3C Test Suite, see the
+policies and contribution forms [3].
+
+[1] http://www.w3.org/Consortium/Legal/2008/04-testsuite-license
+[2] http://www.w3.org/Consortium/Legal/2008/03-bsd-license
+[3] http://www.w3.org/2004/10/27-testcases
+ */
+
+var mark_names = [
+    '',
+    '1',
+    'abc',
+];
+
+var measures = [
+    [''],
+    ['2', 1],
+    ['aaa', 'navigationStart', ''],
+];
+
+function test_method_exists(method, method_name, properties)
+{
+    var msg;
+    if (typeof method === 'function')
+        msg = 'performance.' + method.name + ' is supported!';
+    else
+        msg = 'performance.' + method_name + ' is supported!';
+    wp_test(function() { assert_true(typeof method === 'function', msg); }, msg, properties);
+}
+
+function test_method_throw_exception(func_str, exception, msg)
+{
+    var exception_name = typeof exception === &quot;object&quot; ? exception.name : exception;
+    var msg = 'Invocation of ' + func_str + ' should throw ' + exception_name  + ' Exception.';
+    wp_test(function() { assert_throws(exception, function() {eval(func_str)}, msg); }, msg);
+}
+
+function test_noless_than(value, greater_than, msg, properties)
+{
+    wp_test(function () { assert_true(value &gt;= greater_than, msg); }, msg, properties);
+}
+
+function test_fail(msg, properties)
+{
+    wp_test(function() { assert_unreached(); }, msg, properties);
+}
+
+function test_resource_entries(entries, expected_entries)
+{
+    // This is slightly convoluted so that we can sort the output.
+    var actual_entries = {};
+    var origin = window.location.protocol + &quot;//&quot; + window.location.host;
+
+    for (var i = 0; i &lt; entries.length; ++i) {
+        var entry = entries[i];
+        var found = false;
+        for (var expected_entry in expected_entries) {
+            if (entry.name == origin + expected_entry) {
+                found = true;
+                if (expected_entry in actual_entries) {
+                    test_fail(expected_entry + ' is not expected to have duplicate entries');
+                }
+                actual_entries[expected_entry] = entry;
+                break;
+            }
+        }
+        if (!found) {
+            test_fail(entries[i].name + ' is not expected to be in the Resource Timing buffer');
+        }
+    }
+
+    sorted_urls = [];
+    for (var i in actual_entries) {
+        sorted_urls.push(i);
+    }
+    sorted_urls.sort();
+    for (var i in sorted_urls) {
+        var url = sorted_urls[i];
+        test_equals(actual_entries[url].initiatorType,
+                    expected_entries[url],
+                    origin + url + ' is expected to have initiatorType ' + expected_entries[url]);
+    }
+    for (var j in expected_entries) {
+        if (!(j in actual_entries)) {
+            test_fail(origin + j + ' is expected to be in the Resource Timing buffer');
+        }
+    }
+}
+function performance_entrylist_checker(type)
+{
+    var entryType = type;
+
+    function entry_check(entry, expectedNames)
+    {
+        var msg = 'Entry \&quot;' + entry.name + '\&quot; should be one that we have set.';
+        wp_test(function() { assert_in_array(entry.name, expectedNames, msg); }, msg);
+        test_equals(entry.entryType, entryType, 'entryType should be \&quot;' + entryType + '\&quot;.');
+        if (type === &quot;measure&quot;) {
+            test_true(isFinite(entry.startTime), 'startTime should be a number.');
+            test_true(isFinite(entry.duration), 'duration should be a number.');
+        } else if (type === &quot;mark&quot;) {
+            test_greater_than(entry.startTime, 0, 'startTime should greater than 0.');
+            test_equals(entry.duration, 0, 'duration of mark should be 0.');
+        }
+    }
+
+    function entrylist_order_check(entryList)
+    {
+        var inOrder = true;
+        for (var i = 0; i &lt; entryList.length - 1; ++i)
+        {
+            if (entryList[i + 1].startTime &lt; entryList[i].startTime) {
+                inOrder = false;
+                break;
+            }
+        }
+        return inOrder;
+    }
+
+    function entrylist_check(entryList, expectedLength, expectedNames)
+    {
+        test_equals(entryList.length, expectedLength, 'There should be ' + expectedLength + ' entries.');
+        test_true(entrylist_order_check(entryList), 'Entries in entrylist should be in order.');
+        for (var i = 0; i &lt; entryList.length; ++i)
+        {
+            entry_check(entryList[i], expectedNames);
+        }
+    }
+
+    return{&quot;entrylist_check&quot;:entrylist_check};
+}
+
+function PerformanceContext(context)
+{
+    this.performanceContext = context;
+}
+
+PerformanceContext.prototype =
+{
+
+    initialMeasures: function(item, index, array)
+    {
+        this.performanceContext.measure.apply(this.performanceContext, item);
+    },
+
+    mark: function()
+    {
+        this.performanceContext.mark.apply(this.performanceContext, arguments);
+    },
+
+    measure: function()
+    {
+        this.performanceContext.measure.apply(this.performanceContext, arguments);
+    },
+
+    clearMarks: function()
+    {
+        this.performanceContext.clearMarks.apply(this.performanceContext, arguments);
+    },
+
+    clearMeasures: function()
+    {
+        this.performanceContext.clearMeasures.apply(this.performanceContext, arguments);
+
+    },
+
+    getEntries: function()
+    {
+        return this.performanceContext.getEntries.apply(this.performanceContext, arguments);
+    },
+
+    getEntriesByType: function()
+    {
+        return this.performanceContext.getEntriesByType.apply(this.performanceContext, arguments);
+    },
+
+    getEntriesByName: function()
+    {
+        return this.performanceContext.getEntriesByName.apply(this.performanceContext, arguments);
+    },
+
+    setResourceTimingBufferSize: function()
+    {
+        return this.performanceContext.setResourceTimingBufferSize.apply(this.performanceContext, arguments);
+    },
+
+    registerResourceTimingBufferFullCallback: function(func)
+    {
+        this.performanceContext.onresourcetimingbufferfull = func;
+    },
+
+    clearResourceTimings: function()
+    {
+        this.performanceContext.clearResourceTimings.apply(this.performanceContext, arguments);
+    }
+
+};
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_clear_marksexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_marks-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_marks-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_marks-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,16 @@
</span><ins>+Description
+
+This test validates that the performance.clearMarks() method is working properly. This test creates the following marks to test this method:
+
+&quot;mark1&quot;
+&quot;mark2&quot;
+After creating each mark, performance.clearMarks() is called three times. First, it is provided with a name of &quot;markUndefined&quot;, a non-existent mark, which shouldn't change the state of the Performance Timeline. Next, it is provided with a name of &quot;mark2&quot;, after which, this mark should no longer be present in the Performance Timeline. Finally, performance.clearMarks() is called without any name provided. After this call, no marks should be present in the Performance Timeline. The state of the Performance Timeline is tested with the performance.getEntriesByType() and performance.getEntries() methods.
+
+PASS window.performance is defined 
+PASS Two marks have been created for this test. 
+PASS After a call to window.performance.clearMarks(&quot;markUndefined&quot;), where &quot;markUndefined&quot; is a non-existent mark, window.performance.getEntriesByName(&quot;mark1&quot;) returns an object containing the &quot;mark1&quot; mark. 
+PASS After a call to window.performance.clearMarks(&quot;markUndefined&quot;), where &quot;markUndefined&quot; is a non-existent mark, window.performance.getEntriesByName(&quot;mark2&quot;) returns an object containing the &quot;mark2&quot; mark. 
+PASS After a call to window.performance.clearMarks(&quot;mark1&quot;), window.performance.getEntriesByName(&quot;mark1&quot;) returns an empty object. 
+PASS After a call to window.performance.clearMarks(&quot;mark1&quot;), window.performance.getEntriesByName(&quot;mark2&quot;) returns an object containing the &quot;mark2&quot; mark. 
+PASS After a call to window.performance.clearMarks(), window.performance.getEntriesByType(&quot;mark&quot;) returns an empty object. 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_clear_markshtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_marks.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_marks.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_marks.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,134 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;UTF-8&quot; /&gt;
+        &lt;title&gt;window.performance User Timing clearMarks() method is working properly&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Microsoft&quot; href=&quot;http://www.microsoft.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#dom-performance-clearmarks&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+
+    &lt;script type=&quot;text/javascript&quot;&gt;
+        // test marks
+        var markName1 = &quot;mark1&quot;;
+        var markName2 = &quot;mark2&quot;;
+        var markName3 = &quot;markUndefined&quot;;
+        var markTestDelay = 200;
+        var entries;
+        var pass;
+
+        setup({explicit_done: true});
+
+        test_namespace();
+
+        function onload_test()
+        {
+            // test for existance of User Timing and Performance Timeline interface
+            if (window.performance.mark == undefined ||
+                window.performance.clearMarks == undefined ||
+                window.performance.measure == undefined ||
+                window.performance.clearMeasures == undefined ||
+                window.performance.getEntriesByName == undefined ||
+                window.performance.getEntriesByType == undefined ||
+                window.performance.getEntries == undefined)
+            {
+                test_true(false,
+                          &quot;The User Timing and Performance Timeline interfaces, which are required for this test, &quot; +
+                          &quot;are defined.&quot;);
+
+                done();
+            }
+            else
+            {
+                // create a mark using the test delay; the mark's value should be equivalent to the loadEventStart
+                // navigation timing attribute plus the test delay
+                setTimeout(mark_test_cb, markTestDelay);
+            }
+        }
+
+        function mark_test_cb()
+        {
+            // create the test marks; only create &quot;mark1&quot; and &quot;mark2&quot;, &quot;markUndefined&quot; is a non-existent mark
+            window.performance.mark(markName1);
+            window.performance.mark(markName2);
+
+            // test that two marks have been created
+            entries = window.performance.getEntriesByType(&quot;mark&quot;);
+            test_equals(entries.length, 2, &quot;Two marks have been created for this test.&quot;);
+
+            // clear non-existent mark
+            window.performance.clearMarks(markName3);
+
+            // test that &quot;mark1&quot; still exists
+            entries = window.performance.getEntriesByName(markName1);
+            test_true(entries[0].name == markName1,
+                      &quot;After a call to window.performance.clearMarks(\&quot;&quot; + markName3 + &quot;\&quot;), where \&quot;&quot; + markName3 +
+                      &quot;\&quot; is a non-existent mark, window.performance.getEntriesByName(\&quot;&quot; + markName1 + &quot;\&quot;) &quot; +
+                      &quot;returns an object containing the \&quot;&quot; + markName1 + &quot;\&quot; mark.&quot;);
+
+            // test that &quot;mark2&quot; still exists
+            entries = window.performance.getEntriesByName(markName2);
+            test_true(entries[0].name == markName2,
+                      &quot;After a call to window.performance.clearMarks(\&quot;&quot; + markName3 + &quot;\&quot;), where \&quot;&quot; + markName3 +
+                      &quot;\&quot; is a non-existent mark, window.performance.getEntriesByName(\&quot;&quot; + markName2 + &quot;\&quot;) &quot; +
+                      &quot;returns an object containing the \&quot;&quot; + markName2 + &quot;\&quot; mark.&quot;);
+
+            // clear existent mark
+            window.performance.clearMarks(markName1);
+
+            // test that &quot;mark1&quot; was cleared
+            entries = window.performance.getEntriesByName(markName1);
+            pass = true;
+            for (var i in entries)
+            {
+                pass = false;
+            }
+            test_true(pass,
+                      &quot;After a call to window.performance.clearMarks(\&quot;&quot; + markName1 + &quot;\&quot;), &quot; +
+                      &quot;window.performance.getEntriesByName(\&quot;&quot; + markName1 + &quot;\&quot;) returns an empty object.&quot;);
+
+            // test that &quot;mark2&quot; still exists
+            entries = window.performance.getEntriesByName(markName2);
+            test_true(entries[0].name == markName2,
+                      &quot;After a call to window.performance.clearMarks(\&quot;&quot; + markName1 + &quot;\&quot;), &quot; +
+                      &quot;window.performance.getEntriesByName(\&quot;&quot; + markName2 + &quot;\&quot;) returns an object containing the &quot; +
+                      &quot;\&quot;&quot; + markName2 + &quot;\&quot; mark.&quot;);
+
+            // clear all marks
+            window.performance.clearMarks();
+
+            // test that all marks were cleared
+            entries = window.performance.getEntriesByType(&quot;mark&quot;);
+            pass = true;
+            for (var i in entries)
+            {
+                pass = false;
+            }
+            test_true(pass,
+                      &quot;After a call to window.performance.clearMarks(), &quot; +
+                      &quot;window.performance.getEntriesByType(\&quot;mark\&quot;) returns an empty object.&quot;);
+
+            done();
+        }
+    &lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body onload=&quot;onload_test();&quot;&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates that the performance.clearMarks() method is working properly. This test creates the
+           following marks to test this method:
+            &lt;ul&gt;
+                &lt;li&gt;&quot;mark1&quot;&lt;/li&gt;
+                &lt;li&gt;&quot;mark2&quot;&lt;/li&gt;
+            &lt;/ul&gt;
+           After creating each mark, performance.clearMarks() is called three times. First, it is provided with a name
+           of &quot;markUndefined&quot;, a non-existent mark, which shouldn't change the state of the Performance Timeline. Next,
+           it is provided with a name of &quot;mark2&quot;, after which, this mark should no longer be present in the Performance
+           Timeline. Finally, performance.clearMarks() is called without any name provided. After this call, no marks
+           should be present in the Performance Timeline. The state of the Performance Timeline is tested with the
+           performance.getEntriesByType() and performance.getEntries() methods.
+        &lt;/p&gt;
+
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_clear_measuresexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_measures-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_measures-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_measures-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,16 @@
</span><ins>+Description
+
+This test validates that the performance.clearMeasures() method is working properly. This test creates the following measures to test this method:
+
+&quot;measure1&quot;
+&quot;measure2&quot;
+After creating each measure, performance.clearMeasures() is called three times. First, it is provided with a name of &quot;measureUndefined&quot;, a non-existent measure, which shouldn't change the state of the Performance Timeline. Next, it is provided with a name of &quot;measure2&quot;, after which, this measure should no longer be present in the Performance Timeline. Finally, performance.clearMeasures() is called without any name provided. After this call, no measures should be present in the Performance Timeline. The state of the Performance Timeline is tested with the performance.getEntriesByType() and performance.getEntries() methods.
+
+PASS window.performance is defined 
+PASS Two measures have been created for this test. 
+PASS After a call to window.performance.clearMeasures(&quot;measureUndefined&quot;), where &quot;measureUndefined&quot; is a non-existent measure, window.performance.getEntriesByName(&quot;measure1&quot;) returns an object containing the &quot;measure1&quot; measure. 
+PASS After a call to window.performance.clearMeasures(&quot;measureUndefined&quot;), where &quot;measureUndefined&quot; is a non-existent measure, window.performance.getEntriesByName(&quot;measure2&quot;) returns an object containing the &quot;measure2&quot; measure. 
+PASS After a call to window.performance.clearMeasures(&quot;measure1&quot;), window.performance.getEntriesByName(&quot;measure1&quot;) returns an empty object. 
+PASS After a call to window.performance.clearMeasures(&quot;measure1&quot;), window.performance.getEntriesByName(&quot;measure2&quot;) returns an object containing the &quot;measure2&quot; measure. 
+PASS After a call to window.performance.clearMeasures(), window.performance.getEntriesByType(&quot;measure&quot;) returns an empty object. 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_clear_measureshtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_measures.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_measures.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_measures.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,136 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;UTF-8&quot; /&gt;
+        &lt;title&gt;window.performance User Timing clearMeasures() method is working properly&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Microsoft&quot; href=&quot;http://www.microsoft.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#dom-performance-clearmeasures&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+
+    &lt;script type=&quot;text/javascript&quot;&gt;
+        // test measures
+        var measureName1 = &quot;measure1&quot;;
+        var measureName2 = &quot;measure2&quot;;
+        var measureName3 = &quot;measureUndefined&quot;;
+        var measureTestDelay = 200;
+        var measureEntryNames;
+        var entries;
+
+        setup({explicit_done: true});
+
+        test_namespace();
+
+        function onload_test()
+        {
+            // test for existance of User Timing and Performance Timeline interface
+            if (window.performance.mark == undefined ||
+                window.performance.clearMarks == undefined ||
+                window.performance.measure == undefined ||
+                window.performance.clearMeasures == undefined ||
+                window.performance.getEntriesByName == undefined ||
+                window.performance.getEntriesByType == undefined ||
+                window.performance.getEntries == undefined)
+            {
+                test_true(false,
+                          &quot;The User Timing and Performance Timeline interfaces, which are required for this test, &quot; +
+                          &quot;are defined.&quot;);
+
+                done();
+            }
+            else
+            {
+                // create measures using the test delay
+                setTimeout(measure_test_cb, measureTestDelay);
+            }
+        }
+
+        function measure_test_cb()
+        {
+            // create the test measures; only create &quot;measure1&quot; and &quot;measure2&quot;, &quot;measureUndefined&quot; is a non-existent
+            // measure; give &quot;measure1&quot; a startMark of &quot;navigationStart&quot; and &quot;measure2&quot; a startMark of
+            // &quot;responseEnd&quot;, this way, &quot;measure1&quot; always come first in a PerformanceEntryList returned from a
+            // Performance Timeline accessor
+            window.performance.measure(measureName1, &quot;navigationStart&quot;);
+            window.performance.measure(measureName2, &quot;responseEnd&quot;);
+
+            // test that two measures have been created
+            entries = window.performance.getEntriesByType(&quot;measure&quot;);
+            test_equals(entries.length, 2, &quot;Two measures have been created for this test.&quot;);
+
+            // clear non-existent measure
+            window.performance.clearMeasures(measureName3);
+
+            // test that &quot;measure1&quot; still exists
+            entries = window.performance.getEntriesByName(measureName1);
+            test_true(entries[0].name == measureName1,
+                      &quot;After a call to window.performance.clearMeasures(\&quot;&quot; + measureName3 + &quot;\&quot;), where \&quot;&quot; + measureName3 +
+                      &quot;\&quot; is a non-existent measure, window.performance.getEntriesByName(\&quot;&quot; + measureName1 + &quot;\&quot;) &quot; +
+                      &quot;returns an object containing the \&quot;&quot; + measureName1 + &quot;\&quot; measure.&quot;);
+
+            // test that &quot;measure2&quot; still exists
+            entries = window.performance.getEntriesByName(measureName2);
+            test_true(entries[0].name == measureName2,
+                      &quot;After a call to window.performance.clearMeasures(\&quot;&quot; + measureName3 + &quot;\&quot;), where \&quot;&quot; + measureName3 +
+                      &quot;\&quot; is a non-existent measure, window.performance.getEntriesByName(\&quot;&quot; + measureName2 + &quot;\&quot;) &quot; +
+                      &quot;returns an object containing the \&quot;&quot; + measureName2 + &quot;\&quot; measure.&quot;);
+
+            // clear existent measure
+            window.performance.clearMeasures(measureName1);
+
+            // test that &quot;measure1&quot; was cleared
+            entries = window.performance.getEntriesByName(measureName1);
+            pass = true;
+            for (var i in entries)
+            {
+                pass = false;
+            }
+            test_true(pass,
+                      &quot;After a call to window.performance.clearMeasures(\&quot;&quot; + measureName1 + &quot;\&quot;), &quot; +
+                      &quot;window.performance.getEntriesByName(\&quot;&quot; + measureName1 + &quot;\&quot;) returns an empty object.&quot;);
+
+            // test that &quot;measure2&quot; still exists
+            entries = window.performance.getEntriesByName(measureName2);
+            test_true(entries[0].name == measureName2,
+                      &quot;After a call to window.performance.clearMeasures(\&quot;&quot; + measureName1 + &quot;\&quot;), &quot; +
+                      &quot;window.performance.getEntriesByName(\&quot;&quot; + measureName2 + &quot;\&quot;) returns an object containing the &quot; +
+                      &quot;\&quot;&quot; + measureName2 + &quot;\&quot; measure.&quot;);
+
+            // clear all measures
+            window.performance.clearMeasures();
+
+            // test that all measures were cleared
+            entries = window.performance.getEntriesByType(&quot;measure&quot;);
+            pass = true;
+            for (var i in entries)
+            {
+                pass = false;
+            }
+            test_true(pass,
+                      &quot;After a call to window.performance.clearMeasures(), &quot; +
+                      &quot;window.performance.getEntriesByType(\&quot;measure\&quot;) returns an empty object.&quot;);
+
+            done();
+        }
+    &lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body onload=&quot;onload_test();&quot;&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates that the performance.clearMeasures() method is working properly. This test creates the
+           following measures to test this method:
+            &lt;ul&gt;
+                &lt;li&gt;&quot;measure1&quot;&lt;/li&gt;
+                &lt;li&gt;&quot;measure2&quot;&lt;/li&gt;
+            &lt;/ul&gt;
+           After creating each measure, performance.clearMeasures() is called three times. First, it is provided with a
+           name of &quot;measureUndefined&quot;, a non-existent measure, which shouldn't change the state of the Performance
+           Timeline. Next, it is provided with a name of &quot;measure2&quot;, after which, this measure should no longer be
+           present in the Performance Timeline. Finally, performance.clearMeasures() is called without any name
+           provided. After this call, no measures should be present in the Performance Timeline. The state of the
+           Performance Timeline is tested with the performance.getEntriesByType() and performance.getEntries() methods.
+        &lt;/p&gt;
+
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_entry_typeexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_entry_type-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_entry_type-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_entry_type-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,8 @@
</span><ins>+Description
+
+This test validates the user timing entry type, PerformanceMark and PerformanceMeasure.
+
+
+PASS Class name of mark entry should be PerformanceMark. 
+PASS Class name of measure entry should be PerformanceMeasure. 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_entry_typehtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_entry_type.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_entry_type.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_entry_type.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,29 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;utf-8&quot; /&gt;
+        &lt;title&gt;user timing entry type&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Intel&quot; href=&quot;http://www.intel.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#extensions-performance-interface&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharnessextension.js&quot;&gt;&lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates the user timing entry type, PerformanceMark and PerformanceMeasure.&lt;/p&gt;
+
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+        &lt;script&gt;
+        var context = new PerformanceContext(window.performance);
+        context.mark('mark');
+        context.measure('measure');
+        var mark_entry = context.getEntriesByName('mark')[0];
+        var measure_entry = context.getEntriesByName('measure')[0];
+
+        test_equals(Object.prototype.toString.call(mark_entry), '[object PerformanceMark]', 'Class name of mark entry should be PerformanceMark.');
+        test_equals(Object.prototype.toString.call(measure_entry), '[object PerformanceMeasure]', 'Class name of measure entry should be PerformanceMeasure.');
+        &lt;/script&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_existsexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_exists-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_exists-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_exists-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,11 @@
</span><ins>+Description
+
+This test validates that all of the methods used to interact with the User Timing API are defined.
+
+
+PASS window.performance is defined 
+PASS window.performance.mark is defined. 
+PASS window.performance.clearMarks is defined. 
+PASS window.performance.measure is defined. 
+PASS window.performance.clearMeasures is defined. 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_existshtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_exists.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_exists.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_exists.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,28 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;UTF-8&quot; /&gt;
+        &lt;title&gt;window.performance User Timing exists&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Microsoft&quot; href=&quot;http://www.microsoft.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+
+    &lt;/head&gt;
+    &lt;body&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates that all of the methods used to interact with the User Timing API are defined.&lt;/p&gt;
+
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+
+        &lt;script&gt;
+        test_namespace();
+
+        test_true(window.performance.mark !== undefined, &quot;window.performance.mark is defined.&quot;);
+        test_true(window.performance.clearMarks !== undefined, &quot;window.performance.clearMarks is defined.&quot;);
+        test_true(window.performance.measure !== undefined, &quot;window.performance.measure is defined.&quot;);
+        test_true(window.performance.clearMeasures !== undefined, &quot;window.performance.clearMeasures is defined.&quot;);
+        &lt;/script&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_markexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,30 @@
</span><ins>+Description
+
+This test validates that the performance.mark() method is working properly. This test creates the following marks to test this method:
+
+&quot;mark1&quot;: created using a normal mark() call
+&quot;mark1&quot;: duplicate of the first mark, used to confirm names can be re-used
+After creating each mark, the existence of these marks is validated by calling performance.getEntriesByName() (both with and without the entryType parameter provided), performance.getEntriesByType(), and performance.getEntries()
+
+PASS window.performance is defined 
+PASS window.performance.getEntriesByName(&quot;mark1&quot;)[0].name == &quot;mark1&quot; 
+PASS window.performance.getEntriesByName(&quot;mark1&quot;)[0].startTime is approximately correct (up to 20ms difference allowed) 
+PASS window.performance.getEntriesByName(&quot;mark1&quot;)[0].entryType == &quot;mark&quot; 
+PASS window.performance.getEntriesByName(&quot;mark1&quot;)[0].duration == 0 
+PASS window.performance.getEntriesByName(&quot;mark1&quot;)[1].name == &quot;mark1&quot; 
+PASS window.performance.getEntriesByName(&quot;mark1&quot;)[1].startTime is approximately correct (up to 20ms difference allowed) 
+PASS window.performance.getEntriesByName(&quot;mark1&quot;)[1].entryType == &quot;mark&quot; 
+PASS window.performance.getEntriesByName(&quot;mark1&quot;)[1].duration == 0 
+PASS window.performance.getEntriesByName(&quot;mark1&quot;, &quot;mark&quot;) returns an object containing the &quot;mark1&quot; mark in the correct order 
+PASS window.performance.getEntriesByName(&quot;mark1&quot;, &quot;mark&quot;) returns an object containing the duplicate &quot;mark1&quot; mark in the correct order 
+PASS The &quot;mark1&quot; mark returned by window.performance.getEntriesByName(&quot;mark1&quot;, &quot;mark&quot;) matches the the &quot;mark1&quot; mark returned by window.performance.getEntriesByName(&quot;mark1&quot;) 
+PASS The duplicate &quot;mark1&quot; mark returned by window.performance.getEntriesByName(&quot;mark1&quot;, &quot;mark&quot;) matches the the duplicate &quot;mark1&quot; mark returned by window.performance.getEntriesByName(&quot;mark1&quot;) 
+PASS window.performance.getEntries() returns an object containing the original &quot;mark1&quot; mark in the correct order 
+PASS window.performance.getEntries() returns an object containing the duplicate &quot;mark1&quot; mark in the correct order 
+PASS The &quot;mark1&quot; mark returned by window.performance.getEntries() matches the the &quot;mark1&quot; mark returned by window.performance.getEntriesByName(&quot;mark1&quot;) 
+PASS The &quot;mark1&quot; mark returned by window.performance.getEntries() matches the the duplicate &quot;mark1&quot; mark returned by window.performance.getEntriesByName(&quot;mark1&quot;) 
+PASS window.performance.getEntriesByType(&quot;mark&quot;) returns an object containing the original &quot;mark1&quot; mark in the correct order 
+PASS window.performance.getEntriesByType(&quot;mark&quot;) returns an object containing the duplicate &quot;mark1&quot; mark in the correct order 
+PASS The &quot;mark1&quot; mark returned by window.performance.getEntriesByType(&quot;mark&quot;) matches the the &quot;mark1&quot; mark returned by window.performance.getEntriesByName(&quot;mark1&quot;) 
+PASS The &quot;mark1&quot; mark returned by window.performance.getEntriesByType(&quot;mark&quot;) matches the the duplicate &quot;mark1&quot; mark returned by window.performance.getEntriesByName(&quot;mark1&quot;) 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_markhtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,228 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;UTF-8&quot; /&gt;
+        &lt;title&gt;window.performance User Timing mark() method is working properly&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Microsoft&quot; href=&quot;http://www.microsoft.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#dom-performance-mark&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+
+    &lt;script type=&quot;text/javascript&quot;&gt;
+        // test data
+        var markTestDelay = 200;
+        var testThreshold = 20;
+        var marks;
+
+        var TEST_MARKS =
+        [
+            {
+                name:                   &quot;mark1&quot;,
+                expectedStartTime:      undefined,
+                entryMatch:             undefined
+            },
+            {
+                name:                   &quot;mark1&quot;,
+                expectedStartTime:      undefined,
+                entryMatch:             undefined
+            }
+        ];
+
+        setup({explicit_done: true});
+
+        test_namespace();
+
+        function onload_test()
+        {
+            // test for existance of User Timing and Performance Timeline interface
+            if (window.performance.mark == undefined ||
+                window.performance.clearMarks == undefined ||
+                window.performance.measure == undefined ||
+                window.performance.clearMeasures == undefined ||
+                window.performance.getEntriesByName == undefined ||
+                window.performance.getEntriesByType == undefined ||
+                window.performance.getEntries == undefined)
+            {
+                test_true(false,
+                          &quot;The User Timing and Performance Timeline interfaces, which are required for this test, &quot; +
+                          &quot;are defined.&quot;);
+
+                done();
+            }
+            else
+            {
+                // create first mark
+                window.performance.mark(TEST_MARKS[0].name);
+
+                // record the time that this mark is created; this should correspond to the mark's startTime
+                TEST_MARKS[0].expectedStartTime = (new Date()) - window.performance.timing.navigationStart;
+
+                // create the duplicate mark using the test delay; the duplicate mark's value should be equivalent to
+                // the loadEventStart navigation timing attribute plus the test delay
+                setTimeout(mark_test_cb, markTestDelay);
+            }
+        }
+
+        function mark_test_cb()
+        {
+            var getByNameScenarios = new Array();
+
+            // create second, duplicate mark
+            window.performance.mark(TEST_MARKS[1].name);
+
+            // record the time that this mark is created; this should correspond to the mark's startTime
+            TEST_MARKS[1].expectedStartTime = (new Date()) - window.performance.timing.navigationStart;
+
+            // test the test marks are returned by getEntriesByName
+            entries = window.performance.getEntriesByName(TEST_MARKS[0].name);
+            test_mark(entries[0],
+                      &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot;)[0]&quot;,
+                      TEST_MARKS[0].name,
+                      TEST_MARKS[0].expectedStartTime);
+            TEST_MARKS[0].entryMatch = entries[0];
+
+            test_mark(entries[1],
+                      &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot;)[1]&quot;,
+                      TEST_MARKS[1].name,
+                      TEST_MARKS[1].expectedStartTime);
+            TEST_MARKS[1].entryMatch = entries[1];
+
+            // test the test marks are returned by getEntriesByName with the entryType parameter provided
+            entries = window.performance.getEntriesByName(TEST_MARKS[0].name, &quot;mark&quot;);
+            test_equals(entries[0].name, TEST_MARKS[0].name,
+                        &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot;, \&quot;mark\&quot;) returns an &quot; +
+                        &quot;object containing the \&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot; mark in the correct order&quot;);
+
+            test_equals(entries[1].name, TEST_MARKS[1].name,
+                        &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot;, \&quot;mark\&quot;) returns an &quot; +
+                        &quot;object containing the duplicate \&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot; mark in the correct order&quot;);
+
+            test_true(match_entries(entries[0], TEST_MARKS[0].entryMatch),
+                      &quot;The \&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot; mark returned by &quot; +
+                      &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot;, \&quot;mark\&quot;) matches the &quot; +
+                      &quot;the \&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot; mark returned by &quot; +
+                      &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot;)&quot;);
+
+            test_true(match_entries(entries[1], TEST_MARKS[1].entryMatch),
+                      &quot;The duplicate \&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot; mark returned by &quot; +
+                      &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot;, \&quot;mark\&quot;) matches the &quot; +
+                      &quot;the duplicate \&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot; mark returned by &quot; +
+                      &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot;)&quot;);
+
+            // test the test marks are returned by getEntries
+            entries = get_test_entries(window.performance.getEntries(), &quot;mark&quot;);
+
+            test_equals(entries[0].name, TEST_MARKS[0].name,
+                        &quot;window.performance.getEntries() returns an object containing the original \&quot;&quot; +
+                        TEST_MARKS[0].name + &quot;\&quot; mark in the correct order&quot;);
+
+            test_equals(entries[1].name, TEST_MARKS[1].name,
+                        &quot;window.performance.getEntries() returns an object containing the duplicate \&quot;&quot; +
+                        TEST_MARKS[1].name + &quot;\&quot; mark in the correct order&quot;);
+
+            test_true(match_entries(entries[0], TEST_MARKS[0].entryMatch),
+                      &quot;The \&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot; mark returned by &quot; +
+                      &quot;window.performance.getEntries() matches the the \&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot; mark returned &quot; +
+                      &quot;by window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot;)&quot;);
+
+            test_true(match_entries(entries[1], TEST_MARKS[1].entryMatch),
+                      &quot;The \&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot; mark returned by &quot; +
+                      &quot;window.performance.getEntries() matches the the duplicate \&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot; mark &quot; +
+                      &quot;returned by window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot;)&quot;);
+
+            // test the test marks are returned by getEntriesByType
+            entries = window.performance.getEntriesByType(&quot;mark&quot;);
+
+            test_equals(entries[0].name, TEST_MARKS[0].name,
+                        &quot;window.performance.getEntriesByType(\&quot;mark\&quot;) returns an object containing the original \&quot;&quot; +
+                        TEST_MARKS[0].name + &quot;\&quot; mark in the correct order&quot;);
+
+            test_equals(entries[1].name, TEST_MARKS[1].name,
+                        &quot;window.performance.getEntriesByType(\&quot;mark\&quot;) returns an object containing the duplicate \&quot;&quot; +
+                        TEST_MARKS[1].name + &quot;\&quot; mark in the correct order&quot;);
+
+            test_true(match_entries(entries[0], TEST_MARKS[0].entryMatch),
+                      &quot;The \&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot; mark returned by &quot; +
+                      &quot;window.performance.getEntriesByType(\&quot;mark\&quot;) matches the the \&quot;&quot; + TEST_MARKS[0].name +
+                      &quot;\&quot; mark returned by window.performance.getEntriesByName(\&quot;&quot; + TEST_MARKS[0].name + &quot;\&quot;)&quot;);
+
+            test_true(match_entries(entries[1], TEST_MARKS[1].entryMatch),
+                      &quot;The \&quot;&quot; + TEST_MARKS[1].name + &quot;\&quot; mark returned by &quot; +
+                      &quot;window.performance.getEntriesByType(\&quot;mark\&quot;) matches the the duplicate \&quot;&quot; +
+                      TEST_MARKS[1].name + &quot;\&quot; mark returned by window.performance.getEntriesByName(\&quot;&quot; +
+                      TEST_MARKS[1].name + &quot;\&quot;)&quot;);
+
+            done();
+        }
+
+        function match_entries(entry1, entry2)
+        {
+            var pass = true;
+
+            // match name
+            pass = pass &amp;&amp; (entry1.name == entry2.name);
+
+            // match startTime
+            pass = pass &amp;&amp; (entry1.startTime == entry2.startTime);
+
+            // match entryType
+            pass = pass &amp;&amp; (entry1.entryType == entry2.entryType);
+
+            // match duration
+            pass = pass &amp;&amp; (entry1.duration == entry2.duration);
+
+            return pass;
+        }
+
+        function test_mark(markEntry, markEntryCommand, expectedName, expectedStartTime)
+        {
+            // test name
+            test_equals(markEntry.name, expectedName, markEntryCommand + &quot;.name == \&quot;&quot; + expectedName + &quot;\&quot;&quot;);
+
+            // test startTime, allow for an acceptable threshold in the difference between the startTime and the
+            // expected value for the startTime (loadEventStart + markTestDelay)
+            test_true(Math.abs(markEntry.startTime - expectedStartTime) &lt;= testThreshold,
+                      markEntryCommand + &quot;.startTime is approximately correct (up to &quot; + testThreshold +
+                      &quot;ms difference allowed)&quot;);
+
+            // verify entryType
+            test_equals(markEntry.entryType, &quot;mark&quot;, markEntryCommand + &quot;.entryType == \&quot;mark\&quot;&quot;);
+
+            // verify duration
+            test_equals(markEntry.duration, 0, markEntryCommand + &quot;.duration == 0&quot;);
+        }
+
+        function get_test_entries(entryList, entryType)
+        {
+            var testEntries = new Array();
+
+            // filter entryList
+            for (var i in entryList)
+            {
+                if (entryList[i].entryType == entryType)
+                {
+                    testEntries.push(entryList[i]);
+                }
+            }
+
+            return testEntries;
+        }
+    &lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body onload=&quot;onload_test();&quot;&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates that the performance.mark() method is working properly. This test creates the
+           following marks to test this method:
+            &lt;ul&gt;
+                &lt;li&gt;&quot;mark1&quot;: created using a normal mark() call&lt;/li&gt;
+                &lt;li&gt;&quot;mark1&quot;: duplicate of the first mark, used to confirm names can be re-used&lt;/li&gt;
+            &lt;/ul&gt;
+           After creating each mark, the existence of these marks is validated by calling
+           performance.getEntriesByName() (both with and without the entryType parameter provided),
+           performance.getEntriesByType(), and performance.getEntries()
+        &lt;/p&gt;
+
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributesexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,9 @@
</span><ins>+Description
+
+This test validates exception scenarios of invoking mark() and measure() with timing attributes as value.
+
+
+PASS window.performance is defined 
+FAIL performance.mark and performance.measure should throw if used with timing attribute values assert_throws: function &quot;function () { window.performance.measure(timingAttributes...&quot; did not throw
+FAIL performance.mark and performance.measure should not throw if used with timing attribute values in workers Can't find variable: performance
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributeshtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,32 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+  &lt;head&gt;
+    &lt;meta charset=&quot;utf-8&quot; /&gt;
+    &lt;title&gt;exception test of performance.mark and performance.measure&lt;/title&gt;
+    &lt;meta rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#extensions-performance-interface&quot;/&gt;
+    &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+    &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+    &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+  &lt;/head&gt;
+  &lt;body&gt;
+    &lt;script&gt;
+    setup({explicit_done: true});
+    test_namespace();
+
+    test(function() {
+      for (var i in timingAttributes) {
+        assert_throws(&quot;SyntaxError&quot;, function() { window.performance.mark(timingAttributes[i]); });
+        assert_throws(&quot;SyntaxError&quot;, function() { window.performance.measure(timingAttributes[i]); });
+      }
+    }, &quot;performance.mark and performance.measure should throw if used with timing attribute values&quot;);
+
+    fetch_tests_from_worker(new Worker(&quot;test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.js&quot;));
+
+    done();
+
+    &lt;/script&gt;
+    &lt;h1&gt;Description&lt;/h1&gt;
+    &lt;p&gt;This test validates exception scenarios of invoking mark() and measure() with timing attributes as value.&lt;/p&gt;
+    &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+  &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributesjs"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.js (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.js                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.js        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,14 @@
</span><ins>+importScripts(&quot;/resources/testharness.js&quot;);
+importScripts(&quot;resources/webperftestharness.js&quot;);
+
+test(function() {
+  for (var i in timingAttributes) {
+    performance.mark(timingAttributes[i]);
+    performance.clearMarks(timingAttributes[i]);
+
+    performance.measure(timingAttributes[i]);
+    performance.clearMeasures(timingAttributes[i]);
+  }
+}, &quot;performance.mark and performance.measure should not throw if used with timing attribute values in workers&quot;);
+
+done();
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_and_measure_exception_when_invoke_without_parameterexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,11 @@
</span><ins>+Description
+
+This test validates exception scenarios of invoking mark() and measure() without parameter.
+
+
+PASS window.performance is defined 
+PASS window.performance.mark() threw an exception when invoke without a parameter. 
+PASS window.performance.mark() threw a TYPE_ERR exception when invoke without a parameter. 
+PASS window.performance.measure() threw an exception when invoke without a parameter. 
+PASS window.performance.measure() threw a TYPE_ERR exception when invoke without a parameter. 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_and_measure_exception_when_invoke_without_parameterhtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,67 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;utf-8&quot; /&gt;
+        &lt;title&gt;exception test of performance.mark and performance.measure&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Intel&quot; href=&quot;http://www.intel.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#extensions-performance-interface&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script&gt;
+        setup({explicit_done: true});
+        test_namespace();
+
+        function onload_test() {
+            if (window.performance !== undefined &amp;&amp; window.performance.mark !== undefined)
+            {
+                try
+                {
+                    window.performance.mark();
+                    test_true(false, &quot;window.performance.mark() threw an exception when invoke without a parameter.&quot;);
+                }
+                catch(e)
+                {
+                    test_true(true, &quot;window.performance.mark() threw an exception when invoke without a parameter.&quot;);
+
+                    test_equals(e.name,
+                                &quot;TypeError&quot;,
+                                &quot;window.performance.mark() threw a TYPE_ERR exception when invoke without a parameter.&quot;);
+                }
+            }
+            else
+            {
+                    test_true(false, &quot;window.performance.mark() interface is not supported!&quot;);
+            }
+
+            if (window.performance !== undefined &amp;&amp; window.performance.measure !== undefined)
+            {
+                try
+                {
+                    window.performance.measure();
+                    test_true(false, &quot;window.performance.measure() threw an exception when invoke without a parameter.&quot;);
+                }
+                catch(e)
+                {
+                    test_true(true, &quot;window.performance.measure() threw an exception when invoke without a parameter.&quot;);
+
+                    test_equals(e.name,
+                                &quot;TypeError&quot;,
+                                &quot;window.performance.measure() threw a TYPE_ERR exception when invoke without a parameter.&quot;);
+                }
+            }
+            else
+            {
+                    test_true(false, &quot;window.performance.measure() interface is not supported!&quot;);
+            }
+
+            done();
+        }
+        &lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body onload=&quot;onload_test();&quot;&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates exception scenarios of invoking mark() and measure() without parameter.&lt;/p&gt;
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_exceptionsexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_exceptions-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_exceptions-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_exceptions-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,47 @@
</span><ins>+Description
+
+This test validates that the performance.mark() method throws a SYNTAX_ERR exception whenever a navigation timing attribute is provided for the name parameter.
+
+
+PASS window.performance is defined 
+PASS window.performance.mark(&quot;connectEnd&quot;) threw an exception. 
+PASS window.performance.mark(&quot;connectEnd&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;connectStart&quot;) threw an exception. 
+PASS window.performance.mark(&quot;connectStart&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;domComplete&quot;) threw an exception. 
+PASS window.performance.mark(&quot;domComplete&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;domContentLoadedEventEnd&quot;) threw an exception. 
+PASS window.performance.mark(&quot;domContentLoadedEventEnd&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;domContentLoadedEventStart&quot;) threw an exception. 
+PASS window.performance.mark(&quot;domContentLoadedEventStart&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;domInteractive&quot;) threw an exception. 
+PASS window.performance.mark(&quot;domInteractive&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;domLoading&quot;) threw an exception. 
+PASS window.performance.mark(&quot;domLoading&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;domainLookupEnd&quot;) threw an exception. 
+PASS window.performance.mark(&quot;domainLookupEnd&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;domainLookupStart&quot;) threw an exception. 
+PASS window.performance.mark(&quot;domainLookupStart&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;fetchStart&quot;) threw an exception. 
+PASS window.performance.mark(&quot;fetchStart&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;loadEventEnd&quot;) threw an exception. 
+PASS window.performance.mark(&quot;loadEventEnd&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;loadEventStart&quot;) threw an exception. 
+PASS window.performance.mark(&quot;loadEventStart&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;navigationStart&quot;) threw an exception. 
+PASS window.performance.mark(&quot;navigationStart&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;redirectEnd&quot;) threw an exception. 
+PASS window.performance.mark(&quot;redirectEnd&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;redirectStart&quot;) threw an exception. 
+PASS window.performance.mark(&quot;redirectStart&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;requestStart&quot;) threw an exception. 
+PASS window.performance.mark(&quot;requestStart&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;responseEnd&quot;) threw an exception. 
+PASS window.performance.mark(&quot;responseEnd&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;responseStart&quot;) threw an exception. 
+PASS window.performance.mark(&quot;responseStart&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;unloadEventEnd&quot;) threw an exception. 
+PASS window.performance.mark(&quot;unloadEventEnd&quot;) threw a SYNTAX_ERR exception. 
+PASS window.performance.mark(&quot;unloadEventStart&quot;) threw an exception. 
+PASS window.performance.mark(&quot;unloadEventStart&quot;) threw a SYNTAX_ERR exception. 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_exceptionshtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_exceptions.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_exceptions.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_exceptions.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,105 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;UTF-8&quot; /&gt;
+        &lt;title&gt;window.performance User Timing mark() method is throwing the proper exceptions&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Microsoft&quot; href=&quot;http://www.microsoft.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#dom-performance-mark&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+
+    &lt;script type=&quot;text/javascript&quot;&gt;
+        // navigation timing attributes
+        var timingAttributes = [
+            'connectEnd',
+            'connectStart',
+            'domComplete',
+            'domContentLoadedEventEnd',
+            'domContentLoadedEventStart',
+            'domInteractive',
+            'domLoading',
+            'domainLookupEnd',
+            'domainLookupStart',
+            'fetchStart',
+            'loadEventEnd',
+            'loadEventStart',
+            'navigationStart',
+            'redirectEnd',
+            'redirectStart',
+            'requestStart',
+            'responseEnd',
+            'responseStart',
+            'unloadEventEnd',
+            'unloadEventStart'
+        ];
+
+        // test data
+        var markExceptionThrown = false;
+
+        setup({explicit_done: true});
+
+        test_namespace();
+
+        function onload_test()
+        {
+            // test for existance of User Timing and Performance Timeline interface
+            if (window.performance.mark == undefined ||
+                window.performance.clearMarks == undefined ||
+                window.performance.measure == undefined ||
+                window.performance.clearMeasures == undefined ||
+                window.performance.getEntriesByName == undefined ||
+                window.performance.getEntriesByType == undefined ||
+                window.performance.getEntries == undefined)
+            {
+                test_true(false,
+                          &quot;The User Timing and Performance Timeline interfaces, which are required for this test, &quot; +
+                          &quot;are defined.&quot;);
+
+                done();
+            }
+            else
+            {
+                test_mark_exceptions();
+            }
+        }
+
+        function test_mark_exceptions()
+        {
+            // loop through mark scenarios
+            for (var i in timingAttributes)
+            {
+                try
+                {
+                    // create the mark
+                    window.performance.mark(timingAttributes[i]);
+
+                    test_true(false,
+                              &quot;window.performance.mark(\&quot;&quot; + timingAttributes[i] + &quot;\&quot;) threw an exception.&quot;);
+                }
+                catch(e)
+                {
+                    test_true(true,
+                              &quot;window.performance.mark(\&quot;&quot; + timingAttributes[i] + &quot;\&quot;) threw an exception.&quot;);
+
+                    // confirm that a SYNTAX_ERR exception is thrown and not any other exception
+                    test_equals(e.code,
+                                e.SYNTAX_ERR,
+                                &quot;window.performance.mark(\&quot;&quot; + timingAttributes[i] + &quot;\&quot;) threw a SYNTAX_ERR &quot; +
+                                &quot;exception.&quot;);
+                }
+            }
+
+            done();
+        }
+    &lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body onload=&quot;onload_test();&quot;&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates that the performance.mark() method throws a SYNTAX_ERR exception whenever a navigation
+           timing attribute is provided for the name parameter.
+        &lt;/p&gt;
+
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_with_name_of_navigation_timing_optional_attributeexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,9 @@
</span><ins>+Description
+
+This test validates exception scenarios of invoking performance.mark() with param of &quot;secureConnectionStart&quot;.
+
+
+PASS window.performance is defined 
+PASS window.performance.mark(&quot;secureConnectionStart&quot;) threw an exception when secureConnectionStart attribute of Navigation Timing is supported. 
+PASS window.performance.mark(&quot;secureConnectionStart&quot;) threw a SYNTAX_ERR when secureConnectionStart attribute of Navigation Timing is supported. 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_mark_with_name_of_navigation_timing_optional_attributehtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,55 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;utf-8&quot; /&gt;
+        &lt;title&gt;exception test of performance.mark&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Intel&quot; href=&quot;http://www.intel.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#extensions-performance-interface&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharnessextension.js&quot;&gt;&lt;/script&gt;
+        &lt;script&gt;
+        setup({explicit_done: true});
+        test_namespace();
+
+        function onload_test() {
+            if (window.performance !== undefined &amp;&amp; window.performance.timing !== undefined &amp;&amp; window.performance.timing.secureConnectionStart !== undefined)
+            {
+                var context = new PerformanceContext(window.performance);
+                var optionalAttribute = &quot;secureConnectionStart&quot;;
+                try
+                {
+                    context.mark(optionalAttribute);
+                    test_true(false,
+                              &quot;window.performance.mark(\&quot;&quot; + optionalAttribute + &quot;\&quot;) threw an exception when &quot; +
+                              optinalAttribute + &quot; attribute of Navigation Timing is supported.&quot;);
+                }
+                catch(e)
+                {
+                    test_true(true,
+                              &quot;window.performance.mark(\&quot;&quot; + optionalAttribute + &quot;\&quot;) threw an exception when &quot; +
+                              optionalAttribute + &quot; attribute of Navigation Timing is supported.&quot;);
+
+                    // confirm that a SYNTAX_ERR exception is thrown and not any other exception
+                    test_equals(e.code,
+                                e.SYNTAX_ERR,
+                                &quot;window.performance.mark(\&quot;&quot; + optionalAttribute + &quot;\&quot;) threw a SYNTAX_ERR when &quot; +
+                              optionalAttribute + &quot; attribute of Navigation Timing is supported.&quot;);
+                }
+            }
+            else
+            {
+                test_true(true,
+                          &quot;This test is ignored when secureConnectionStart attribute of Navigation Timing is not supported.&quot;);
+            }
+            done();
+        }
+        &lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body onload=onload_test()&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates exception scenarios of invoking performance.mark() with param of &quot;secureConnectionStart&quot;.&lt;/p&gt;
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measureexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,42 @@
</span><ins>+Description
+
+This test validates that the performance.measure() method is working properly. This test creates the following measures to test this method:
+
+&quot;measure_no_start_no_end&quot;: created using a measure() call without a startMark or endMark provided
+&quot;measure_start_no_end&quot;: created using a measure() call with only the startMark provided
+&quot;measure_start_end&quot;: created using a measure() call with both a startMark or endMark provided
+&quot;measure_no_start_no_end&quot;: duplicate of the first measure, used to confirm names can be re-used
+After creating each measure, the existence of these measures is validated by calling performance.getEntriesByName() (both with and without the entryType parameter provided), performance.getEntriesByType(), and performance.getEntries()
+
+PASS window.performance is defined 
+PASS window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[0].name == &quot;measure_no_start_no_end&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[0].startTime is correct 
+PASS window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[0].entryType == &quot;measure&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[0].duration is approximately correct (up to 20ms difference allowed) 
+PASS window.performance.getEntriesByName(&quot;measure_start_no_end&quot;)[0].name == &quot;measure_start_no_end&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_start_no_end&quot;)[0].startTime is correct 
+PASS window.performance.getEntriesByName(&quot;measure_start_no_end&quot;)[0].entryType == &quot;measure&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_start_no_end&quot;)[0].duration is approximately correct (up to 20ms difference allowed) 
+PASS window.performance.getEntriesByName(&quot;measure_start_end&quot;)[0].name == &quot;measure_start_end&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_start_end&quot;)[0].startTime is correct 
+PASS window.performance.getEntriesByName(&quot;measure_start_end&quot;)[0].entryType == &quot;measure&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_start_end&quot;)[0].duration is approximately correct (up to 20ms difference allowed) 
+PASS window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[1].name == &quot;measure_no_start_no_end&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[1].startTime is correct 
+PASS window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[1].entryType == &quot;measure&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[1].duration is approximately correct (up to 20ms difference allowed) 
+PASS window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;, &quot;measure&quot;)[0] returns an object containing the &quot;measure_no_start_no_end&quot; measure in the correct order, and its value matches the &quot;measure_no_start_no_end&quot; measure returned by window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;) 
+PASS window.performance.getEntriesByName(&quot;measure_start_no_end&quot;, &quot;measure&quot;)[0] returns an object containing the &quot;measure_start_no_end&quot; measure in the correct order, and its value matches the &quot;measure_start_no_end&quot; measure returned by window.performance.getEntriesByName(&quot;measure_start_no_end&quot;) 
+PASS window.performance.getEntriesByName(&quot;measure_start_end&quot;, &quot;measure&quot;)[0] returns an object containing the &quot;measure_start_end&quot; measure in the correct order, and its value matches the &quot;measure_start_end&quot; measure returned by window.performance.getEntriesByName(&quot;measure_start_end&quot;) 
+PASS window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;, &quot;measure&quot;)[1] returns an object containing the &quot;measure_no_start_no_end&quot; measure in the correct order, and its value matches the &quot;measure_no_start_no_end&quot; measure returned by window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;) 
+PASS window.performance.getEntries() returns an object containing the &quot;measure_no_start_no_end&quot; measure, and it's value matches the measure returned by window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[0]. 
+PASS window.performance.getEntries() returns an object containing the &quot;measure_start_no_end&quot; measure, and it's value matches the measure returned by window.performance.getEntriesByName(&quot;measure_start_no_end&quot;)[0]. 
+PASS window.performance.getEntries() returns an object containing the &quot;measure_start_end&quot; measure, and it's value matches the measure returned by window.performance.getEntriesByName(&quot;measure_start_end&quot;)[0]. 
+PASS window.performance.getEntries() returns an object containing the &quot;measure_no_start_no_end&quot; measure, and it's value matches the measure returned by window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[1]. 
+PASS window.performance.getEntries() returns an object containing all test measures in order. 
+PASS window.performance.getEntriesByType(&quot;measure&quot;) returns an object containing the &quot;measure_no_start_no_end&quot; measure, and it's value matches the measure returned by window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[0]. 
+PASS window.performance.getEntriesByType(&quot;measure&quot;) returns an object containing the &quot;measure_start_no_end&quot; measure, and it's value matches the measure returned by window.performance.getEntriesByName(&quot;measure_start_no_end&quot;)[0]. 
+PASS window.performance.getEntriesByType(&quot;measure&quot;) returns an object containing the &quot;measure_start_end&quot; measure, and it's value matches the measure returned by window.performance.getEntriesByName(&quot;measure_start_end&quot;)[0]. 
+PASS window.performance.getEntriesByType(&quot;measure&quot;) returns an object containing the &quot;measure_no_start_no_end&quot; measure, and it's value matches the measure returned by window.performance.getEntriesByName(&quot;measure_no_start_no_end&quot;)[1]. 
+PASS window.performance.getEntriesByType(&quot;measure&quot;) returns an object containing all test measures in order. 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measurehtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,334 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;UTF-8&quot; /&gt;
+        &lt;title&gt;window.performance User Timing measure() method is working properly&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Microsoft&quot; href=&quot;http://www.microsoft.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#dom-performance-measure&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+
+    &lt;script type=&quot;text/javascript&quot;&gt;
+        // test data
+        var startMarkName = &quot;mark_start&quot;;
+        var startMarkValue;
+        var endMarkName = &quot;mark_end&quot;;
+        var endMarkValue;
+        var measures;
+        var testThreshold = 20;
+
+        // test measures
+        var measureTestDelay = 200;
+        var TEST_MEASURES =
+        [
+            {
+                name:                   &quot;measure_no_start_no_end&quot;,
+                startMark:              undefined,
+                endMark:                undefined,
+                startTime:              undefined,
+                duration:               undefined,
+                entryType:              &quot;measure&quot;,
+                entryMatch:             undefined,
+                order:                  undefined,
+                found:                  false
+            },
+            {
+                name:                   &quot;measure_start_no_end&quot;,
+                startMark:              &quot;mark_start&quot;,
+                endMark:                undefined,
+                startTime:              undefined,
+                duration:               undefined,
+                entryType:              &quot;measure&quot;,
+                entryMatch:             undefined,
+                order:                  undefined,
+                found:                  false
+            },
+            {
+                name:                   &quot;measure_start_end&quot;,
+                startMark:              &quot;mark_start&quot;,
+                endMark:                &quot;mark_end&quot;,
+                startTime:              undefined,
+                duration:               undefined,
+                entryType:              &quot;measure&quot;,
+                entryMatch:             undefined,
+                order:                  undefined,
+                found:                  false
+            },
+            {
+                name:                   &quot;measure_no_start_no_end&quot;,
+                startMark:              undefined,
+                endMark:                undefined,
+                startTime:              undefined,
+                duration:               undefined,
+                entryType:              &quot;measure&quot;,
+                entryMatch:             undefined,
+                order:                  undefined,
+                found:                  false
+            }
+        ];
+
+        setup({explicit_done: true});
+
+        test_namespace();
+
+        function onload_test()
+        {
+            // test for existance of User Timing and Performance Timeline interface
+            if (window.performance.mark == undefined ||
+                window.performance.clearMarks == undefined ||
+                window.performance.measure == undefined ||
+                window.performance.clearMeasures == undefined ||
+                window.performance.getEntriesByName == undefined ||
+                window.performance.getEntriesByType == undefined ||
+                window.performance.getEntries == undefined)
+            {
+                test_true(false,
+                          &quot;The User Timing and Performance Timeline interfaces, which are required for this test, &quot; +
+                          &quot;are defined.&quot;);
+
+                done();
+            }
+            else
+            {
+                // create the start mark for the test measures
+                window.performance.mark(startMarkName);
+
+                // get the start mark's value
+                startMarkValue = window.performance.getEntriesByName(startMarkName)[0].startTime;
+
+                // create the test end mark using the test delay; this will allow for a significant difference between
+                // the mark values that should be represented in the duration of measures using these marks
+                setTimeout(measure_test_cb, measureTestDelay);
+            }
+        }
+
+        function measure_test_cb()
+        {
+            // create the end mark for the test measures
+            window.performance.mark(endMarkName);
+
+            // get the end mark's value
+            endMarkValue = window.performance.getEntriesByName(endMarkName)[0].startTime;
+
+            // loop through all measure scenarios and create the corresponding measures
+            for (var i in TEST_MEASURES)
+            {
+                var scenario = TEST_MEASURES[i];
+
+                if (scenario.startMark == undefined &amp;&amp; scenario.endMark == undefined)
+                {
+                    // both startMark and endMark are undefined, don't provide either parameters
+                    window.performance.measure(scenario.name);
+
+                    // when startMark isn't provided to the measure() call, a DOMHighResTimeStamp corresponding
+                    // to the navigationStart attribute with a timebase of the same attribute is used; this is
+                    // equivalent to 0
+                    scenario.startTime = 0;
+
+                    // when endMark isn't provided to the measure() call, a DOMHighResTimeStamp corresponding to
+                    // the current time with a timebase of the navigationStart attribute is used
+                    scenario.duration = (new Date()) - window.performance.timing.navigationStart;
+                }
+                else if (scenario.startMark != undefined &amp;&amp; scenario.endMark == undefined)
+                {
+                    // only startMark is defined, provide startMark and don't provide endMark
+                    window.performance.measure(scenario.name, scenario.startMark);
+
+                    // when startMark is provided to the measure() call, the value of the mark whose name is
+                    // provided is used for the startMark
+                    scenario.startTime = startMarkValue;
+
+                    // when endMark isn't provided to the measure() call, a DOMHighResTimeStamp corresponding to
+                    // the current time with a timebase of the navigationStart attribute is used
+                    scenario.duration = ((new Date()) - window.performance.timing.navigationStart) -
+                                                startMarkValue;
+                }
+                else if (scenario.startMark != undefined &amp;&amp; scenario.endMark != undefined)
+                {
+                    // both startMark and endMark are defined, provide both parameters
+                    window.performance.measure(scenario.name, scenario.startMark, scenario.endMark);
+
+                    // when startMark is provided to the measure() call, the value of the mark whose name is
+                    // provided is used for the startMark
+                    scenario.startTime = startMarkValue;
+
+                    // when endMark is provided to the measure() call, the value of the mark whose name is
+                    // provided is used for the startMark
+                    scenario.duration = endMarkValue - startMarkValue;
+                }
+            }
+
+            // test that expected measures are returned by getEntriesByName
+            for (var i in TEST_MEASURES)
+            {
+                entries = window.performance.getEntriesByName(TEST_MEASURES[i].name);
+                // for all test measures, the test will be validate the test measure against the first entry returned
+                // by getEntriesByName(), except for the last measure, where since it is a duplicate measure, the test
+                // will validate it against the second entry returned by getEntriesByName()
+                test_measure(entries[(i == 3 ? 1 : 0)],
+                            &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MEASURES[i].name + &quot;\&quot;)[&quot; +
+                            (i == 3 ? 1 : 0) + &quot;]&quot;,
+                            TEST_MEASURES[i].name,
+                            TEST_MEASURES[i].startTime,
+                            TEST_MEASURES[i].duration);
+                TEST_MEASURES[i].entryMatch = entries[(i == 3 ? 1 : 0)];
+            }
+
+            // test that expected measures are returned by getEntriesByName with the entryType parameter provided
+            for (var i in TEST_MEASURES)
+            {
+                entries = window.performance.getEntriesByName(TEST_MEASURES[i].name, &quot;measure&quot;);
+
+                test_true(match_entries(entries[(i == 3 ? 1 : 0)], TEST_MEASURES[i].entryMatch),
+                          &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MEASURES[i].name + &quot;\&quot;, \&quot;measure\&quot;)[&quot; +
+                          (i == 3 ? 1 : 0) + &quot;] returns an object containing the \&quot;&quot; + TEST_MEASURES[i].name +
+                          &quot;\&quot; measure in the correct order, and its value matches the \&quot;&quot; + TEST_MEASURES[i].name +
+                          &quot;\&quot; measure returned by window.performance.getEntriesByName(\&quot;&quot; + TEST_MEASURES[i].name +
+                          &quot;\&quot;)&quot;);
+            }
+
+            // test that expected measures are returned by getEntries
+            entries = get_test_entries(window.performance.getEntries(), &quot;measure&quot;);
+
+            test_measure_list(entries, &quot;window.performance.getEntries()&quot;, TEST_MEASURES);
+
+            // test that expected measures are returned by getEntriesByType
+            entries = window.performance.getEntriesByType(&quot;measure&quot;);
+
+            test_measure_list(entries, &quot;window.performance.getEntriesByType(\&quot;measure\&quot;)&quot;, TEST_MEASURES);
+
+            done();
+        }
+
+        function match_entries(entry1, entry2, threshold)
+        {
+            if (threshold == undefined)
+            {
+                threshold = 0;
+            }
+
+            var pass = true;
+
+            // match name
+            pass = pass &amp;&amp; (entry1.name == entry2.name);
+
+            // match startTime
+            pass = pass &amp;&amp; (Math.abs(entry1.startTime - entry2.startTime) &lt;= testThreshold);
+
+            // match entryType
+            pass = pass &amp;&amp; (entry1.entryType == entry2.entryType);
+
+            // match duration
+            pass = pass &amp;&amp; (Math.abs(entry1.duration - entry2.duration) &lt;= testThreshold);
+
+            return pass;
+        }
+
+        function test_measure(measureEntry, measureEntryCommand, expectedName, expectedStartTime, expectedDuration)
+        {
+            // test name
+            test_true(measureEntry.name == expectedName, measureEntryCommand + &quot;.name == \&quot;&quot; + expectedName + &quot;\&quot;&quot;);
+
+            // test startTime; since for a mark, the startTime is always equal to a mark's value or the value of a
+            // navigation timing attribute, the actual startTime should match the expected value exactly
+            test_true(Math.abs(measureEntry.startTime - expectedStartTime) == 0,
+                      measureEntryCommand + &quot;.startTime is correct&quot;);
+
+            // test entryType
+            test_true(measureEntry.entryType == &quot;measure&quot;, measureEntryCommand + &quot;.entryType == \&quot;measure\&quot;&quot;);
+
+            // test duration, allow for an acceptable threshold in the difference between the actual duration and the
+            // expected value for the duration
+            test_true(Math.abs(measureEntry.duration - expectedDuration) &lt;= testThreshold, measureEntryCommand +
+                      &quot;.duration is approximately correct (up to &quot; + testThreshold + &quot;ms difference allowed)&quot;);
+        }
+
+        function test_measure_list(measureEntryList, measureEntryListCommand, measureScenarios)
+        {
+            // give all entries a &quot;found&quot; property that can be set to ensure it isn't tested twice
+            for (var i in measureEntryList)
+            {
+                measureEntryList[i].found = false;
+            }
+
+            for (var i in measureScenarios)
+            {
+                measureScenarios[i].found = false;
+
+                for (var j in measureEntryList)
+                {
+                    if (match_entries(measureEntryList[j], measureScenarios[i]) &amp;&amp; !measureEntryList[j].found)
+                    {
+                        test_true(match_entries(measureEntryList[j], measureScenarios[i].entryMatch),
+                                  measureEntryListCommand + &quot; returns an object containing the \&quot;&quot; +
+                                  measureScenarios[i].name + &quot;\&quot; measure, and it's value matches the measure &quot; +
+                                  &quot;returned by window.performance.getEntriesByName(\&quot;&quot; + measureScenarios[i].name +
+                                  &quot;\&quot;)[&quot; + (i == 3 ? 1 : 0) + &quot;].&quot;);
+
+                        measureEntryList[j].found = true;
+                        measureScenarios[i].found = true;
+                        break;
+                    }
+                }
+
+                if (!measureScenarios[i].found)
+                {
+                    test_true(false,
+                              measureEntryListCommand + &quot; returns an object containing the \&quot;&quot; +
+                              measureScenarios[i].name + &quot;\&quot; measure.&quot;);
+                }
+            }
+
+            // verify order of output of getEntriesByType
+            var startTimeCurr = 0;
+            var pass = true;
+            for (var i in measureEntryList)
+            {
+                if (measureEntryList[i].startTime &lt; startTimeCurr)
+                {
+                    pass = false;
+                }
+                startTimeCurr = measureEntryList[i].startTime;
+            }
+            test_true(pass,
+                      measureEntryListCommand + &quot; returns an object containing all test &quot; +
+                      &quot;measures in order.&quot;);
+        }
+
+        function get_test_entries(entryList, entryType)
+        {
+            var testEntries = new Array();
+
+            // filter entryList
+            for (var i in entryList)
+            {
+                if (entryList[i].entryType == entryType)
+                {
+                    testEntries.push(entryList[i]);
+                }
+            }
+
+            return testEntries;
+        }
+    &lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body onload=&quot;onload_test();&quot;&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates that the performance.measure() method is working properly. This test creates the
+           following measures to test this method:
+            &lt;ul&gt;
+                &lt;li&gt;&quot;measure_no_start_no_end&quot;: created using a measure() call without a startMark or endMark
+                    provided&lt;/li&gt;
+                &lt;li&gt;&quot;measure_start_no_end&quot;: created using a measure() call with only the startMark provided&lt;/li&gt;
+                &lt;li&gt;&quot;measure_start_end&quot;: created using a measure() call with both a startMark or endMark provided&lt;/li&gt;
+                &lt;li&gt;&quot;measure_no_start_no_end&quot;: duplicate of the first measure, used to confirm names can be re-used&lt;/li&gt;
+            &lt;/ul&gt;
+           After creating each measure, the existence of these measures is validated by calling
+           performance.getEntriesByName() (both with and without the entryType parameter provided),
+           performance.getEntriesByType(), and performance.getEntries()
+        &lt;/p&gt;
+
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measure_exceptionsexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_exceptions-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_exceptions-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_exceptions-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,23 @@
</span><ins>+Description
+
+This test validates that the performance.measure() method throws a SYNTAX_ERR exception whenever a non-existent mark is provided as the startMark or endMark, and the method also throws a INVALID_ACCESS_ERR whenever a navigation timing attribute with a value of zero is provided as the startMark or endMark.
+
+
+PASS window.performance is defined 
+PASS window.performance.measure(&quot;measure&quot;, &quot;mark&quot;), where &quot;mark&quot; is a non-existent mark,  threw an exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;mark&quot;), where &quot;mark&quot; is a non-existent mark, threw a SYNTAX_ERR exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;mark&quot;, &quot;responseEnd&quot;), where &quot;mark&quot; is a non-existent mark, threw an exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;mark&quot;, &quot;responseEnd&quot;), where &quot;mark&quot; is a non-existent mark, threw a SYNTAX_ERR exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;navigationStart&quot;, &quot;mark&quot;), where &quot;mark&quot; is a non-existent mark, threw an exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;navigationStart&quot;, &quot;mark&quot;), where &quot;mark&quot; is a non-existent mark, threw a SYNTAX_ERR exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;mark&quot;, &quot;mark&quot;), where &quot;mark&quot; is a non-existent mark, threw an exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;mark&quot;, &quot;mark&quot;), where &quot;mark&quot; is a non-existent mark, threw a SYNTAX_ERR exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;unloadEventStart&quot;), where &quot;unloadEventStart&quot; is a navigation timing attribute with a value of 0, threw an exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;unloadEventStart&quot;), where &quot;unloadEventStart&quot; is a navigation timing attribute with a value of 0, threw an INVALID_ACCESS_ERR exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;unloadEventStart&quot;, &quot;responseEnd&quot;), where &quot;unloadEventStart&quot; is a navigation timing attribute with a value of 0, threw an exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;unloadEventStart&quot;, &quot;responseEnd&quot;), where &quot;unloadEventStart&quot; is a navigation timing attribute with a value of 0, threw an INVALID_ACCESS_ERR exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;navigationStart&quot;, &quot;unloadEventStart&quot;), where &quot;unloadEventStart&quot; is a navigation timing attribute with a value of 0, threw an exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;navigationStart&quot;, &quot;unloadEventStart&quot;), where &quot;unloadEventStart&quot; is a navigation timing attribute with a value of 0, threw an INVALID_ACCESS_ERR exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;unloadEventStart&quot;, &quot;unloadEventStart&quot;), where &quot;unloadEventStart&quot; is a navigation timing attribute with a value of 0, threw an exception. 
+PASS window.performance.measure(&quot;measure&quot;, &quot;unloadEventStart&quot;, &quot;unloadEventStart&quot;), where &quot;unloadEventStart&quot; is a navigation timing attribute with a value of 0, threw an INVALID_ACCESS_ERR exception. 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measure_exceptionshtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_exceptions.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_exceptions.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_exceptions.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,282 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;UTF-8&quot; /&gt;
+        &lt;title&gt;window.performance User Timing measure() method is throwing the proper exceptions&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Microsoft&quot; href=&quot;http://www.microsoft.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#dom-performance-measure&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+
+    &lt;script type=&quot;text/javascript&quot;&gt;
+        // navigation timing attributes
+        var timingAttributes = [
+            'connectEnd',
+            'connectStart',
+            'domComplete',
+            'domContentLoadedEventEnd',
+            'domContentLoadedEventStart',
+            'domInteractive',
+            'domLoading',
+            'domainLookupEnd',
+            'domainLookupStart',
+            'fetchStart',
+            'loadEventEnd',
+            'loadEventStart',
+            'navigationStart',
+            'redirectEnd',
+            'redirectStart',
+            'requestStart',
+            'responseEnd',
+            'responseStart',
+            'unloadEventEnd',
+            'unloadEventStart'
+        ];
+
+        // test data
+        var zeroedNavTimingAtt = undefined;
+
+        setup({explicit_done: true});
+
+        test_namespace();
+
+        function onload_test()
+        {
+            // test for existance of User Timing and Performance Timeline interface
+            if (window.performance.mark == undefined ||
+                window.performance.clearMarks == undefined ||
+                window.performance.measure == undefined ||
+                window.performance.clearMeasures == undefined ||
+                window.performance.getEntriesByName == undefined ||
+                window.performance.getEntriesByType == undefined ||
+                window.performance.getEntries == undefined)
+            {
+                test_true(false,
+                          &quot;The User Timing and Performance Timeline interfaces, which are required for this test, &quot; +
+                          &quot;are defined.&quot;);
+
+                done();
+            }
+            else
+            {
+                test_measure_exceptions();
+            }
+        }
+
+        function test_measure_exceptions()
+        {
+            // test scenarios for the SYNTAX_ERR exception
+            try
+            {
+                // create the measure
+                window.performance.measure(&quot;measure&quot;, &quot;mark&quot;);
+
+                test_true(false,
+                          &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;mark\&quot;), where \&quot;mark\&quot; is a non-existent mark, &quot; +
+                          &quot;threw an exception.&quot;);
+            }
+            catch(e)
+            {
+                test_true(true,
+                          &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;mark\&quot;), where \&quot;mark\&quot; is a non-existent mark, &quot; +
+                          &quot; threw an exception.&quot;);
+
+                test_equals(e.code,
+                            e.SYNTAX_ERR,
+                            &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;mark\&quot;), where \&quot;mark\&quot; is a non-existent &quot; +
+                            &quot;mark, threw a SYNTAX_ERR exception.&quot;);
+            }
+
+            try
+            {
+                // create the measure
+                window.performance.measure(&quot;measure&quot;, &quot;mark&quot;, &quot;responseEnd&quot;);
+
+                test_true(false,
+                          &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;mark\&quot;, \&quot;responseEnd\&quot;), where \&quot;mark\&quot; is a &quot; +
+                          &quot;non-existent mark, threw an exception.&quot;);
+            }
+            catch(e)
+            {
+                test_true(true,
+                          &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;mark\&quot;, \&quot;responseEnd\&quot;), where \&quot;mark\&quot; is a &quot; +
+                          &quot;non-existent mark, threw an exception.&quot;);
+
+                test_equals(e.code,
+                            e.SYNTAX_ERR,
+                            &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;mark\&quot;, \&quot;responseEnd\&quot;), where \&quot;mark\&quot; is a &quot; +
+                            &quot;non-existent mark, threw a SYNTAX_ERR exception.&quot;);
+            }
+
+            try
+            {
+                // create the measure
+                window.performance.measure(&quot;measure&quot;, &quot;navigationStart&quot;, &quot;mark&quot;);
+
+                test_true(false,
+                          &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;navigationStart\&quot;, \&quot;mark\&quot;), where \&quot;mark\&quot; is &quot; +
+                          &quot;a non-existent mark, threw an exception.&quot;);
+            }
+            catch(e)
+            {
+                test_true(true,
+                          &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;navigationStart\&quot;, \&quot;mark\&quot;), where \&quot;mark\&quot; is &quot; +
+                          &quot;a non-existent mark, threw an exception.&quot;);
+
+                test_equals(e.code,
+                            e.SYNTAX_ERR,
+                            &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;navigationStart\&quot;, \&quot;mark\&quot;), where \&quot;mark\&quot; &quot; +
+                            &quot;is a non-existent mark, threw a SYNTAX_ERR exception.&quot;);
+            }
+
+            try
+            {
+                // create the measure
+                window.performance.measure(&quot;measure&quot;, &quot;mark&quot;, &quot;mark&quot;);
+
+                test_true(false,
+                          &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;mark\&quot;, \&quot;mark\&quot;), where \&quot;mark\&quot; is a &quot; +
+                          &quot;non-existent mark, threw an exception.&quot;);
+            }
+            catch(e)
+            {
+                test_true(true,
+                          &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;mark\&quot;, \&quot;mark\&quot;), where \&quot;mark\&quot; is a &quot; +
+                          &quot;non-existent mark, threw an exception.&quot;);
+
+                test_equals(e.code,
+                            e.SYNTAX_ERR,
+                            &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;mark\&quot;, \&quot;mark\&quot;), where \&quot;mark\&quot; is a &quot; +
+                            &quot;non-existent mark, threw a SYNTAX_ERR exception.&quot;);
+            }
+
+
+            // for testing the INVALID_ACCESS_ERR exception, find a navigation timing attribute with a value of zero
+            for (var i in timingAttributes)
+            {
+                if (window.performance.timing[timingAttributes[i]] == 0)
+                {
+                    zeroedNavTimingAtt = timingAttributes[i];
+                }
+            }
+
+            if (zeroedNavTimingAtt == undefined)
+            {
+                test_true(false,
+                          &quot;A navigation timing attribute with a value of 0 was not found to test for the &quot; +
+                          &quot;INVALID_ACCESS_ERR exception thrown by window.performance.measure().&quot;);
+            }
+            else
+            {
+                try
+                {
+                    // create the measure
+                    window.performance.measure(&quot;measure&quot;, zeroedNavTimingAtt);
+
+                    test_true(false,
+                              &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot;), where \&quot;&quot; +
+                              zeroedNavTimingAtt + &quot;\&quot; is a navigation timing attribute with a value of 0, threw an &quot; +
+                              &quot;exception.&quot;);
+                }
+                catch(e)
+                {
+                    test_true(true,
+                              &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot;), where \&quot;&quot; +
+                              zeroedNavTimingAtt + &quot;\&quot; is a navigation timing attribute with a value of 0, threw an &quot; +
+                              &quot;exception.&quot;);
+
+                    test_equals(e.code,
+                                e.INVALID_ACCESS_ERR,
+                                &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot;), where \&quot;&quot; +
+                                zeroedNavTimingAtt + &quot;\&quot; is a navigation timing attribute with a value of 0, threw &quot; +
+                                &quot;an INVALID_ACCESS_ERR exception.&quot;);
+                }
+
+                try
+                {
+                    // create the measure
+                    window.performance.measure(&quot;measure&quot;, zeroedNavTimingAtt, &quot;responseEnd&quot;);
+
+                    test_true(false,
+                              &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot;, &quot; +
+                              &quot;\&quot;responseEnd\&quot;), where \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot; is a navigation timing &quot; +
+                              &quot;attribute with a value of 0, threw an exception.&quot;);
+                }
+                catch(e)
+                {
+                    test_true(true,
+                              &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot;, &quot; +
+                              &quot;\&quot;responseEnd\&quot;), where \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot; is a navigation timing &quot; +
+                              &quot;attribute with a value of 0, threw an exception.&quot;);
+
+                    test_equals(e.code,
+                                e.INVALID_ACCESS_ERR,
+                                &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot;, &quot; +
+                                &quot;\&quot;responseEnd\&quot;), where \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot; is a navigation timing &quot; +
+                                &quot;attribute with a value of 0, threw an INVALID_ACCESS_ERR exception.&quot;);
+                }
+
+                try
+                {
+                    // create the measure
+                    window.performance.measure(&quot;measure&quot;, &quot;navigationStart&quot;, zeroedNavTimingAtt);
+
+                    test_true(false,
+                              &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;navigationStart\&quot;, \&quot;&quot; + zeroedNavTimingAtt +
+                              &quot;\&quot;), where \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot; is a navigation timing attribute with a &quot; +
+                              &quot;value of 0, threw an exception.&quot;);
+                }
+                catch(e)
+                {
+                    test_true(true,
+                              &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;navigationStart\&quot;, \&quot;&quot; + zeroedNavTimingAtt +
+                              &quot;\&quot;), where \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot; is a navigation timing attribute with a &quot; +
+                              &quot;value of 0, threw an exception.&quot;);
+
+                    test_equals(e.code,
+                                e.INVALID_ACCESS_ERR,
+                                &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;navigationStart\&quot;, \&quot;&quot; + zeroedNavTimingAtt +
+                                &quot;\&quot;), where \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot; is a navigation timing attribute with a &quot; +
+                                &quot;value of 0, threw an INVALID_ACCESS_ERR exception.&quot;);
+                }
+
+                try
+                {
+                    // create the measure
+                    window.performance.measure(&quot;measure&quot;, zeroedNavTimingAtt, zeroedNavTimingAtt);
+
+                    test_true(false,
+                              &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot;, \&quot;&quot; +
+                              zeroedNavTimingAtt + &quot;\&quot;), where \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot; is a navigation timing &quot; +
+                              &quot;attribute with a value of 0, threw an exception.&quot;);
+                }
+                catch(e)
+                {
+                    test_true(true,
+                              &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot;, \&quot;&quot; +
+                              zeroedNavTimingAtt + &quot;\&quot;), where \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot; is a navigation timing &quot; +
+                              &quot;attribute with a value of 0, threw an exception.&quot;);
+
+                    test_equals(e.code,
+                                e.INVALID_ACCESS_ERR,
+                                &quot;window.performance.measure(\&quot;measure\&quot;, \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot;, \&quot;&quot; +
+                                zeroedNavTimingAtt + &quot;\&quot;), where \&quot;&quot; + zeroedNavTimingAtt + &quot;\&quot; is a navigation &quot; +
+                                &quot;timing attribute with a value of 0, threw an INVALID_ACCESS_ERR exception.&quot;);
+                }
+            }
+
+            done();
+        }
+    &lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body onload=&quot;onload_test();&quot;&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates that the performance.measure() method throws a SYNTAX_ERR exception whenever a
+           non-existent mark is provided as the startMark or endMark, and the method also throws a INVALID_ACCESS_ERR
+           whenever a navigation timing attribute with a value of zero is provided as the startMark or endMark.
+        &lt;/p&gt;
+
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measure_navigation_timingexpectedtxt"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_navigation_timing-expected.txt (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_navigation_timing-expected.txt                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_navigation_timing-expected.txt        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,28 @@
</span><ins>+Description
+
+This test validates that the performance.measure() method is working properly when navigation timing attributes are used in place of mark names. This test creates the following measures to test this method:
+
+&quot;measure_nav_start_no_end&quot;: created using a measure() call with a navigation timing attribute provided as the startMark and nothing provided as the endMark
+&quot;measure_nav_start_mark_end&quot;: created using a measure() call with a navigation timing attribute provided as the startMark and a mark name provided as the endMark
+&quot;measure_mark_start_nav_end&quot;: created using a measure() call with a mark name provided as the startMark and a navigation timing attribute provided as the endMark
+&quot;measure_nav_start_nav_end&quot;:created using a measure() call with a navigation timing attribute provided as both the startMark and endMark
+After creating each measure, the existence of these measures is validated by calling performance.getEntriesByName() with each measure name
+
+PASS window.performance is defined 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_no_end&quot;)[0].name == &quot;measure_nav_start_no_end&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_no_end&quot;)[0].startTime is correct 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_no_end&quot;)[0].entryType == &quot;measure&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_no_end&quot;)[0].duration is aproximately correct (up to 20ms difference allowed) 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_mark_end&quot;)[0].name == &quot;measure_nav_start_mark_end&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_mark_end&quot;)[0].startTime is correct 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_mark_end&quot;)[0].entryType == &quot;measure&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_mark_end&quot;)[0].duration is aproximately correct (up to 20ms difference allowed) 
+PASS window.performance.getEntriesByName(&quot;measure_mark_start_nav_end&quot;)[0].name == &quot;measure_mark_start_nav_end&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_mark_start_nav_end&quot;)[0].startTime is correct 
+PASS window.performance.getEntriesByName(&quot;measure_mark_start_nav_end&quot;)[0].entryType == &quot;measure&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_mark_start_nav_end&quot;)[0].duration is aproximately correct (up to 20ms difference allowed) 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_nav_end&quot;)[0].name == &quot;measure_nav_start_nav_end&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_nav_end&quot;)[0].startTime is correct 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_nav_end&quot;)[0].entryType == &quot;measure&quot; 
+PASS window.performance.getEntriesByName(&quot;measure_nav_start_nav_end&quot;)[0].duration is aproximately correct (up to 20ms difference allowed) 
+
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingtest_user_timing_measure_navigation_timinghtml"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_navigation_timing.html (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_navigation_timing.html                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_navigation_timing.html        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,233 @@
</span><ins>+&lt;!DOCTYPE html&gt;
+&lt;html&gt;
+    &lt;head&gt;
+        &lt;meta charset=&quot;UTF-8&quot; /&gt;
+        &lt;title&gt;window.performance User Timing clearMeasures() method is working properly with navigation timing
+               attributes&lt;/title&gt;
+        &lt;link rel=&quot;author&quot; title=&quot;Microsoft&quot; href=&quot;http://www.microsoft.com/&quot; /&gt;
+        &lt;link rel=&quot;help&quot; href=&quot;http://www.w3.org/TR/user-timing/#dom-performance-measure&quot;/&gt;
+        &lt;script src=&quot;/resources/testharness.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;/resources/testharnessreport.js&quot;&gt;&lt;/script&gt;
+        &lt;script src=&quot;resources/webperftestharness.js&quot;&gt;&lt;/script&gt;
+
+    &lt;script type=&quot;text/javascript&quot;&gt;
+        // navigation timing attributes
+        var timingAttributes = [
+            'connectEnd',
+            'connectStart',
+            'domComplete',
+            'domContentLoadedEventEnd',
+            'domContentLoadedEventStart',
+            'domInteractive',
+            'domLoading',
+            'domainLookupEnd',
+            'domainLookupStart',
+            'fetchStart',
+            'loadEventEnd',
+            'loadEventStart',
+            'navigationStart',
+            'redirectEnd',
+            'redirectStart',
+            'requestStart',
+            'responseEnd',
+            'responseStart',
+            'unloadEventEnd',
+            'unloadEventStart'
+        ];
+
+        // test data
+        var startMarkName = &quot;mark_start&quot;;
+        var startMarkValue;
+        var endMarkName = &quot;mark_end&quot;;
+        var endMarkValue;
+        var measures;
+        var testThreshold = 20;
+
+        // test measures
+        measureTestDelay = 200;
+        var TEST_MEASURES =
+        [
+            {
+                name:                   &quot;measure_nav_start_no_end&quot;,
+                startMark:              &quot;navigationStart&quot;,
+                endMark:                undefined,
+                exceptionTestMessage:   &quot;window.performance.measure(\&quot;measure_nav_start_no_end\&quot;, &quot; +
+                                        &quot;\&quot;navigationStart\&quot;) ran without throwing any exceptions.&quot;,
+                expectedStartTime:      undefined,
+                expectedDuration:       undefined,
+                entryMatch:             undefined
+            },
+            {
+                name:                   &quot;measure_nav_start_mark_end&quot;,
+                startMark:              &quot;navigationStart&quot;,
+                endMark:                &quot;mark_end&quot;,
+                exceptionTestMessage:   &quot;window.performance.measure(\&quot;measure_nav_start_end\&quot;, \&quot;navigationStart\&quot;, &quot; +
+                                        &quot;\&quot;mark_end\&quot;) ran without throwing any exceptions.&quot;,
+                expectedStartTime:      undefined,
+                expectedDuration:       undefined,
+                entryMatch:             undefined
+            },
+            {
+                name:                   &quot;measure_mark_start_nav_end&quot;,
+                startMark:              &quot;mark_start&quot;,
+                endMark:                &quot;responseEnd&quot;,
+                exceptionTestMessage:   &quot;window.performance.measure(\&quot;measure_start_nav_end\&quot;, \&quot;mark_start\&quot;, &quot; +
+                                        &quot;\&quot;responseEnd\&quot;) ran without throwing any exceptions.&quot;,
+                expectedStartTime:      undefined,
+                expectedDuration:       undefined,
+                entryMatch:             undefined
+            },
+            {
+                name:                   &quot;measure_nav_start_nav_end&quot;,
+                startMark:              &quot;navigationStart&quot;,
+                endMark:                &quot;responseEnd&quot;,
+                exceptionTestMessage:   &quot;window.performance.measure(\&quot;measure_nav_start_nav_end\&quot;, &quot; +
+                                        &quot;\&quot;navigationStart\&quot;, \&quot;responseEnd\&quot;) ran without throwing any exceptions.&quot;,
+                expectedStartTime:      undefined,
+                expectedDuration:       undefined,
+                entryMatch:             undefined
+            }
+        ];
+
+        setup({explicit_done: true});
+
+        test_namespace();
+
+        function onload_test()
+        {
+            // test for existance of User Timing and Performance Timeline interface
+            if (window.performance.mark == undefined ||
+                window.performance.clearMarks == undefined ||
+                window.performance.measure == undefined ||
+                window.performance.clearMeasures == undefined ||
+                window.performance.getEntriesByName == undefined ||
+                window.performance.getEntriesByType == undefined ||
+                window.performance.getEntries == undefined)
+            {
+                test_true(false,
+                          &quot;The User Timing and Performance Timeline interfaces, which are required for this test, &quot; +
+                          &quot;are defined.&quot;);
+
+                done();
+            }
+            else
+            {
+                // create the start mark for the test measures
+                window.performance.mark(startMarkName);
+
+                // get the start mark's value
+                startMarkValue = window.performance.getEntriesByName(startMarkName)[0].startTime;
+
+                // create the test end mark using the test delay; this will allow for a significant difference between
+                // the mark values that should be represented in the duration of measures using these marks
+                setTimeout(measure_test_cb, measureTestDelay);
+            }
+        }
+
+        function measure_test_cb()
+        {
+            // create the end mark for the test measures
+            window.performance.mark(endMarkName);
+
+            // get the end mark's value
+            endMarkValue = window.performance.getEntriesByName(endMarkName)[0].startTime;
+
+            // loop through measure scenarios
+            for (var i in TEST_MEASURES)
+            {
+                var scenario = TEST_MEASURES[i];
+
+                if (scenario.startMark != undefined &amp;&amp; scenario.endMark == undefined)
+                {
+                    // only startMark is defined, provide startMark and don't provide endMark
+                    window.performance.measure(scenario.name, scenario.startMark);
+
+                    // when startMark is provided to the measure() call, the value of the mark or navigation
+                    // timing attribute whose name is provided is used for the startMark
+                    scenario.expectedStartTime = (timingAttributes.indexOf(scenario.startMark) != -1 ?
+                                                  window.performance.timing[scenario.startMark] -
+                                                  window.performance.timing.navigationStart :
+                                                  startMarkValue);
+
+                    // when endMark isn't provided to the measure() call, a DOMHighResTimeStamp corresponding to
+                    // the current time with a timebase of the navigationStart attribute is used
+                    scenario.expectedDuration = ((new Date()) - window.performance.timing.navigationStart) -
+                                                scenario.expectedStartTime;
+                }
+                else if (scenario.startMark != undefined &amp;&amp; scenario.endMark != undefined)
+                {
+                    // both startMark and endMark are defined, provide both parameters
+                    window.performance.measure(scenario.name, scenario.startMark, scenario.endMark);
+
+                    // when startMark is provided to the measure() call, the value of the mark or navigation
+                    // timing attribute whose name is provided is used for the startMark
+                    scenario.expectedStartTime = (timingAttributes.indexOf(scenario.startMark) != -1 ?
+                                                  window.performance.timing[scenario.startMark] -
+                                                  window.performance.timing.navigationStart :
+                                                  startMarkValue);
+
+                    // when endMark is provided to the measure() call, the value of the mark whose name is
+                    // provided is used for the startMark
+                    scenario.expectedDuration = (timingAttributes.indexOf(scenario.endMark) != -1 ?
+                                                 window.performance.timing[scenario.endMark] -
+                                                 window.performance.timing.navigationStart :
+                                                 endMarkValue) - scenario.expectedStartTime;
+                }
+            }
+
+            // test the test measures are returned by getEntriesByName
+            for (var i in TEST_MEASURES)
+            {
+                entries = window.performance.getEntriesByName(TEST_MEASURES[i].name);
+                test_measure(entries[0],
+                            &quot;window.performance.getEntriesByName(\&quot;&quot; + TEST_MEASURES[i].name + &quot;\&quot;)[0]&quot;,
+                            TEST_MEASURES[i].name,
+                            TEST_MEASURES[i].expectedStartTime,
+                            TEST_MEASURES[i].expectedDuration);
+                TEST_MEASURES[i].entryMatch = entries[0];
+            }
+
+            done();
+        }
+
+        function test_measure(measureEntry, measureEntryCommand, expectedName, expectedStartTime, expectedDuration)
+        {
+            // test name
+            test_true(measureEntry.name == expectedName, measureEntryCommand + &quot;.name == \&quot;&quot; + expectedName + &quot;\&quot;&quot;);
+
+            // test startTime; since for a mark, the startTime is always equal to a mark's value or the value of a
+            // navigation timing attribute, the actual startTime should match the expected value exactly
+            test_true(Math.abs(measureEntry.startTime - expectedStartTime) == 0,
+                      measureEntryCommand + &quot;.startTime is correct&quot;);
+
+            // test entryType
+            test_true(measureEntry.entryType == &quot;measure&quot;, measureEntryCommand + &quot;.entryType == \&quot;measure\&quot;&quot;);
+
+            // test duration, allow for an acceptable threshold in the difference between the actual duration and the
+            // expected value for the duration
+            test_true(Math.abs(measureEntry.duration - expectedDuration) &lt;= testThreshold, measureEntryCommand +
+                      &quot;.duration is aproximately correct (up to &quot; + testThreshold + &quot;ms difference allowed)&quot;);
+        }
+    &lt;/script&gt;
+    &lt;/head&gt;
+    &lt;body onload=&quot;onload_test();&quot;&gt;
+        &lt;h1&gt;Description&lt;/h1&gt;
+        &lt;p&gt;This test validates that the performance.measure() method is working properly when navigation timing
+           attributes are used in place of mark names. This test creates the following measures to test this method:
+            &lt;ul&gt;
+                &lt;li&gt;&quot;measure_nav_start_no_end&quot;: created using a measure() call with a navigation timing attribute
+                    provided as the startMark and nothing provided as the endMark&lt;/li&gt;
+                &lt;li&gt;&quot;measure_nav_start_mark_end&quot;: created using a measure() call with a navigation timing attribute
+                    provided as the startMark and a mark name provided as the endMark&lt;/li&gt;
+                &lt;li&gt;&quot;measure_mark_start_nav_end&quot;: created using a measure() call with a mark name provided as the
+                    startMark and a navigation timing attribute provided as the endMark&lt;/li&gt;
+                &lt;li&gt;&quot;measure_nav_start_nav_end&quot;:created using a measure() call with a navigation timing attribute
+                    provided as both the startMark and endMark&lt;/li&gt;
+            &lt;/ul&gt;
+           After creating each measure, the existence of these measures is validated by calling
+           performance.getEntriesByName() with each measure name
+        &lt;/p&gt;
+
+        &lt;div id=&quot;log&quot;&gt;&lt;/div&gt;
+    &lt;/body&gt;
+&lt;/html&gt;
</ins></span></pre></div>
<a id="trunkLayoutTestsimportedw3cwebplatformtestsusertimingw3cimportlog"></a>
<div class="addfile"><h4>Added: trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/w3c-import.log (0 => 211333)</h4>
<pre class="diff"><span>
<span class="info">--- trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/w3c-import.log                                (rev 0)
+++ trunk/LayoutTests/imported/w3c/web-platform-tests/user-timing/w3c-import.log        2017-01-28 09:26:27 UTC (rev 211333)
</span><span class="lines">@@ -0,0 +1,32 @@
</span><ins>+The tests in this directory were imported from the W3C repository.
+Do NOT modify these tests directly in WebKit.
+Instead, create a pull request on the W3C CSS or WPT github:
+        https://github.com/w3c/csswg-test
+        https://github.com/w3c/web-platform-tests
+
+Then run the Tools/Scripts/import-w3c-tests in WebKit to reimport
+
+Do NOT modify or remove this file.
+
+------------------------------------------------------------------------
+Properties requiring vendor prefixes:
+None
+Property values requiring vendor prefixes:
+None
+------------------------------------------------------------------------
+List of files:
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/OWNERS
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/idlharness.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_marks.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_clear_measures.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_entry_type.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_exists.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_with_timing_attributes.js
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_and_measure_exception_when_invoke_without_parameter.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_exceptions.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_mark_with_name_of_navigation_timing_optional_attribute.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_exceptions.html
+/LayoutTests/imported/w3c/web-platform-tests/user-timing/test_user_timing_measure_navigation_timing.html
</ins></span></pre>
</div>
</div>

</body>
</html>