[webkit-dev] PSA: higher precision event timestamp landing soon - port verification needed

Karen Shaeffer shaeffer at neuralscape.com
Mon Jun 10 20:32:21 PDT 2013


On Mon, Jun 10, 2013 at 08:25:57PM -0400, Rick Byers wrote:
> On Mon, Jun 10, 2013 at 5:19 PM, Benjamin Poulain <benjamin at webkit.org>wrote:
> 
> > On Mon, Jun 10, 2013 at 7:37 AM, Rick Byers <rbyers at chromium.org> wrote:
> >
> >> There's been discussion / patches in the past for exposing system time as
> >> a separate timestamp on the Event object (as IE does).  See
> >> https://lists.webkit.org/pipermail/webkit-dev/2012-October/022574.html,
> >> https://bugs.webkit.org/show_bug.cgi?id=94987 and
> >> http://lists.w3.org/Archives/Public/public-web-perf/2012Oct/0046.html.
> >>
> >> In particular, the use of UNIX-epoch timestamps means such measurements
> >> will never be completely accurate (due to NTP skew, leap seconds, etc.).
> >>  But just updating the timestamp everyone uses to be more accurate (even if
> >> not perfect) seems like a clear win.
> >>
> >> Do you think both approaches should be pursued, or is updating the
> >> existing timestamp to be as accurate as possible within the epoch semantics
> >> good enough?
> >>
> >
> >  Kind of different goals in one timestamp. :)
> >
> > For input events, the accurate time delta covers many use cases. High
> > precision time would be nice but it is not really a must have.
> >
> 
> Right, but isn't NTP skew a problem (at least in theory) even for accurate
> time deltas when using an epoch-based timestamp?  At least I believe that's
> part of the push back flackr@ got when he tried to plumb PlatformEvent
> timestamps into the DOM event objects a few months back.
> 
> 
> > For other kind of events, a high precision timestamp like you suggest
> > could make sense.
> >
> 
> > Benjamin
--- end quoted text ---

Hi,
On commodity digital hardware, a high resolution system clock get's its time
reference from NTP. And a good implementation and configuration of NTP using
commodity technology is only accurate to the millisecond range at best. While
a high resolution system clock provides enhanced precision, it is no more accurate
than the time reference provided by NTP.

If you want more accuracy, then the way to realize it is to improve the
characteristics of your NTP source and the network between your endpoint device
and the NTP source. Only then would it be valid to claim better accuracy by
using a high precision clock.

I suspect folks want the higher precision, because, in theory, it can provide more
accurate time differences around a local time point. In practice though, there are
numerous uncertainties that degrade one's confidence in the accuracy of a time
duration based on a high resolution clock. Just off the top of my head, what happens
when the time is adjusted to maintain sync with NTP? If you take a high precision
time duration across that event... What about the jitter in the system call to
acquire the high resolution time points? If you time a system call on linux, then
you are going to see it has its own jitter. While that jitter likely wouldn't impact the
accuracy of millisecond precision, it could impact the accuracy of a nanosecond and
even a microsecond resolution of a high resolution clock. What about the frequency jitter
inherent to all commodity digital time? What about the problems associated with
creating a high resolution time duration from different cores on a multicore platform?
I am certain there are other issues as well. I'm out of time. Very busy today.

In summary, if you want to work with high resolution clocks on commodity digital hardware,
then you can talk about precision, but be very careful with the notion of accuracy.

enjoy,
Karen
-- 
Karen Shaeffer
Neuralscape, Mountain View, CA 94040


More information about the webkit-dev mailing list