[webkit-dev] Manipulate / gesture events

Hiitola Kari (Nokia-D/Tampere) kari.hiitola at nokia.com
Wed Oct 15 09:35:12 PDT 2008


Hi,

My name is Kari Hiitola and I work for the same project in Nokia as Jonni
Rainisto, who also was emailing about our project about a month ago. As he
said, we are working towards enabling a lot more compelling web experience
on the small screen touch/gesture devices. We are eager moving the the
technology introduced by Apple to the WebKit forward for wider acceptance
and have worked with the software and hardware stacks used in Nokia and
tried to figure out general solutions that would be building on top of
existing implementations but also be general enough to use by the whole
industry that could be moved forward in the W3C standardisation. And for the
record, all we do is or will be open source, the limiting factors being only
some of the legal hurdles imposed to us externally.

So, as we already have done some test implementations, I would like to ask
your comments on some change proposals we have, so that we would be able to
make a general enough implementation to the WebKit, and if there is
agreement, we will be happy to provide the final implementation to the
WebKit.

I'll start by elaborating more on the gesture events. We looked at iPhoneŒs
<http://www.opensource.apple.com/darwinsource/iPhone2.1/WebCore-351.9/dom/Ge
stureEvent.idl>, but we¹d like to see a bit more generic event, more
suitable for becoming a standard. I'd like to hear your feedback on this
proposal, and if there would be interested parties to participate in
standardization of the events.

1. Name the event "manipulate"
Different name would help in keeping the Apple proprietary gesture event in
the devices alongside with the potentially standards-track version. The
gesture event really is all about manipulating an object on the web page. On
different device types there might be different methods for doing that. E.g.
zooming could be point&roll instead of pinch, and rotate could be
point&joystick (just random picks not related to any real existing or coming
product). These are methods that you wouldn¹t call gestures, but in any case
they will be manipulation.

2. Enable panning with the same event as zoom and rotate
Take for example a map that you can pan, zoom and rotate using one event
handler. Or the classical multi-touch demo of photo organizer (where pan
would be roughly equal to drag). The JS code would be simplified a lot when
you don¹t separately receive the pan (or raw touch) and the zoom/rotate
events. The events are most of the time used together, and naturally you can
just ignore the parameters you don¹t need. In its simplest form you'd just
need to actually use the X/Y coordinates that there already are in the
iPhone¹s GestureEvent, but there is an additional need that would change the
logic a bit: 

3. Allow starting the "pan" manipulate event with one finger
iPhone's GestureEvent requires two fingers to be down to trigger the event,
so the pan would not be possible with one finger. Admitted, starting pan
with one finger would bring the problem: what is then the correct coordinate
to be used in the scenario where you first start panning with finger 1, then
zoom/rotate with two fingers, release finger 1, and continue pan with finger
2? 

3.1 First option: Use relative coordinates for the pan, with the start of
the gesture as 0,0. Page/client/screen X/Y are then in the place of the only
finger, or in the middle of two fingers if present. This would result in
"jumpy" non-continuous coordinates for page/client/screen when fingers enter
and exit the screen, but there would be no "lying" about the coordinates,
and the pan coordinates would still be continuous. Having all fingers
separately modeled in the manipulation event wouldn¹t be such a good idea
either because of the added complexity, and if you want raw touch events,
you should use raw touch events.

3.2 Second option: Use absolute coordinates for the page/client/screen X/Y
so that they are continuous. This requires a bit of "lying" about the
coordinates in the case of multi-touch. Instead of reporting the actual
position of one of the fingers, you apply the movement delta to the original
coordinates of the first finger. When you put the second finger down, the
coordinates remain constant at the place of the first finger. When both
fingers move, the delta movement of their center point is applied to the
original coordinate. If the first finger is lifted, the coordinates are not
in the place of the remaining finger, and can even be outside the screen,
but those events can be filtered out if found necessary.
With this trick the coordinates are not necessarily the center of rotation,
or center of scaling, which is also the case with device types different
from iPhone's. Thus, the rotation and scale center should be added as
attributes. I don't see a need to have separate rotation and scale centers,
but a combined centerX/Y with client coordinates should suffice.

3.3 A combination: Add new panX/Y attribute, which would be the same as X/Y
from option 3.2 but the page/client/screen X/Y would be ³honest² and ³jumpy²
like in 3.1.

Here's the idl reflecting the proposal 3.1 which would be our favorite, and
panOffsetX/Y is in fact the only change needed in addition to the name
change:

module events {
    interface ManipulateEvent : UIEvent {
      void initManipulateEvent(  in AtomicString type,
                  in boolean canBubble,
                  in boolean cancelable,
                  in DOMWindow view,
                  in long detail,
                  in long screenX,
                  in long screenY,
                  in long clientX,
                  in long clientY,
                  in long panOffsetX,
                  in long panOffsetY,
                  in boolean ctrlKey,
                  in boolean altKey,
                  in boolean shiftKey,
                  in boolean metaKey,
                  in EventTarget target,
                  in float scale,
                  in float rotation
                  );
      
      readonly attribute EventTarget target;

      readonly attribute long panOffsetX; // delta X coordinates of pan
      readonly attribute long panOffsetY; // delta Y coordinates of pan
      
      readonly attribute float scale;      // distance (since event start)
between fingers as multiplier of initial value. Initially 1.0, zoom out
(pinch) < 1.0, zoom in > 1.0.
      
      readonly attribute float rotation;   // rotation delta (since event
start) in degrees (clockwise is positive). Initially 0.0.
      readonly attribute boolean ctrlKey;
      readonly attribute boolean shiftKey;
      readonly attribute boolean altKey;
      readonly attribute boolean metaKey;
    };
}

We¹ll try to hang around today on the IRC channel for discussion (khiitola
and Jonni at least) in spite of the terrible 10-hour time difference to
California. Or you can continue the discussion using the mailing list.

Best Regards,
 - Kari Hiitola
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.webkit.org/pipermail/webkit-dev/attachments/20081015/d73d7c87/attachment.html 


More information about the webkit-dev mailing list