felipe at crochik.com
Mon Oct 17 04:55:17 PDT 2011
That is the problem: how can I "separate" the touch events implemented by
web sites (e.g. pinching on the map on google maps or swiping on a results
page of a google images search to switch page) from "gestures for user
interaction" (e.g. change the zoom factor, scroll the page)?
It seems that I should first give the "page" a chance to deal with the
gestures and only if they are not "needed" have them interact with the whole
view. Every "sample" I have seen will start by intercepting any user
mouse/touch/gesture events and just forward to "webkit" the mouse clicks.
p.s. I really enjoy reading your blog and many times came across your work.
On Sun, Oct 16, 2011 at 11:15 PM, Ariya Hidayat <ariya.hidayat at gmail.com>wrote:
> On Sun, Oct 16, 2011 at 6:40 PM, Felipe Crochik <felipe at crochik.com>
> > I can't seem to find a definitive answer where Qt Webkit "can" or can't
> > support gestures (pinch, swipe, ...)
> Do you mean the gestures for user interactions? For example, pinch can
> be used to zoom in and out, swipe to flick the view, etc. In that
> case, those gestures are application-specific gestures and should be
> handled at the application level, e.g. the browser which uses
> If what you mean is touch events (http://www.w3.org/TR/touch-events/),
> then see http://trac.webkit.org/wiki/QtWebKitFeatures21.
> Ariya Hidayat, http://ariya.ofilabs.com
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the webkit-qt