Tuesday, February 7, 2012

Multitouch in X - Multitouch-touchpads

This post is part of a series on multi-touch support in the X.Org X server.
  1. Multitouch in X - Getting events
  2. Multitouch in X - Pointer emulation
  3. Multitouch in X - Touch grab handling
  4. Multitouch in X - Multitouch-touchpads
In this post, I'll outline the handling of touch events for multitouch-capable touchpads. Multi-touch touchpads that are supported are those that provide position information for more than one finger. The current version of the synaptics X driver does some tricks to pretend two-finger interaction on single-finger touchpads - such devices are not applicable here.

Touchpads are primarily pointer devices and any multi-touch interaction is usually a gesture. In the protocol, such devices are of the type XIDependentDevice and the server does adjust touch event delivery. I've already hinted at this here but this time I'll give a more detailed explanation.

Touch event delivery

Unlike for direct touch devices such as touchscreens, dependent devices have a different picking mechanism in the server. We assume that all gestures are semantically associated with the cursor position. For example, for scrolling, you would move the cursor on top of the window to be scrolled, then you would start scrolling. The server thus adjusts event delivery accordingly. Whereas for direct touch devices the touch events are delivered to whichever window is at the position of the touch, touch events from dependent devices are always delivered to the window underneath the pointer (grab semantics are adjusted to follow the same rules). So if you start a gesture in the top-left corner of the touchpad, the window underneath the cursor gets the events with the top-left coordinates. Note that the event and root coordinates always reflect the pointer position.

The average multi-touch touchpad has two modes of operation: single-finger operation where the purpose is to move the visible cursor and multi-finger operation which is usually interpreted into a gesture on the client. These two modes are important, as they too affect event delivery. The protocol specifies that any interaction with the device that serves to move the visible cursor only should not generate touch events, and that touch events will start once that interaction becomes a true multi-touch interaction. This leaves the drivers a little bit of leeway, but the current implementation in the synaptics driver is the following:
  1. A user places one finger on the touchpad and moves. The client will receive regular pointer events.
  2. The user places a second finger on the touchpad. The client will now receive a TouchBegin event for the first and the second touch, at their respective current positions in device coordinate range.
  3. Movement of either finger now will generate touch events, but no pointer events.
  4. Any other fingers will generate touch events only.
  5. When one of two remaining fingers on the touchpoint ceases the touch, a TouchEnd is sent for both the terminating touch and the remaining touch. The remaining finger will revert to sending pointer events.

Legacy in-driver gestures

As you are likely aware, the synaptics driver currently supports several pseudo gestures such as tap-to-click or two-finger scrolling. These gestures are interpreted in the driver, thus the server and client never see the raw data for them.

With proper multi-touch support these gestures are now somewhat misplaced. On the one hand, we want the clients to interpret multitouch, on the other hand we want the gestures to be handled in the same manner in all applications. (Coincidentally, this is also a problem that we need to solve for Wayland).

We toyed with ideas of marking emulated events so clients can filter but since we do need to be compatible to the core and XI 1.x behaviours, we only found one solution: any in-driver gestures that alter event deliver must not generate touch events. Thus, if the driver is set to handle two-finger scrolling, the clients will only see the pointer events and scroll events, they will not see touch events from two-fingers. To get two-finger scrolling handled by the client, the in-driver gesture must be disabled. The obvious side-effect of that is that you then cannot scroll in applications that don't support the gestures. Oh well, it's the price we have to pay for having integrated gesture support in the wrong place.

4 comments:

  1. Just curious: do you plan to support users that rest their thumb at the bottom of the touchpad? Mac and other systems support this mode of operation.

    ReplyDelete
  2. Just curious: do you plan to support users that rest their thumb at the bottom of the touchpad? Mac and other systems support this mode of operation.

    ReplyDelete
  3. Why can't the in driver gestures be moved to the server so you can do different things for XI1 and XI2 clients? (I if the events would go to non-XI2 client do the gesture interpretation in the server, otherwise pass touch events. It sure sounds like you could do that without breaking XI2 stuff for touchpads since the client under the pointer is always going to get all the events anyway).

    I'm sure you've thought of something like this and discarded the idea for some reason, I'd just like to know why :P

    ReplyDelete
  4. Spudd86:
    Gestures are very context specific and the server almost never has enough context to decide. The client is the only one to sensibly interpret whether two ore more touchpoints are a gesture or just independent touchpoints moving things around.

    ReplyDelete

Comments are moderated thanks to spammers