So you thought moving a pointer/cursor on-screen is simple? Well... no.
Having recently spent a day fixing up freedesktop Bug 31636, I figured maybe I should take you on a journey that starts with naïveté and ends in insanity. Just like Twilight.
What is actually needed to move a pointer on the screen? Move your mouse. The driver submits two relative coordinates to the server and expects the pointer to move. So we take the coordinates, add it to the last known coordinates and we're done.
A 1:1 movement of course only works for slow mouse movements. For larger deltas, we need to accelerate the pointer movement so we can easily cover larger distances on the screen. So we look at the timestamps of the events, their delta movements and use that to calculate some factor. This factor is applied to the latest deltas and then decides the actual movement.
Now we've pretty much covered traditional mice. Many devices however are absolute input devices. They don't use delta coordinates, they just give us the absolute position of the pointer. However, that position is not in pixels but some device coordinate system. So we need to remember the axis ranges and scale from the device coordinate system into the screen coordinate system.
Many users have more than one screen. Two or more screens, when not in mirrored mode, create a desktop that is larger than each single screen. For absolute devices, we map the device to the desktop coordinates so that each edge of the device maps to the corresponding edge on the desktop. Absolute events from such a device must be first mapped to desktop coordinates. From those coordinates we can gather the screen the pointer is to be on and clip the coordinates back to the per-screen coordinates to draw the visible cursor. For relative devices the process is somewhat similar, we add movement to the desktop coordinates, then clip back to per-screen for updates.
All of the above is pretty standard and doesn't require any X specifics. Let's get into what the X11 protocol requires.
evdev has a calibration feature that allows a device to be adjusted for differences in the actual vs. announced coordinates. This is needed when a device real axis ranges are actually different to what the device announces. For example, a device may claim that the axis starts at 0, but really the first value you get out of it is e.g. 50. For historical reasons we cannot change the device axes once they are set up though. So evdev's calibration scales from the calibrated device range (e.g. 50-950) into the actual announced device range (0-1000). That scaled coordinate is then posted to the server. The wacom driver has a similar feature (called Area).
The X Input Extensions (XI) provides so-called "valuators" (== axes) to the clients as part of the various input events. Valuators 0 and 1 are x and y. XI requires valuator data to be in absolute device coordinates, but those are per protocol screen. In old-style multi-monitor setups with two Section Device entries in the xorg.conf, you have more than one protocol screen. The device itself however is still mapped to the whole desktop. So we convert device coordinates to desktop coordinates, then to screen coordinates on the current screen, and then that position is converted back into device coordinates. Bonus points for considering what happens in a setup with three monitors but only two protocol screens
If you kept counting, you should be up to 5 coordinate systems now:
- device coordinate system
- adjusted device coordinate system after calibration is applied
- desktop-wide coordinate system
- per-screen coordinate system
- per-screen device coordinate system
- relative coordinates
- device-specific acceleration
- desktop-wide coordinate system
- per-screen coordinate system