This time we talk trackpoints. Or pointing sticks, or whatever else you want
to call that thing between the GHB keys. If you don't have one and you've
never seen one, prepare to be amazed. [1]
Trackpoints are tiny joysticks that react to pressure [2], convert that
pressure into relative x/y events and pass that on to whoever is
interested in it. The harder you push, the higher the deltas.
This is where the simple and obvious stops and it gets difficult. But then
again, if it was that easy I wouldn't write this post, you wouldn't have
anything to read, so somehow everyone wins. Whoop-dee-doo.
All the data and measurements below refer to my trackpoint, a
Lenovo T440s. It may not apply to any other trackpoints, including those on
on different laptop models or even on the same laptop model with different
firmware versions. I've written the below with a lot of cringing and
handwringing. I want to write data that is irrefutable, but the
universe is against me and what the universe wants, the universe gets.
Approximately every second sentence below has a footnote of "actual results
may vary". Feel free to re-create the data on your device though.
Measuring trackpoint range is highly subjective, so you'll have to trust me
when I describe how specific speeds/pressure ranges feel. There are three
ranges of pressure on my trackpoint (sort-of):
- Pressure range one: When resting the finger on the trackpoint I don't
really need to apply noticable pressure to make the trackpoint send events. Just
moving the finger on the trackpoint makes it send
events, albeit sporadically.
- Pressure range two: Going beyond range one requires applying real
pressure and feels to me like we're getting into RSI territory. Not a
problem for short periods, but definitely not something I'd want all the
time. It's the pressure I'd use to cross the screen.
- Pressure range three: I have to push hard. I definitely
wouldn't want to do this during everyday interaction and it just feels wrong
anyway. This pressure range is for testing maximum deltas,
not one you would want to use otherwise.
The first/second range are easier delineated than the second/third
range because going from almost no pressure to some real pressure is easy. Going from
some pressure to too much pressure is more blurry, there is some overlap between second and
third range. Either way, keep these ranges in mind though as I'll be using them in the
explanations below.
Ok, so with the physical conditions explained, let's look at what we have to
worry about in software:
-
It is impossible to provide a constant input to a trackpoint if you're
a puny human. Without a robotic setup you just cannot apply constant
pressure so any measurements have some error. You also get to enjoy a
feedback loop - pressure influences pointer motion but that pointer motion
influences how much pressure you inadvertently apply. This makes any
comparison filled with errors. I don't know if I'm applying the same
pressure on the two devices I'm testing, I don't know if a
user I'm asking to test something uses constant/the same/the right pressure.
-
Not all trackpoints are created equal. Some trackpoints (mostly in
Lenovos), have configurable sensibility - 256 levels of it. [3] So one
trackpoint measured does not equal another trackpoint unless you keep track
of the firmware-set sensibility. Those trackpoints also have other
toggles. More importantly and AFAIK, this type of trackpoint also has a
built-in acceleration curve. [4] Other trackpoints (ALPS) just have a
fixed sensibility, I have no idea whether those have a built-in acceleration
curve or merely have a linear-ish pressure->delta mappings.
Due to some design choices we did years ago, systemd increases the
sensitivity on some devices (the POINTINGSTICK_SENSITIVITY property).
So even on a vanilla install, you can't actually rely on the trackpoint
being set to the manufacturer default. This was in an attempt to make
trackpoints behave more consistently, systemd had the hwdb and it seemed
like the right place to put device-specific quirks. In hindsight, it was the
wrong design choice.
-
Deltas are ... unreliable. At high sensitivity and high pressures you
might get a sequence of [7, 7, 14, 8, 3, 7]. At lower pressure you get the
deltas at seemingly random intervals. This could be because it's
hard to keep exact constant pressure, it could be a hardware issue.
-
evdev has been the default driver for almost a decade and before that it
was the mouse driver for a long time. So the kernel will "Divide 4 since
trackpoint's speed is too fast" [sic] for some trackpoints. Or by 8. Or not
at all. In other words, the kernel adjusts for what the default user space
is and userspace is based on what the kernel provides. On the newest ALPS
trackpoints the kernel has stopped doing any in-kernel scaling (good!) but
that means that the deltas are out by a factor of 8 now.
-
Trackpoints don't always have the same pressure ranges for x/y. AFAICT the
y range is usually a bit less than the x range on many or most trackpoints.
A bit weird because the finger position would suggest that strong vertical
pressure is easier to apply than sideways pressure.
-
(Some? All?) Trackpoints have built-in calibration procedures to find and
set their own center-point. Without that you'll get the trackpoint
eventually being ever so slightly off center over time, causing a mouse
pointer that just wanders off the screen, possibly into the woods, without
the obligatory red cape and basket full of whatever grandma eats when she's
sick.
So the calibration is required but can be triggered accidentally by the
user: If you push with the same pressure into the same
direction for 2-5 seconds (depending on $THINGS) you trigger the calibration
procedure and the current position becomes the new center point. When you
release, the cursor wanders off for a few seconds until the calibration sets
things straight again. If you ever see the cursor buzz off in a fixed
direction or walking backwards for a centimetre or two you've triggered that
calibration. The only way to avoid this is to make sure the pointer
acceleration mechanism allows you to reach any target within 2
seconds and/or never forces you to apply constant pressure for more than 2
seconds. Now there's a challenge...
Ok. If you've been paying attention instead of hoping for a TLDR that's more
elusive than Godot, we're now aware of the various drawbacks of collecting data
from a trackpoint. Let's go and look at data. Sensitivity is set to the
kernel default of 128 in sysfs, the default reporting rate is 100Hz. All
observations are YMMV and whatnot, especially the latter.
Trackpoint deltas are in integers but the dynamic range of delta values is tiny. You
mostly get 1 or 2 and it requires quite a fair bit of pressure to get up to
5 or more. At low pressure you get deltas of 1, but less frequently.
Visualised, the relationship between deltas and the interval between deltas
is like this:
At low pressure, we get deltas of 1 but high intervals. As the pressure
increases, the interval between events shrinks until at some point the
interval between events matches the reporting rate (100Hz/10ms). Increasing the
pressure further now increases the deltas while the intervals remain at the
reporting rate. For example, here's an event sequence at low pressure:
E: 63796.187226 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +20ms
E: 63796.227912 0002 0001 0001 # EV_REL / REL_Y 1
E: 63796.227912 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +40ms
E: 63796.277549 0002 0000 -001 # EV_REL / REL_X -1
E: 63796.277549 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +50ms
E: 63796.436793 0002 0000 -001 # EV_REL / REL_X -1
E: 63796.436793 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +159ms
E: 63796.546114 0002 0001 0001 # EV_REL / REL_Y 1
E: 63796.546114 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +110ms
E: 63796.606765 0002 0000 -001 # EV_REL / REL_X -1
E: 63796.606765 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +60ms
E: 63796.786510 0002 0000 -001 # EV_REL / REL_X -1
E: 63796.786510 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +180ms
E: 63796.885943 0002 0001 0001 # EV_REL / REL_Y 1
E: 63796.885943 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +99ms
E: 63796.956703 0002 0000 -001 # EV_REL / REL_X -1
E: 63796.956703 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +71ms
This was me pressing lightly but with perceived constant pressure and the
time stamps between events go from 20m to 180ms. Remember what I said above
about unreliable deltas? Yeah, that.
Here's an event sequence from a trackpoint at a pressure that triggers
almost constant reporting:
E: 72743.926045 0002 0000 -001 # EV_REL / REL_X -1
E: 72743.926045 0002 0001 -001 # EV_REL / REL_Y -1
E: 72743.926045 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +10ms
E: 72743.939414 0002 0000 -001 # EV_REL / REL_X -1
E: 72743.939414 0002 0001 -001 # EV_REL / REL_Y -1
E: 72743.939414 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +13ms
E: 72743.949159 0002 0000 -002 # EV_REL / REL_X -2
E: 72743.949159 0002 0001 -002 # EV_REL / REL_Y -2
E: 72743.949159 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +10ms
E: 72743.956340 0002 0000 -001 # EV_REL / REL_X -1
E: 72743.956340 0002 0001 -001 # EV_REL / REL_Y -1
E: 72743.956340 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +7ms
E: 72743.978602 0002 0000 -001 # EV_REL / REL_X -1
E: 72743.978602 0002 0001 -001 # EV_REL / REL_Y -1
E: 72743.978602 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +22ms
E: 72743.989368 0002 0000 -001 # EV_REL / REL_X -1
E: 72743.989368 0002 0001 -001 # EV_REL / REL_Y -1
E: 72743.989368 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +11ms
E: 72743.999342 0002 0000 -001 # EV_REL / REL_X -1
E: 72743.999342 0002 0001 -001 # EV_REL / REL_Y -1
E: 72743.999342 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +10ms
E: 72744.009154 0002 0000 -001 # EV_REL / REL_X -1
E: 72744.009154 0002 0001 -001 # EV_REL / REL_Y -1
E: 72744.009154 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +10ms
E: 72744.018965 0002 0000 -002 # EV_REL / REL_X -2
E: 72744.018965 0002 0001 -003 # EV_REL / REL_Y -3
E: 72744.018965 0000 0000 0000 # ------------ SYN_REPORT (0) ---------- +9ms
Note how there is an events in there with 22ms? Maintaining constant
pressure is
hard. You can re-create the above recordings by running
evemu-record.
Pressing hard I get deltas up to maybe 5. That's staying within the second
pressure range outlined above, I can force higher deltas but what's the
point. So the dynamic range for deltas alone is terrible - we have a
grand total of 5 values across the comfortable range.
Changing the sensitivity setting higher than the default will send higher
deltas, including deltas greater than 1 before reaching the report rate. Setting
it to lower than the default (does anyone do that?) sends smaller deltas.
But doing so means changing the hardware properties, similar to how some
gaming mice can switch dpi on the fly.
I leave you with a fun thought exercise in correlation vs. causation: your
trackpoint uses PS/2, your touchpad probably uses PS/2. Your trackpoint has
a reporting rate of 100Hz but when you touch the touchpad half the bandwidth
is used by the touchpad. So your trackpoint sends half the events when
you have the palm resting on the touchpad. From my observations, the deltas
don't double in size. In other words, your trackpoint just slows down to
roughly half the speed. I can reduce the reporting rate to approximately a
third by putting two or more fingers onto the touchpad. Trackpoints haven't changed
that much over the years but touchpads have. So the takeway is: 10 years ago
touchpads were smaller and trackpoints were faster. Simply because you could
use them without touching the touchpad. Mind blown (if true, measuring these things is hard...)
Well, that was fun, wasn't it. I'm glad you stayed that long, because I did
and it'd feel lonely otherwise. In the next post I'll outline the pointer acceleration
curves for trackpoints and what we're going to to about that. Besides
despairing, that is.
[1] I doubt you will be, but it always pays to be prepared.
[2] In this post I'm using "pressure" here as side-ways pressure, not downwards
pressure. Some trackpoints can handle downwards pressure and modify the
acceleration based on it (or expect userland to do so).
[3] Not that this number is always correct, the Lenovo CompactKeyboard USB
with Trackpoint has a default sensibility of 5 - any laptop trackpoint would
be unusable at that low value (their default is 128).
[4] I honestly don't know this for sure but ages ago I found a hw spec
document that actually detailed the process. Search for ""TrackPoint System
Version 4.0 Engineering Specification", page 43 "2.6.2 DIGITAL TRANSFER
FUNCTION"