Heuristics for smarter Leap Motion tracking

Leap Motion is quite easy to develop for but I find that hand and finger tracking can be tricky – especially for more precise gesture control. I’ve been doing some development with Leap that requires precise positioning (~mm accuracy needed) with multiple fingers. In this post I run through a few strategies I use to try and improve my Leap applications.

1) Track hand and finger identifiers

Each Finger and Hand object has a unique identifier which should remain consistent between frames, so long as the object continues to be tracked. If you can make a reasonable assumption about which finger(s) the user is intending to use for interaction, you can look for a specific identifier to find the finger(s) of interest, ignoring anything else in the frame. I find that when extending their index finger for interaction, most users tend to hold their thumb out unwittingly; which Leap detects, of course.

I use a simple heuristic to try address this issue: pick the finger with the smallest z-coordinate (furthest forward) and track its identifier in future frames. When I want to track multiple fingers for interaction, I make sure those fingers are attached to the hand whose identifier I’m tracking. This rejects other unattached fingers in the field of view, e.g. the cylindrical lights in my office at university! (Seriously…)

2) Check finger lengths

Sometimes Leap sees your knuckles in a closed fist and reports them as fingers. The looser the grip, the worse the problem. Checking that the length of each finger is above a threshold length can help you reject misclassified knuckles and clenched fingers. A threshold of 2 cm or thereabouts should be fine; any longer and you risk rejecting small fingers or fingers which are fully extended but Leap just isn’t tracking properly.

3) Look at direction vectors for unlikely values

The biggest problem I have with Leap is that it picks up fingers in a closed fist. Looking at the debug visualiser, I tend to see fingers pointing upwards through the fist, rather than an accurate representation of a finger in a clenched fist. These “phantom” fingers tend to have extreme y values in the direction vector (> 0.6). Through observation of some of my users I’ve also noticed that when using a certain hand pose for input, “inactive” fingers are left to hang down; which Leap picks up perfectly. These fingers typically point downwards, with a high y value in the direction vector (< -0.6).

A heuristic I use to reject “phantom” fingers and fingers the user probably doesn’t want to use for input is to take the absolute y value from the direction vector and ignore the finger if this value exceeds a threshold. I use a threshold around 0.4. A limitation of this heuristic is that the user has to keep their hand quite flat. To address this, you could instead measure finger direction relative to the hand orientation and then apply the threshold.

Summary

Sometimes Leap gives crap data and we also can’t expect our users to keep a perfectly clenched fist. A few simple checks on each frame update can help us reject fingers which either don’t exist or weren’t intended to be part of the interaction.