Close

A small gesture

A project log for LiDAR as an input device

Experiments with using a 2D 360 LiDAR sensor as an input device.

timescaleTimescale 10/31/2021 at 16:590 Comments

Although the project is currently on hold, I continue playing around with the LiDAR sensor to explore the possibilities of using a robot sensor as a static interface device. Last time I uploaded the basic multispot code that tracks several groups, averaged them and checks these against a set of pre-defined points in space. Functionally this is about all a simple interactive activity in a space should ever need. This script could do with a little refinement like considering accuracy based on distance and/or position, but basically it just works good enough from desktop use to small rooms. I'm quite sure that implementing this in a big exposition room will be just fine and will require little tweaks.

So having that functionality basically down, I started to look at using the sensor as a pure HID again. The proof-of-concept LiDAR mouse pointer script was quite basic. It worked by closest point, could not track multiple point groups and has no smoothing. It also did not nicely map the data to the screen coordinates. The obvious next step was to take the grouping method and use that for the HID and try to implement all the necessities to make it a workable input device.

The file isn't quite ready yet to upload for everybody to play around with, but I thought it useful to share my ideas on it so perhaps somebody with a better idea could comment or ask a question. The majority of choices I make in tinkering with this was that I could very well make it so that apart from driving the cursor, I could also perhaps implement gestures in some form.

Initially I am using group 0 for the pointer. so no matter how many groups are created, only the first one will control the cursor. This is fine for testing purposes and trying out different smoothing techniques. The smoothing trick I found to be the nicest one, I.E. the one that gave the least jittery cursor and the best control was a single step smooth with a vector and velocity value to impact the amount of smoothing. The velocity of course we get for free because in single step smoothing you retain the old position and find the difference between that and the new one. That value is also smoothed, albeit linearly whereas the we smooth the position data based on how the smoothed velocity matched the vector.

This way, movement in any direction always gets a bias, meaning that quick motions do not take to long to catch up and slow movement give fine control. All the small jittering from the sensor data is also smoothed quit aggressively which means you get a cleaner cursor movement in all ranges.

Now I'm sure that the vector method is quite overkill, but the reason I stuck with this was because I realized that IF this got me precise enough results, it could be used to do the same with the second group and based on the vectors and velocity, this could drive other events like zoom, pinch or any spacing 2d gesture the sensor could pick up.

So this is my thinking right now. Get the cursor as smooth as possible and then try and implement gestures for two hands/ people. Hopefully next time I will have some code to share.

Discussions