I wish more people experimented with hand input since this has essentially been solved in recent years due to advancements in computer vision: https://mediapipe.dev/. Yeah it might be awkward to ask the user to activate their camera, etc etc. But right now I'm barely seeing any experimentation in this direction which is a shame.
Asking for camera access is not "awkward". In 2022, it's assumed that if I say yes, you'll record absolutely everything the camera can see, store it forever, sell it to anyone who wants it, and use it to show me ads for crap no one wants. There are a whole host of amazing possible apps using camera, location, microphone, etc., but tech companies have proven that they cannot or will not deliver these apps without egregious privacy violations.
Tech companies need to prove they can limit themselves to using data for the purpose for which it was requested, then we can talk about whether or not I'll give camera access.
I think one way this can be more acceptable is to redirect the camera to view the hands above the keyboard instead of having to wave my hands in front of my face. Interesting to be able to enable gestures to swipe desktops without using the trackpad.
The LeapMotion was basically a really accurate Kinect for your hands, and that was back in 2014. I always wondered why it never took off, and I suspect it's a market issue more than a technological one. Waving your hand in the air and doing hand gestures just didn't provide enough benefit I guess
I had one of the LeapMotion input devices back around when they launched, and really did try to use it in earnest - using a whole bunch of shortcuts and AutoHotkey hacks to navigate the OS with it. There were even experimental programs people wrote to input text with it, using something similar to chorded keyboards.
It didn't really work out though. Long story short: my arms got tired. Turns out that it's a kinda fundamental problem with how they had designed the interface - hovering your hands above something for extended periods of time is simply just tiring and uncomfortable.
That seemed pretty obvious the first time I saw it, but I figured maybe I was over-estimating the issue. I mean, here's this darling company with a product everyone's excited about. Surely they must have thought about that!
The Leap Motion still does hand tracking better than MediaPipe, and it's still the best hand tracker I know of (besides larger devices like the Oculus Quest).
We've got an open source library for mobile hand-object input [https://portalble.cs.brown.edu/], and the version with Leap Motion is really nice, but doesn't directly work with a phone (we had to pipe data through a compute stick to make it work).
I'd love to see MediaPipe Hands match LeapMotion precision some day, but I'm not even sure if it's possible. A real depth sensor goes a long way.
I really wanted to play with it but never did because I had to buy a whole thing (that I might only use once and there wasn't many people developing for). With it being available on the web (without a device) things might be different since the the bar to try it out is much lowered. (After all oculus has tons of fun games and things that use hand tracking so I don't think you can conclude the idea doesn't have potential.)
"really accurate" that wasn't my experience of the leap motion at all, we had one alongside an occulus rift dev kit in a museum I worked for and it was essentially impossible to use, even for their inbuilt demo games. We got a replacement and had the same experience so just gave up on it.
Part of me wonders if it's just that people don't know that this is solved on the web, so make sure to go here and try it out and make something if Brets article appealed to you: https://google.github.io/mediapipe/solutions/hands#javascrip... + https://codepen.io/mediapipe/pen/RwGWYJw