The LEAP Motion people have an article up (Drax links to it also in episode 21) about recent changes in their technology. They are striving to make a natural user interface for the computer. With VR becoming a reality many think we need a new user interface that allows us to use the computer in the same way we manipulate real world things.
The LEAP controller I purchased some months ago is not the answer. It can detect hand motion. But, to control a game we used gestures. Trying to remember that gesture ‘A’ (may be pointing two fingers right) means turn right and gesture ‘B’ (may be moving both hands up and down) means jump is not much of an improvement. We are just changing what we manipulate to tell the computer something.
We learn to do ‘this’ with a mouse to accomplish ‘that’. With LEAP we move out hands ‘this way’ to accomplish ‘that’. Not much of a paradigm change.
Plus there was little feedback. It was hard to know if your hands were in the right place or how the LEAP was interpreting you gestures.
Now they are in the Beta stage of adding well articulated hands as part of the controller that display in the game. The visual feedback should make picking up and manipulating things much easier. The goal is apparently to build more of the user interface for games into the LEAP Motion Controller. This is probably a good idea as it will provide standardization in how the LEAP works from game to game.
For more details on what LEAP is doing see: Skeletal Tracking 101: Getting Started with the Bone API and Rigged Hands.