Monday, September 9, 2013

Progress in Hand and Body Tracking

I continue to believe that putting a display on the head, as great as that display might be, is not enough for a truly compelling virtual reality interaction, and that hand and finger tracking is a critical missing component for the VR experience.

Two new Kickstarter projects provide a step in the right direction, each taking a different approach:

Approach 1: PrioVR from YEI

PrioVR from YEI Technology has launched on Kickstarter earlier this month. It uses a connected network of high-quality orientation sensors to determine the orientation of various body parts. By logically connecting these sensors to a skeletal model of the body, the system can also determine the body posture and position of hands, feet, elbows, etc. This facilitates both motion capture as well as real-time reporting of body position and orientation into a game engine.
Motion Capture studio from YEI technology
One of the things that I like about the PrioVR system is that it is completely portable. It does not require a stationary sensor (e.g. Kinect), it does not require the person to be facing towards a particular direction and can really be taken anywhere, assuming you are willing to walk around with the sensors strapped to the body. The system does assume a wireless link between the central station on the body and a computer, but this works over fairly substantial distances. Additionally, if the computing device is portable, one could imagine a simple wired connection to it for enhanced portability.

The fidelity of the model is dependent on many parameters, including:
  • The number of sensors that are being used. For instance, if a sensor is placed on the back, this sensor can be used to determine the rotation of the body and also help in determining the XYZ position of head (leaning forward will be registered in the system and through the skeletal model can be used to estimate the height of the head). However, if another sensor is placed on the lower back, the combination of these two sensors can be used to determine if the person has turned or is twisting the back.
  • Calibration accuracy. In the YEI model, sensors are attached to the body using elastic straps. It is easy to see how a strap might be rotated so that, for instance, an arm sensor is not parallel to the ground even when the arms are. To avoid errors, a quick calibration might be required at the beginning of a session. 
  • Accuracy of skeletal model. If the model assumes a certain distance from shoulder to elbow, but the actual distance is different that what is assumed, one could see how the hand position might be impacted by this skeletal error.
One wonders if this system does not produce 'too much information' relative to what is required. For instance, while it may be nice to understand if the arm is bent and exactly at what direction, is that information really required for a game that only cares about the hand position?

Approach 2: STEM from Sixense

The STEM system is scheduled to launch on Kickstarter later this month. It is an enhanced version of the current Razer Hydra in the sense that it adds wireless controllers as well as additional tracking points.

The STEM system uses a base station that helps track the XYZ position of various sensors/endpoints. A typical use case would be to track both hands when the user is holding a wireless controller as well as to track additional devices (head, weapon, lower body) if sensing modules are placed on it. To some extent, this is a simpler and more direct method than the PrioVR solution. With STEM, if you want to know the position of the hand, you just put a sensor on the hand. With PrioVR, if you want to know the position of the hand, you have to deduce it from the orientation of the various bones that make up the arm as well as knowledge about the upper and lower body. At the same time, it provides fewer data points about the exact posture and perhaps is more limited in the number of simultaneous sensors.

I have not had a chance yet to thoroughly evaluate the accuracy and response time of the STEM system yet.
Sixense STEM controller

Once the basic position and orientation data is presented to the application from either the PrioVR or STEM sensors, there is still an opportunity for a higher level of processing and analysis. For instance, additional software layers can determine gestures or hand signals. If more processing can be done in a middleware software layer, less processing will be required by the games and other applications to take advantage of these new sensors.

Another open question for me is the applicability to multi-person scenarios, assuming more than one 'instrumented' person in the same space. How many of these devices can be used in the same room without cross-interference.

Having said all that, I am excited by both these products. They are very welcome steps in the right direction towards enhancing and potentially revolutionizing the user experience in virtual reality.

No comments: