Two new Kickstarter projects provide a step in the right direction, each taking a different approach:
Approach 1: PrioVR from YEIPrioVR from YEI Technology has launched on Kickstarter earlier this month. It uses a connected network of high-quality orientation sensors to determine the orientation of various body parts. By logically connecting these sensors to a skeletal model of the body, the system can also determine the body posture and position of hands, feet, elbows, etc. This facilitates both motion capture as well as real-time reporting of body position and orientation into a game engine.
|Motion Capture studio from YEI technology|
- The number of sensors that are being used. For instance, if a sensor is placed on the back, this sensor can be used to determine the rotation of the body and also help in determining the XYZ position of head (leaning forward will be registered in the system and through the skeletal model can be used to estimate the height of the head). However, if another sensor is placed on the lower back, the combination of these two sensors can be used to determine if the person has turned or is twisting the back.
- Calibration accuracy. In the YEI model, sensors are attached to the body using elastic straps. It is easy to see how a strap might be rotated so that, for instance, an arm sensor is not parallel to the ground even when the arms are. To avoid errors, a quick calibration might be required at the beginning of a session.
- Accuracy of skeletal model. If the model assumes a certain distance from shoulder to elbow, but the actual distance is different that what is assumed, one could see how the hand position might be impacted by this skeletal error.
Approach 2: STEM from SixenseThe STEM system is scheduled to launch on Kickstarter later this month. It is an enhanced version of the current Razer Hydra in the sense that it adds wireless controllers as well as additional tracking points.
The STEM system uses a base station that helps track the XYZ position of various sensors/endpoints. A typical use case would be to track both hands when the user is holding a wireless controller as well as to track additional devices (head, weapon, lower body) if sensing modules are placed on it. To some extent, this is a simpler and more direct method than the PrioVR solution. With STEM, if you want to know the position of the hand, you just put a sensor on the hand. With PrioVR, if you want to know the position of the hand, you have to deduce it from the orientation of the various bones that make up the arm as well as knowledge about the upper and lower body. At the same time, it provides fewer data points about the exact posture and perhaps is more limited in the number of simultaneous sensors.
I have not had a chance yet to thoroughly evaluate the accuracy and response time of the STEM system yet.
|Sixense STEM controller|
Once the basic position and orientation data is presented to the application from either the PrioVR or STEM sensors, there is still an opportunity for a higher level of processing and analysis. For instance, additional software layers can determine gestures or hand signals. If more processing can be done in a middleware software layer, less processing will be required by the games and other applications to take advantage of these new sensors.