|Photo Credit: |
Saad Faruque via Compfight
- Change the viewpoint of the user to reflect actions such as jumping, ducking or leaning forward.
- Show hands and other objects in the displayed image. A common complaint of users in virtual reality is that they can't see their hands.
- Connect the physical and virtual world. For instance, by detecting hand position, a software program can implement the option to move virtual objects by touching them.
- Detect gestures. By analyzing position over time, a gesture can be detected. For instance, a user might draw the number "8" in air and have the software detect it.
I've described these tracking methods as well as others such as depth map in a guest blog post at RoadToVR. Please click here to read that post on the RoadToVR site.
For additional VR tutorials on this blog, click here
Expert interviews and tutorials can also be found on the Sensics Insight page here