Given that Oculus/Facebook mothballed support for VR on Mac, the OSVR
team at Sensics decided to step up and fill that void.
Once successful, this will provide an active path to continue to work with Oculus (and many other HMDs) on Mac. It will also pave the way to make use of the many capabilities of OSVR including positional and predictive tracking, device-independent integration for cameras, gestures, skeleton, locomotion devices and more.
We are seeking community assistance in making this happen faster. Below
are short descriptions of what we know and what we are seeking.
Comments, feedback and offers to assist will be most appreciated.
What we know
OSVR is built to be multi-platform. We have OSVR working on Windows,
Android and Linux and have had some reports of success in
working on Mac.
We have OSVR building on both Linux (with Clang and
GCC) and Windows. All the commits must pass testing on the CI for
Linux and Windows.
OSVR supports multiple HMDs including those made by Oculus, OSVR,
Sensics, Vuzix. Vive support is coming, as well as better
integration with Cardboard. Support includes display parameters,
distortion. Asynchronous time warp supported on Windows/NVIDIA
environment but we are working to expand this to AMD/Intel as well
as other platforms.
OSVR has been integrated into WebVR. Bugzilla patch has been
submitted and reviewed by the Mozilla team. WebVR demos work on top
of OSVR-supported HMDs. This means that OSVR work will also help support WebVR on Mac and on other platforms
We know OSVR was successfully built on Mac last December (prior to
public release), and all external dependencies have Mac support, so
we should have a good idea of the missing pieces.
Status of Facebook/Oculus support in OSVR
Display support for Oculus in OSVR is already entirely independent of Oculus SDK. It uses distortion parameters from OSVR Distortionizer and our own
distortion shader, and display descriptor.
So, OSVR support for Oculus on Mac comes down to sensor access (and sensor fusion, as in the Oculus
case, the SDK or runtime does the IMU fusion).
Currently, the OSVR-Oculus plugin master branch uses a VRPN driver
written to access the Rift SDK, which is built against a 0.4.x release,
and fully functional and multi-platform. For instance, here is a demo of OSVR running Rift on Linux. Here is another demo showing the OSVR Palace demo running on Unity over Linux with OSVR/Oculus:
A branch contains a direct OSVR PluginKit driver (no VRPN involved) that
builds against a newer Rift SDK, but there are bugs in it, so it is unfinished.
If looking to handle Rift on "officially unsupported" (by Facebook)
platforms, including Mac and Linux, will want to develop an OSVR PluginKit
driver using an open-source driver stack.
There are several, though some
of them claim a non-copyleft license while incorporating some well-known
GPL sensor fusion, so problematic that way.
OpenHMD appears intended to be a broader approach than just an
open-source sensor access for Oculus, but is BSL1.0 licensed and
includes sensor fusion, so could serve the purpose. There are almost surely others and we would be happy to receive referrals.
OSVR (Open Source Virtual Reality) aims to create an open and universal standard for the discovery, configuration, and operation of VR/AR devices. Created by Sensics and Razer, OSVR includes two independent components: 1) the OSVR software platform and 2) the open-source Hacker Development Kit VR headset.
Since the OSVR launch in January this year, nearly 250 organizations including Intel, NVIDIA, Xilinx, Ubisoft, Leap Motion, and many others have joined the OSVR ecosystem. Concurrent with the expansion of the OSVR community, the capabilities of the software platform have grown by leaps and bounds.
The Sensics team architected the OSVR software platform and is its official maintainer. Below, I describe my personal perspective on the road ahead for the OSVR software along several paths: interfaces and devices, game engines, low-latency rendering, operating systems, utilities, and high-level processing.
The OSVR HDK is modular with open-source design plans
The Big Picture
The goal of the OSVR software framework is to make it easy to create compelling, high-performance VR/AR applications that:
Work on as many VR/AR displays and peripherals as possible.
Support even those devices that were not available at the time the application was created. Just like you don’t need to upgrade your word processor when you buy a new printer, you should not have to upgrade your game when a new HMD becomes available.
If desired, can run on a wide range of operating systems and computing platforms.
Take advantage of the unique features of individual devices, as opposed to reaching for the ‘lowest common denominator’.
Are not locked into a particular device, peripheral, development environment, programming language or app store.
The OSVR framework aims to achieve these goals while open-sourcing the vast majority of the software in order to:
Encourage participation from the broader VR/AR community.
Provide adopters the security and peace of mind that they desire.
Accelerate the pace of development by including a wide community of contributors.
Allow adopters to customize the platform to their specific needs.
Last, OSVR leverages existing open-source projects (OpenCV, CMake, VRPN) and is designed with a modular, plugin-based architecture so that:
Participants can opt to keep modules closed-sourced so as to protect IP.
Adopters can choose a small footprint deployment by choosing only the modules they need.
Functionality such as support for new devices can be added after the fact.
Interfaces and Devices
Overview
In “OSVR speak”, an interface is a pipe for data of a certain type (“interface class”). Devices are said to expose interfaces to the OSVR core, and in turn, to an application. Such interfaces include tracker (provides position, orientation, or full pose), button, analog (axis of a joystick, analog trigger value), eye tracker (gaze direction), gesture, imager (single image or image stream), locomotion (for omnidirectional treadmills), skeleton (bone structure) and display (output device).
A single physical device may provide data in more than one interface class, just like a multi-function printer might look to an operating system as a printer, a scanner, and a fax. For instance, here are some interfaces exposed by popular devices:
Imagine that you are developing a viewer for 360 degree videos and you want to allow the users interact with it using gestures. With OSVR, because you work through a universal device-independent gesture interface, you can use any device that has an OSVR plugin. In the above example, these devices would be products from Leap Motion, NOD, YEI and Softkinetic. All these devices would expose gestures to the app in a standardized way. Contrast this approach with the hassle of having to integrate every one of these devices individually. Moreover, when new devices such as the Intel RealSense camera get an OSVR plugin that includes a gesture interface, they work immediately work with your app without having to change a single line of code.
Current Status
To date, the OSVR team has primarily focused on creating the various types of interfaces, connecting popular devices that illustrate that these interfaces work, and creating simulation plugins that allow developers to work with simulated data in lieu of using a real device. Aside from native OSVR plugin, OSVR also inherits support for about 100 devices from VRPN, a popular open-source project.
Future Plans
In the coming months, I believe that the OSVR team, OSVR hardware partners, and other OSVR contributors will significantly expand the number of devices supported by OSVR. Specifically, I am aware of plans and work in progress to support Intel Realsense, Tobii eye trackers, NOD devices, Leap Motion camera, HTC Vive and others.
Game Engines
Overview
OSVR performs a very important service for both game engines as well as for hardware devices that wish to be supported by these engines.
Figure 1 – Without OSVR: each device needs multiple plugins
The problem is that there are many graphics and game engines (Unity, Unreal, Crytek, VBS, Mongame, OpenVR, Unigine, WebVR, etc.) and numerous hardware devices. If each hardware vendor wants to be supported by each engine, a lot of drivers will need to be written and many person-years will be spent in integration and optimization. Moreover, this “many-to-many” connectivity (seen in Figure 1) puts a lot of stress on engine developers as they need to continuously educate and support hardware developers as well as relentlessly work to keep up with the latest hardware and driver versions.
Figure 2 – With OSVR: a single OSVR plugin makes it easy to support many engines
In contrast, when using OSVR, a game engine needs to have a single integration with OSVR to support all OSVR-supported hardware. The OSVR game engine integration provides a harmonized, standardized way to connect to all devices that support the relevant OSVR interfaces: whether tracker, gesture, skeleton, eye trackers or others. The OSVR team and community can work with each engine developer to create an optimized integration, so there is no need for hardware developers to learn and re-learn the intricacies of each engine. If a new hardware device comes on the market, the hardware manufacturer (or an interested member of the OSVR community) can create a device plugin for OSVR, thus automatically achieving connectivity into all OSVR-supported game engines. Because OSVR is open-source and plugin-based, the availability of such plugin does not depend on the priorities of Sensics or any other company. A developer can simply start from an existing open-source plugin and quickly modify it for the specific API of the new device. The result is illustrated in Figure 2.
Current Status
OSVR is integrated Unity, Unreal Engine, and Monogame
Future Plans
A plugin for OpenVR (Valve) is in beta and should be completed soon. This will allow OSVR-supported displays and devices to be used with OpenVR/SteamVR games, though subject to the limitations of the OpenVR API.
An OSVR backend for WebVR in Mozilla Firefox has been submitted to the Mozilla project and we expect it will become part of the Firefox WebVR nightly build very soon.
A team of students is working on a Blender Game Engine plugin. As part of this effort, they are creating a Python wrapper for the OSVR Core client functionality, which should allow easy integration into other engines such as WorldViz Vizard.
Possible integrations into CryEngine, Bohemia VBS3 and others are being discussed.
Due to conscious design decisions made when developing the client (game/application) API, because there are multiple language wrappers available for the native C OSVR API (C++, .Net and soon Python), and because existing integrations are open-sourced, it is easy to integrate OSVR into additional engines, including internal engines and custom applications.
Low-latency Rendering
Overview
The elapsed time from sensing to rendering—sometimes called ‘motion to photon’ latency—has been the subject of intense scrutiny towards creating more comfortable and immersive experiences. Latency comes from multiple sources including: how often do the sensors generate data? How quickly does that data get to the application? Are data points extrapolated into the future to create ‘predictive’ tracking? How quickly can the application render the image through the graphics stack? Can the application make ‘time warping’ corrections at the last instant?
Current Status
OSVR systematically addresses these points as follows:
Data rate from sensors: The OSVR HDK provides 100 Hz positional data and 400 Hz “sensor-fused” orientation data.
Speed in which data reaches the application: OSVR integrates with ETW (Event Tracing for Windows) which is a powerful tool for latency and performance analysis (See http://osvr.github.io/presentations/20150901-Intro-ETW-OSVR/for a brief tutorial). ETW helps optimize the full software stack—from the game to the sensors—towards achieving optimal performance.
Predictive tracking: OSVR currently includes predictive orientation tracking. When using the OSVR HDK, this is derived from angular velocity reports that are provided at 400Hz to the controlling PC. When using HMDs that do not provide angular velocity, the velocity can be extracted from consecutive yaw/pitch/roll reports. Predictive tracking looks 16 milliseconds into the future and reduces the apparent latency.
Direct render: The Sensics/OSVR Render Manager (supported on Windows/NVIDIA platforms and includes a Unity Plugin) provides optimal low-latency rendering on any OSVR-supported device. It includes:
Direct to HMD Mode: Enable an application to treat VR Headsets as head mounted displays that are accessible only to VR applications, bypassing rendering delays typical for Windows displays. Direct to HMD Mode is available for both Direct3D and OpenGL applications.
Front-Buffer Rendering: Renders directly to the front buffer to reduce latency.
Time warping: OSVR includes Asynchronous Time Warp as part of the Render Manager. It reduces latency by making just-in-time adjustments to the rendered image based on the latest head orientation after scene rendering but before sending the pixels to the display. It includes texture overfill on all borders for both eyes and supports all translations and rotations, given an approximate depth to apply to objects in the image.
Future Plans
The OSVR team is working to expand the range of supported “Render Manager” graphics cards to include AMD and Intel. We are also looking to add Render Manager capabilities on Android and other non-Windows platforms by collaborating with graphics vendors for these platforms.
With regards to the Render Manager itself, in the very near future, the following enhancements will be released:
Integrated Distortion Correction: handling the per-color distortion found in some HMDs requires post-rendering distortion. Today, OSVR performs this on the client/application side, but moving distortion correction into the Render Manager provides additional benefits. The same buffer-overfill rendering used in Asynchronous Time Warp will provide additional image regions for rendering.
High-Priority Rendering: increasing the priority of the rendering thread associated with the final pixel scan-out ensures that every frame is displayed on time.
Time Tracking: indicating to the application what time the future frame will be displayed lets it render the appropriate scene. This also enables the Render Manager to do predictive tracking when producing the rendering transformations and asynchronous time warp. The system also reports the time taken by previous rendering cycles, informing application when to simplify the scene to maintain an optimal update rate.
Add Render Manager support in additional engines.
Use ETW to perform additional engine and plugin optimizations.
Design of a plugin-based API for advanced rendering capabilities. This would allow open-source release and public development of a cross-platform Render Manager stack with the ability to load vendor-specific code (which may be closed-source if required by those vendors).
Operating Systems
Overview
VR/AR products come in many shapes and forms and reside on several different computing platforms. The obvious examples are PC-based VR, which typically uses Windows, and mobile VR, which typically uses Android phones. The goal of OSVR is to both support a wide range of operating systems as well as provide a consistent programming interface regardless of the particular operating system. In an ideal case, OS-specific details (such as graphics drivers, file system details) as they relate to creating the VR/AR experience are all abstracted by OSVR.
Current Status
OSVR currently supports the following operating systems:
Windows, including Windows-specific support for direct rendering.
Android, including the ability to use internal sensors, camera and other on-board peripherals. Currently, this uses the CrystaX NDK to build native-code applications. Unity/Android also works on top of OSVR/Android.
Linux: The OSVR engine has complete support for Linux and the code is tested on Linux before Windows binaries are released. Unity/Linux should be possible but has not been tested yet.
Future Plans
Include Android and possibly Linux binaries in Unity plugin releases.
Add OSX support.
Add iOS support.
Add RenderManager support to other platforms, working with platform vendors and manufacturers of graphics chips as required.
Validate correct operation of additional game engines for non-Windows operating systems.
Add plugins for OS-specific/platform-specific peripherals and capabilities.
Utilities
Overview
OSVR utilities are standalone programs that are perform useful functions in support of OSVR development and deployment. True to the OSVR philosophy, OSVR utilities are also open-sourced.
Current Status
The following utilities currently exist:
Distortionizer: The distortionizer is an interactive tool that helps determine the RGB optical distortion parameters of a display. Sometimes, these parameters are provided by the optical design team. Other times, they need to be measured. The output of the distortionizer is a set of distortion correction coefficients that automatically feeds into the Render Manager.
Latency test: This combines open-source hardware (based on low-cost Arduino components) and open-source software to provide end-to-end latency measurements.
Tracker Viewer: a graphical utility to dynamically display position and orientation of several tracked objects.
Future Plans
I am aware of several additional utilities under development:
Windows installer for the runtime components of OSVR.
Interactive configuration utility to allow configuring eye separation, height and other user-specific parameters.
High-level Processing
Overview
High-level processing modules (Analysis Plugins in “OSVR speak”) are software modules that convert data into higher-level information. For instance, a gesture engine plugin can convert a stream of XYZ coordinates to a recognized gesture; an eye tracker plugin can take live feed from a camera pointed at the eye and provide real-time gaze direction; a sensor-fusion plugin can mesh data various tracker interfaces into more accurate reports.
Current Status
The OSVR team and community have primarily been focused on building the lower-level infrastructure before adding optional processing layers on top of them. At present, two analysis plugins exist:
1-Euro filter: This is a data smoothing filter that, as currently applied, improves stability of the output from a Razer Hydra.
Predictive tracker: Estimates head orientation into the near future as a method to reduce perceived latency.
Future Plans
A unified API for easily developing analysis plugins and allowing their configuration is in progress. Several analysis plugins are under consideration and in various stages of design:
Sensor fusion: to combine the output from multiple sensors into more accurate information.
Augmented reality: to allow detecting objects in the scene.
Eye tracking: to convert pupil camera images into pupil coordinates.
We are working to simplify the process of writing analysis plugins, to provide open-source examples and are very open to community contributions.
Summary
OSVR, like VR in general, is a work in progress. I am very pleased with the quality of partners, the breadth of functionality, and the level of community involvement that we have seen in the eight months since launching OSVR. Having said that, there is still plenty of work to be done: whether in supporting new devices, supporting a wider range of engines and development environments, in making devices more useful by creating analysis plugins, or in providing high-performance rendering across multiple platforms. I am excited to have played a role in building OSVR and look forward to the months and years ahead.