OSVR Architecture |
The full OSVR architecture is shown in the figure to the right and is described in greater detail in this technical white paper
Of course not all of these blocks were required for the CES demos. The demos used the OSVR HDK (Hacker Developer Kit - the open-source goggles), and some combination of the Razer Hydra, the Leap Motion Controller and the Nod Ring. Some of the OSVR HDKs were equipped with its built-in tracker (Bosch BNO070) and others were equipped with the YEI 3-Space Sensor. Various software demos were written, some using Unity and some using the Unreal Engine.
CES demo architecture |
OSVR has a client/server architecture. The server connects to the hardware devices and performs the analysis if required. The client uses this information. In the diagram to the right, the Unity and Unreal engines are clients, through an OSVR Unity plugin and an OSVR Unreal plugin. Everything else is the OSVR server.
Often, the client and the server would reside on the same machine, but there are also some additional possibilities including:
- Client on one machine, server on another. For instance, you might have the client running on a mobile phone while the server would be a PC that has more processing power and better connectivity to the various peripherals
- Multiple clients connecting to one server on the same machine. For instance, OSVR includes a graphical utility called "Tracker Viewer" that show the orientation and position of the various tracked devices and has proven to be a useful debug tool. TrackerViewer is an OSVR client and it can run concurrently with other applications such as the demos, connecting to the OSVR server.
- One machine running client and server, a second machine running a client. This can create a 'shadow' experience on a remote machine.
- One client connecting to multiple servers. This is useful in some high-end situations where OSVR applications wish to connect to devices (such as ART SmartTrack) that embed a server.
The 'device plugins' layer of OSVR connected in the demos to the various hardware devices and provided an abstract interface to the higher layers, so that the application does not care - for instance - whether the Bosch or YEI trackers are used. This proved to be useful during the show as our booth had a sofa used for demos and one of the trackers did not like the metal rails of the sofa. Swapping in the other tracker was extremely simple. More importantly, the same application can work across multiple trackers without change. OSVR supports for the Oculus DK2 orientation and position trackers which is helpful for those that wish to write cross-platform applications or for those that wish to debug their OSVR code using an HMD they already own.
An interesting software component on the OSVR stack at CES was the "1 euro filter" that is part of the analysis layer. The analysis layer serves to perform post-processing or high-level analysis on data that comes from the lower layer. In this case, the 1 euro filter is a low-pass filter that can be used to smooth and improve data coming from the Razer Hydra positional information.
One feature of OSVR that allows this to happen is the "path tree". Similar to a URL or file system path, the path tree is how all the sensing and rendering data is made available. Aliases are configured in the server to essentially redirect from a semantic path (a path with a meaningful name) all the way back to the system-specific hardware details. For instance:
- Position of the left hand: /me/hands/left/position
- Orientation of the left hand: /me/hands/left/orientation
- All resources associated with the left hand: /me/hands/left
- Position of the “0” controller of the Hydra: /razer/hydra/position/0
- Output of the first smoothing filter: /analysis/smooth_filter/0
OSVR subsequently allows defining the connection between the various components. In our specific example, /razer/hydra/pos/0 feeds into /analysis/smooth_filter/0 (1 euro filter) which feeds into /hands/left .
This allows to re-route information or insert or remove software components as necessary. For instance, if the 1-euro filter is not desired, simply map /razer/hydra/pos/0 into /hands/left. If a new filter is available, insert it back.
With each of the above software components, comes a JSON file, which is a human- and machine-readable descriptor file. The JSON file could provide device-specific information (for instance, for an HMD it would indicate what is the resolution, field of view and distortion correction coefficients). For a software component, it could define the semantic path of the inputs and outputs which can help in determining available information routes or facilitate auto-routing. Over time, the JSON descriptor could grow to include other parameters such as device-specific calibration data and so forth.
The CES demos used only a small portion of the OSVR components, but even so, the architecture proved to be useful for both development and ongoing operation. The OSVR community led by Sensics and Razer, continues to add and demonstrate components to OSVR. Whether it's a "locomotion device" (to support products from Virtuix and Cyberith), "eye tracking" or others, the capabilities of OSVR continue to grow very nicely.
The post "The OSVR Software Stack at CES" first appeared on the VRguy's blog:www.vrguy.net
The post "The OSVR Software Stack at CES" first appeared on the VRguy's blog:www.vrguy.net