Stereo rendering and projection according to IMU sensor happens in vrcompositor, which can also be used without a HMD with mouse controls. Right, they are the core components required in a VR renderer. But for VR you need stereo rendering and barrel distortion, right? This fallback mode would probably be best done during run time. So you can still view spherical video projected correctly and navigate with your mouse. Currently you can compile gst-vr without OpenHMD and can view things with an arcball camera, without stereo. No problem, you can view spherical videos and photos anyway.
![oculus rift opengl color shift shader oculus rift opengl color shift shader](https://i.stack.imgur.com/aEfxG.png)
For broader headset support, and because I think it will be adapted as a standard, I will implement support for OSVR in the future of gst-plugins-vr. This is why I decided to use OpenHMD as a minimal approach for initial VR sensor support in GStreamer.
#Oculus rift opengl color shift shader driver
In contrast to that the proprietary Oculus driver OVR is very unstable and unsupported, which I used in my demo HoloChat in 2015. Although its HMD support is very limited, it currently only supports the IMU ( Inertial Measurement Unit) of the DK2, it is very lightweight since it uses hidapi directly and requires no daemon. The vrtestsrc element $ gst-launch-1.0 vrtestsrc ! glimagesink How do you track the HMD?įor my HoVR demo in 2014 i made Python bindings for using OpenHMD in Blender game engine and had a good experience with it. You can switch to the wireframe rendering with Tab in the vrtestsrc element. In Gst3D, which is a small graphics library currently supporting OpenGL 3+ I am providing a sphere built with one triangle strip, which has an equirectangular mapping of UV coordinates, which you can see in yellow to green. With video half spheres, or hemispheres, you could for example texture a skydome. 180° video also uses usually a half cylinder.
![oculus rift opengl color shift shader oculus rift opengl color shift shader](http://i.ytimg.com/vi/Z0B-zeM7Sa4/maxresdefault.jpg)
This is why the term 360° video does not suite to describe spherical video, since there are other popular shapes for projecting video having φ = 360°, like cylinders.
#Oculus rift opengl color shift shader full
The unit for the solid angle is steradian, where the full sphere is 4 π sr, hence the hemisphere 2π sr. You can calculate Cartesian coordinates from spherical coordinates like thisįor describing the angle on a sphere we can use the solid angle Ω, calculated by integrating over two angles θ and φ. In fact we have 2 angles, θ and φ, also called inclination and azimuth. Sperical projection is used very commonly for example in Google Street View and of course in VR video.Īs we are in 3D, a regular angle wouldn’t be enough to describe all directions on a sphere, since 360° can only describe a full circle, a 2D shape. Still panoramas have been very commonly projected onto cylinders, not only in modern photo viewer software, but also in monumental paintings like the Racławice Panorama, which is housed inside a cylinder shaped building.īut to store information from each angle in 3D space we need a different geometric shape. Projecting our images on different shapes than planes in virtual space is not new at all though.
![oculus rift opengl color shift shader oculus rift opengl color shift shader](https://www.researchgate.net/profile/Mikael-Johansson-10/publication/308925999/figure/fig1/AS:414577499099136@1475854386209/A-typical-stereo-rendering-pipeline-for-the-Oculus-Rift_Q320.jpg)
Although this seems to be a physical limitation, there are some ways to overcome it, in particular with Curved LCD screens, fancy projector setups rarely seen in art installations and of course most recently: Virtual Reality Head Mounted Displays. Nowadays mankind projects its imagery mostly onto planes, as seen on most LCD Screens, canvases and Polaroids. A brief history of mapping imagery on anything different than a plane A better solution for “real 3D” video would be of course capturing a point cloud with as many sensitive sensors as possible, filter it and construct mesh data out of it for rendering, but more on that later. Mapping the stereo video onto a sphere does not solve this, but at least it stores color information independent of view angle, so it’s way more immersive and gets described as telepresence experience. Stereoscopic video does not provide the full 3D information, since the perspective is always given for a certain view, or parallax. With its 1.6 release GStreamer added support for stereoscopic video, I didn’t test Side-By-Side stereo with that though. Side-by-side stereoscopic video was becoming very popular, due to “3D” movies and screens gaining popularity. It used the headset only as a stereo viewer and didn’t provide any tracking, it was just a quick way to make any use of the DK1 with GStreamer at all. Three Years ago in 2013 I released an OpenGL fragment shader you could use with the GstGLShader element to view Side-By-Side stereoscopical video on the Oculus Rift DK1 in GStreamer. This article sums up some VR R&D work I have been doing lately at Collabora, so thanks for making this possible! ? Previously on GStreamer