In the process of creating SceneVR, I’ve investigated a lot of different options in terms of non-browser renderers. I’ve been doing this for two reasons:

App packaging

Virtual Reality APIs

Performance

App Packaging

I want to be able to make a nice mobile experience for Scene, because I believe that google cardboard is one of the massive levellers for access to VR. Anyone can buy a $10 cardboard v2, or spend $20 and get something nice like the bobovr z3.

Currently, accessing cardboard on content on scenevr, you do it through the browser, which is pretty good, but there are some downsides, like safari doesn’t really like to go full screen, there is no gamepad api for safari (so you can’t use a bluetooth gamepad to navigate the world), when you touch the screen the screen dims. Most of these are solvable, but I think it’d be nice to create an actual mobile app that let you launch a high performance renderer when you want to view a scene.

My current prototype uses a webview, which works quite well on iPhone, but then it comes to the next issue.

Virtual Reality APIs

SceneVR is designed for virtual reality. That’s why I created it. When the first webvr browser builds started coming out in 2014, I borrowed a friends DK1 and started on the journey that has taken Scene to where it is now. So, I need to have strong VR support. Currently — there are 4 major VR headsets that I can target:

Oculus Rift

Gear VR

Google Cardboard

HTC Vive

Since the rewrite, where I migrated from my own markup language to using Mozillas A-Frame, the only way to view SceneVR scenes in VR (hah) is using google cardboard (which uses the webvr-polyfill to detect the phones orientation), or one of the webvr builds by Brandon Jones. The problem is that the webvr builds of Firefox and Chrome are pretty flakey at the moment, and although it’s likely that webvr support for the rift and htc vive will land in the production versions of FF and Chrome eventually, you can’t rely on the timeline, it might be 2 months, it might be 12 months.

With cardboard, there is another fly in the ointment. Google has released parts of it’s cardboard SDK as opensource. One of the components is a great sensor fusion package that samples the accelerometer and gyroscope at a maximum rate, and then does some nice filtering on it to give the best possible stability it can in cardboard apps. The webvr-polyfill does a pretty good job of this on iPhone, since apple clearly does some filtering at a operation system level, but on android, the webvr-polyfill is laggy and inaccurate, when compared to the cardboard app.

It’d be nice to use googles nice filtering algorithms in your webvr app, which isn’t possible until they browser vendors expose webvr support to their mobile apps.

Performance

WebGL is rendered in the browser, and goes through the browsers composition layer, so I naively expected that if I cut the browser out of the equation, it’d be faster.

WebGL rendered by node

So, I installed node-webgl, which adds webgl bindings to node.js using glfw. Basically, you can open a window on your mac and render webgl straight to the screen.

But it’s not much faster. With the same sized window, and running the same code, chrome pumped out 125fps, and node.js pumped up 170fps. So, a little bit of a difference ,but definitely not enough to make it worth porting to something like gles.js, ejecta or cocoon just for the performance bump. Now, note that this is running on my macbook with a GTX 650m, so maybe there is a bigger performance difference on other platforms. I’m pretty interested to see how gles.js runs on my Note 4 for example.

Back to the APIs

However — there is one very interesting part of node-webgl, is that there is example code of how to get glfw rendering to the oculus rift, so I could make a nice node.js app that renders straight to the rift. Sign the executables, put it in a nice installer, and you have a windows app that might even make it into the Oculus store.

Also, there is NDK support for GL-ES on the gear vr, so you could use something like gles.js to render directly to the gear vr, supporting asynchronous time warp and working properly with oculus home.

A new browser?

So, while we wait for the browser vendors to add webvr support to their browsers, I‘ll keep experimenting and chipping away, because I wonder if there is need for a new browser that is optimised for webgl content, delivered over the web, but without the overhead of the html engine. Maybe in the VR future, we’ll all be browsing a web made up of A-Frame content, rendered in ultra high performance straight to our eyes, and html will be relegated to static content on billboards.