This is my first post sharing my experiences porting and optimizing Yana Virtual Relaxation for the Gear VR. Yana is a 10-minute getaway to your own private beach, which includes a full day-to-night cycle reflected over water, which proved to be quite a challenge to get to the required performance level and to do so on a mobile device.

Here are my top 10 things you need to think about carefully in planning and understanding development for the Gear VR.

1. Know the limits of the Gear VR

This is the most important thing you need to keep in mind. The limitations of the mobile device will (and should) shape your experience.

It’s very hard to take an existing experience and change it to suit the Gear VR, and even when it is possible the end product is never as good as something designed specifically for it.

It’s best to start from scratch and base all key decisions regarding interaction, movement, and user interface around how they will function on this particular hardware.

The limits of the Gear VR are widely known, advertised by Oculus, and repeated by every developer giving advice on how to get started so... Here they are!

Max 50-100 draw calls

Max 50k-100k triangles

Must hit a consistent 60 frames per second

It is also recommended to use optimized one pass shaders, no transparency, no post processes, no reflection, and no dynamic lighting (you can get away with one directional light if you need).

While these are pretty hard lines and going over 100 draw calls will likely drop your app below 60 frames per second, you can work very well within these boundaries.

2. Read the Oculus mobile SDK documents religiously

Oculus has worked for years to compile these guidelines for new developers and they contain almost everything you need to know to get started developing for the Gear VR. You should read every section of their docs at least once and revisit the best practices frequently.

I know this isn’t what you wanted to hear, but knowing the recommended methods will shine through in the final product. Almost every developer who has done Gear VR development will immediately notice things like the gaze cursor being a set distance from the head instead of it being projected on the surface you’re looking at like it should be.

Developers will appreciate the attention to detail and users will have no idea, but they won’t be sick and won’t have headaches or eyestrain. The last thing you want is a user to leave your experience with simulator sickness.

So let these documents guide your development process and if you decide to stray away from the path set before you by Oculus, make sure you understand why it’s not recommended to do it that way and try to minimize any negative effects.

3. Set up your development environment

This seems very straight forward but it can be a huge pain. Oculus has had compatibility issues with Unity 5 since it was first released and although it has since been sorted out, you need to make sure that the versions of the mobile SDK and Unity are compatible.

Many people have reverted back to Unity 4.6.3 and Mobile SDK version 0.5.0.1 or earlier and plan on using that environment until their current projects are completed. Unity 5.1 and Mobile SDK 0.6.0.1 seem to have most of the bugs fixed and developers are slowly starting to port over content, but you have to decide what is right for you and ensure that the pairing you choose works.

4. Get your project settings right

Getting your project settings right early on will save you a big headache down the road. You don’t want to be at the end of a project tearing stuff out to get the required frames per second and realize that all you needed to do was change one or two things in the quality settings.

The Mobile SDK comes complete with a project settings folder that has almost all the recommended settings.

Just copy this from the Mobile SDK folder into your project folder and all that digging and manually changing every little setting is done for you.

Of course this isn’t a fix-all. You should still become familiar with these settings and know enough about them that you can troubleshoot bottlenecks if you have to.

5. Set up your test environment

Setting up your test environment is extremely important to the success of your application. You will always need to know where you stand, what you can and can’t get away with adding, and when and where you need to cut back.

One very useful thing for testing is being able to run your application outside of the Gear VR headset. I have never actually used this feature but I’ve read about it in the mobile docs. On those long days of testing an app running around 45-50 frames per second where I was feeling the headaches and eyestrain coming on, I wish I had taken the time to set this up.

I recommend setting up a wireless debugging bridge. All you have to do is download ADB (it should come standard with android studio now) and a wireless ADB app such as ADB Wireless on your phone.

Simply follow the instructions in the app and once you’re connected to your phone use adb logcat –s "UnityPlugin" to get a printout of the frames per second of any Unity application.

You live and die by this number. Keep in mind that 58 or 59 frames per second is 60 frames per second, it will never actually say 60, and the overhead of wirelessly connecting with the ADB will account for the reading of 58.

But anytime you see this number drop below 58 or 59 frames per second, it should be treated as an emergency.

6. Build well under the limits

Just because you know that you can have up to 100 draw calls and 100k triangles it doesn’t mean you should use all of them.

Early on especially, you should aim well below these limits. Currently there is a problem with the Note 4 and the Android Lollipop operating system causing each draw call to take 20% more time than normal. This effectively reduces the number of draw calls you can have to around 80 and is definitely something you need to consider since many of the devices currently being used for the Gear VR are Note 4’s with Lollipop installed.

If you build it for the worst case scenario, it will run like a charm on every device that is more powerful. You should also build the basic scenes well below these guidelines and add anything not essential to the scene later on if you have some extra wiggle room.

7. Test early and often

If you have set up your test environment properly it shouldn’t be a problem to build your application every time you make a few changes and test it.

Once you’ve got the core mechanics set up and you’ve finished combing through all the scripts optimizing and deleting any empty update functions, you should have a pretty good idea of how much you can add to your app.

It’s at this point that you should start the process of make changes, build, test, repeat.

You should also take advantage of the Unity profiler. Run the application in the Unity editor and keep an eye on the batching statistics. It’s the only way you will be able to create a compelling experience within these limitations.

8. Balancing act

Unless your experience is very simple, you will inevitably get to a point where adding one thing will make your app drop below 60 frames per second. This is when you need to consider the impact of every single object in your scene.

Is it necessary? How much does it add to the experience? Would the users experience by drastically worse if it wasn’t included?

If you really want to keep something, you will have to remove something else or find way to make it more efficient by changing the shader or combining its mesh with other objects. This is where third party tools can be very useful.

9. Know your tools

I am a big fan of Simplygon which is a tool for optimizing meshes and textures. It does everything from reducing the number of triangles on a single object while preserving its appearance with a normal map, to combining textures into atlases and combining multiple objects into one mesh.

You can use 1 hour of processing time on their servers for free every month, but beyond that their pricing gets pretty steep.

Blender’s decimate tool is a quick, easy, and free way to reduce the triangles of an object without having any prior knowledge of 3D modeling.

There are also countless tools available on the Unity asset store that combine meshes or create vertex coloured meshes, but those shouldn’t be that hard to just write yourself.

10. Tricks of the trade

Take advice from developers that have gone through this process before you. I did hours of research and read countless developer blogs before jumping into my first Gear VR project.

The VR community is very tight knit and people will jump at the chance to share their knowledge with you. Read the forums, ask questions when necessary and you should have no problem finding someone willing to offer a helping hand as you start this journey.

In my next post I will be talking about specific problems we encountered while porting Yana Virtual Relaxation from PC to Gear VR, as well as my "solutions" (hacks, work arounds, and a few things I’m particularly proud of). Until then good luck and remember BEST PRACTICES!!!

LANDON BUTTERWORTH Virtual Reality Developer

- by John Luxford