I recently had the chance to test out Oculus Medium using the Touch controllers at OC2. I nearly missed out on signing up but I got lucky at the last minute, a slot opened up and I got to spend 8 minutes in the virtual reality sculpting app.

A couple caveats here — because I had such a short time in the experience, 4 minutes of instruction and the remaining 4 of free sculpt time, I can’t go into too much depth but I can at least give my initial impressions. Also, I am going to mostly be (unfairly) critiquing it from the perspective of someone used to many of the professional tools out there such as zBrush and Mudbox.

I want to start off by talking about what the app gets right, and I think there are many things it does. For one, the experience is buttery smooth. I’m sure the frame rate must be locked at 90fps because everything feels incredibly responsive and comfortable. I also appreciate the considerable tech challenges they must have faced translating what appears to be high poly sculpting meshes to something that runs this well. In this area, it very much feels like a zBrush viewport where there is no delay from your actions to the results.

I am also really glad they decided to set the experience in an environment other than an endless void (i.e. the “grey world” of every other 3d app). For VR, this would likely get very uncomfortable very fast. Can you imagine staring into a grey, endless horizon with nothing but an infinite wireframe grid for an extended period? Lydia Choy mentioned this in her talk about the development of Medium and it sounds like Oculus spent a considerable amount of time trying to get this area right.

Tool selection is also very polished feeling. I almost wish I didn’t have to listen to the person instructing me at the beginning because it was so intuitive I would have likely figured it out rather quickly. I have never used Tilt Brush before and I know they use similar interface techniques but it definitely works quite flawlessly here. You use your thumbsticks on the left and right controllers to access radial tool menus and then you can point with the opposite hand and click a trigger to select the tool from the radial palette. Unfortunately, I wasn’t able to fully explore all the tools as much as I would have liked, but the sculpting tools all felt really fast to access.

Another tool that felt great is the smoothing tool. The coolest thing was the way you could hold it really close to the model to smooth small areas of the model or pull it way back and use the cone shaped area of affect to broaden your smoothing size.

Now I want to mention some areas that would be great to see some development in. Again, this is more from the perspective of comparing it to the capabilities of some other sculpting software.

I went into the demo wanting to sculpt a human portrait because I think this is generally a pretty good test of a sculpting tool and also requires several different types of tools and tool sizes. I knew I wouldn’t get to finish but I wanted to see how far I could get. By the time I got my 4 minutes to create, I felt pretty comfortable with the interface and I got started. I flicked the analog stick on my left controller and a radial selection of tools popped up on a holographic palette. I lifted my hand closer to my face for a better view then took my right hand and pointed at the sculpt tool and squeezed the trigger to activate it. It was instantly loaded into my right hand (I’m right handed, something the guide asked before I started).

I brought the tool up to the sphere in front of me that I had already sized to be about twice the size of an actual human head. I used the right analog stick to bring down the size of the sphere floating on the tip of my tool so I would draw a thinner stream of polys and pressed the trigger to start building up the bridge of the nose. As I did this a huge gob of polygons trailed my tool where ever I dragged it. It was quite a fascinating experience, I had never used a tool like this in a 3D tracked space before. I quickly realized it was going to be somewhat difficult to control the precision of the polygons the way I expected. It wasn’t until later that I realized why.

In most sculpting apps the default tool is one that allows you to gradually build up more mass along the surface of the mesh you are working on. They actually don’t stack on new geometry, they are modifying the current mesh, pulling vertices in and out depending on your tool settings. But in Medium the default sculpt tool is actually creating tubes of polygons anywhere you drag it. It wasn’t until after the demo that I was told the trigger was pressure sensitive, which might have helped with finer control but would still function fundamentally different than most sculpting apps. I don’t think this is a bad decision on Oculus’ part, but it is something that needs to be adjusted to.

There is an “inflate” tool as well, and perhaps this would be closer to the default tool in other apps, but I didn’t get to experiment with it as much. The polygon tube streams are actually quite cool and in fact I think this style of building is a much more impressive effect in VR because you can wave the tool all around you and create ribbons and streams of geometry enveloping you. Which actually brings up another area that could be interesting to explore. Because VR is such a spacial medium — I wanted to be able to take a few steps around my creation and look at it from the other side. There are some very simple controls to grab the model and turn it around, but I would feel more present with my model if I could literally step around it. I think this is an area that will improve as tracking spaces expand even slightly, and for now the grab and rotate works fine, but in the future I can see this helping to further distinguish this style of creation from using a 2D monitor.

I should also mention the other tool I felt was missing: the move topology tool. Traditional sculpting apps have a tool that allow you to move large selections of vertices with a simple falloff control so you can quickly squash and stretch large or small features on the model. It is used for the times when you want to keep certain details in tact but still pull things out or push things a little further. Imagine you sculpt eyes on your character but want to pull them open a bit and lift an eyebrow on one side.

Right now I know this experience is being designed for everyone, not just modelers or graphics professionals but eventually it is clear there will be a VR sculpting app for pros and I can see the full potential in this early build. The possibilities feel endless and I was very encouraged by the precision of the Touch controllers for dealing with these kinds of experiences. I didn’t test the painting features as much, but the little bit I did use was also very impressive. I’m not sure what it would take for the technology to improve to the point where I could spend 4 hours at a time in VR sculpting, but I know I wanted way more than the 8 minutes I had.