First Impression





unprecedented and I believe that some thoughts are worthsharing. I was very lucky today to experiment with an Oculus Rift Development Kit. By now, I am probably several months late to do an elaborate review of the packaging-installation-performance evaluation; you already know many of that stuff from many other posts from the Oculus community (e.g. r/oculus ). However, the experience of playing with a Rift as a user was





Firstly, the Rift came in a beautiful case that absolutely protects the device and everything was included: from UK, US plugs to HDMI cable, mini-usb and a usb-to-DVI adapter. The professionalism that the Oculus team shows by this "product" presentation with the SDK is amazing.





Unfortunately I wasn't able to try this out at my Ubuntu laptop, as a displayport-to-HDMI converter was necessary; so our first exploration was made from a Sony Vaio equipped with a mainstream NVidia GeForce 330 that did run the demos, but with the FPS (in Rift) varying from 30 to 40. The installation was seamless. No quirks, no missing dlls (except when I first built a demo with some needed DirectX components that come with the redistributable), no surprises. We used the second pair of lenses that came with the Rift. The package included three pairs, A, B and C and the user is free to use which ever he likes in order to achieve the most focus possible for nearsighted people. The 1280x800 ( an HD prototype is already presented NVidia GeForce 330 that did run the demos, but with the FPS (in Rift) varying from 30 to 40. The installation was seamless. No quirks, no missing dlls (except when I first built a demo with some needed DirectX components that come with the redistributable), no surprises. We used the second pair of lenses that came with the Rift. The package included three pairs, A, B and C and the user is free to use which ever he likes in order to achieve the most focus possible for nearsighted people. The 1280x800 () (2 displays with resolution 640x800), 32-bit colour, LCD head-display was ready to have it's gyroscope sensor calibrated.









The simulator sickness





Before trying Rift, I was sceptical (as everyone) for the simulator sickness that occurs to the user, that results from slight disorientation as a game progresses. The user builds up discomfort from in-game locomotion, rapid rotations, fast changes in elevation etc., which are all some form of acceleration that the brain perceives but the body doesn't actually feel. Some things regarding this matter can be improved dramatically: 1) technologically (like latency and tracking precision) and 2) some other things must be taken under consideration in the game design - HCI level. The developer.oculusvr.com site has a very informative wiki page with guidelines (e.g., placement of camera, displaying text, speed of elements, flashing, providing static references like a cockpit etc. that limit the sickness effect).





Screen-door effect

The most annoying thing in the whole process of evaluation is that due to the low resolution of the displays that are extremely close to your eyes you can really see the black lines between pixels. From what I read at the community-press this effect can be eliminated with both higher resolution and better higher pixel-fill rate (to my understanding).





Packaging





Lenses





USB box (transfers sensor data to pc)





Again a rear view





Side view (controlling the distance of the headset from eyes)





(as an end user and not API-wise yet)





Oculus Rift is definitely THE future of Virtual Reality (as a computer science field in general). It can have many applications apart from home entertainment to boost productivity and efficiency, enhance (augment) reality, help people with vision problems ( as in this recent project about diplopia ) and many more. The final product is expected to benefit from the market growth of smart-phones and the need for better, more complex with live colors, high refresh-rate (and of reduced price) displays.







