A 360-degree levitating light field display that provides significant improvements in light field reconstruction efficiency, full-parallax adaptive rendering complexity, and user-friendly interaction: We present a prototype that uses a flat-plate deflected diffuser screen and a high-speed projector to create a 360-degree floating scene in the air. A panoramic annular lens is used to omni-directionally track users’ faces and provide the motion vertical parallax in real-time rendering. A Leap Motion controller tracks all 10 of the human fingers simultaneously and enhances the natural interaction experience

Wide Field of View Augmented Reality Eyeglasses using Defocused Point Light Sources

We present a novel design for an optical see-through augmented reality display that offers a wide field of view and supports a compact form factor approaching ordinary eyeglasses. Instead of conventional optics, our design uses only two simple hardware components: an LCD panel and an array of point light sources (implemented as an edge-lit, etched acrylic sheet) placed directly in front of the eye, out of focus.

Panoptic Camera - A 360-degree field-of-view (FOV) multi-camera platform. The Panoptic camera is an omnidirectional imaging system capable of reconstructing full FOV panorama in real-time, displaying it on a client PC/tablet, stream it online, or display on a virtual reality head-mounted display. We will present a real-time operation of a miniature prototype consisting of 15 image sensors connected to Oculus Rift. In addition, we will demonstrate a telepresence capability, by connecting to our lab in Lausanne, Switzerland, where the output of a larger, higher-resolution prototype is streamed over a web server.

Augmented Reality for Museums on Epson Moverio



Part of a greater project investigating the uses of augmented reality in a museum environment, we will demonstrate our prototype on AR glasses (Epson Moverio BT-200). This system recognizes an artwork from a database stored on device, and augments the view in real-time with a customized label providing more information about the object.

Near Eye LIght Field Displays that enable thin, lightweight head-mounted displays (HMDs) capable of presenting nearly correct convergence, accommodation, binocular disparity, and retinal defocus depth cues.

Oculus Crescent Bay Prototype:Crescent Bay is the latest prototype headset on the path to the consumer version of the Rift. Crescent Bay features new display technology, 360° head tracking, expanded positional tracking volume, dramatically improved weight and ergonomics, and high-quality integrated audio.The upcoming Oculus Audio SDK uses Head-Related Transfer Function (HRTF) technology in conjunction with the Rift’s head tracking to achieve a sense of true 3D audio spatialization. Along with the new hardware, we’ve created original demo content, which we’re calling the “Crescent Bay Experiences,” developed in-house by our content team specifically for Oculus Connect. The demo is designed to demonstrate the power of presence and give you a glimpse into the level of VR experience you can expect to see come to life in gaming, film, and beyond.

OTOY will demonstrate a groundbreaking immersive light field experience in virtual reality. The demo brings to life via interactive holographic video the Batcave from the acclaimed Emmy Award–winning Batman: The Animated Series. The interactive narrative experience will give viewers the opportunity to explore Batman’s world, allowing them to feel what it is like to be inside the show’s stylized universe on devices such as the Oculus Rift.

Lytro Development Kit (LDK): The LDK provides imaging researchers with the highest degree of control of Lytro’s advanced light field capture devices and processing software engine and paves the path for deeper partnerships with technical R&D teams and enterprises in new undiscovered scientific territories.

Intel RealSense Technology

Intel(R) RealSense(tm) technology lets you interact with your devices more like you interact with people - with natural movements. The technology is powered by the real-time depth-sensing Intel RealSense 3D camera, available today in select AIO and notebook devices. Pairing 3D input with a floating 3D display, Intel is designing prototype systems that enable interactive mid-air interfaces.

TruLife Optics designs and manufactures transparent full color holographic wave-guided optics for AR, Eye Tracking using IR, and floating wave-guided holograms for computers and tablets. We will demonstrate our optics as well as a floating keypad hologram paired with a Leap Motion controller.

Pelican will demonstrate depth-enhanced video and still image capture with a super-thin array camera built into a Qualcomm reference design tablet. Along with an accurate depth map generated in real time, we’ll show photos with motion parallax and a range of post-capture image edits such as refocus, matting, background substitution, and distance measurement.

While big format VR and small-screen 'glanceables' have successfully launched the wearables industry, Innovega's unique eyeborne optics platform "cracks the code" and uniquely delivers any digital media (panoramic, HD, 3D, transparent) from a lightweight fashion glasses.

Transmissive Head Mounted Display for Virtual, Augmented and Collaborative Reality Applications with Optical Tracking for 3D Positioning

Occipital develops state-of-the-art computer vision hardware and software. Their most recent product, the Structure Sensor, is the first 3D sensor for mobile devices

CastAR is a magical experience that allows for groups of people to see and interact with 3D objects that spring from your coffee table, workstation, wall, or other objects.

Realtime TheatriX creates immersive 360-degree, horizon to sky, 3D, interactive social entertainment "blended reality" experiences for 15 to 20 simultaneous participants using Technical Illusions' CastAR technology.

Augmented reality is set to evolve. Coming soon to Kickstarter, Seebright has developed wide field of view display technology for mobile virtual and augmented reality. Seebright is now welcoming pioneering developers to build innovative experiences for AR and VR on one platform ranging from information displays to interactive games and academic and professional applications.



Seebright will be demonstrating their latest prototype and mixed reality experiences.

SMI Eye Tracking Glasses: Glasses-based 60 Hz system used for real-world interactions, sports, kinesiology, hand-eye coordination, driving, biomechanics, rehabilitation, etc. Small, light, and designed for maximal peripheral and binocular view, with wireless recording and remote annotation in real-time. Highly-robust technology with +100,000 participant’s recorded.



SMI Eye Tracking for Oculus DK2: Binocular, 60 Hz eye tracking integrated into the Oculus DK2 HMD. Includes an SDK for real-time streaming of eye tracking data with support for VR engine integration. Based on popular ETG platform and used for fully immersive visual perception analysis in VR environment.