Esper experiments with realtime 3D capture using the cameras and LIDAR sensor at room scale. It’s a fun and useful way to capture a space.

In the GIF above, we scan a kitchen. Panning around helps the resolution of the LIDAR sensor immensely, as you can see it forming — and refining a 3D mesh of its surroundings. When you tap the capture button, it textures the 3D model, using the camera data.

Once captured, you can view the space in 3D or in AR. You can change the way it looks:

And thanks to it being a three dimensional capture, you can even walk around in it!

The smallest ‘resolution’ we found this to be feasible in is about furniture-size. This chair captures pretty well:

If you’re interested in seeing how this works, download one of Ben’s chairs that he ‘captured’ in Esper. (insert ‘you wouldn’t download a car’ joke here)

You can view it on any device and even place it into the ‘real world’ using AR. Thanks to the LIDAR sensor, its scale is accurate.

That’s Esper! It’s our quick proof of concept to show that while this new LIDAR sensor isn’t something that is (currently) able to augment our ‘traditional’ photography, it opens the door to new applications that are powerful and creativity-enabling in their own right.

Let us know what you think of Esper on Twitter!

We think this is a super exciting thing about the way camera hardware is evolving. It’s not simply new sensors used to augment photography or video capture as we know it: it will enable entirely new ways to capture reality around us.

It’s up to us as developers to re-imagine the art of capturing the reality around us and give users the tools to explore what’s possible.

Your questions

We asked for some questions, and we got them. Let’s answer some:

It’s not better or worse, because it’s going for something different — room scale scanning. Just like having a gyroscope or GPS opens up new possibilities, it would be great for this sensor to find its way to an iPhone.

‘

With LIDAR, this iPad is easily Apple’s best device for sensing three-dimensional space. Whether it comes to measurements or AR, there’s no contest. Outside of that, your several-year old iPhone will still take better photos.

Ah, objects. Unfortunately, the mesh output by the system right now isn’t accurate enough to send to a 3d printer. If you look at Ben’s chair scan, you’ll see that the surface is rough, and it has trouble with details like chair legs. But it’s a great starting point for a 3d model, since all the proportions will be very accurate.

We’re super excited as photographers and developers to see what the next few years bring . Photography isn’t traditionally taking photos anymore; it’s combining all the data and smarts in our devices into allowing totally new interpretations of what ‘photography’ can mean. And if you’re not excited about that, we’re at a loss!

No optical image stabilization (OIS) was added, and while it’s not exactly missed at the ultra-wide end (the wider the lens, the less OIS is necessary), it’s a bummer that the camera did not improve at all on the wide end. Next year?

Ah, the hard hitting questions.

Unless the inside of your mouth is about the scale of a room (like a blue whale), you could conceivably do that a lot better with the infrared, front-facing TrueDepth camera on your iPhone X (or newer). Halide will let you capture the raw depth data. Let us know how it goes!

Wrapping up

Thanks for your questions, we hope you enjoyed our regular virtual ‘tear down’ of this camera module! We’re excited to see cameras get ever more interesting with additional sensors and modules.

Happy snapping!