Beam Goes Back To School

In 2017, Beam Imagination was introduced to Georgia State University’s Student Innovation Fellowship (SIF) program as they embarked on an ambitious project with Emory University to map Oakland Cemetery. The SIF team, led by Brennan Collins and Spencer Roberts, set out to document both the physical state of this landmark and the remarkable stories contained within each plot.

The SIF program draws upon students from various disciplines and majors to create teams that focus on a single applied learning experience around one central project. This crew of archaeology, history, film, comp sci students were tasked with telling the story of historic Oakland Cemetery, one of Atlanta’s oldest and largest greenspaces. Now at 48 acres, the cemetery’s original six-acre design was intended to reflect a “garden-style” cemetery where visitors treated it as a park. SIF program lead Brennan Collins championed the idea of using the park as a sandbox for student research and innovation.

Collins and Roberts, who also worked on Unpacking Manuel’s and 3D Atlanta, connected students with the Beam team to help test and develop a workflow for photogrammetry of individual headstones. After a few meetings with the larger SIF team as well as staff from both schools, we decided to support the mapping of Oakland in a meaningful way.

Imagination at Play

With this project, we saw an opportunity to utilize our stable of tech toys to help digitally preserve Oakland’s history from the ground and air, through photographs and point clouds.

Using a DJI Inspire 2 drone and DroneDeploy software, Beam’s Logan Riely captured over 12,000 images to create an orthographic scan of the 48-acre cemetery and surrounding 12 acres.

On the ground, we rolled out our Motion Impossible M-Series rover equipped with Phoenix Ranger Lidar system to map out the smaller Jewish Flats area of the cemetery. Lidar is short for light detection and ranging, and it measures the distance to a target by hitting the target with pulsed laser light and measuring the reflected pulses with a sensor. In laymen’s terms, it has a really good idea of where everything around it is located, this technology is used on driverless cars and robotics. This particular unit has accuracy down to ~.1 cm at 100 meters of distance. Anyone who is worried about driverless cars hasn’t realized this sees a lot better than humans do.

These scans, in conjunction with the SIF team’s photogrammetry scans of several headstones, were used as a small scale test before implementation across the entire 48-acre green space.

The Photogrammetry Process

The Beam team using an array of cameras tested on a headless statue within this section. The first photogrammetry test used a Red Epic-W, Sony A7rii, Nikon D810 and Canon 6D using various lenses the create 7 different camera/lens combinations. Approximately 100-150 photos were taken of the test statue subject. Ultimately, a wide selection of Canon lenses and bodies were used to capture most of the photographs for photogrammetry.

Later, using Agisoft Photoscan we created 3D models (a process known as photogrammetry) of these fading pieces of our city’s history. The team of students, led on the post-production side by Nicholas Chalkley, also used a gray ball and chrome ball to de-light the objects working in Unreal Engine 4. Fellow SIF student and videographer, Blake Lowe, documented the process for their blog.

In September of 2018, using a kit of 20 Canon SL2 cameras, the Beam team will train and deploy students from SIF to capture high detail scans of each stone and marker inside the Jewish Flats.