Players dive off a research boat, identify and classify coral reefs using satellite and drone images, and bring marine life back to reefs. In doing so, they help scientists teach a machine to learn.

An interactive video game currently in development (update: the game is now available to download for free here) will teach players to classify corals using satellite images of reef systems around the world. The players’ results will be used to train an algorithm that will classify corals automatically and create a global data set of reefs.

“Aquatic ecosystems, particularly coral reefs, remain quantitatively misrepresented by low-resolution remote sensing,” Ved Chirayath, the director of the Laboratory for Advanced Sensing (LAS) and a research scientist at NASA Ames Research Center, told Eos. Chirayath, who leads the game development team, said that the lack of a global reef survey is largely because of how ocean waves distort and attenuate reef images taken by satellites.

Results will train a machine learning to classify corals automatically and create a global data set of classified coral reefs. Chirayath and his team are working to solve this problem with a video game for tablets called Neural Multi-Modal Observation and Training Network, or NeMO-Net. The game will allow players to identify and classify real coral reef systems around the world from satellite images. Those results will train a machine learning to classify corals automatically and create a global data set of classified coral reefs.

“This is a 3-D citizen science video game currently in development that aims to create a global data set to classify coral reefs,” said Jarrett van den Bergh, a research scientist and graphics engineer at LAS. Van den Bergh demonstrated the game design last week at AGU’s Fall Meeting 2018 in Washington, D. C.

Peering Beneath the Waves

“For how important these ecosystems are, we know very little about where they are, what their breakdown is, and how they’re changing with time,” van den Bergh said. Most reef systems that have been studied have been mapped in detail only with on-site field campaigns, he said, which is difficult to do for isolated reefs. Remote sensing of coral reefs by satellites or drones, which can target larger areas more quickly, is often hindered by how the ocean distorts the view of what’s below, he added. New technologies, however, are helping the team get around this problem.

In the video below, Chirayath describes a technique he pioneered called fluid lensing. Fluid lensing removes the distorting effects of flowing water by characterizing how the water refracts light and reversing the distortion. This correction allows clear imaging of what’s beneath the waves.

﻿

To expand the mapping of reef systems, the team developed FluidCam, a high-performance camera that applies fluid lensing to clear up remote sensing data. By attaching FluidCams to small drones, the team can image thousands of corals down to millimeter scales and create 3-D images of them. This technology will let the team “determine coral reef ecosystem makeup globally at unprecedented spatial and temporal scales,” Chirayath said.

Currently, FluidCams are mounted on two drones that have been mapping shallow coral reef systems in the South Pacific for the past 2 years. The team combined these FluidCam data with measurements from NASA’s Coral Reef Airborne Laboratory (CORAL) satellite and lower-resolution satellites to create a combination of 2-D and 3-D pictures of thousands of corals.

Coloring Corals

Players map the boundaries and textures of a coral, categorize it, and navigate around the reef. The research efforts have created a wealth of images that now need to be processed. But rather than a few scientists pouring over images one by one, Chirayath and colleagues had an idea. What if classification could be crowdsourced, through an interactive game?

NeMO-Net aims to do exactly that. In the game, players start on a research boat floating above a coral reef. They complete tutorials and learn to map the boundaries and textures of a coral, categorize it, and navigate around the reef. After reaching a certain accuracy threshold, they can explore the reef, classify the corals they see, and evaluate other users’ coral classifications.

To classify a 2-D or 3-D image, players trace the outlines of seafloor components—rock, sand, mounding coral, or branching coral—with different colors using an in-game drawing feature controlled by touching with a finger or stylus. The video below shows how it’s done. Once a piece of coral is classified by a player, the information is used as training data for a machine learning algorithm that will classify corals automatically.

A key behind-the-scenes aspect of the game is the frequent assessment of a player’s accuracy in classifying corals, van den Bergh said. Sprinkled throughout the game are images that have already been classified by experts. Occasionally comparing a player’s classification to those of experts helps the researchers weight the results on the basis of a player’s accuracy. Moreover, a player needs to maintain a high level of accuracy to continue playing and progress through the game, van den Bergh explained.

A player’s avatar is a reef animal. As players classify more corals and improve their accuracy, their avatar advances up the coral reef food chain, from a plankton to a clown fish and beyond. When a player levels up, old avatars are left behind to populate the reef with vibrant marine life.

Toward a Global Coral Data Set

NeMO-Net, which is still in development, currently includes data from reefs near Guam, American Samoa, and Western Australia. The team is planning a mapping campaign of Puerto Rico’s reefs in January 2019 and will add those data to NeMO-Net. They plan to release NeMO-Net as an iPad game to scientists in the near future and to the public in 2019.

Until then, you can preview one of the 3-D corals that will be included in the game in the interactive system below. Click the image to rotate, zoom in on details, and add coloration and layers to the coral to explore it in detail.

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer