By

on

Q UT’s remote sensing and unmanned aerial vehicle (UAV) experts are partnering with the Australian Institute for Marine Science (AIMS) to test whether small drones, machine learning and specialised hyperspectral cameras can monitor the Great Barrier Reef more quickly, efficiently and in more detail than manned aircraft and satellite surveys.

QUT’s project leader Associate Professor Felipe Gonzalez said the team surveyed three reefs in the Great Barrier Reef Marine Park from 60 metres in the air while AIMS divers recorded precise levels of coral bleaching from under the water.

“By taking readings from the air and verifying them against the AIMS data from below the surface, we are teaching the system how to see and classify bleaching levels,” said Professor Gonzalez an aeronautical engineer from QUT’s Institute for Future Environments and Australian Centre for Robotic Vision.

The images captured give a spatial resolution of about 9 cm which is sufficient to detect individual coral bleaching. The drones can also cover more area in a day than diving surveys, and can be deployed quickly. Reefs have previously been studied using satellite and aerially-derived hyperspectral images, but the drone-based surveys may be superior in that they don’t have limitations such as cloud cover and high cost of deployment.

Key to the new aerial system is miniaturised hyperspectral cameras, which until recently were so large and expensive only satellites and manned aircraft could carry them. Unlike traditional cameras, these drone-mounted hyperspectral cameras capture 270 bands in the visible and near-infrared portions of the spectrum, providing far more detail than the human eye can see and at an ultra-high resolution.

“The more data scientists have at their fingertips during a bleaching event, the better they can address it. We see small drones with hyperspectral cameras acting as a rapid response tool for threatened reefs during and after coral bleaching events.”

Roughly the size of Japan, the Great Barrier Reef is home to around 3,000 reefs stretching 2,300 kilometres, making it slow and costly to survey using traditional methods.

Once the images are captured they must be processed. In order to expedite the interpretation of the images, Gonzalez and his team are building an artificial intelligence system that identifies and categorises the different spectral pattern for objects within the footage.

“Every object gives off a unique hyperspectral signature, like a fingerprint. The signature for sand is different to the signature for coral and, likewise, brain coral is different to soft coral.”

“More importantly, an individual coral colony will give off different hyperspectral signatures as its bleaching level changes, so we can potentially track those changes in individual corals over time.”

“The more fingerprints in our database, the more accurate and effective the system.”

Gonzalez and his team from QUT recently undertook a similar drone-based observation project on Ningaloo Reef in Western Australia.

QUT’s drone and remote-sensing innovations also include research advances in marine robots (COTSbot), agricultural robots (Ag Bot II), and in using UAVs to detect and monitor both pests and gas leaks.