We tend to associate healthy coral reefs with their visual splendor: the vibrant array of colors and shapes that populate these beautiful underwater ecosystems.

But they can also be rather noisy places. If you've ever been snorkeling in a coral reef environment, you'll know the distinctive clicking and popping sounds made by various marine creatures under the water, such as snapping shrimp and feeding fish.

That buzzy din of background noise – almost like the chattering hiss of radio static – is such a unique feature of the coral reef soundscape, it could help us to monitor the health of these endangered marine habitats.

In a new study, scientists used machine learning to train an algorithm to recognize the subtle acoustic differences between a healthy, vibrant reef and a degraded coral site – an acoustic contrast so faint it may be impossible for people to discern.

Compared to other labor-intensive and time-consuming processes for monitoring reef health – having divers visit reefs to visually assess coral cover, or manually listening to reef recordings – the new tool could deliver significant advantages, the team suggests. Besides, many reef creatures conceal themselves or are only seen at night, further complicating any visual surveys.

"Our findings show that a computer can pick up patterns that are undetectable to the human ear," says marine biologist Ben Williams from the University of Exeter in the UK.

"It can tell us faster, and more accurately, how the reef is doing."

To capture the coral acoustics, Williams and fellow researchers made recordings at seven different sites in the Spermonde Archipelago, located off the southwest coast of Sulawesi in Indonesia, and the home of the Mars Coral Reef Restoration project.

The recordings encompassed four distinct types of reef habitat – healthy, degraded, mature restored, and newly restored – each of which exhibited a different amount of coral cover, and subsequently generated a different character of noise from aquatic creatures living and feeding in the area.

"Previously we relied on manual listening and annotation of these recordings to make reliable comparisons," Williams explains in a Twitter thread.

"However, this is a very slow process and the size of marine soundscape databases is skyrocketing given the advent of low-cost recorders."

To automate the process, the team trained a machine learning algorithm to discriminate between the different kinds of coral recordings. Subsequent tests showed the AI tool could identify reef health from audio recordings with 92 percent accuracy.

"This is a really exciting development," says co-author and marine biologist Timothy Lamont from Lancaster University in the UK.

"In many cases it's easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it – especially in remote locations."

According to the researchers, the algorithm's results depend on a combination of underwater soundscape factors, including the abundance and diversity of fish vocalizations, sounds made by invertebrates, and even possibly faint noises thought to be made by algae, along with contributions made by abiotic sources (such as subtle differences in how waves and wind might sound across different kinds of coral habitat).

While the human ear might not be able to easily identify such faint and hidden sounds, machines can detect the differences reliably well, it seems, although the researchers acknowledge the method can still be refined further, with greater sound sampling in the future expected to deliver "a more nuanced approach to classifying ecostate".

Sadly, time is a commodity the world's corals are quickly running out of. We'll have to act fast if we want to save them.

The findings are reported in Ecological Indicators.