Artificial Intelligence Can Track Health of Corals Reefs by Understanding Complex Soundscape
In a new research by scientists at University of Exeter, artificial intelligence (AI) can track the health of coral reefs by learning the “song of the reef”. The team used several recordings of healthy and deteriorated reefs to train a computer algorithm, allowing the machine to learn the difference.
Coral reefs hide within them several secrets that are difficult for scientists to record and study. Experts must do rigorous analyses to determine reef health based on sound recordings because coral reefs have a complex soundscape. Now, in a new research, artificial intelligence (AI) can track the health of coral reefs by learning the “song of the reef”. Scientists at University of Exeter used several recordings of healthy and deteriorated reefs to train a computer algorithm, allowing the machine to learn the difference. The computer then analysed a slew of additional recordings and correctly recognised reef health 92 percent of the time. This was used to keep track of the progress of reef restoration projects by the team.
The meaning of many of these calls is unknown, but the new AI system can tell the difference between healthy and unhealthy reefs’ overall sounds. The recordings utilised in the study were made at the Mars Coral Reef Restoration Project in Indonesia, which is working to restore severely damaged reefs.
The findings were published in the journal Ecological Indicators.
Ben Williams, the lead author of the study, said that coral reefs were under threat from a variety of factors, including climate change, and so, it was critical to keep track of their health and the success of conservation efforts.
One important challenge faced by challengers was that visual and audio reef surveys were typically labour-intensive. The fact that many coral organisms hide or are active at night limits visual surveys, and the richness of reef noises makes it difficult to determine reef health using individual recordings.
To solve that difficulty, the researchers used machine learning to see if a computer could pick up on the reef’s melody.
“Our findings show that a computer can pick up patterns that are undetectable to the human ear. It can tell us faster, and more accurately, how the reef is doing,” said Williams.
Dr Tim Lamont of Lancaster University, the co-author, believes the AI method will greatly improve coral reef monitoring. He said that sound recorders and artificial intelligence might be used all across the world to track the health of reefs and see if efforts to protect and restore them are successful.
In many circumstances, it is quicker and less expensive to place an underwater hydrophone on a reef and leave it there rather than have specialist divers inspect the reef on a regular basis, especially in isolated regions, said Dr Lamont.