SurfPerch can now be quickly trained to detect any new reef sound, increasing efficiency in analyzing new datasets and eliminating the need for training on expensive GPU processors. The tool's performance was further enhanced by leveraging bird recordings, as common patterns were found between bird songs and fish sounds. Initial trials with SurfPerch have revealed differences between protected and unprotected reefs in the Philippines, tracked restoration outcomes in Indonesia, and improved understanding of relationships with the fish community on the Great Barrier Reef.
Key takeaways:
- Google has developed an AI tool called SurfPerch, in collaboration with Google Research and DeepMind, to help marine biologists understand coral reef ecosystems and their health.
- The tool was trained on thousands of hours of audio reef recordings and can track reef activity at night and in deep or murky waters. It can also be quickly trained to detect any new reef sound.
- SurfPerch's model performance was improved by leveraging bird recordings, as there were common patterns between bird songs and fish sounds that the model could learn from.
- The project continues with new audio being added to the Calling in Our Corals website, which will help to further train the AI model.