Google's new SurfPerch AI can monitor coral reefs health, trained using 400 hours of audio
It's not the only AI application in this domain
2 min. read
Published on
Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more
Key notes
- Google and DeepMind launched SurfPerch, an AI tool for monitoring coral reef health.
- SurfPerch analyzes underwater audio and is used in the Philippines and Indonesia.
- Trained with 400+ hours of audio, it detects new reef sounds for conservation.
Google and DeepMind researchers have just recently announced SurfPerch, a new AI tool to help marine scientists monitor coral reef health a lot more efficiently.
That’s another application of AI in a research-based field that goes way beyond GenAI, which arrived not too long after Google’s Med-Gemini & Microsoft’s rigorous “TRAIN” standards for AI in the medical field.
Google said that SurfPerch (yes, just like the fish) can process thousands of hours of underwater audio to understand reef ecosystems, and it’s already in place in several countries like the Philippines and Indonesia. Over 400 hours of reef audio were analyzed by volunteers to train the tool, letting it detect new reef sounds even with minimal examples.
“The members of this open listening collective had to click when they heard a fish sound. This brought thousands of eyes and ears to data that would take bio acousticians months to analyze,” Google describes the training progress, and you can still contribute by listening to new audio on the platform here.
Coral reefs, though covering just 0.1% of the ocean’s surface, host 25% of marine species and are threatened by factors like overfishing, disease, and climate change. Just a little while ago, AI has helped scientists discover 7,000 humpback whale deaths, according to a study published by the Royal Society Open Science.
User forum
0 messages