Deep Learning Locally Trained Wildlife Sensing in Real Acoustic Wetland Environment

We describe ‘Tidzam’, an application of deep learning that leverages a dense, multimodal sensor network installed at a large wetland restoration performed at Tidmarsh, a 600-acre former industrial-scale cranberry farm in Southern Massachusetts. Wildlife acoustic monitoring is a crucial metric during post-restoration evaluation of the processes, as well as a challenge in such a noisy outdoor environment. This article presents the entire Tidzam system, which has been designed in order to identify in real-time the ambient sounds of weather conditions as well as sonic events such as insects, small animals and local bird species from microphones deployed on the site. This experiment provides insight on the usage of deep learning technology in a real deployment. The originality of this work concerns the system’s ability to construct its own database from local audio sampling under the supervision of human visitors and bird experts.

Citation

Clement Duhart, Gershon Dublon, Brian Mayton, and Joe Paradiso. Deep Learning Locally Trained Wildlife Sensing in Real Acoustic Wetland Environment,” Advances in Signal Processing and Intelligent Recognition Systems. SIRS 2018. Communications in Computer and Information Science, Singapore, January 2019. DOI: https://doi.org/10.1007/978-981-13-5758-9_1.
© 2019 Springer Nature Singapore Pte Ltd.

Authors

Clement Duhart
Gershon Dublon
Brian Mayton
Joe Paradiso

Institutions

MIT Media Lab