Automation of acoustic camera data analysis using a Convolutional Neural Network
References
- Ahmad, M., & Sundararajan, D. (1987). A fast algorithm for two dimensional median filtering. IEEE Transactions on Circuits and Systems, 34(11), 13641374. doi:10.1109/TCS.1987.1086059
- Christin, S., Hervet, É., Lecomte, N. (2019). Applications for deep learning in ecology. Methods in Ecology and Evolution, 10, 1632–1644. doi:10.1111/2041-210X.13256
- Dougherty, E. R., & Lotufo, R. A. (2003). Hands-on morphological image processing (vol. 59). SPIE press, 290 p.
- Fernandez Garcia, G., Corpetti, T., Nevoux, M., Beaulaton, L., & Martignac, F. (2023). AcousticIA, a deep neural network for multi-species fish detection using multiple models of acoustic cameras. Aquatic Ecology, 57, 881893. doi:10.1007/s10452-023-10004-2
- Jocher, G. (2020). Yolo network from ultralytics LLC.
- Lamba, A., Cassey, P., Segaran, R. R., & Koh, L. P. (2019). Deep learning for environmental conservation. Current Biology, 29(19),
- LeCun, Y., Bengio, Y., & Hinton, G. E. (2015). Deep learning. Nature, 521(7553), 436444. doi:10.1038/nature14539
- Martignac, F., Daroux, A., Bagliniere, J. L., Ombredane, D., & Guillard, J. (2015). The use of acoustic cameras in shallow waters: new hydroacoustic tools for monitoring migratory fish population. A review of DIDSON technology. Fish and Fisheries, 16(3), 486510. doi:10.1111/faf.12071
- O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy.
- Wei, Y., Duan, Y., & An, D. (2022). Monitoring fish using imaging sonar: Capacity, challenges and future perspective. Fish and Fisheries, 23(6), 13471370. doi:10.1111/faf.12693
- Zivkovic, Z., & Van Der Heijden, F. (2006). Efficient adaptive density estimation per image pixel for the task of background subtraction. Pattern Recognition Letters, 27(7), 773780. doi:10.1016/j.patrec.2005.11.005
Abstract
Acoustic cameras are high-frequency multi-beam sonars that provide images of fish from which morphological and behavioral features can be extracted. Using these tools in rivers enables the observation of fish's natural behavior without a supporting structure, even in turbid water. Currently, the analysis of data recorded by acoustic cameras requires the full visualization of images by operators, a very time-consuming process. Automation of image processing is crucial to further develop the use of these tools by detecting fish passages and identifying species based on morphological and behavioral criteria. The study aims to develop an innovative method for automating the analysis of images from DIDSON and ARIS acoustic cameras using a Convolutional Neural Network (CNN). Based on a large training dataset to optimize the model, the study evaluates its performance on a validation set of videos. Results indicate successful fish passage detection, although performance varies with fish size. However, the false positive rate for target species such as Atlantic salmon or European eel remains too high for a fully automatic use of the developed pipeline.
No supporting information for this article
Article statistics
Views: 1592