Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Detecting Marine Organisms Via Joint Attention-Relation Learning for Marine Video Surveillance
Ist Teil von
IEEE journal of oceanic engineering, 2022-10, Vol.47 (4), p.959-974
Ort / Verlag
New York: IEEE
Erscheinungsjahr
2022
Quelle
IEEE Xplore
Beschreibungen/Notizen
The better way to understand marine life and ecosystems is to surveil and analyze the activities of marine organisms. Recently, research on marine video surveillance is becoming increasingly popular. With the rapid development of deep learning (DL), convolutional neural networks (CNNs) have made remarkable progresses in image/video understanding tasks. In this article, we explore a visual attention and relation mechanism for marine organism detection, and propose a new way to apply an improved attention-relation (AR) module on an efficient marine organism detector (EMOD), which can well enhance the discrimination of organisms in complex underwater environments. We design our EMOD via integrating current state-of-the-art (SOTA) detection methods in order to detect organisms and surveil marine environments in a real time and fast fashion for high-resolution marine video surveillance. We implement our EMOD and AR on the annotated video data sets provided by the public data challenges in conjunction with the workshops (CVPR 2018 and 2019), which are supported by National Oceanic and Atmospheric Administration (NOAA) and their research works (NMFS-PIFSC-83). Experimental results and visualizations demonstrate that our application of AR module is effective and efficient, and our EMOD equipped with AR modules can outperform SOTA performance on the experimental data sets. For application requirements, we also provide the application suggestions of EMOD framework. Our code is publicly available at https://github.com/zhenglab/EMOD .