WYKRYWANIE I KLASYFIKACJA RYB W CZASIE RZECZYWISTYM W ŚRODOWISKU PODWODNYM PRZY UŻYCIU YOLOV5: BADANIE PORÓWNAWCZE ARCHITEKTUR GŁĘBOKIEGO UCZENIA
Rizki Multajam
Universiti Malaysia Terengganu (Malezja)
Ahmad Faisal Mohamad Ayob
Universiti Malaysia Terengganu (Malezja)
W.S. Mada Sanjaya
Universitas Islam Negeri Sunan Gunung Djati (Indonezja)
Aceng Sambas
Universiti Sultan Zainal Abidin (Malezja)
Volodymyr Rusyn
rusyn_v@ukr.netYuriy Fedkovych Chernivtsi National University, Department of Radio Engineering and Information (Ukraina)
https://orcid.org/0000-0001-6219-1031
Andrii Samila
Yuriy Fedkovych Chernivtsi National University (Ukraina)
Abstrakt
Niniejszy artykuł bada metody wykrywania i klasyfikacji ryb jako integralną część podwodnych systemów monitorowania środowiska. Wykorzystując innowacyjne podejście, badania koncentrują się na opracowaniu metod w czasie rzeczywistym do bardzo dokładnego wykrywania i klasyfikacji ryb. Wprowadzenie zaawansowanych technologii, takich jak YOLO (You Only Look Once) V5, stanowi podstawę wydajnego i responsywnego systemu. Badanie ocenia również różne podejścia w kontekście głębokiego uczenia się, aby porównać wydajność i dokładność wykrywania i klasyfikacji ryb. Oczekuje się, że wyniki tych badań przyczynią się do rozwoju bardziej zaawansowanych i wydajnych systemów monitorowania zbiorników wodnych w celu zrozumienia podwodnych ekosystemów i wysiłków na rzecz ochrony przyrody.
Słowa kluczowe:
deep learning, YOLOv5, metody czasu rzeczywistego, ONNX, automatyczne wykrywanie i klasyfikacja rybBibliografia
[1] Abdul Aziz M. F. et al.: Development of Smart Sorting Machine using artificial intelligence for Chili Fertigation Industries. Journal of Automation, Mobile Robotics and Intelligent Systems 28, 2022, 44–52 [https://doi.org/10.14313/jamris/4-2021/26].
DOI: https://doi.org/10.14313/JAMRIS/4-2021/26
Google Scholar
[2] Ayob A. et al.: Analysis of pruned neural networks (mobilenetv2-yolo v2) for underwater object detection. 11th National Technical Seminar on Unmanned System Technology 2019 NUSYS’19, Springer Singapore, Singapore, 2021, 87–98.
DOI: https://doi.org/10.1007/978-981-15-5281-6_7
Google Scholar
[3] Boudhane M., Benayad N.: Underwater Image Processing Method for Fish Localization and Detection in Submarine Environment. Journal of Visual Communication and Image Representation 39, 2016, 226–238 [https://doi.org/10.1016/j.jvcir.2016.05.017].
DOI: https://doi.org/10.1016/j.jvcir.2016.05.017
Google Scholar
[4] Brownscombe J. W. et al.: The Future of Recreational Fisheries: Advances in Science, Monitoring, Management, and Practice. Fisheries Research 211, 2019, 247–255 [https://doi.org/10.1016/j.fishres.2018.10.019].
DOI: https://doi.org/10.1016/j.fishres.2018.10.019
Google Scholar
[5] Chen PH. C. et al.: An Augmented Reality Microscope with Real-time Artificial Intelligence Integration for Cancer Diagnosis. Nature Medicine 25(9), 2019, 1453–1457 [https://doi.org/10.1038/s41591-019-0539-7].
DOI: https://doi.org/10.1038/s41591-019-0539-7
Google Scholar
[6] Du J.: Understanding of Object Detection Based on CNN Family and YOLO. Journal of Physics: Conference Series 1004, 2018, 012029 [https://doi.org/10.1088/1742-6596/1004/1/012029].
DOI: https://doi.org/10.1088/1742-6596/1004/1/012029
Google Scholar
[7] Fan F.-L. et al.: On Interpretability of Artificial Neural Networks: A Survey. IEEE Transactions on Radiation and Plasma Medical Sciences 5(6), 2021, 741–760 [https://doi.org/10.1109/trpms.2021.3066428].
DOI: https://doi.org/10.1109/TRPMS.2021.3066428
Google Scholar
[8] Hong S. et al.: Opportunities and Challenges of Deep Learning Methods for Electrocardiogram Data: A Systematic Review. Computers in Biology and Medicine 122, 2020, 103801
Google Scholar
[https://doi.org/10.1016/j.compbiomed.2020.103801].
DOI: https://doi.org/10.1016/j.compbiomed.2020.103801
Google Scholar
[9] Hu J. et al.: Real-time Nondestructive Fish Behavior Detecting in Mixed Polyculture System Using Deep-learning and Low-cost Devices. Expert Systems With Applications 178, 2021, 115051 [https://doi.org/10.1016/j.eswa.2021.115051].
DOI: https://doi.org/10.1016/j.eswa.2021.115051
Google Scholar
[10] Iqbal M. A. et al.: Automatic Fish Species Classification Using Deep Convolutional Neural Networks. Wireless Personal Communications 116(2), 2019, 1043–1053 [https://doi.org/10.1007/s11277-019-06634-1].
DOI: https://doi.org/10.1007/s11277-019-06634-1
Google Scholar
[11] Isabelle D. A., Westerlund M.: A Review and Categorization of Artificial Intelligence-Based Opportunities in Wildlife, Ocean and Land Conservation. Sustainability 14(4), 2022, 1979 [https://doi.org/10.3390/su14041979].
DOI: https://doi.org/10.3390/su14041979
Google Scholar
[12] Ismail N., Owais A. M.: Real-time Visual Inspection System for Grading Fruits Using Computer Vision and Deep Learning Techniques. Information Processing in Agriculture 9(1), 2022, 24–37 [https://doi.org/10.1016/j.inpa.2021.01.005].
DOI: https://doi.org/10.1016/j.inpa.2021.01.005
Google Scholar
[13] Jalal A. et al.: Fish Detection and Species Classification in Underwater Environments Using Deep Learning with Temporal Information. Ecological Informatics 57, 2020, 101088 [https://doi.org/10.1016/j.ecoinf.2020.101088].
DOI: https://doi.org/10.1016/j.ecoinf.2020.101088
Google Scholar
[14] Jing L. et al.: Video You Only Look Once: Overall Temporal Convolutions for Action Recognition. Journal of Visual Communication and Image Representation 52, 2018, 58–65 [https://doi.org/10.1016/j.jvcir.2018.01.016].
DOI: https://doi.org/10.1016/j.jvcir.2018.01.016
Google Scholar
[15] Khan A. N. et al.: Sectorial Study of Technological Progress and CO2 Emission: Insights From a Developing Economy. Technological Forecasting and Social Change 151, 2020, 119862 [https://doi.org/10.1016/j.techfore.2019.119862].
DOI: https://doi.org/10.1016/j.techfore.2019.119862
Google Scholar
[16] Khokher M. R. et al.: Early Lessons in Deploying Cameras and Artificial Intelligence Technology for Fisheries Catch Monitoring: Where Machine Learning Meets Commercial Fishing. Canadian Journal of Fisheries and Aquatic Sciences 79(2), 2022, 257–266 [https://doi.org/10.1139/cjfas-2020-0446].
DOI: https://doi.org/10.1139/cjfas-2020-0446
Google Scholar
[17] Klapp I. et al.: Ornamental Fish Counting by Non-imaging Optical System for Real-time Applications. Computers and Electronics in Agriculture 153, 2018, 126–133 [https://doi.org/10.1016/j.compag.2018.08.007].
DOI: https://doi.org/10.1016/j.compag.2018.08.007
Google Scholar
[18] Liu H., Lang B.: Machine Learning and Deep Learning Methods for Intrusion Detection Systems: A Survey. Applied Sciences 9(20), 2019, 4396 [https://doi.org/10.3390/app9204396].
DOI: https://doi.org/10.3390/app9204396
Google Scholar
[19] Mada Sanjaya W. S.: Deep Learning Citra Medis Berbasis Pemrograman Python. Bolabot, 2023.
Google Scholar
[20] Redmon J. et al.: You Only Look Once: Unified, Real-Time Object Detection. arXiv.org, 8 June 2015, arxiv.org/abs/1506.02640.
DOI: https://doi.org/10.1109/CVPR.2016.91
Google Scholar
[21] Reynard D., Shirgaokar M.: Harnessing the Power of Machine Learning: Can Twitter Data Be Useful in Guiding Resource Allocation Decisions During a Natural Disaster? Transportation Research Part D: Transport and Environment 77, 2019, 449–463 [https://doi.org/10.1016/j.trd.2019.03.002].
DOI: https://doi.org/10.1016/j.trd.2019.03.002
Google Scholar
[22] Rico-Díaz Á. J. et al.: An Application of Fish Detection Based on Eye Search With Artificial Vision and Artificial Neural Networks. Water 12(11), 2020, 3013 [https://doi.org/10.3390/w12113013].
DOI: https://doi.org/10.3390/w12113013
Google Scholar
[23] Sanjaya W. S. et al.: The Design of Face Recognition and Tracking for Human-robot Interaction. 2nd International Conferences on Information Technology, Information Systems and Electrical Engineering – ICITISEE). IEEE, 2017 [https://doi.org/10.1109/icitisee.2017.8285519].
DOI: https://doi.org/10.1109/ICITISEE.2017.8285519
Google Scholar
[24] Shafiee M. J. et al.: Fast YOLO: A Fast You Only Look Once System for Real-time Embedded Object Detection in Video. arXiv.org, 18 Sept. 2017, arxiv.org/abs/1709.05943.
DOI: https://doi.org/10.15353/vsnl.v3i1.171
Google Scholar
[25] Unlu E. et al.: Deep Learning-based Strategies for the Detection and Tracking of Drones Using Several Cameras. IPSJ Transactions on Computer Vision and Applications 11(1), 2019 [https://doi.org/10.1186/s41074-019-0059-x].
DOI: https://doi.org/10.1186/s41074-019-0059-x
Google Scholar
[26] Wang D. et al.: UAV Environmental Perception and Autonomous Obstacle Avoidance: A Deep Learning and Depth Camera Combined Solution. Computers and Electronics in Agriculture 175, 2020, 105523 [https://doi.org/10.1016/j.compag.2020.105523].
DOI: https://doi.org/10.1016/j.compag.2020.105523
Google Scholar
[27] Xiu L. et al.: Fast Accurate Fish Detection and Recognition of Underwater Images With Fast R-CNN. OCEANS 2015 – MTS/IEEE Washington. IEEE, 2015 [https://doi.org/10.23919/oceans.2015.7404464].
DOI: https://doi.org/10.23919/OCEANS.2015.7404464
Google Scholar
[28] Zhang L. et al.: Automatic Fish Counting Method Using Image Density Grading and Local Regression. Computers and Electronics in Agriculture 179, 2020, 105844 [https://doi.org/10.1016/j.compag.2020.105844].
DOI: https://doi.org/10.1016/j.compag.2020.105844
Google Scholar
[29] Zhao Zhong-Qiu et al.: Object Detection With Deep Learning: A Review. IEEE Transactions on Neural Networks and Learning Systems 30(11), 2019, 3212–3232 [https://doi.org/10.1109/tnnls.2018.2876865].
DOI: https://doi.org/10.1109/TNNLS.2018.2876865
Google Scholar
Autorzy
Rizki MultajamUniversiti Malaysia Terengganu Malezja
Autorzy
Ahmad Faisal Mohamad AyobUniversiti Malaysia Terengganu Malezja
Autorzy
W.S. Mada SanjayaUniversitas Islam Negeri Sunan Gunung Djati Indonezja
Autorzy
Aceng SambasUniversiti Sultan Zainal Abidin Malezja
Autorzy
Volodymyr Rusynrusyn_v@ukr.net
Yuriy Fedkovych Chernivtsi National University, Department of Radio Engineering and Information Ukraina
https://orcid.org/0000-0001-6219-1031
Autorzy
Andrii SamilaYuriy Fedkovych Chernivtsi National University Ukraina
Statystyki
Abstract views: 131PDF downloads: 65
Inne teksty tego samego autora
- Heorhii Rozorinov, Oleksandr Hres, Volodymyr Rusyn, Petro Shpatar, ŚRODOWISKO KOMPATYBILNOŚCI ELEKTROMAGNETYCZNEJ ŚRODKÓW KOMUNIKACJI RADIOELEKTRONICZNEJ , Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska: Tom 10 Nr 1 (2020)
- Sviatoslav Khrapko, Volodymyr Rusyn, Leonid Politansky, BADANIE NIELINIOWYCH WŁAŚCIWOŚCI MEMRYSTORA , Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska: Tom 8 Nr 1 (2018)
- Heorhii Rozorinov, Oleksandr Hres, Volodymyr Rusyn, UOGÓLNIONY MODEL PROCESU OCHRONY INFORMACJI W SIECIACH DYSTRYBUCJI TREŚCI AUDIOWIZUALNYCH , Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska: Tom 12 Nr 4 (2022)
- Lintang Patria, Aceng Sambas, Ibrahim Mohammed Sulaiman, Mohamed Afendee Mohamed, Volodymyr Rusyn, Andrii Samila, WYKRYWANIE CHWASTÓW NA MARCHWI PRZY UŻYCIU KONWOLUCYJNEJ SIECI NEURONOWEJ I INTERNETU RZECZY OPARTEGO NA SMARTFONIE , Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska: Tom 14 Nr 3 (2024)