Araştırma Makalesi
BibTex RIS Kaynak Göster

Koruyucu Gözlük Kullanımının Görüntü İşleme Yöntemiyle Tespit Edilmesi

Yıl 2022, Cilt: 9 Sayı: 1, 86 - 95, 31.01.2022
https://doi.org/10.31202/ecjse.945167

Öz

Bu çalışmada, İş Sağlığı ve Güvenliği önlemlerinden biri olan Koruyucu Gözlük kullanımının Görüntü İşleme yöntemiyle tespiti yapılmıştır. Yerleştirilen kamera aracılığı ile alınan gözlük görüntüleri, Görüntü İşleme ve Derin Öğrenme ile saptanmıştır. Çalışma gerçekleştirilirken Python Programlama Dili, Google Colab Platformu, OpenCV kütüphanesinden yararlanılmış ve nesne tanıma algoritmaları kullanılmıştır. Öncelikle, elde edilen Koruyucu Gözlük fotoğrafları bir dosyada toplanmış, MakeSence.Ai platformunda etiketleme işlemi gerçekleştirilmiştir. Her bir fotoğrafa ait tespit dokümanları alınarak koordinat bilgileri elde edilmiş, çıkan sonuçlar belli bir oranda test ve eğitim verisi olarak ikiye ayrılmış, Darknet yardımıyla Google Colab Platformu’na aktarılmıştır. Aktarılan veriler YOLOv4 Algoritması ile eğitilmiş, tüm sonuçlar Python OpenCV kütüphanesi aracılığıyla çalıştırılıp doğruluğu tespit edilmiştir. Görüntü İşleme ve Derin Öğrenme yöntemleri kullanılarak uygun sonuçlara ulaşılan bu çalışma, YOLOv4 Algoritmasının Google Colab Platformu üzerinde çalıştırılması ve bunun sonucunda, işletmelerde tam zamanlı İş Sağlığı Güvenlik önlemlerinin denetiminin kolaylaştırılması açısından yenilikçi bir bakış açısı getirmektedir.

Kaynakça

  • Kim, K. G.,“Deep learning”, Healthcare informatics research, 2016, 22(4): 351-354.
  • Tian, Y., Yang, G., Wang, Z., Wang, H., Li, E., & Liang, Z., “Apple detection during different growth stages in orchards using the improved YOLO-V3 model”, Computers and electronics in agriculture, , 2019, 157, 417-426.
  • Wu, D., Lv, S., Jiang, M., & Song, H., ”Using channel pruning-based YOLO v4 deep learning algorithm for the real-time and accurate detection of apple flowers in natural environments”, Computers and Electronics in Agriculture, 2020, 178, 105742.
  • Çağıl, G.,Yıldırım, B., “Bir Montaj Parçasının Derin Öğrenme ve Görüntü İşleme ile Tespiti”, Zeki Sistemler Teori ve Uygulamaları Dergisi, 2020, 3(2): 31-37.
  • Aktaş, A., Doğan, B., & Demir, Ö., “Tactile paving surface detection with deep learning methods” , Journal of the Faculty of Engineering and Architecture of Gazi University, 2020,35(3): 1685-1700.
  • Fang, Q., Li, H., Luo, X., Ding, L., Luo, H., Rose, T. M., & An, W., “Detecting non-hardhat-use by a deep learning method from far-field surveillance videos”, Automation in Construction, 2018, 85, 1-9.
  • Bo, Y., Huan, Q., Huan, X., Rong, Z., Hongbin, L., Kebin, M., ... & Lei, Z., “Helmet Detection Under The Power Construction Scene Based On Image Analysis”, In 2019 IEEE 7th international conference on computer science and network technology (ICCSNT), Dalian, China, 67-71, (2019)
  • Lopes, H. C. V., “A Robust Real-time Component For Personal Protective Equipment Detection In An Industrial Setting”, PhD Thesis, PUC-Rio ,(2021).
  • Coşkun, M., YILDIRIM, Ö., Ayşegül, U. Ç. A. R., & Demir, Y., “An overview of popular deep learning methods”, European Journal of Technique, 2017, 7(2): 165-176.
  • Altuntaş, Y., Cömert, Z., & Kocamaz, A. F. “Identification of haploid and diploid maize seeds using convolutional neural networks and a transfer learning approach”, Computers and Electronics in Agriculture, 2019,163, 104874.
  • Jiang, R., Lin, Q., & Qu, S. “Let Blind People See: Real-Time Visual Recognition With Results Converted to 3D Audio”,Report No. 218, Standord University, Stanford, USA, (2016).
  • Redmon, J., & Farhadi, A., “YOLO9000: better, faster, stronger.”, In Proceedings of the IEEE conference on computer vision and pattern recognition, 7263-7271, (2017).
  • Rezaei, M., & Azarmi, M., “Deepsocial: Social distancing monitoring and infection risk assessment in covid-19 pandemic”. Applied Sciences, 2020, 10(21): 7514.
  • Misra, D., ”Mish: A Self Regularized Non-monotonic Activation Function“, arXiv preprint arXiv:1908.08681, (2019).
  • Yu, J., & Zhang, W., ”Face mask wearing detection algorithm based on improved YOLO-v4. Sensors”, 2021, 21(9): 3263.
  • Guo, F., Qian, Y., & Shi, Y., “Real-time railroad track components inspection based on the improved YOLOv4 framework”, Automation in Construction, 2021, 125, 103596.
  • Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. “You only look once: Unified, real-time object detection”. In Proceedings of the IEEE conference on computer vision and pattern recognition,779-788,(2016).

Detection The Use Of Protective Glasses by Image Processing

Yıl 2022, Cilt: 9 Sayı: 1, 86 - 95, 31.01.2022
https://doi.org/10.31202/ecjse.945167

Öz

In this study, the use of Protective Glasses, which is one of the Occupational Health and Safety measures, was determined by Image Processing method. The glasses images taken through the placed camera were determined by Image Processing and Deep Learning. Throughout the study, Python Programming Language, Google Colab Platform, OpenCV library were used and object recognition algorithms were used. First of all, the obtained Goggles photos were collected in a file, and the labeling process was carried out on the MakeSence.Ai platform. Coordinate information was obtained by taking the detection documents of each photo, the results were divided into two as test and training data at a certain rate and transferred to the Google Colab Platform with the help of Darknet. The transferred data was trained with the YOLOv4 Algorithm, all results were run through the Python OpenCV library and their accuracy was determined. This study, which reached appropriate results by using Image Processing and Deep Learning methods, brings an innovative perspective in terms of running the YOLOv4 Algorithm on the Google Colab Platform and, as a result, facilitating the control of full-time Occupational Health and Safety measures in enterprises.

Kaynakça

  • Kim, K. G.,“Deep learning”, Healthcare informatics research, 2016, 22(4): 351-354.
  • Tian, Y., Yang, G., Wang, Z., Wang, H., Li, E., & Liang, Z., “Apple detection during different growth stages in orchards using the improved YOLO-V3 model”, Computers and electronics in agriculture, , 2019, 157, 417-426.
  • Wu, D., Lv, S., Jiang, M., & Song, H., ”Using channel pruning-based YOLO v4 deep learning algorithm for the real-time and accurate detection of apple flowers in natural environments”, Computers and Electronics in Agriculture, 2020, 178, 105742.
  • Çağıl, G.,Yıldırım, B., “Bir Montaj Parçasının Derin Öğrenme ve Görüntü İşleme ile Tespiti”, Zeki Sistemler Teori ve Uygulamaları Dergisi, 2020, 3(2): 31-37.
  • Aktaş, A., Doğan, B., & Demir, Ö., “Tactile paving surface detection with deep learning methods” , Journal of the Faculty of Engineering and Architecture of Gazi University, 2020,35(3): 1685-1700.
  • Fang, Q., Li, H., Luo, X., Ding, L., Luo, H., Rose, T. M., & An, W., “Detecting non-hardhat-use by a deep learning method from far-field surveillance videos”, Automation in Construction, 2018, 85, 1-9.
  • Bo, Y., Huan, Q., Huan, X., Rong, Z., Hongbin, L., Kebin, M., ... & Lei, Z., “Helmet Detection Under The Power Construction Scene Based On Image Analysis”, In 2019 IEEE 7th international conference on computer science and network technology (ICCSNT), Dalian, China, 67-71, (2019)
  • Lopes, H. C. V., “A Robust Real-time Component For Personal Protective Equipment Detection In An Industrial Setting”, PhD Thesis, PUC-Rio ,(2021).
  • Coşkun, M., YILDIRIM, Ö., Ayşegül, U. Ç. A. R., & Demir, Y., “An overview of popular deep learning methods”, European Journal of Technique, 2017, 7(2): 165-176.
  • Altuntaş, Y., Cömert, Z., & Kocamaz, A. F. “Identification of haploid and diploid maize seeds using convolutional neural networks and a transfer learning approach”, Computers and Electronics in Agriculture, 2019,163, 104874.
  • Jiang, R., Lin, Q., & Qu, S. “Let Blind People See: Real-Time Visual Recognition With Results Converted to 3D Audio”,Report No. 218, Standord University, Stanford, USA, (2016).
  • Redmon, J., & Farhadi, A., “YOLO9000: better, faster, stronger.”, In Proceedings of the IEEE conference on computer vision and pattern recognition, 7263-7271, (2017).
  • Rezaei, M., & Azarmi, M., “Deepsocial: Social distancing monitoring and infection risk assessment in covid-19 pandemic”. Applied Sciences, 2020, 10(21): 7514.
  • Misra, D., ”Mish: A Self Regularized Non-monotonic Activation Function“, arXiv preprint arXiv:1908.08681, (2019).
  • Yu, J., & Zhang, W., ”Face mask wearing detection algorithm based on improved YOLO-v4. Sensors”, 2021, 21(9): 3263.
  • Guo, F., Qian, Y., & Shi, Y., “Real-time railroad track components inspection based on the improved YOLOv4 framework”, Automation in Construction, 2021, 125, 103596.
  • Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. “You only look once: Unified, real-time object detection”. In Proceedings of the IEEE conference on computer vision and pattern recognition,779-788,(2016).
Toplam 17 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Mühendislik
Bölüm Makaleler
Yazarlar

Gülsemin Başaran 0000-0003-1720-626X

Gültekin Cagıl 0000-0001-8609-6178

Yayımlanma Tarihi 31 Ocak 2022
Gönderilme Tarihi 30 Mayıs 2021
Kabul Tarihi 19 Kasım 2021
Yayımlandığı Sayı Yıl 2022 Cilt: 9 Sayı: 1

Kaynak Göster

IEEE G. Başaran ve G. Cagıl, “Koruyucu Gözlük Kullanımının Görüntü İşleme Yöntemiyle Tespit Edilmesi”, ECJSE, c. 9, sy. 1, ss. 86–95, 2022, doi: 10.31202/ecjse.945167.