Araştırma Makalesi
BibTex RIS Kaynak Göster

Motion Control of the Robot Arm Manufactured with a Three-Dimensional Printer and Hardness Detection of Objects

Yıl 2022, Cilt: 15 Sayı: 3, 289 - 300, 31.07.2022
https://doi.org/10.17671/gazibtd.1059378

Öz

In the study, a robotic arm was produced using a Fused Deposition Modeling (FDM) printer, one of the 3D printing technologies. Tactile sensing and motion planning of the produced robot arm was investigated by using image processing techniques and machine learning algorithms. This study aims to investigate and apply innovative approaches using image processing techniques and deep learning algorithms to prevent uncontrolled force application of the robotic arm and to solve tactile grip problems. In this study, solid models of the parts were designed by CAD program and manufactured using FDM type three-dimensional printer. The control system of the robotic hand consists of a Raspberry Pi control card, servo motors, pressure sensors, and a camera. Tactile sensing was performed by measuring the hardness of the product with pressure sensors placed on each fingertip of the robotic arm. Raspberry pi control card is receive the data from the sensors are process them, after that the appropriate motion and clutch pressure information is sent to the servo motors. A reference data set for the robotic arm was prepared with the possible movements of the human hand obtained using the camera. Image processing is provided by using the Gaussian filtering method on the images of the data set. In addition, the angular position of the robotic arm's motion was optimized using machine learning algorithms on the data set, and the motion planning of the robot arm was classified with 90% accuracy using HitNet, CNN, Capsule Networks, and Naive Bayes deep learning models. Among the deep learning models which were very successful are compared each other according to the performance evaluation criteria, for the motion planning of the robotic arm; The accuracy rate was 97.23% with the HitNET algorithm, 97.48% with CNN, 98.58% with the Capsnet algorithm and 98.61% with the Naive Bayes model. As a result of the performance evaluation criteria; It has been observed that the Naive Bayes model gives more successful results than other models with 98.61% accuracy, 98.63% specificity, 98.65% sensitivity, 1.39 error rate, and 68.64% F-measure value.

Kaynakça

  • E. Kahya, “Kivi Hasatı İçin Robotik Tutucu Tasarımı”, Uluslararası Teknolojik Bilimler Dergisi, 6(2), 18-35, 2014.
  • Y. Bayrak, S. Tanju, E. Öztürk, M. Ş. Dilege, “Akciğer Kanserinde Robotik Lobektomi: Erken Dönem Sonuçlar”, Turk Gogus Kalp Dama, 22(4), 785-789, 2014.
  • B. Robins, K. Dautenhahn, R. T. Boekhorst, A. Billard, “Robotic Assistants İn Therapy And Education Of Children With Autism: Can A Small Humanoid Robot Help Encourage Social İnteraction Skills?”, Universal Access in the Information Society, 4(2), 105-120, 2005.
  • Ö. F. Görçün, “Lojistikte Teknoloji Kullanımı Ve Robotik Sistemler-Technology Utılızatıon In Logıstıcs And Robotıc Systems”, Mehmet Akif Ersoy Üniversitesi Sosyal Bilimler Enstitüsü Dergisi, 10(24), 351-368, 2019.
  • U. Yüzgeç, H. E. Büyüktepe, C. Karakuzu, “Kablosuz Eldiven Sistemi ile Kontrol Edilen Robot Kol Tasarımı”, Türkiye Bilişim Vakfı Bilgisayar Bilimleri ve Mühendisliği Dergisi, 9(2), 35-42, 2016.
  • Y. Pititeeraphab, M. Sangworasil, “Design And Construction Of System To Control The Movement Of The Robot Arm” In 2015 8th Biomedical Engineering International Conference (BMEiCON), Pattaya, Thailand, 14, November, 2015.
  • Y. Zhuang, L. Duanling, “Principle And Mechanical Analysis Of A Pneumatic Underactuated Bionic Hand”, IEEE International Conference on Robotics and Biomimetics (ROBIO), Guilin, China 432436, December, 2009.
  • İ. Özkök, G. Kucukyildiz, S. Karakaya, H. Ocak, “Kinect Tabanlı Robot Kolu Kontrolü”, In Otomatik Kontrol Ulusal Toplantisi, (2013).
  • K. Hosoda, T. “Iwase, Robust Haptic Recognition By Anthropomorphic Bionic Hand Through Dynamic İnteraction”, International Conference on Intelligent Robots and Systems, 12361241, October, 2010.
  • G. C. Choudhary, R. B. V. Chethan, “Real Time Robotic Arm Control Using Hand Gestures”, International Conference on High Performance Computing and Applications, Bhubaneswar, India, 13, December, 2014.
  • K. S. Sree, T. Bikku, S. Mounika, N. Ravinder, M. L. Kumar, C. Prasad, “EMG Controlled Bionic Robotic Arm using Artificial Intelligence and Machine Learning”, In 2021 Fifth International Conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud)(I-SMAC), Palladam, India, 548554, 1113 November, 2021.
  • A. Hekmatmanesh, H. Wu, H. Handroos, “Largest Lyapunov Exponent Optimization for Control of a Bionic-Hand: A Brain Computer Interface Study”, Frontiers in Rehabilitation Sciences 2, 802-070, 2022.
  • S. Ryew, C. Hyoukryeol, "Doubleactive Universal Joint (Dauj): Robotic Jointmechanism For Human-Like Motions", Transactions on Robotics and Automation, 17(3), 290-300, 2001.
  • S. Hafiane, Y. Salih, A. S. Malik, “3d Hand Recognition For Telerobotics”, Symposium on Computers & Informatics, Langkawi, Malaysia, 132137, September, 2013.
  • G. Gómez, A. Hernandez, P. E. Hotz, R. Pfeifer, “An Adaptive Learning Mechanism For Teaching A Robotic Hand To Grasp”, In International symposium on adaptive motion of animals and machines, September, 2005.
  • H. Kawasaki, T. Komatsu, K. Uchiyama, “Dexterous Anthropomorphic Robot Hand With Distributed Tactile Sensor: Gifu Hand II”, IEEE/ASME Transactions On Mechatronics, 7(3), 296-303, 2002.
  • F. Doğan, İ. Türkoğlu, “Derin Öğrenme Modelleri ve Uygulama Alanlarına İlişkin Bir Derleme” Dicle Üniversitesi Mühendislik Fakültesi Mühendislik Dergisi, 10(2), 409-445, 2019.
  • J. Xing, G. Fang, J. Zhong, J. Li, “Application of Face Recognition Based on CNN in Fatigue Driving Detection”, In Proceedings of the 2019 International Conference on Artificial Intelligence and Advanced Manufacturing, Dublin Ireland, 15, 1719 October, 2019.
  • C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, A. Rabinovich, “Going Deeper With Convolutions”, In Proceedings of the IEEE conference on computer vision and pattern recognition, Boston, 19, 712 June, 2015.
  • D. Palaz, M. Magimai-Doss, R. Collobert, “Analysis Of Cnn-Based Speech Recognition System Using Raw Speech As İnput”, Proceedings of the Annual Conference of the International Speech Communication Association, 1115, January, 2015.
  • R. A. A. R. Agha, M. N. Sefer, P. Fattah, “A comprehensive study on sign languages recognition systems using (SVM, KNN, CNN and ANN)”, Proceedings of the First International Conference on Data Science, E-learning and Information Systems, Madrid, Spain, 16, 0102 October, 2018.
  • P. Ballester, R. M. Araujo, “On the performance of GoogLeNet and AlexNet applied to sketches”, Thirtieth AAAI Conference on Artificial Intelligence, Arizona, USA, 11241128, 1217 February, 2016.
  • G. Tripathi, K. Singh, D. K. Vishwakarma, “Convolutional neural networks for crowd behaviour analysis: a survey”, The Visual Computer, 35(5), 753-776, 2019.
  • R. Zhao, R. Yan, Z. Chen, K. Mao, P. Wang, R. X. Gao, “Deep Learning and Its Applications to Machine Health Monitoring”, Mechanical Systems and Signal Processing, 14(8), 213-237, 2016.
  • Internet: Convolutional Neural Network (CNN) Tutorial, https://www.kaggle.com/kanncaa1/convolutional-neural-network-cnn-tutorial, 12.08.2021.
  • L. Huang, J. Li, H. Hao, X. Li, “Micro-Seismic Event Detection And Location İn Underground Mines By Using Convolutional Neural Networks (Cnn) And Deep Learning”, Tunnelling and Underground Space Technology, 8, 265–276, 2018.
  • T. Guo, J. Dong, H. Li, Y. Gao, “Simple convolutional neural network on image classification”, 2nd International Conference on Big Data Analysis, Beijing, China, 721–724, 1012 March, 2017.
  • A. Deliege, A. Cioppa, M. V. Droogenbroeck, “Hitnet: a neural network with capsules embedded in a hit-or-miss layer, extended with hybrid data augmentation and ghost capsules”, arXiv preprint, 2018.
  • Internet: Hitnet: A Neural Network With Capsules Embedded İn A Hit-Or-Miss Layer, Extended With Hybrid Data Augmentation And Ghost Capsules, http://www.telecom.ulg.ac.be/hitnet/, 12.08.2021.
  • C. Xiang, L. Zhang, Y. Tang, W. Zou, C. Xu, “MS-CapsNet: A novel multi-scale capsule network”, Signal Processing Letters, 25(12), 1850-1854, 2018.
  • S. Toraman, “Kapsül ağları kullanılarak eeg sinyallerinin sınıflandırılması”, Fırat Üniversitesi Mühendislik Bilimleri Dergisi, 32(1), 203-209, 2020.
  • M. K. Patrick, A. F. Adekoya, A. A. Mighty, B. Y. Edward, “Capsule networks–a survey”, Journal of King Saud University-Computer and Information Sciences, 34(1), 1295-1310, 2019.
  • R. Mukhometzianov, J. Carrillo, “CapsNet comparative performance evaluation for image classification”, arXiv, 1805.11195, 2018.
  • W. Huang, F. Zhou, “DA-CapsNet: dual attention mechanism capsule network”, Scientific Reports, 10(1), 1-13. 2020.
  • F. Beşer, M. A. Kizrak, B. Bolat, T. Yildirim, “Recognition Of Sign Language Using Capsule Networks”, 26th Signal Processing and Communications Applications Conference (SIU), Izmir, Turkey, 14, 0205 May, 2018.
  • İ. Soyhan, S. Gurel, S. A. Tekin, “Yapay Zeka Tabanlı Görüntü İşleme Tekniklerinin İnsansız Hava Araçları Üzerinde Uygulamaları”, Avrupa Bilim ve Teknoloji Dergisi, (24), 469-473, 2021.
  • S. Solak, U. Altınışık, “Görüntü işleme teknikleri ve kümeleme yöntemleri kullanılarak fındık meyvesinin tespit ve sınıflandırılması”, Sakarya University Journal of Science, 22(1), 56-65, 2018.
  • A. Ravishankar, S. Anusha, H. K. Akshatha, A. Raj, S. Jahnavi, J. Madhura, "A Survey On Noise Reduction Techniques İn Medical İmages", International Conference of Electronics Communication and Aerospace Technology, Coimbatore, India, 385389, 2022 April, 2017.
  • A. Kumar, S. S. Sodhi, “Comparative analysis of gaussian filter, median filter and denoise autoenocoder”, 7th International Conference On Computing For Sustainable Global Development, New Delhi, India, 4551, 1214 March, 2020.
  • Z. Masetic, A. Subasi, “Congestive heart failure detection using random forest classifier”, Computer methods and programs in biomedicine, 130, 54-64, 2016.
  • M. Mursalin, Y. Zhang, Y. Chen, N. V. “Chawla, Automated epileptic seizure detection using improved correlation-based feature selection with random forest classifier”, Neurocomputing, 241, 204-214, 2017.
  • A. Ozcift, A. Gulten, “Classifier ensemble construction with rotation forest to improve medical diagnosis performance of machine learning algorithms”, Computer methods and programs in biomedicine, 104(3), 443-451, 2011.
  • P. Eusebi, “Diagnostic accuracy measures”, Cerebrovascular Diseases, 36(4), 267-272, 2013.
  • Ö. Ekrem, O. K. M. Salman, B. Aksoy, S. A. İnan, “Yapay Zekâ Yöntemleri Kullanılarak Kalp Hastalığının Tespiti”, Mühendislik Bilimleri ve Tasarım Dergisi, 8(5), 241-254, 2020.
  • A.M. Šimundić, “Measures of diagnostic accuracy: Basic definitions”, The Electronic Journal of the International Federation of Clinical Chemistry and Laboratory Medicine, 19(4), 203, 2009.
  • A. Şenol, Y. Canbay, M. Kaya, “Makine Öğrenmesi Yaklaşımlarını Kullanarak Salgınları Erken Evrede Tespit Etme Alanındaki Eğilimler”, International Journal of Informatics Technologies, 14(4), 355-366, 2021.
  • L. Q. Tan, S. Q. Xie, I. C. Lin, T. Lin, “Development of a multifingered robotic hand”, International Conference on Information and Automation, Zhuhai/Macau, China, 15411545, June, 2009.
  • Z. Ye, D. Li, “Principle And Mechanical Analysis Of A Pneumatic Underactuated Bionic Hand”, International Conference on Robotics and Biomimetics, Guilin, China, 432436, 1923 December, 2009.
  • Y. Hirano, K. Kitahama, S. Yoshizawa, “Image-Based Object Recognition And Dexterous Hand/Arm Motion Planning Using Rrts For Grasping İn Cluttered Scene”, International Conference on Intelligent Robots and Systems, Edmonton, Canada, 20412046, 0206 August, 2005.
  • S. Mahboubi, S. Davis, M. Nefti-Meziani, “Variable Stiffness Robotic Hand For Stable Grasp And Flexible Handling”, IEEE Access, 6, 68195-68209, 2018.
  • K. Mitsui, R. Ozawa, T. Kou, “An under-actuated robotic hand for multiple grasps”, International Conference on Intelligent Robots and Systems, Tokyo, Japan, 54755480, 0307 November, 2013.
  • S. W. Ruehl, C. Parlitz, G. Heppner, A. Hermann, A. Roennau, R. Dillmann, “Experimental Evaluation Of The Schunk 5-Finger Gripping Hand For Grasping Tasks”, International Conference on Robotics and Biomimetics, Bali, Indonesia, 24652470, 0510 December, 2014.
  • L. Jiang, K. Low, J. Costa, R. J. Black, Y. L. Park, “Fiber Optically Sensorized Multi-Fingered Robotic Hand”, International Conference on Intelligent Robots and Systems, Hamburg, Germany, 17631768, 28 September, 2015.

Yapay Zekâya Dayalı Robot Kol ile Hareket ve Farklı Nesnelerin Sertlik Kontrolü

Yıl 2022, Cilt: 15 Sayı: 3, 289 - 300, 31.07.2022
https://doi.org/10.17671/gazibtd.1059378

Öz

Çalışmada 3D baskı teknolojilerinden Fused Deposition Modeling (FDM) yazıcı kullanılarak robotik kol üretilmiştir. Üretilen robot kolun görüntü işleme teknikleri ve makine öğrenme algoritmaları kullanarak dokunsal algılama ve hareket planlaması araştırılmıştır. Bu çalışmanın amacı, robotik kolun kontrolsüz kuvvet uygulamasını engellemek ve dokunsal kavrama sorunlarını çözmek için görüntü işleme teknikleri ve derin öğrenme algoritmaları kullanılarak yenilikçi yaklaşımların araştırılması ve uygulanmasıdır. Bu çalışmada, CAD programı ile tasarımı gerçekleştirilmiş parçaların FDM tipi üç boyutlu yazıcı kullanılarak katı modelleri alınmış ve montaj için uygun hale getirilmiştir. Montajı tamamlanan robotik elin kontrol sistemi ise temel olarak Raspberry Pi kontrol kartı, servo motorlar, basınç sesörleri ve kameradan oluşmaktadır. Robotik kola ait her parmak ucuna yerleştirilen basınç sensörleri ile ürünün sertliği ölçülerek dokunsal algılama işlemi gerçekleştirilmiştir. Raspberrry pi kontrol kartı kullanılarak sensörlerden alınan veriler işlenmekte ve servo motorlara uygun hareket ve kavrama basınç bilgisi gönderilmektedir. Kamera kullanılarak elde edilen insan elinin olası hareketleri ile robotik kol için referans bir veri seti hazırlanmıştır. Veri setine ait görüntüler üzerinde Gaussian filtreleme yöntemi kullanılarak görüntü işleme sağlanmıştır. Bununla birlikte veri seti üzerinde makine öğrenme algoritmaları kullanarak robotik kolun hareket açısal konumu optimize edilmiş ve HitNet, CNN, Kapsül Ağları ve Naive Bayes derin öğrenme modelleri kullanılarak robot kolun hareket planlanması %90 doğruluk oranı ile sınıflandırılmıştır. Performans değerlendirme kriterlerine göre başarıları kıyaslanan derin öğrenme modelleri arasında, robotik kolun hareket planlaması için; HitNET algoritması ile 97.23%, CNN ile 97.48%, Capsnet algoritması ile %98,58 ve Naive Bayes modeli ile %98.61 doğruluk oranı elde edilmiştir. Performans değerlendirme kriterleri sonucunda; Naive Bayes modelinin %98.61 doğruluk, %98.63 özgüllük, %98.65 duyarlılık, 1.39 hata oranı ve %68.64 F-ölçüsü değeri ile diğer modellere göre daha başarılı sonuç verdiği gözlemlenmiştir.

Kaynakça

  • E. Kahya, “Kivi Hasatı İçin Robotik Tutucu Tasarımı”, Uluslararası Teknolojik Bilimler Dergisi, 6(2), 18-35, 2014.
  • Y. Bayrak, S. Tanju, E. Öztürk, M. Ş. Dilege, “Akciğer Kanserinde Robotik Lobektomi: Erken Dönem Sonuçlar”, Turk Gogus Kalp Dama, 22(4), 785-789, 2014.
  • B. Robins, K. Dautenhahn, R. T. Boekhorst, A. Billard, “Robotic Assistants İn Therapy And Education Of Children With Autism: Can A Small Humanoid Robot Help Encourage Social İnteraction Skills?”, Universal Access in the Information Society, 4(2), 105-120, 2005.
  • Ö. F. Görçün, “Lojistikte Teknoloji Kullanımı Ve Robotik Sistemler-Technology Utılızatıon In Logıstıcs And Robotıc Systems”, Mehmet Akif Ersoy Üniversitesi Sosyal Bilimler Enstitüsü Dergisi, 10(24), 351-368, 2019.
  • U. Yüzgeç, H. E. Büyüktepe, C. Karakuzu, “Kablosuz Eldiven Sistemi ile Kontrol Edilen Robot Kol Tasarımı”, Türkiye Bilişim Vakfı Bilgisayar Bilimleri ve Mühendisliği Dergisi, 9(2), 35-42, 2016.
  • Y. Pititeeraphab, M. Sangworasil, “Design And Construction Of System To Control The Movement Of The Robot Arm” In 2015 8th Biomedical Engineering International Conference (BMEiCON), Pattaya, Thailand, 14, November, 2015.
  • Y. Zhuang, L. Duanling, “Principle And Mechanical Analysis Of A Pneumatic Underactuated Bionic Hand”, IEEE International Conference on Robotics and Biomimetics (ROBIO), Guilin, China 432436, December, 2009.
  • İ. Özkök, G. Kucukyildiz, S. Karakaya, H. Ocak, “Kinect Tabanlı Robot Kolu Kontrolü”, In Otomatik Kontrol Ulusal Toplantisi, (2013).
  • K. Hosoda, T. “Iwase, Robust Haptic Recognition By Anthropomorphic Bionic Hand Through Dynamic İnteraction”, International Conference on Intelligent Robots and Systems, 12361241, October, 2010.
  • G. C. Choudhary, R. B. V. Chethan, “Real Time Robotic Arm Control Using Hand Gestures”, International Conference on High Performance Computing and Applications, Bhubaneswar, India, 13, December, 2014.
  • K. S. Sree, T. Bikku, S. Mounika, N. Ravinder, M. L. Kumar, C. Prasad, “EMG Controlled Bionic Robotic Arm using Artificial Intelligence and Machine Learning”, In 2021 Fifth International Conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud)(I-SMAC), Palladam, India, 548554, 1113 November, 2021.
  • A. Hekmatmanesh, H. Wu, H. Handroos, “Largest Lyapunov Exponent Optimization for Control of a Bionic-Hand: A Brain Computer Interface Study”, Frontiers in Rehabilitation Sciences 2, 802-070, 2022.
  • S. Ryew, C. Hyoukryeol, "Doubleactive Universal Joint (Dauj): Robotic Jointmechanism For Human-Like Motions", Transactions on Robotics and Automation, 17(3), 290-300, 2001.
  • S. Hafiane, Y. Salih, A. S. Malik, “3d Hand Recognition For Telerobotics”, Symposium on Computers & Informatics, Langkawi, Malaysia, 132137, September, 2013.
  • G. Gómez, A. Hernandez, P. E. Hotz, R. Pfeifer, “An Adaptive Learning Mechanism For Teaching A Robotic Hand To Grasp”, In International symposium on adaptive motion of animals and machines, September, 2005.
  • H. Kawasaki, T. Komatsu, K. Uchiyama, “Dexterous Anthropomorphic Robot Hand With Distributed Tactile Sensor: Gifu Hand II”, IEEE/ASME Transactions On Mechatronics, 7(3), 296-303, 2002.
  • F. Doğan, İ. Türkoğlu, “Derin Öğrenme Modelleri ve Uygulama Alanlarına İlişkin Bir Derleme” Dicle Üniversitesi Mühendislik Fakültesi Mühendislik Dergisi, 10(2), 409-445, 2019.
  • J. Xing, G. Fang, J. Zhong, J. Li, “Application of Face Recognition Based on CNN in Fatigue Driving Detection”, In Proceedings of the 2019 International Conference on Artificial Intelligence and Advanced Manufacturing, Dublin Ireland, 15, 1719 October, 2019.
  • C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, A. Rabinovich, “Going Deeper With Convolutions”, In Proceedings of the IEEE conference on computer vision and pattern recognition, Boston, 19, 712 June, 2015.
  • D. Palaz, M. Magimai-Doss, R. Collobert, “Analysis Of Cnn-Based Speech Recognition System Using Raw Speech As İnput”, Proceedings of the Annual Conference of the International Speech Communication Association, 1115, January, 2015.
  • R. A. A. R. Agha, M. N. Sefer, P. Fattah, “A comprehensive study on sign languages recognition systems using (SVM, KNN, CNN and ANN)”, Proceedings of the First International Conference on Data Science, E-learning and Information Systems, Madrid, Spain, 16, 0102 October, 2018.
  • P. Ballester, R. M. Araujo, “On the performance of GoogLeNet and AlexNet applied to sketches”, Thirtieth AAAI Conference on Artificial Intelligence, Arizona, USA, 11241128, 1217 February, 2016.
  • G. Tripathi, K. Singh, D. K. Vishwakarma, “Convolutional neural networks for crowd behaviour analysis: a survey”, The Visual Computer, 35(5), 753-776, 2019.
  • R. Zhao, R. Yan, Z. Chen, K. Mao, P. Wang, R. X. Gao, “Deep Learning and Its Applications to Machine Health Monitoring”, Mechanical Systems and Signal Processing, 14(8), 213-237, 2016.
  • Internet: Convolutional Neural Network (CNN) Tutorial, https://www.kaggle.com/kanncaa1/convolutional-neural-network-cnn-tutorial, 12.08.2021.
  • L. Huang, J. Li, H. Hao, X. Li, “Micro-Seismic Event Detection And Location İn Underground Mines By Using Convolutional Neural Networks (Cnn) And Deep Learning”, Tunnelling and Underground Space Technology, 8, 265–276, 2018.
  • T. Guo, J. Dong, H. Li, Y. Gao, “Simple convolutional neural network on image classification”, 2nd International Conference on Big Data Analysis, Beijing, China, 721–724, 1012 March, 2017.
  • A. Deliege, A. Cioppa, M. V. Droogenbroeck, “Hitnet: a neural network with capsules embedded in a hit-or-miss layer, extended with hybrid data augmentation and ghost capsules”, arXiv preprint, 2018.
  • Internet: Hitnet: A Neural Network With Capsules Embedded İn A Hit-Or-Miss Layer, Extended With Hybrid Data Augmentation And Ghost Capsules, http://www.telecom.ulg.ac.be/hitnet/, 12.08.2021.
  • C. Xiang, L. Zhang, Y. Tang, W. Zou, C. Xu, “MS-CapsNet: A novel multi-scale capsule network”, Signal Processing Letters, 25(12), 1850-1854, 2018.
  • S. Toraman, “Kapsül ağları kullanılarak eeg sinyallerinin sınıflandırılması”, Fırat Üniversitesi Mühendislik Bilimleri Dergisi, 32(1), 203-209, 2020.
  • M. K. Patrick, A. F. Adekoya, A. A. Mighty, B. Y. Edward, “Capsule networks–a survey”, Journal of King Saud University-Computer and Information Sciences, 34(1), 1295-1310, 2019.
  • R. Mukhometzianov, J. Carrillo, “CapsNet comparative performance evaluation for image classification”, arXiv, 1805.11195, 2018.
  • W. Huang, F. Zhou, “DA-CapsNet: dual attention mechanism capsule network”, Scientific Reports, 10(1), 1-13. 2020.
  • F. Beşer, M. A. Kizrak, B. Bolat, T. Yildirim, “Recognition Of Sign Language Using Capsule Networks”, 26th Signal Processing and Communications Applications Conference (SIU), Izmir, Turkey, 14, 0205 May, 2018.
  • İ. Soyhan, S. Gurel, S. A. Tekin, “Yapay Zeka Tabanlı Görüntü İşleme Tekniklerinin İnsansız Hava Araçları Üzerinde Uygulamaları”, Avrupa Bilim ve Teknoloji Dergisi, (24), 469-473, 2021.
  • S. Solak, U. Altınışık, “Görüntü işleme teknikleri ve kümeleme yöntemleri kullanılarak fındık meyvesinin tespit ve sınıflandırılması”, Sakarya University Journal of Science, 22(1), 56-65, 2018.
  • A. Ravishankar, S. Anusha, H. K. Akshatha, A. Raj, S. Jahnavi, J. Madhura, "A Survey On Noise Reduction Techniques İn Medical İmages", International Conference of Electronics Communication and Aerospace Technology, Coimbatore, India, 385389, 2022 April, 2017.
  • A. Kumar, S. S. Sodhi, “Comparative analysis of gaussian filter, median filter and denoise autoenocoder”, 7th International Conference On Computing For Sustainable Global Development, New Delhi, India, 4551, 1214 March, 2020.
  • Z. Masetic, A. Subasi, “Congestive heart failure detection using random forest classifier”, Computer methods and programs in biomedicine, 130, 54-64, 2016.
  • M. Mursalin, Y. Zhang, Y. Chen, N. V. “Chawla, Automated epileptic seizure detection using improved correlation-based feature selection with random forest classifier”, Neurocomputing, 241, 204-214, 2017.
  • A. Ozcift, A. Gulten, “Classifier ensemble construction with rotation forest to improve medical diagnosis performance of machine learning algorithms”, Computer methods and programs in biomedicine, 104(3), 443-451, 2011.
  • P. Eusebi, “Diagnostic accuracy measures”, Cerebrovascular Diseases, 36(4), 267-272, 2013.
  • Ö. Ekrem, O. K. M. Salman, B. Aksoy, S. A. İnan, “Yapay Zekâ Yöntemleri Kullanılarak Kalp Hastalığının Tespiti”, Mühendislik Bilimleri ve Tasarım Dergisi, 8(5), 241-254, 2020.
  • A.M. Šimundić, “Measures of diagnostic accuracy: Basic definitions”, The Electronic Journal of the International Federation of Clinical Chemistry and Laboratory Medicine, 19(4), 203, 2009.
  • A. Şenol, Y. Canbay, M. Kaya, “Makine Öğrenmesi Yaklaşımlarını Kullanarak Salgınları Erken Evrede Tespit Etme Alanındaki Eğilimler”, International Journal of Informatics Technologies, 14(4), 355-366, 2021.
  • L. Q. Tan, S. Q. Xie, I. C. Lin, T. Lin, “Development of a multifingered robotic hand”, International Conference on Information and Automation, Zhuhai/Macau, China, 15411545, June, 2009.
  • Z. Ye, D. Li, “Principle And Mechanical Analysis Of A Pneumatic Underactuated Bionic Hand”, International Conference on Robotics and Biomimetics, Guilin, China, 432436, 1923 December, 2009.
  • Y. Hirano, K. Kitahama, S. Yoshizawa, “Image-Based Object Recognition And Dexterous Hand/Arm Motion Planning Using Rrts For Grasping İn Cluttered Scene”, International Conference on Intelligent Robots and Systems, Edmonton, Canada, 20412046, 0206 August, 2005.
  • S. Mahboubi, S. Davis, M. Nefti-Meziani, “Variable Stiffness Robotic Hand For Stable Grasp And Flexible Handling”, IEEE Access, 6, 68195-68209, 2018.
  • K. Mitsui, R. Ozawa, T. Kou, “An under-actuated robotic hand for multiple grasps”, International Conference on Intelligent Robots and Systems, Tokyo, Japan, 54755480, 0307 November, 2013.
  • S. W. Ruehl, C. Parlitz, G. Heppner, A. Hermann, A. Roennau, R. Dillmann, “Experimental Evaluation Of The Schunk 5-Finger Gripping Hand For Grasping Tasks”, International Conference on Robotics and Biomimetics, Bali, Indonesia, 24652470, 0510 December, 2014.
  • L. Jiang, K. Low, J. Costa, R. J. Black, Y. L. Park, “Fiber Optically Sensorized Multi-Fingered Robotic Hand”, International Conference on Intelligent Robots and Systems, Hamburg, Germany, 17631768, 28 September, 2015.
Toplam 53 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Bilgisayar Yazılımı
Bölüm Makaleler
Yazarlar

Bekir Aksoy 0000-0001-8052-9411

Koray Özsoy 0000-0001-8663-4466

Mehmet Yücel 0000-0002-4100-5831

Özge Ekrem 0000-0001-9142-405X

Osamah Khaled Musleh Salman 0000-0001-6526-4793

Yayımlanma Tarihi 31 Temmuz 2022
Gönderilme Tarihi 18 Ocak 2022
Yayımlandığı Sayı Yıl 2022 Cilt: 15 Sayı: 3

Kaynak Göster

APA Aksoy, B., Özsoy, K., Yücel, M., Ekrem, Ö., vd. (2022). Motion Control of the Robot Arm Manufactured with a Three-Dimensional Printer and Hardness Detection of Objects. Bilişim Teknolojileri Dergisi, 15(3), 289-300. https://doi.org/10.17671/gazibtd.1059378