Araştırma Makalesi
BibTex RIS Kaynak Göster

A Flower Status Tracker and Self Irrigation System (FloTIS)

Yıl 2021, Cilt: 1 Sayı: 1, 45 - 50, 30.08.2021

Öz

The Internet of Things (IoT) provides solutions to many daily life problems. Smartphones with user-friendly applications make use of artificial intelligence solutions offered by deep learning techniques. In this work, we provide a sustainable solution to automatically monitor and control the irrigation process for detected flowers by combining deep learning and IoT techniques. The proposed flower status tracker and self-irrigation system (FloTIS) is implemented using a cloud-based server and an Android-based application to control the status of the flower which is being monitored by the local sensor devices. The system detects changes in the moisture of the soil and provides necessary irrigation for the flower. In order to optimize the water consumption, different classification algorithms are tested. The performance comparisons of similar works for example flower case denoted higher accuracy scores. Then the best generated deep learning model is deployed into the smartphone application that detects the flower type in order to determine the amount of water required for the daily irrigation for each type of flower. In this way, the system monitors water content in the soil and performs smart utilization of water while acknowledging the user.

Kaynakça

  • [1] A. Patil, M. Beldar, A. Naik, and S. Deshpande, "Smart farming using Arduino and data mining," In 3 rd International Conference on Computing for Sustainable Global Development (INDIACom), 2016, pp. 1913-1917.
  • [2] R. Ratasuk, B. Vejlgaard, N. Mangalvedhe, and A. Ghosh, "NB-IoT system for M2M communication," In Proc. IEEE Wireless Communications and Networking Conference, 2016, pp. 428-432.
  • [3] A. Kumar, A. Surendra, H. Mohan, K. M. Valliappan, and N. Kirthika, "Internet of things based smart irrigation using regression algorithm," In Proc. International Conference on Intelligent Computing, Instrumentation and Control Technologies, 2017, pp. 1652-1657.
  • [4] M. Mancuso and F. Bustaffa, "A wireless sensors network for monitoring environmental variables in a tomato greenhouse," In IEEE International Workshop on Factory Communication Systems, 2006, pp. 107-110.
  • [5] H. H. Lee, X. H. Li, K. W. Chung, and K. S. Hong. "Flower image recognition using multi-class SVM," Applied Mechanics and Materials, vol. 284-287, pp. 3106-3110, 2013.
  • [6] W. Zhou, S. Gao, L. Zhang, and X. Lou. "Histogram of oriented gradients feature extraction from raw Bayer pattern images," IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 67, no. 5, pp. 946-950, 2020.
  • [7] W. Liu, Y. Rao, B. Fan, J. Song, and Q. Wang. "Flower classification using fusion descriptor and SVM," In Proc. IEEE International Smart Cities Conference (ISC2), 2017, pp. 1-4.
  • [8] M. Hussain, J. J. Bird, and D. R. Faria, "A study on CNN transfer learning for image classification," In UK Workshop on Computational Intelligence, 2018, pp. 191-202.
  • [9] N. Sharma, V. Jain, and A. Mishra, "An analysis of convolutional neural networks for image classification," Procedia Computer Science, vol. 132, pp. 377-384, 2018.
  • [10] I. Gogul and V. S. Kumar, "Flower species recognition system using convolution neural networks and transfer learning," In 4 th International Conference on Signal Processing, Communication and Networking, 2017, pp. 1-6.
  • [11] X. Xia, C. Xu, and B. Nan, "Inception-v3 for flower classification," In 2 nd International Conference on Image, Vision and Computing, 2017, pp. 783-787.
  • [12] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, "Rethinking the inception architecture for computer vision," In Proc. IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 2818-2826.
  • [13] D. Sinha and M. El-Sharkawy. "Thin mobilenet: An enhanced mobilenet architecture," In IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference, 2019, pp. 280-285.
  • [14] K. He, X. Zhang, S. Ren, J. Sun, "Deep residual learning for image recognition", In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 770-778.
  • [15] W. Wang, Y. Li, T. Zou, X. Wang, J. You, and Y. Lou, "A novel image classification approach via dense-MobileNet models," Mobile Information Systems, vol. 2020, Article ID 7602384, 8 pages, 2020.
  • [16] I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning, MIT Press, Cambridge, 2016.
  • [17] Y. LeCun, B. Boser, J. Denker, D. Henderson, R. Howard, W. Hubbard, and L. Jackel, "Handwritten digit recognition with a backpropagation network," in: Advances in Neural Information Processing Systems (NIPS), vol. 2, 1989, pp. 396-404.
  • [18] L. Alzubaidi, J. Zhang, A. J. Humaidi, A. Al‑Dujaili, Y. Duan, O. Al‑Shamma, J. Santamaría, M. A. Fadhel, M. Al‑Amidie, and L. Farhan, "Review of deep learning: concepts, CNN architectures, challenges, applications, future directions," Journal of Big Data, vol. 8, no. 53, pp. 1-74, 2021.
  • [19] A. Rayes and S. Salam, Internet of Things From Hype to Reality: The Road to Digitization, 2nd, Springer, 2019.
  • [20] A. Al-Fuqaha, M. Guizani, M. Mohammadi, M. Aledhari, and M. Ayyash, "Internet of things: A survey on enabling technologies, protocols, and applications," IEEE Communications Surveys & Tutorials, vol. 17, no. 4, pp. 2347-2376, Fourthquarter 2015.
  • [21] I. U. Din et al., "The Internet of things: A review of enabled technologies and future challenges," IEEE Access, vol. 7, pp. 7606- 7640, 2019.
  • [22] A. Khanna and S. Kaur, "Internet of things (IoT), applications and challenges: A comprehensive review," Wireless Personal Communications, vol. 114, pp. 1687-1762, 2020.
  • [23] H. F. Atlam, R. J. Walters, and G. B. Wills, "Internet of things: State-of-the-art, challenges, applications, and open issues," International Journal of Intelligent Computing Research (IJICR), vol. 9, no. 3, pp. 928-938, 2018.
  • [24] A. Krizhevsky, I. Sutskever, and G. E. Hinton, "ImageNet classification with deep convolutional neural networks," in: Advances in Neural Information Processing Systems (NIPS), vol. 25, 2012, pp. 1097-1105.
  • [25] O. Russakovsky, J. Deng, H. Su, J. Krause, S. Satheesh, S. Ma, Z. Huang, A. Karpathy, A. Khosla, M. Bernstein, A. C. Berg, and L. Fei-Fei, "Imagenet large scale visual recognition challenge," 2015, pp. 211-252.
  • [26] A. G. Howard, M. Zhu, B. Chen, D. Kalenichenko, W. Wang, T. Weyand, M. Andreetto, and H. Adam, "Mobilenets: Efficient convolutional neural networks for mobile vision applications,", arXiv:1704.04861, 2017.
  • [27] K. He, X. Zhang, S. Ren, and J. Sun, "Deep residual learning for image recognition," In Proc. IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 770-778.
  • [28] K. Simonyan and A. Zisserman, "Very deep convolutional networks for large-scale image recognition,", arXiv:1409.1556, 2014.
  • [29] A. Gulli and S. Pal, Deep Learning with Keras, Packt Publishing, 2017.
  • [30] S. M. Sam, K. Kamardin, N. N. A. Sjarif, and N. Mohamed, "Offline signature verification using deep learning convolutional neural network (CNN) architectures GoogLeNet Inception-v1 and Inception-v3," Procedia Computer Science, vol. 161, pp. 475- 483, 2019.
  • [31] S. R. Bose and V. S. Kumar, "Efficient inception V2 based deep convolutional neural network for real-time hand action recognition," IET Image Processing, vol. 14, pp. 688-696, 2019.
  • [32] M. Kim and L. Rigazio, "Deep clustered convolutional kernels," In Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015, PMLR vol. 44, pp. 160-172, 2015.
  • [33] Y. A. Badamasi, "The working principle of an Arduino," In 11th International Conference on Electronics, Computer and Computation (ICECCO), 2014, pp. 1-4.
  • [34] N. S. Yamanoor, and S. Yamanoor, "High quality, low cost education with the Raspberry Pi," In IEEE Global Humanitarian Technology Conference (GHTC), 2017, pp. 1-5.
  • [35] [Online]. Available: https://www.groww.fr/en/plants.
  • [36] L. Moroney, "The firebase realtime database," in The Definitive Guide to Firebase, Apress, 2017, pp. 51-71.
  • [37] M.-E. Nilsback and A. Zisserman, "A visual vocabulary for flower classification," In Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2006, pp. 1447-1454.
  • [38] L. Bottou, "Stochastic gradient descent tricks," In: Montavon G., Orr G. B., Müller K. R. (eds) Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science, vol. 7700. Springer, Berlin, Heidelberg, 2012, pp. 421-436.
  • [39] O. Alsing, "Mobile object detection using tensorflow lite and transfer learning," Master thesis, KTH Royal Institute of Technology, 2018.
  • [40] Y. Wu, X. Qin, Y. Pan, and C. Yuan, "Convolution neural network based transfer learning for classification of flowers," In Proc. IEEE 3rd International Conference on Signal and Image Processing, 2018, pp. 562-566.
  • [41] S. Cao and B. Song, "Visual attentional-driven deep learning method for flower recognition," Mathematical Biosciences and Engineering, vol. 18, no. 3, pp. 1981-1991, 2021.
Yıl 2021, Cilt: 1 Sayı: 1, 45 - 50, 30.08.2021

Öz

Kaynakça

  • [1] A. Patil, M. Beldar, A. Naik, and S. Deshpande, "Smart farming using Arduino and data mining," In 3 rd International Conference on Computing for Sustainable Global Development (INDIACom), 2016, pp. 1913-1917.
  • [2] R. Ratasuk, B. Vejlgaard, N. Mangalvedhe, and A. Ghosh, "NB-IoT system for M2M communication," In Proc. IEEE Wireless Communications and Networking Conference, 2016, pp. 428-432.
  • [3] A. Kumar, A. Surendra, H. Mohan, K. M. Valliappan, and N. Kirthika, "Internet of things based smart irrigation using regression algorithm," In Proc. International Conference on Intelligent Computing, Instrumentation and Control Technologies, 2017, pp. 1652-1657.
  • [4] M. Mancuso and F. Bustaffa, "A wireless sensors network for monitoring environmental variables in a tomato greenhouse," In IEEE International Workshop on Factory Communication Systems, 2006, pp. 107-110.
  • [5] H. H. Lee, X. H. Li, K. W. Chung, and K. S. Hong. "Flower image recognition using multi-class SVM," Applied Mechanics and Materials, vol. 284-287, pp. 3106-3110, 2013.
  • [6] W. Zhou, S. Gao, L. Zhang, and X. Lou. "Histogram of oriented gradients feature extraction from raw Bayer pattern images," IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 67, no. 5, pp. 946-950, 2020.
  • [7] W. Liu, Y. Rao, B. Fan, J. Song, and Q. Wang. "Flower classification using fusion descriptor and SVM," In Proc. IEEE International Smart Cities Conference (ISC2), 2017, pp. 1-4.
  • [8] M. Hussain, J. J. Bird, and D. R. Faria, "A study on CNN transfer learning for image classification," In UK Workshop on Computational Intelligence, 2018, pp. 191-202.
  • [9] N. Sharma, V. Jain, and A. Mishra, "An analysis of convolutional neural networks for image classification," Procedia Computer Science, vol. 132, pp. 377-384, 2018.
  • [10] I. Gogul and V. S. Kumar, "Flower species recognition system using convolution neural networks and transfer learning," In 4 th International Conference on Signal Processing, Communication and Networking, 2017, pp. 1-6.
  • [11] X. Xia, C. Xu, and B. Nan, "Inception-v3 for flower classification," In 2 nd International Conference on Image, Vision and Computing, 2017, pp. 783-787.
  • [12] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, "Rethinking the inception architecture for computer vision," In Proc. IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 2818-2826.
  • [13] D. Sinha and M. El-Sharkawy. "Thin mobilenet: An enhanced mobilenet architecture," In IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference, 2019, pp. 280-285.
  • [14] K. He, X. Zhang, S. Ren, J. Sun, "Deep residual learning for image recognition", In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 770-778.
  • [15] W. Wang, Y. Li, T. Zou, X. Wang, J. You, and Y. Lou, "A novel image classification approach via dense-MobileNet models," Mobile Information Systems, vol. 2020, Article ID 7602384, 8 pages, 2020.
  • [16] I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning, MIT Press, Cambridge, 2016.
  • [17] Y. LeCun, B. Boser, J. Denker, D. Henderson, R. Howard, W. Hubbard, and L. Jackel, "Handwritten digit recognition with a backpropagation network," in: Advances in Neural Information Processing Systems (NIPS), vol. 2, 1989, pp. 396-404.
  • [18] L. Alzubaidi, J. Zhang, A. J. Humaidi, A. Al‑Dujaili, Y. Duan, O. Al‑Shamma, J. Santamaría, M. A. Fadhel, M. Al‑Amidie, and L. Farhan, "Review of deep learning: concepts, CNN architectures, challenges, applications, future directions," Journal of Big Data, vol. 8, no. 53, pp. 1-74, 2021.
  • [19] A. Rayes and S. Salam, Internet of Things From Hype to Reality: The Road to Digitization, 2nd, Springer, 2019.
  • [20] A. Al-Fuqaha, M. Guizani, M. Mohammadi, M. Aledhari, and M. Ayyash, "Internet of things: A survey on enabling technologies, protocols, and applications," IEEE Communications Surveys & Tutorials, vol. 17, no. 4, pp. 2347-2376, Fourthquarter 2015.
  • [21] I. U. Din et al., "The Internet of things: A review of enabled technologies and future challenges," IEEE Access, vol. 7, pp. 7606- 7640, 2019.
  • [22] A. Khanna and S. Kaur, "Internet of things (IoT), applications and challenges: A comprehensive review," Wireless Personal Communications, vol. 114, pp. 1687-1762, 2020.
  • [23] H. F. Atlam, R. J. Walters, and G. B. Wills, "Internet of things: State-of-the-art, challenges, applications, and open issues," International Journal of Intelligent Computing Research (IJICR), vol. 9, no. 3, pp. 928-938, 2018.
  • [24] A. Krizhevsky, I. Sutskever, and G. E. Hinton, "ImageNet classification with deep convolutional neural networks," in: Advances in Neural Information Processing Systems (NIPS), vol. 25, 2012, pp. 1097-1105.
  • [25] O. Russakovsky, J. Deng, H. Su, J. Krause, S. Satheesh, S. Ma, Z. Huang, A. Karpathy, A. Khosla, M. Bernstein, A. C. Berg, and L. Fei-Fei, "Imagenet large scale visual recognition challenge," 2015, pp. 211-252.
  • [26] A. G. Howard, M. Zhu, B. Chen, D. Kalenichenko, W. Wang, T. Weyand, M. Andreetto, and H. Adam, "Mobilenets: Efficient convolutional neural networks for mobile vision applications,", arXiv:1704.04861, 2017.
  • [27] K. He, X. Zhang, S. Ren, and J. Sun, "Deep residual learning for image recognition," In Proc. IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 770-778.
  • [28] K. Simonyan and A. Zisserman, "Very deep convolutional networks for large-scale image recognition,", arXiv:1409.1556, 2014.
  • [29] A. Gulli and S. Pal, Deep Learning with Keras, Packt Publishing, 2017.
  • [30] S. M. Sam, K. Kamardin, N. N. A. Sjarif, and N. Mohamed, "Offline signature verification using deep learning convolutional neural network (CNN) architectures GoogLeNet Inception-v1 and Inception-v3," Procedia Computer Science, vol. 161, pp. 475- 483, 2019.
  • [31] S. R. Bose and V. S. Kumar, "Efficient inception V2 based deep convolutional neural network for real-time hand action recognition," IET Image Processing, vol. 14, pp. 688-696, 2019.
  • [32] M. Kim and L. Rigazio, "Deep clustered convolutional kernels," In Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015, PMLR vol. 44, pp. 160-172, 2015.
  • [33] Y. A. Badamasi, "The working principle of an Arduino," In 11th International Conference on Electronics, Computer and Computation (ICECCO), 2014, pp. 1-4.
  • [34] N. S. Yamanoor, and S. Yamanoor, "High quality, low cost education with the Raspberry Pi," In IEEE Global Humanitarian Technology Conference (GHTC), 2017, pp. 1-5.
  • [35] [Online]. Available: https://www.groww.fr/en/plants.
  • [36] L. Moroney, "The firebase realtime database," in The Definitive Guide to Firebase, Apress, 2017, pp. 51-71.
  • [37] M.-E. Nilsback and A. Zisserman, "A visual vocabulary for flower classification," In Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2006, pp. 1447-1454.
  • [38] L. Bottou, "Stochastic gradient descent tricks," In: Montavon G., Orr G. B., Müller K. R. (eds) Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science, vol. 7700. Springer, Berlin, Heidelberg, 2012, pp. 421-436.
  • [39] O. Alsing, "Mobile object detection using tensorflow lite and transfer learning," Master thesis, KTH Royal Institute of Technology, 2018.
  • [40] Y. Wu, X. Qin, Y. Pan, and C. Yuan, "Convolution neural network based transfer learning for classification of flowers," In Proc. IEEE 3rd International Conference on Signal and Image Processing, 2018, pp. 562-566.
  • [41] S. Cao and B. Song, "Visual attentional-driven deep learning method for flower recognition," Mathematical Biosciences and Engineering, vol. 18, no. 3, pp. 1981-1991, 2021.
Toplam 41 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Yapay Zeka
Bölüm Research Articles
Yazarlar

Rumeysa Keskin

Furkan Güney

M. Erdal Özbek

Yayımlanma Tarihi 30 Ağustos 2021
Gönderilme Tarihi 16 Temmuz 2021
Yayımlandığı Sayı Yıl 2021 Cilt: 1 Sayı: 1

Kaynak Göster

IEEE R. Keskin, F. Güney, ve M. E. Özbek, “A Flower Status Tracker and Self Irrigation System (FloTIS)”, Journal of Artificial Intelligence and Data Science, c. 1, sy. 1, ss. 45–50, 2021.

All articles published by JAIDA are licensed under a Creative Commons Attribution 4.0 International License.

88x31.png