Research Article
BibTex RIS Cite

An Environmental Sustainable Approach to Machine Learning, Training and Development

Year 2025, Volume: 8 Issue: 3, 457 - 469, 30.09.2025
https://doi.org/10.35377/saucis...1661247

Abstract

Artificial intelligence has the potential to drive sustainability by minimizing the impact of machine learning (ML) development on the environment. However, many ML techniques, particularly ensemble methods like the Random Forest classifier, require large computational resources during the tuning of hyperparameters. These hyperparameters are the number of trees, the depth of the tree, and the number of features considered at each split of the tree. These hyperparameters considerably impact model performance and energy consumption. This paper proposes an eco-friendly multi-objective framework (EFMOF) to optimize the hyperparameters with minimal environmental impact while retaining high model accuracy. By leveraging advanced hyperparameter optimization techniques like Optuna, Hyperopt, and Grid Search, the framework effectively explores the hyperparameter space, focusing on energy efficiency and carbon reduction. From the above, incorporating sustainable AI into ML development requires monitoring energy consumption and carbon emissions at every hyperparameter tuning. This will ensure that the models developed perform well and are sustainable without too much environmental cost. The Experimental result shows that the most dominant hyperparameter is the number of estimators, which leads to higher energy consumption. In contrast, minimum samples per leaf and split have a moderate effect, while maximum depth has a minor impact.

References

  • Sharma, P., & Puri, S. “Random Forest-Based Prediction of Breast Cancer Survival: Cross-Validation and Hyperparameter Tuning,” In International Conference on Advances in Computing and Data Sciences, 138-145. 2020. DOI: 10.1007/978-981-15-0277-0_13.
  • Alghamdi, F., Alsuhaibani, R., & Albattah, K. “Breast Cancer Diagnosis and Prediction Using Machine Learning and Data Mining Techniques: A Review,” IEEE Access, 9, 18152-18164. 2021, DOI: 10.1109/ACCESS.2021.3052953.
  • Feurer, M., & Hutter, F. “Hyperparameter Optimization in Machine Learning: A Comprehensive Survey,” Journal of Machine Learning Research, 20(1), 1-45, 2019. Available at: https://www.jmlr.org/papers/v20/18-444.html.
  • Gamage, G., Samarakoon, S., & Nguyen, N. T. “Energy-Efficient Machine Learning Models for Healthcare Applications.” IEEE Access, 9, 150357-150373, 2021. DOI: 10.1109/ACCESS.2021.3124182.
  • Rolnick, D., Donti, P. L., Kaack, L. H., Kochanski, K., Lacoste, A., Sankaran, K., ... & Bengio, Y. “Sustainable AI: Environmental Implications, Challenges, and Opportunities.” In Proceedings of the 2022 Conference on Fairness, Accountability, and Transparency, 145-156, 2022, DOI: 10.1145/3442188.3445934.
  • Zheng, B., Yoon, S. W., & Lam, S. S. “Breast cancer diagnosis based on feature extraction using a hybrid of K-means and support vector machine algorithms,” Expert Systems with Applications, 41(4), 1476-1482, 2021. https://doi.org/10.1016/j.eswa.2021.08.027
  • Delen, D., Walker, G., & Kadam, A. “Predicting breast cancer survivability: A comparison of three data mining methods,” Artificial Intelligence in Medicine, 34(2), 113-127, 2020. https://doi.org/10.1016/j.artmed.2020.08.003
  • Zizaan, Asma, and Ali Idri. “Evaluating and Comparing Bagging and Boosting of Hybrid Learning for Breast Cancer Screening.” Scientific African, vol. 23, Mar. 2024, doi:10.1016/j.sciaf.2023.e01989.
  • Jegadeeswari, K., and R. Rathipriya. "Optimized Stacking Ensemble Classifier for Early Cancer Detection Using Biomarker Data." Advance Sustainable Science Engineering and Technology 6, no. 4 (2024): 02404017-02404017.
  • Akiba, T., Sano, S., Yanase, T., Ohta, T., & Koyama, M. “Optuna: A next-generation hyperparameter optimization framework,” Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2623-2631, 2019. https://doi.org/10.1145/3292500.3330701
  • Bergstra, J., Yamins, D., & Cox, D. D. “Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures,” Proceedings of the 30th International Conference on Machine Learning, 28, 115-123, 2013.
  • Henderson, P., Hu, J., Romoff, J., Brunskill, E., Jurafsky, D., & Pineau, J. “Towards the systematic reporting of the energy and carbon footprints of machine learning,” Journal of Machine Learning Research, 21(1), 1-43, 2020.
  • Liu, Z., Cheng, S., Zhou, H., & You, Y. “Hanayo: Harnessing Wave-like Pipeline Parallelism for Enhanced Large Model Training Efficiency”, 2023. https://doi.org/10.1145/3581784.3607073
  • Strubell, E., Ganesh, A., & McCallum, A. “Energy and policy considerations for deep learning in NLP,” Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 3645-3650, 2019. https://doi.org/10.18653/v1/P19-1355
  • Lottick, K., Sakaguchi, K., Schwartz, R., & Smith, N. A. “Energy and policy considerations for deep learning in NLP,” Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 364-367, 2020. https://doi.org/10.18653/v1/2020.acl-main.34
  • Schwartz, R., Dodge, J., Smith, N. A., & Etzioni, O. “ Green AI,” Communications of the ACM, 63(12), 54-63, 2020. https://doi.org/10.1145/3381831
  • Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L. M., Rothchild, D., Socher, R., & Dean, J. “Carbon Emissions and large neural network training”. arXiv preprint arXiv:2104.10350, 2021.
  • Lo, F., Bitz, C. M., and Hess, J. J. “Development of a Random Forest Model for Forecasting Allergenic Pollen in North America”. Sci. Total Environ. 773, 145590, 2021. doi: 10.1016/j.scitotenv.2021.145590
  • Akiba, T., Sano, S., Yanase, T., Ohta, T., & Koyama, M. “Optuna: A next-generation hyperparameter optimization framework.” Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2623–2631, 2019.
  • Li, L., Jamieson, K., DeSalvo, G., Rostamizadeh, A., & Talwalkar, A. “Hyperband: Bandit-based configuration evaluation for hyperparameter optimization.” International Conference on Learning Representations, 2017.
  • Saranya, G., & Pravin, A. “GridSearch based optimum feature selection by tuning hyperparameters for heart disease diagnosis in machine learning.” The Open Biomedical Engineering Journal, 17(1), 2023a. https://doi.org/10.2174/18741207-v17-e230510-2022-ht28-4371-8
  • Zhu, N.; Zhu, C.; Zhou, L.; Zhu, Y.; Zhang, X. “Optimization of the Random Forest Hyperparameters for Power Industrial Control Systems Intrusion Detection Using an Improved GridSearch Algorithm.” Appl. Sci. 2022, 12, 10456. https://doi.org/10.3390/app122010456
  • K. Jegadeeswari, R. Rathipriya, "Green AI Practices in Multi-objective Hyperparameter Optimization for Sustainable Machine Learning", International Journal of Information Technology and Computer Science (IJITCS), Vol.17, No.2, pp.1-9, 2025. DOI:10.5815/ijitcs.2025.02.01.
  • Jegadeeswari, K., & Rathipriya, R. “Minimizing the carbon footprint of machine learning techniques through sustainable AI training methods.” In Sustainable information security in the age of AI and green computing. IGI Global, 2025.
  • Dodge, J., Prewitt, T., Combes, R T D., Odmark, E., Schwartz, R., Strubell, E., Luccioni, A S., Smith, N A., DeCario, N., & Buchanan, W. “Measuring the Carbon Intensity of AI in Cloud Instances”, 2022.
  • K. Jegadeeswari, R. Rathipriya and J. Renugadevi, "Fusion Learning of Regression Models for Missing Data Imputation in Breast Cancer Dataset," 2023 International Conference on Artificial Intelligence for Innovations in Healthcare Industries (ICAIIHI), Raipur, India, 2023, pp. 1-14, doi: 10.1109/ICAIIHI57871.2023.10489656.
  • K Jegadeeswari, R Ragunath, R Rathipriya, “A Prediction Model with Multi-Pattern Missing Data Imputation for Medical Dataset”, Advanced Network Technologies and Intelligent Computing, ANTIC 2022, CCIS 1798, Singapore Nature, 2023, 798, 2023, https://doi.org/10.1007/978-3-031-28183-9_38.

An Environmental Sustainable Approach to Machine Learning, Training and Development

Year 2025, Volume: 8 Issue: 3, 457 - 469, 30.09.2025
https://doi.org/10.35377/saucis...1661247

Abstract

Artificial intelligence has the potential to drive sustainability by minimizing the impact of machine learning (ML) development on the environment. However, many ML techniques, particularly ensemble methods like the Random Forest classifier, require large computational resources during the tuning of hyperparameters. These hyperparameters are the number of trees, the depth of the tree, and the number of features considered at each split of the tree. These hyperparameters considerably impact model performance and energy consumption. This paper proposes an eco-friendly multi-objective framework (EFMOF) to optimize the hyperparameters with minimal environmental impact while retaining high model accuracy. By leveraging advanced hyperparameter optimization techniques like Optuna, Hyperopt, and Grid Search, the framework effectively explores the hyperparameter space, focusing on energy efficiency and carbon reduction. From the above, incorporating sustainable AI into ML development requires monitoring energy consumption and carbon emissions at every hyperparameter tuning. This will ensure that the models developed perform well and are sustainable without too much environmental cost. The Experimental result shows that the most dominant hyperparameter is the number of estimators, which leads to higher energy consumption. In contrast, minimum samples per leaf and split have a moderate effect, while maximum depth has a minor impact.

References

  • Sharma, P., & Puri, S. “Random Forest-Based Prediction of Breast Cancer Survival: Cross-Validation and Hyperparameter Tuning,” In International Conference on Advances in Computing and Data Sciences, 138-145. 2020. DOI: 10.1007/978-981-15-0277-0_13.
  • Alghamdi, F., Alsuhaibani, R., & Albattah, K. “Breast Cancer Diagnosis and Prediction Using Machine Learning and Data Mining Techniques: A Review,” IEEE Access, 9, 18152-18164. 2021, DOI: 10.1109/ACCESS.2021.3052953.
  • Feurer, M., & Hutter, F. “Hyperparameter Optimization in Machine Learning: A Comprehensive Survey,” Journal of Machine Learning Research, 20(1), 1-45, 2019. Available at: https://www.jmlr.org/papers/v20/18-444.html.
  • Gamage, G., Samarakoon, S., & Nguyen, N. T. “Energy-Efficient Machine Learning Models for Healthcare Applications.” IEEE Access, 9, 150357-150373, 2021. DOI: 10.1109/ACCESS.2021.3124182.
  • Rolnick, D., Donti, P. L., Kaack, L. H., Kochanski, K., Lacoste, A., Sankaran, K., ... & Bengio, Y. “Sustainable AI: Environmental Implications, Challenges, and Opportunities.” In Proceedings of the 2022 Conference on Fairness, Accountability, and Transparency, 145-156, 2022, DOI: 10.1145/3442188.3445934.
  • Zheng, B., Yoon, S. W., & Lam, S. S. “Breast cancer diagnosis based on feature extraction using a hybrid of K-means and support vector machine algorithms,” Expert Systems with Applications, 41(4), 1476-1482, 2021. https://doi.org/10.1016/j.eswa.2021.08.027
  • Delen, D., Walker, G., & Kadam, A. “Predicting breast cancer survivability: A comparison of three data mining methods,” Artificial Intelligence in Medicine, 34(2), 113-127, 2020. https://doi.org/10.1016/j.artmed.2020.08.003
  • Zizaan, Asma, and Ali Idri. “Evaluating and Comparing Bagging and Boosting of Hybrid Learning for Breast Cancer Screening.” Scientific African, vol. 23, Mar. 2024, doi:10.1016/j.sciaf.2023.e01989.
  • Jegadeeswari, K., and R. Rathipriya. "Optimized Stacking Ensemble Classifier for Early Cancer Detection Using Biomarker Data." Advance Sustainable Science Engineering and Technology 6, no. 4 (2024): 02404017-02404017.
  • Akiba, T., Sano, S., Yanase, T., Ohta, T., & Koyama, M. “Optuna: A next-generation hyperparameter optimization framework,” Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2623-2631, 2019. https://doi.org/10.1145/3292500.3330701
  • Bergstra, J., Yamins, D., & Cox, D. D. “Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures,” Proceedings of the 30th International Conference on Machine Learning, 28, 115-123, 2013.
  • Henderson, P., Hu, J., Romoff, J., Brunskill, E., Jurafsky, D., & Pineau, J. “Towards the systematic reporting of the energy and carbon footprints of machine learning,” Journal of Machine Learning Research, 21(1), 1-43, 2020.
  • Liu, Z., Cheng, S., Zhou, H., & You, Y. “Hanayo: Harnessing Wave-like Pipeline Parallelism for Enhanced Large Model Training Efficiency”, 2023. https://doi.org/10.1145/3581784.3607073
  • Strubell, E., Ganesh, A., & McCallum, A. “Energy and policy considerations for deep learning in NLP,” Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 3645-3650, 2019. https://doi.org/10.18653/v1/P19-1355
  • Lottick, K., Sakaguchi, K., Schwartz, R., & Smith, N. A. “Energy and policy considerations for deep learning in NLP,” Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 364-367, 2020. https://doi.org/10.18653/v1/2020.acl-main.34
  • Schwartz, R., Dodge, J., Smith, N. A., & Etzioni, O. “ Green AI,” Communications of the ACM, 63(12), 54-63, 2020. https://doi.org/10.1145/3381831
  • Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L. M., Rothchild, D., Socher, R., & Dean, J. “Carbon Emissions and large neural network training”. arXiv preprint arXiv:2104.10350, 2021.
  • Lo, F., Bitz, C. M., and Hess, J. J. “Development of a Random Forest Model for Forecasting Allergenic Pollen in North America”. Sci. Total Environ. 773, 145590, 2021. doi: 10.1016/j.scitotenv.2021.145590
  • Akiba, T., Sano, S., Yanase, T., Ohta, T., & Koyama, M. “Optuna: A next-generation hyperparameter optimization framework.” Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2623–2631, 2019.
  • Li, L., Jamieson, K., DeSalvo, G., Rostamizadeh, A., & Talwalkar, A. “Hyperband: Bandit-based configuration evaluation for hyperparameter optimization.” International Conference on Learning Representations, 2017.
  • Saranya, G., & Pravin, A. “GridSearch based optimum feature selection by tuning hyperparameters for heart disease diagnosis in machine learning.” The Open Biomedical Engineering Journal, 17(1), 2023a. https://doi.org/10.2174/18741207-v17-e230510-2022-ht28-4371-8
  • Zhu, N.; Zhu, C.; Zhou, L.; Zhu, Y.; Zhang, X. “Optimization of the Random Forest Hyperparameters for Power Industrial Control Systems Intrusion Detection Using an Improved GridSearch Algorithm.” Appl. Sci. 2022, 12, 10456. https://doi.org/10.3390/app122010456
  • K. Jegadeeswari, R. Rathipriya, "Green AI Practices in Multi-objective Hyperparameter Optimization for Sustainable Machine Learning", International Journal of Information Technology and Computer Science (IJITCS), Vol.17, No.2, pp.1-9, 2025. DOI:10.5815/ijitcs.2025.02.01.
  • Jegadeeswari, K., & Rathipriya, R. “Minimizing the carbon footprint of machine learning techniques through sustainable AI training methods.” In Sustainable information security in the age of AI and green computing. IGI Global, 2025.
  • Dodge, J., Prewitt, T., Combes, R T D., Odmark, E., Schwartz, R., Strubell, E., Luccioni, A S., Smith, N A., DeCario, N., & Buchanan, W. “Measuring the Carbon Intensity of AI in Cloud Instances”, 2022.
  • K. Jegadeeswari, R. Rathipriya and J. Renugadevi, "Fusion Learning of Regression Models for Missing Data Imputation in Breast Cancer Dataset," 2023 International Conference on Artificial Intelligence for Innovations in Healthcare Industries (ICAIIHI), Raipur, India, 2023, pp. 1-14, doi: 10.1109/ICAIIHI57871.2023.10489656.
  • K Jegadeeswari, R Ragunath, R Rathipriya, “A Prediction Model with Multi-Pattern Missing Data Imputation for Medical Dataset”, Advanced Network Technologies and Intelligent Computing, ANTIC 2022, CCIS 1798, Singapore Nature, 2023, 798, 2023, https://doi.org/10.1007/978-3-031-28183-9_38.
There are 27 citations in total.

Details

Primary Language English
Subjects Computer Software
Journal Section Research Article
Authors

K Jegadeeswari 0000-0002-2632-4674

Rathipriya R 0000-0002-3970-262X

Early Pub Date September 26, 2025
Publication Date September 30, 2025
Submission Date March 24, 2025
Acceptance Date June 23, 2025
Published in Issue Year 2025 Volume: 8 Issue: 3

Cite

APA Jegadeeswari, K., & R, R. (2025). An Environmental Sustainable Approach to Machine Learning, Training and Development. Sakarya University Journal of Computer and Information Sciences, 8(3), 457-469. https://doi.org/10.35377/saucis...1661247
AMA Jegadeeswari K, R R. An Environmental Sustainable Approach to Machine Learning, Training and Development. SAUCIS. September 2025;8(3):457-469. doi:10.35377/saucis.1661247
Chicago Jegadeeswari, K, and Rathipriya R. “An Environmental Sustainable Approach to Machine Learning, Training and Development”. Sakarya University Journal of Computer and Information Sciences 8, no. 3 (September 2025): 457-69. https://doi.org/10.35377/saucis. 1661247.
EndNote Jegadeeswari K, R R (September 1, 2025) An Environmental Sustainable Approach to Machine Learning, Training and Development. Sakarya University Journal of Computer and Information Sciences 8 3 457–469.
IEEE K. Jegadeeswari and R. R, “An Environmental Sustainable Approach to Machine Learning, Training and Development”, SAUCIS, vol. 8, no. 3, pp. 457–469, 2025, doi: 10.35377/saucis...1661247.
ISNAD Jegadeeswari, K - R, Rathipriya. “An Environmental Sustainable Approach to Machine Learning, Training and Development”. Sakarya University Journal of Computer and Information Sciences 8/3 (September2025), 457-469. https://doi.org/10.35377/saucis. 1661247.
JAMA Jegadeeswari K, R R. An Environmental Sustainable Approach to Machine Learning, Training and Development. SAUCIS. 2025;8:457–469.
MLA Jegadeeswari, K and Rathipriya R. “An Environmental Sustainable Approach to Machine Learning, Training and Development”. Sakarya University Journal of Computer and Information Sciences, vol. 8, no. 3, 2025, pp. 457-69, doi:10.35377/saucis. 1661247.
Vancouver Jegadeeswari K, R R. An Environmental Sustainable Approach to Machine Learning, Training and Development. SAUCIS. 2025;8(3):457-69.


INDEXING & ABSTRACTING & ARCHIVING


 31045 31044   ResimLink - Resim Yükle  31047 

31043 28939 28938 34240


29070    The papers in this journal are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License