Artificial intelligence has the potential to drive sustainability by minimizing the impact of machine learning (ML) development on the environment. However, many ML techniques, particularly ensemble methods like the Random Forest classifier, require large computational resources during the tuning of hyperparameters. These hyperparameters are the number of trees, the depth of the tree, and the number of features considered at each split of the tree. These hyperparameters considerably impact model performance and energy consumption. This paper proposes an eco-friendly multi-objective framework (EFMOF) to optimize the hyperparameters with minimal environmental impact while retaining high model accuracy. By leveraging advanced hyperparameter optimization techniques like Optuna, Hyperopt, and Grid Search, the framework effectively explores the hyperparameter space, focusing on energy efficiency and carbon reduction. From the above, incorporating sustainable AI into ML development requires monitoring energy consumption and carbon emissions at every hyperparameter tuning. This will ensure that the models developed perform well and are sustainable without too much environmental cost. The Experimental result shows that the most dominant hyperparameter is the number of estimators, which leads to higher energy consumption. In contrast, minimum samples per leaf and split have a moderate effect, while maximum depth has a minor impact.
Multi-objective Ensemble Classification Hyperparameter Optimization Eco-friendly Sustainable ML.
Artificial intelligence has the potential to drive sustainability by minimizing the impact of machine learning (ML) development on the environment. However, many ML techniques, particularly ensemble methods like the Random Forest classifier, require large computational resources during the tuning of hyperparameters. These hyperparameters are the number of trees, the depth of the tree, and the number of features considered at each split of the tree. These hyperparameters considerably impact model performance and energy consumption. This paper proposes an eco-friendly multi-objective framework (EFMOF) to optimize the hyperparameters with minimal environmental impact while retaining high model accuracy. By leveraging advanced hyperparameter optimization techniques like Optuna, Hyperopt, and Grid Search, the framework effectively explores the hyperparameter space, focusing on energy efficiency and carbon reduction. From the above, incorporating sustainable AI into ML development requires monitoring energy consumption and carbon emissions at every hyperparameter tuning. This will ensure that the models developed perform well and are sustainable without too much environmental cost. The Experimental result shows that the most dominant hyperparameter is the number of estimators, which leads to higher energy consumption. In contrast, minimum samples per leaf and split have a moderate effect, while maximum depth has a minor impact.
Multi-objective Ensemble Classification Hyperparameter Optimization Eco-friendly Sustainable ML.
| Primary Language | English |
|---|---|
| Subjects | Computer Software |
| Journal Section | Research Article |
| Authors | |
| Early Pub Date | September 26, 2025 |
| Publication Date | September 30, 2025 |
| Submission Date | March 24, 2025 |
| Acceptance Date | June 23, 2025 |
| Published in Issue | Year 2025 Volume: 8 Issue: 3 |
The papers in this journal are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License