Skip to content

Streamline ML-Model Training Process new

Overview

Sustainability DimensionEcological
ML Development PhaseModeling and Training
ML Development StakeholdersML Development, Software Development

Description

Previous research endeavors provide evidence that emissions from ML training can be significantly reduced when using servers within selected geographic regions at specific times (Dodge et al., 2022; Xu, 2022). Therefore, the DP “Streamline ML-Model Training Process” describes the optimization of the ML training setup to allow flexible training schedules and leverage renewable energy. An in-depth analysis of different techniques can be found in Xu (2022) and Radovanovic et al. (2023).

Sources

  • Dodge, J., Prewitt, T., Tachet des Combes, R., Odmark, E., Schwartz, R., Strubell, E., Luccioni, A. S., Smith, N. A., DeCario, N., & Buchanan, W. (2022). Measuring the Carbon Intensity of AI in Cloud Instances. Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency, 1877–1894. https://doi.org/10.1145/3531146.3533234
  • Xu, T. (2022). These simple changes can make AI research much more energy efficient. MIT Technology Review. https://www.technologyreview.com/2022/07/06/1055458/ai-research-emissions-energy-efficient/
  • Radovanović, A., Koningstein, R., Schneider, I., Chen, B., Duarte, A., Roy, B., Xiao, D., Haridasan, M., Hung, P., Care, N., Talukdar, S., Mullen, E., Smith, K., Cottman, M., & Cirne, W. (2023). Carbon-Aware Computing for Datacenters. IEEE Transactions on Power Systems, 38(2), 1270–1280. https://doi.org/10.1109/TPWRS.2022.3173250