Author Image

Comprehensive Carbon Footprint for AI

Measuring the Carbon footprint of AI

Acceleration and use of Artifical Intelligence raises the issue of its environmental cost, regarding its deployment on numerous data centers and the generalisation of more than ever power and data hungry algorithms. There is currently little work about this aspect, however, bad software optimization and user practices is significant energy waste, it has been measured as half of the consumption of a data center. The first need to reverse this trend is an accurate measurement procedure of the carbon footprint, both at the scale of the whole data center, and at the finer grain scale of the algorithms. Our goal in this project is to answer the following questions: “Does this architecture consumes more than another?” “How much energy do I save if I sacrifice a few percent of accuracy?”

We plan to build multi-scale models to measure carbone footprint and evaluate the compromise “environmental impact”/“algorithm accuracy” for deep learning algorithms at the scale of a data center: algorithm iteration, user, node, whole infrastructure. The measurements of the power consumption will come from accurate external power meters, in build computer sensors as well data from the facilities such as the cooling system. The collected data will enable to identify new opportunities applicable to other large data centers.

3 laboratories and 1 data center

This project is a collaboration between the Lip6, the LISN and the GrennAIUppa laboratories. The application context will be centred around the LabIA.


LabIA Data Center: it contains 12 stations dedicated to AI and is used by 5 other laboratories. It has a representative size of to analyze the consumption of AI models and user practices. The results will be directly applicable to most of the industrial and university data centers. See also for more details on the infrastructure.

How to measure

Energy consumption is not trivial, its more complicated than plugging a power meter to a computer. Nowadays, there is a growing interest in the machine learning and IT community and several libraries have been build based on tools such as RAPL and NVIDIA-SMI. We walk this line and are developping our IAPowerMeter. The final goal is to give to engineers and data scientists the capacity to measure the consumption of deep/machine learning algorithms. However, such approaches are dependant on the build-in sensors and in any case, will miss some energy spending such as fans, harddrive and so on. For this reason, CoCa4AI will complement its study with external powermeters in order to obtain complete measures and evaluate the gaps of the common libraries.

Carbon footprint pushes the analysis one step further as the complete life cycle of the infratstructure has to been taken into account. CoCa4AI will be the first project to perform this anlysis at the scale of a data center and at a fine grained resolution of user activity and node computation.