Item

Wafer-Scale Systems: A Carbon Perspective

Golden, Alicia
Elgamal, Mariam
Mahmoud, Abdulrahman
Hills, Gage
Wu, Carolejean
Wei, Guyeon
Brooks, David Michael
Supervisor
Department
Computer Science
Embargo End Date
Type
Journal article
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
The rapid rise of Large Language Models (LLMs) has prompted a re-evaluation of system architecture design, making energy efficiency and sustainability more crucial than ever. Recently, wafer-scale architectures have emerged as a viable alternative for LLM training and inference, as evidenced by the success of Cerebras Systems. In this work, we examine the carbon implications of wafer-scale architectures as compared to traditional GPUs. As a case study, we examine LLMs on a Cerebras CS-3 system in order to quantify power and total carbon. Then, we analyze total carbon delay product (tCDP) to evaluate the carbon efficiency and performance potential of these systems. We take the first step towards exploring this trade-off for wafer-scale versus traditional GPU architectures - and ultimately find there exists a rich design space, depending on workload and hardware configuration.
Citation
GoldenAlicia et al., “Wafer-Scale Systems: A Carbon Perspective,” ACM SIGENERGY Energy Informatics Review, vol. 5, no. 2, pp. 118–124, Jul. 2025, doi: 10.1145/3757892.3757909
Source
ACM SIGENERGY Energy Informatics Review
Conference
Keywords
Wafer-Scale, Carbon Footprint, Large Language Models, Sustainability, Sustainable Computing
Subjects
Source
Publisher
Association for Computing Machinery
Full-text link