Glass fiber membranes, typically used in water and air filters, could be the key to cooling off the data centers that run large language models (LLMs) like ChatGPT and other artificial intelligence (AI) systems.
AI and other high-intensity computing applications are energy-hungry. A 2024 Dept. of Energy report found that data center energy load had tripled over the previous decade and was expected to double or triple again by 2028. As of 2023, data centers accounted for 4.4% of the electricity consumed in the U.S., that report found, with an expected increase of up to 12% of U.S. electricity by 2028.
The graphics processing units (GPUs) and central processing units (CPUs) in data centers are currently cooled either by air-based cooling systems that run cool air between racks, or by single-phase liquid cooling. In single-phase liquid cooling, a coolant is circulated inside a cold plate that sits next to the electronic equipment. These methods can account for up to 45% of a data center’s electricity requirements. And they’re not enough, adds Renkun Chen, a professor of mechanical and aerospace engineering at the Univ. of California, San Diego...
Would you like to access the complete CEP News Update?
No problem. You just have to complete the following steps.
You have completed 0 of 2 steps.
-
Log in
You must be logged in to view this content. Log in now.
-
AIChE Membership
You must be an AIChE member to view this article. Join now.
Copyright Permissions
Would you like to reuse content from CEP Magazine? It’s easy to request permission to reuse content. Simply click here to connect instantly to licensing services, where you can choose from a list of options regarding how you would like to reuse the desired content and complete the transaction.