The environmental toll of advancing artificial intelligence (AI) technologies is a growing concern, as the increasing demand for computing power generates a variety of environmental consequences.
The burgeoning requirement for AI processing power not only leads to the heightened consumption of freshwater to cool expansive data centers but also aggravates air pollution due to the reliance on coal-fired power plants for electricity.
These findings, outlined in a recent paper by researchers at the University of California, Riverside, reveal that major technology firms are not adequately addressing the equitable distribution of these environmental impacts.
This perspective is echoed by international entities like the United Nations Educational, Scientific and Cultural Organization (UNESCO) and the Organization for Economic Cooperation and Development (OECD), which are also advocating for efforts to mitigate AI’s environmental inequity.
The research paper offers potential solutions. The UCR team provides models that companies, such as Google and Microsoft, could adopt to distribute their computing power and processing loads equitably across the globe.
The flexibility inherent to the tech industry provides opportunities to prevent regional environmental injustices. This, according to Shaolei Ren, an associate professor at UCR’s Bourns College of Engineering and the corresponding author of the paper titled “Towards Environmentally Equitable AI via Geographical Load Balancing.”
Using too much water for computing power
Professor Ren emphasizes that tech firms have the freedom to direct computing power to different locations instantaneously. However, instead of utilizing this flexibility to account for the environmental cost of AI, tech giants often use it to minimize capital and operational costs.
Ren expressed his concern over the issue. He stated, “It just does not feel right. We are exploiting the cheaper prices and amplifying the environmental impacts.”
A clear example of this dilemma is the plans by several tech companies to establish large-scale data centers in the Phoenix, Arizona region. These centers are projected to consume between 1 to 5 million gallons of water daily, even though this desert region is grappling with water scarcity due to the dwindling flows of the Colorado River.
Ren proposed alternatives to this practice. He suggests that firms could divert their computing power workloads from water-scarce regions like Arizona to other areas.
Enormous amounts of electricity
In addition to water consumption, these data centers consume copious amounts of electricity produced at power plants. The resulting emissions include not just carbon, contributing to global warming, but also other harmful pollutants like particulate matter and nitrogen oxides. These elements can form lung-irritating ozone.
The pollutants generated by data centers pose significant health risks. These include increased chances of cancer, heart disease, shortened lifespans, and other health detriments. Consequently, residents living near these power plants bear the brunt of these environmental and health impacts.
This research was led by UCR doctoral candidates Pengfei Li and Jianyi Yang, with co-authorship from Cal Tech professor Adam Wierman and correspondence from Professor Ren. Their work is currently available as a preprint at eScholarship Publishing.
“AI will have some negative environmental impacts,” Ren concludes, “But, what’s more concerning for me is that the environmental cost is not equitably distributed across all data centers. We should reduce the environmental inequity to make AI truly responsible.”
More about energy use and data centers
Data centers, the critical nerve centers of the modern digital economy, require a significant amount of energy resources to operate. These large-scale operations run non-stop to support Internet services, business applications, and cloud computing platforms.
Let’s dive deeper into the various energy resources that data centers use to function. These include electricity, renewable energy sources, and novel solutions being developed to increase energy efficiency.
Computing power guzzles electricity
Traditional data centers primarily rely on electricity to power their operations. This power is needed not only for servers and storage systems but also for cooling systems, power backups, and peripheral devices.
Data centers globally use about 300 terawatt hours (TWh) each year, approximately 2% of the world’s electricity use, as of 2022.
Cooling systems required
Cooling systems are a major part of data centers’ energy consumption. They use electricity to reduce the heat generated by servers and other equipment.
Traditional air conditioning systems or more advanced solutions like liquid cooling are common. In some instances, companies use creative cooling solutions. Google’s data center in Finland, for instance, uses cold seawater from the Gulf of Finland for cooling.
Easing the environmental cost of AI with renewable energy
Many data center operators are increasingly shifting towards renewable energy sources to power their operations. This is in response to global sustainability trends.
These renewable sources include solar, wind, hydroelectric, and geothermal energy. For instance, Apple’s data center in North Carolina uses a 100-acre solar farm, providing a significant portion of the center’s power needs.
Energy efficiency and novel technologies
In recent years, there has been an increased focus on improving energy efficiency in data centers. Virtualization technologies can significantly reduce the number of physical servers required, thereby reducing energy needs. More efficient cooling technologies, such as liquid cooling, are also becoming more popular.
Some data centers are located in colder climates to take advantage of natural cooling, reducing the need for traditional cooling systems. For example, Facebook’s data center in Luleå, Sweden, uses the cold Nordic air to cool its servers.
Moreover, AI and machine learning are being used to optimize energy use in data centers. Google’s DeepMind AI, for instance, has reduced the energy used for cooling by up to 40% in their data centers.
Power backup systems are necessary
Data centers often have power backup systems to ensure they stay operational in case of a power outage. These systems usually consist of uninterruptible power supplies (UPS), generators, and batteries. The most common energy resources for these backup systems include diesel generators and lithium-ion batteries.
Future solutions for meeting computing power demand
As energy demands continue to rise, data centers are exploring innovative solutions to improve efficiency and reduce environmental impact. Future directions include the expanded use of renewable energy, the development of more efficient cooling techniques, and harnessing AI to further optimize energy use.
As the digital economy and demands for computing power continue to grow, so will the importance of finding innovative and environmentally friendly energy resources for data centers.