Summary
- Open AI’s ChatGPT has made generative AI popular and accessible, but training large language models such as GPT-3.5 and GPT-4 is expensive and resource-intensive.
- Thanks to Microsoft’s investment in Open AI, GPT-4 was trained using massive amounts of computing power, generating heat that required a lot of water to cool.
- Microsoft’s water use for cooling data centers will increase by 34% between 2021 and 2022, raising concerns about AI’s environmental impact.
Open AI’s ChatGPT popular generative AI with an easy-to-use chatbot interface, and within a year, competing AIs like Google Bard and Microsoft Bing found a home in everything from workspace apps to Google Chrome. Computing responses and training large language models like OpenAI’s GPT3.5 can be very expensive and resource intensive, which powers the free version of ChatGPT. Chatbot is now also available as an Android app, making AI more accessible. However, these calculations also draw significant amounts of power and generate a lot of heat as a byproduct, and your favorite AI might be leaking a lot more water than you imagined.
Microsoft invested its first billion in open AI in 2019, two years before ChatGPT took the world by storm. The tech giant’s data centers were used to train GPT4 on massive amounts of human-generated content. In 2020, Microsoft revealed that the supercomputer for this task used 285,000 cores of conventional semiconductors and 10,000 graphics processors, which are critical for AI workloads. Understandably, such hardware will generate a lot more heat at full tilt than your average Android phone, which dissipates heat through the rear panel to stay cool. Large amounts of water or other liquid coolants are typically required to carry this heat away.
A recent Fortune report on the environmental impact of AI chatbots noted that between 2021 and 2022 (via Windows Central), Microsoft’s water use for cooling data centers is set to increase by 34% annually. It now holds a whopping 1.7 billion gallons of water, or 2,500 Olympic swimming pools. Recently, Microsoft vice president and president Brad Smith revealed that the GPT-4 was “built” in a data center in Iowa, cooled by water from the Raccoon and Des Moines rivers. Microsoft claims it only uses water when temperatures rise above 85 °F, but even so, the company pumped 11.5 million gallons of water into the cluster months before GPT-4 was completed—6% of the entire district’s water use.
Understandably, Iowa City fears any more Microsoft projects will house homes unless the company promises to reduce its water usage from current levels. Despite Microsoft’s significant contribution to the city, it supports its investment in the form of taxes. A University of California researcher estimates that on a per-use basis, OpenAI uses 16 ounces of water (about 500 ml) for every 5 to 50 questions. The statistics also account for water used in power generation for data centers, but the difference is due to seasonal temperatures affecting cooling needs and the location of Microsoft’s facilities handling the request.
For example, while an Arizona data center uses more water for the same output, an Iowa facility runs the coolest, because the ambient temperature there is higher. However, reckless use of natural resources by multinational corporations is also accelerating climate change. Despite initial secrecy surrounding the company’s environmental impact, annual reports and promises to be carbon negative and water positive by 2030 have helped Microsoft come clean. We hope the company will spread awareness about the environmental impact of AI which is mostly free to use.