pwshub.com

AI’s drinking problem may just solve itself

Comment Once an abstract subject of science fiction and academic research, the concept of artificial intelligence has become the topic of dinner table conversations over the past two years.

This shift has brought widespread awareness of the environmental implications of this technology, most prominently centered on the massive sums of power and water required to train and deploy these models. And it's understandable why.

A recent report found that datacenter water consumption in Northern Virginia, the bit barn capital of the world, had increased by two-thirds over the past five years.

"ChatGPT needs to 'drink' a 500 ml bottle of water for a simple conversation of roughly 20-50 questions and answers, depending on when and where ChatGPT is deployed," the researchers estimated in a paper published early last year.

To make matters worse, that was for a GPT-3-class model measuring roughly 175 billion parameters, a figure that feels positively tiny by today's standards. GPT-4 is estimated to be somewhere between 1.7 and 1.8 trillion parameters in size, and, as OpenAI's Trevor Cai put it in his Hot Chips keynote last week, these models are only going to get bigger.

While that doesn't bode well for datacenter power consumption, the same may not be true for its H2O addiction. At least, it doesn't have to be.

First, let's get something out of the way before the comments point out the obvious. Datacenters don't really consume water. The real problem is that water is being removed from the local environment rather than returned to its source. Second, the IT infrastructure, AI related or otherwise, isn't actually what's consuming the water.

Even when liquid cooled, these systems are usually closed loops that lose little if any appreciable quantity of fluids during normal operation. What's actually gobbling up all that H2O is the bit barn's air handlers, often called evaporative or swamp coolers, used to keep those systems from overheating.

However, it's important to note that this is a design decision and evaporative coolers aren't used in every facility. If Meta or Amazon are setting up an AI datacenter in your backyard, they will put a strain on your local power grid, but it doesn't necessarily mean they're going to suck up a quarter of your town's water supply like Google does out in The Dalles, Oregon.

In colder climates, dry coolers and so-called "free cooling" are adequate, while in hotter, drought-prone regions, it's not uncommon to see DC operators opt for refrigerant-based systems. Last we heard, that's exactly what Microsoft is doing with its DC developments in Goodyear, Arizona, albeit only after a wastewater dispute with the city.

Although there are alternatives to evaporative cooling, many come at the expense of higher power consumption, a commodity already in short supply, as CBRE recently reported.

While there's only so much to be done about existing DC facilities, the decision to employ evaporative cooling in new builds ultimately comes down to greed, or, to use the politically correct parlance, capitalism.

For hyperscalers in particular, everything eventually comes down to margins. If you can do something 5 percent cheaper or more efficiently than the competition, you can make that much more in profits, or undercut them and win over their customer base. And water just happens to be incredibly efficient at stripping heat from the air compared to alternative technologies. That means lower electricity costs or the ability to build larger, denser facilities in power-constrained locales.

Even at industrial rates, electricity costs add up quickly. So, in markets where evaporative coolers are viable, the technology offers a competitive advantage.

The argument can also be made that the evaporative cooler's water consumption is a worthwhile trade-off if it means burning fewer fossil fuels to keep the lights on, but that's heavily dependent on location. The nature of evaporative cooling means that they're most efficient in arid climates where water is already a scarce resource.

Ultimately, it really boils down to this: you can either use more power or consume more water. If water is cheaper than power, and better yet, perceived to be plentiful such as around the Great Lakes, you can guess which operators are going to choose.

  • Google's Irish bit barn plans denied over eco shortfall
  • Meta digs deep to strike geothermal power deal for its US datacenters
  • LiquidStack says its new CDU can chill more than 1MW of AI compute
  • LG Electronics aims to become a datacenter cooling player, with aircon and immersion tech

However, this may be changing. The pace of AI innovation doesn't look like it's going to let up any time soon. In this climate, we've seen chips grow ever hotter, passing the one kilowatt mark, and driving a transition to liquid cooling.

Nvidia's Grace Blackwell Superchips that we looked at back at GTC are rated for 2,700 W with two of them designed to fit into a single RU chassis. To accommodate this incredibly dense package, Nvidia unsurprisingly opted for direct liquid cooling (DLC).

While arguably better in terms of operation efficiency – DLC is substantially more energy efficient than burning energy on fans – it also poses major headaches for datacenter operators as many older facilities can't be retrofitted to accommodate this technology easily.

While this is a headache for some, widespread adoption of liquid cooling, perhaps ironically, has the potential to cut water consumption in the long run. That's because the higher thermal coefficient of liquid cooling allows the use of dry coolers, which work a bit like a car's radiator but on an industrial scale.

There's also the potential for heat reuse in these scenarios. In one thought experiment presented at SC23, it was estimated that training a GPT-3-sized model could generate enough heat to support roughly 4.6 greenhouses and grow over a million tomatoes. We've seen other examples of datacenters contributing to district heating grids as well.

However, until a critical mass of liquid-cooled systems have been deployed, we're likely to continue seeing headlines about datacenter water consumption for better or worse. ®

Source: theregister.com

Related stories
4 days ago - It's September, but the sun is still blazing. Here's what I'm using to beat the heat and keep training for my next marathon.
1 month ago - It can be tough to train when the sun is blazing. Here's what I'm using to train for my next marathon.
1 month ago - The Pixel 9 Pro XL has copious amounts of AI that mask a higher price and the usual slate of year-over-year improvements. I wish it had more camera upgrades.
1 month ago - Kurt “CyberGuy" Knutsson takes a closer look at a new device that uses air nozzles and virus-killing tech to shield against airborne threats.
3 weeks ago - Is collagen really worth the hype? Let's discuss all the collagen supplement benefits and drawbacks you should keep in mind.
Other stories
3 minutes ago - The Indian government has approved $2.7 billion in new spending for its space program.
3 minutes ago - heard you like apps — Windows App replaces Microsoft Remote Desktop on macOS, iOS, and Android. Enlarge / The...
3 minutes ago - LinkedIn limits opt-outs to future training, warns AI models may spout personal data.
3 minutes ago - BUSTED — iServer provided a simple service for phishing credentials to unlock phones. Getty Images ...
29 minutes ago - European regulators want Apple to open up device pairing, notifications and more to other companies' products.