Principal Uncertainty said:
HollywoodBQ said:
MouthBQ98 said:
They seem to be concentrated out west where the power is wind turbines or Permian basin gas., it's actually cost effective in that way as long as the power source of choice does not require much water cooling, or they don't mind operating at variable capacity based on availability of wind power.
Before I left Australia, I did some work in a brand new data center which had a new concept of using ambient air for cooling. That was probably late 2019 and I left in early 2020 so I don't know how it panned out.
It was pretty damn cold in Melbourne every time I visited there so they might have had a chance. This was in a crappy little suburb south of the Melbourne Tullamarine Airport and well west of the CBD.
Found it. Equinix ME4 in Derrimut, Victoria, Australia.
They refer to it as "free air cooling"
https://www.equinix.com.br/content/dam/eqxcorp/en_us/documents/resources/ibx-tech-specs/ibx_me4_en.pdf
Fun fact, that day in 2019, I used the 4G on my phone to download an 8 GB .iso image file while sitting in the back of a taxi on my way to the data center.
Maybe that was an option at one time, but it's not anymore. The latest server racks have so many fans whining at what seems like supersonic speed that you actually need hearing protection just to approach them. The next gen that is already being mocked up are bringing liquid cooling down the server racks and into the servers directly. We've reached the limits of what can practicably be done with air cooling of the racks. Each rack, which is about the size of a refrigerator, is now approaching 100KW. That's like having 100 of those home electric space heaters pushed into a refrigerator sized box. Even load banks, which are specifically designed to do nothing more than burn off heat for testing purposes, take up more space that that.
I've been in the storage game for a long time and seen quite a few data centers.
The whole hot aisle / cold aisle thing is already out of control at someplace like Switch in Vegas. But they're about the only ones who really do it properly.
You can't last too long standing in their hot aisle when you've got a rack full of disks.
But most people don't do that properly so the heat on your person usually isn't too much of a problem.
Also with multiple network cables, redundant power cables, out of band management, 1U servers, etc., a lot of times the heat doesn't get dissipated from the back of the rack so you can have component failures that way.
When I was in Australia, crypto mining wasn't really a thing yet but since I've been back in the US for 6 years, I've been to a number of data centers that have racks of servers that have failed due to overheating from mining bitcoin. That's been kind of crazy to see.
I do agree that there's not much more that can be done with regular HVAC.
The ambient air thing I wrote about earlier was an Australian attempt to be more "green" and show that they were working towards achieving their wack-a-doodle climate goals.
I think Australia has backed off most of their carbon targets by now but 10 years ago, they had some aggressive goals for 2030 or whatever.