• Echo Dot@feddit.uk
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    1 day ago

    I don’t understand why they even need to use up water. Water cooling does not require you to evaporate the water. You can just keep it as a closed system and reuse the water.

    If nuclear power plants can manage it which would be easy for a server farm

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 hours ago

        I guess water is cheep enough.

        Still kinda obnoxious though. Like they couldn’t see that the ultra high water usage was the thing that would get the most pushback from?

    • scutiger@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      20 hours ago

      Closed loop watercooling is really just air cooling with extra steps. The water is heated by the devices and cooled by a large radiator with fans. Or it’s cooled with a chiller which in turn is cooled by a radiator with fans.

      Replacing the water is the most effective (yet wasteful) way to remove the heat.

      • d00phy@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        16 hours ago

        To a point, yes. While you’re still using air to cool the water, I think it’s still a little more efficient than blindly keeping the server room at a low-ish temperature.

        • scutiger@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          13 hours ago

          Keeping the server room cool is just using an air conditioner which is cooled by a radiator with a fan, and then using that cooled air to cool another radiator with a fan. Every step is a loss of efficiency.

          The main advantage of water loops is that you get to use a different form factor for the radiator and fan by moving it away from the source of heat and aren’t limited by the case dimensions.