The hidden cooling bottleneck inside liquid-cooled AI data centers: liquid cooling eliminates system airflow, creating a hidden thermal bottleneck for 'left-behind' components like memory and SSDs
Posted by sr_local@reddit | hardware | View on Reddit | 15 comments
Sopel97@reddit
Enterprise-grade watercooling takes all of this into account. The problem happens at consumer level and shitty workstation.
spicesucker@reddit
TBF this is only a new phenomenon with more recent DDR5 specs.
The fact that motherboard manufacturers placed active cooling fans on PCIe 4.0+ chipsets yet aren’t installing fan brackets near memory sockets will give a lot of builders false confidence that their big heatsink memory modules don’t need focused airflow.
Jeep-Eep@reddit
Camm2 will have at minimum accommodations for 120mm fans on top, and in many cases will ship with it.
Jeep-Eep@reddit
This is why I hate the current LCD fad. Put a RAM and VRAM fan on the fucking block, not that garbage.
rstune@reddit
Can I put a GIF of a fan on my AIO's LCD?
TwilightOmen@reddit
This is a non-issue. The two most common approaches either have air cooling plus water cooling, or are immersion based. The company I work for has both approaches, for example. The full immersion is using novec 649 if my memory does not fail me.
I know of no company whose data center had any sections or PODs that could suffer from the issue in the thread title. I sincerely doubt any engineer at this level would make such a rookie mistake.
FullOf_Bad_Ideas@reddit
It reads like AI-generated SEO article.
I probably should put a fan on my workstation though since I have risers in the way of airflow and no fans besides multiple gpu fans and those on cpu AiO radiator, so I'm being called out here in the comments.
exomachina@reddit
As someone who works in data centers (L3, Qualys, H5, currently QTS) I've never seen a rack that does only direct to hardware liquid cooling... That's actually insane to think about. Every AIO style rack solution I've seen is designed to be deployed in a traditional hot/cold rack environment and definitely still has air cooling, not to mention they have these big chillers that are basically an entire racks worth of radiators and fans.
When people say "liquid cooled" data centers, we're talking about the heat exchangers for the air system. Every data center I've worked in uses air cooling in a hot side/cold side rack configuration. The differences being how that air is cooled. Most I've worked in use a closed loop refrigerant system. Smaller co-located facilities will use water/air but these aren't generally used for AI, or at least not specifically as they are mostly for smaller businesses that rent a couple racks.
I'm really not sure where this article is getting it's info from or what specific data centers it's talking about.
I've also never heard of a pure "AI" datacenter outside of specialized labs. Most of the datacenters you see popping up in cities are for general commercial use. We don't really dictate what type of hardware and workloads you can and cannot run in your rack. Google/Amazon/Microsoft/Oracle are not building pure AI datacenters, they are building expanded availability zones for edge computing and often lease space from larger datacenters that are geographically advantaged.
forreddituse2@reddit
In all the DC touring videos I saw on YouTube, for liquid cooling rack there is always air cooling present as supplement.
Actually-Yo-Momma@reddit
As someone in server hardware design, there is absolutely fans in every LC server
RumbleTheCassette@reddit
This is discussed to some degree for home OC builds, but even in those instances, there's usually enough case airflow from other fans to keep non-liquid-cooled components cool enough.
Data centers almost assuredly have fans and/or some sort of HVAC to cycle the air enough for other components. There's no way entire teams of engineers are just forgetting about other component cooling needs.
Jeep-Eep@reddit
It's come up off and on in client as well - the better AIOs have onboard VRM cooling fan solutions and there's a reason that TG wire protection gadget has its own cooling fan.
billm4@reddit
i have yet to see / deploy a liquid cooled environment where this was a real issue. we specifically design for the non liquid cooled components to still be air cooled. nobody is deploying only liquid cooling without any air cooling (unless you’re talking about immersion, then the point is moot as everything is cooled).
szank@reddit
Its like the engineers actually know what they are doing when designing these systems. Unbelievable.
poorlycooked@reddit
Yeah I think this is more likely to be a problem for gaming PCs lol. Some mobo VRMs get real hot when the CPU is liquid cooled