Liquid Cooling: Use Cases, Innovation, and Deployment Readiness

Liquid Cooling: Use Cases, Innovation, and Deployment Readiness

Last week in our special report series, we discussed the new designs, standards, and liquid cooling systems that are helping mitigate risks and alleviate challenges around data center cooling. In our final article in the series, we’ll look at how liquid cooling can impact edge computing cooling.

Edge computing cooling

Download the full report.

The technologies supporting the next generation of immersion cooling have already come a long way. Many challenges facing integrated liquid cooling architectures are being put to rest with innovative approaches to all-in-one liquid cooling solutions.

Let’s quickly recap some of the key benefits behind liquid cooling:

  1. Density. With HPC, for example, you’re running latency-sensitive applications and workloads. Liquid cooling helps you keep systems dense while maintaining high levels of performance.
  2. Performance. On that note, use-cases around research, modeling, and finance tracking all require high-performance levels. It’s an excellent area for liquid cooling to make an impact.
  3. Economy. It’s not just about the economy; it’s also about efficiency. For example, IT fans represent around 20% of all IT loads. Immersion cooling can help remove that function. Furthermore, a liquid cooling solution can impact power distribution, including switch gears, UPS, PDUs, and more. Finally, with the massive increase in rack density, liquid cooling systems can actually produce genuine CapEx and OpEx savings.
  4. More Accurate PUE. Although server fans are on the IT load side of the PUE equation, they are still a part of the cooling infrastructure. So, a liquid cooling solution can actually help you deliver a more accurate PUE reading. And, with greater levels of efficiency and density, it can help lower it too.

There’s another benefit here as well. Highly dense and integrated liquid cooling systems are also a showcase in innovation. These integrated solutions bring multiple forward-leaning technologies to create its platform. These include:

  • A micro data center module can be deployed in several form factors, including 60kW, 120kW, and 600kW. Further, these integrated liquid-cooled enclosures can be stacked and operate indoors or outdoors. The system can be expanded in increments to manage capital.
  • Two-phase immersion cooling, in which servers are immersed in coolant fluid that boils off as the chips generate heat, removing the heat as it changes from liquid to vapor. A closed-loop water system is used to condense the fluid and return it to the tank.
  • Custom servers that adapt OEM components in an immersion-ready chassis that can plug into a backplane providing network and power.
  • Robotics technology can “hot swap” servers, remove a failed server from the immersion bath, and replace it with a new server.
  • A software platform to manage all elements of the data center.

Integrated liquid cooling in the data center and for edge computing cooling

It’s important to note that these specific liquid cooling platforms are no just a concept. For example, Edge and data center provider DartPoints has recently adopted TMGCore’s robotic immersion-cooled system, OTTO. As an integrated unit, the OTTO system is a closed box in which specially built Hydroblades are inserted vertically into an immersion cooling platform.

“These units are not just the right size. They are modular and scalable to support any power, cooling, and density requirement that local networks or large hyperscalers may need in any market they want.” — Scott Willis, CEO of DartPoints

In terms of innovation, the robotics leveraged by OTTO to swap out servers create new possibilities in the industry’s long-running effort to make a “lights out” minimal personnel data center. The OTTO robotic arm can latch onto a server, lift it out of the immersion tank, and place it into an enclosure next to the tank that houses backup servers and has open slots where the robotic arm can place the faulty server. The system then grabs one of the backup servers, lifts it into the tank, and pops it into the plug-n-play backplane

edge computing cooling

(Source: TMGcore)

For DartPoints, this design will give the organization the most high-density and most efficient Edge computing platform available, going from modules of 4.5kW up to 1.2MW of power within a space of 320 sq ft. This system will also support a PUE of 1.028 during operation. Another benefit is that it doesn’t require a raised floor or any external cooling.

Partnering with DartPoints allows us to quickly deliver a turnkey solution in edge markets, allowing customers to interconnect and distribute their content and applications efficiently. – John David Enright, CEO of TMGcore.

As noted earlier, these systems aren’t simply dunking servers in liquid. Instead, these are advanced blade server platforms capable of supporting several use-cases. A key component is the GPU integration and the amount of density delivered per single blade. The system can cool 6kW in a single 1U blade, where GPU units can be stacked two deep because of the heat removal effectiveness.

Deployable tactical data centers at the edge

Liquid cooling for edge computing cooling may be seen as a challenge due to density concerns and deployment considerations. However, new solutions like the TMGCore OTTO Edgebox 30 redefine high-density data centers at the edge through innovative two-phase liquid immersion cooling technology. With a total footprint of 16 square feet and up to 30kW of density, the Edgebox can operate at high efficiency and density will still provide a complete system PUE of 1.028 or lower.

Under the hood, the Edgebox 30 comes equipped with high-speed, high-capacity memory. It also comes with industry-leading resources to support some of the most high-density workloads. The architecture is capable of delivering up to 1792 cores of Intel’s highest performing Xeon Scalable processors. Furthermore, you can deliver up to 32 Nvidia V100 GPUs for advanced graphics acceleration. This means 20,480 Tensor cores, 163, 840 CUDA cores, and 4,000 TFLOPS Tensor performance. Management of this edge ecosystem is made easy with local and remote web-based interfaces for granular control.

These blades push new workloads like HPS to the edge, reduce analyst lag time, and enable more operator access. The Edgebox reduces its logistics footprint by scaling down on space and power requirements. It leverages modular servers for easy maintenance and customization for many use-cases in the field.

edge computing cooling

(Source: TMGcore)

What happens when you deploy OTTO Edgebox 30 at the edge?

  • Experience efficiency with 90-96% reduction in square footage
  • 54% reduction in CAPEX costs
  • 60-80% reduction in OPEX costs
  • 77% reduction in water usage
  • 4000x more efficiency than air
  • Racks, PSUs, and Fiber Links are all included
  • PUE of 1.028 or lower

Architectural and deployment considerations

Focusing on TMGCore, OTTO can offer significant improvements in energy efficiency and economics. For example, For applications outside of the core, the OTTO 120 provides the same degree of highly efficient, two-phase liquid immersion-cooled data center capacity with a smaller footprint. The platform supports 120kW of power and 20 OIU capacity. As mentioned, the 120kW version of OTTO has operated at a PUE 1.028 (or lower) within an active Dallas data center. It can be deployed at the cost of $3.2 million a megawatt, about half the current deployment cost of the largest data center REITs.

“Today, companies seek more data processing, higher densities, and quicker deployments, yet the available land has become increasingly scarce and cost-prohibitive. Users don’t have the space to add additional data centers to handle the increasing demand.

We are scaling down the size of the traditional data center without decreasing the processing power. We’ve created a solution that can be significantly smaller for the same or greater compute capacity and still have the room within to scale up as the needs increase.” — John David Enright, CEO of TMGcore

Those data points are based on racks filled with OTTOblade servers, which pack 6kW of computing power into a single 1U blade. Since the 600kW unit has a footprint of just 160 square feet, OTTO offers the potential to cut operational costs by 80 percent while receiving ten times more processing power per square foot.

Micro modular data centers

With select manufacturing partners, the micro data center enclosures for the OTTO platform will help customers achieve high-performance, high-density modular data center designs, potentially as part of an edge computing initiative. Each module will require:

edge computing cooling

(Source: TMGcore)

  • Power connectivity
  • Network connections
  • Water for the condenser (which uses a closed-loop that can either be charged upfront or use a water supply)

Inside the module, each immersion tank features a busbar for power supported by 60kW hydropower blade rectifiers. An OTTO tank features 12 1U equipment slots, holding ten 1U servers and two power blades, supporting dual A and B power.

From there, the requirements are straightforward, and it’s smaller and more manageable than creating an entire air-cooled data center. A modular OTTO system can cool 10 megawatts of capacity in about 8,000 square feet, compared to about 90,000 square feet for a wholesale provider.

It’s important to note that these designs are genuinely novel and take an entirely different approach to integrated liquid cooling solutions. In many cases, they’ve taken the complexity out of delivering robust liquid-cooled solutions. That said, let’s focus on a few points to help you get started on a liquid cooling journey.

Download the full report, “The State of Data Center Cooling: A Key Point in Industry Evolution and Liquid Cooling” courtesy of TMGcore for exclusive content on how to get started on a liquid cooling journey and best practices for integrating liquid cooling into your data center and edge computing cooling systems. 



More >> Liquid Cooling: Use Cases, Innovation, and Deployment Readiness
Featured Data Centers