Cooling the Cloud: Data Center Trends Heat Up Opportunities for HVAC Contractors
As the digital world continues to expand, so does the demand for reliable, energy-efficient data centers — and that’s creating a booming business opportunity for HVAC contractors. Just about all market research firms predict double-digit annual growth (roughly 12–16%) in the next one to three years as data center capacity ramps up to support AI, cloud, and hyperscale infrastructure. With servers and networking equipment generating massive amounts of heat 24/7, precision cooling has become mission-critical to keep data flowing and downtime at bay.
“As new computing technologies emerge, data centers are rapidly adapting to meet changing requirements — and the high-performance computing power necessary to support them,” says Sean Crain, data center sales engineer, Johnson Controls. “This evolution typically results in three primary challenges when designing cooling systems: increasing rack densities and the required cooling systems to support it; limited equipment space — cooling equipment must also be designed with footprint in mind; and rising ambient design temperatures. Additionally, as data center development expands around the U.S., local weather conditions can also impact ambient temperature. Understanding these factors and how they may shift over time is essential for successful air-cooled heat rejection.”
Mihir Kalyani, data center business unit manager, Americas, EVAPCO, also points to greater rack density resulting in changes in data center design for next-gen AI workloads. Kalyani notes that existing data centers with air-cooled racks are being retrofitted with liquid-cooled server racks, which can be cooled by air-cooled or water-cooled chillers.
“Air-cooled chillers take up large footprints, have high connected loads and start to lose viability as the heat rejection requirements grow, as data centers get bigger,” he says. “We are seeing data centers over 1 GW in density, which is over 280,000 tons of heat rejection — truly massive and unprecedented. When water-cooled chillers are deployed, EVAPCO’s heat rejection technologies can be used to cool them — evaporative, hybrid, dry, or adiabatic methods of heat rejection, depending on site priorities and resource availability.”
According to Nolan Foran, national sales manager mega projects, data center, semiconductor, EV battery, Watts Water Technology, the conversations regarding data centers has shifted away from environmental or space cooling, involving evaporative cooling for air systems to chilled water solutions for data center facilities.
“The increase of chilled-water systems for direct-to-chip cooling within data centers is one of the fastest-developing trends I’ve witnessed,” he says. “For new facilities, the needs are merely built in from the outset. However, for existing data center facilities, the need for new chilled or ‘technical water’ loops are evolving quickly. The need is to accommodate faster, more effective heat removal for direct-to-chip cooling applications involving hydronically-cooled server rack cabinets containing high-density, heat-generating GPUs (general processing units) for high-performance computing and AI applications.”
Foran adds that he is seeing the need for a lot of stainless steel components, such as strainers, ball valves, check valves, balancing valves, butterfly valves, regulators, and ACVs. “The use of glycol solutions is also common to mitigate biological formations. This is important because the piping closest to the CPUs is quite small, and any contaminant can impede the flow of water for cooling. That’s why we refer to this system as “high purity” piping and components, favoring Watts stainless steel flow controls – and also thermoplastic (Orion) polypropylene and PVDF piping systems.”
A Rising Tide of Liquid Cooling
The new cooling demands in data centers often go beyond the limits of traditional air cooling, which is driving a growing interest in liquid cooling solutions, notes Steve Hueckel, director of data centers, Daikin Applied.
“Specifically, direct-to-chip single-phase liquid cooling is gaining traction, which is where a liquid coolant is sent directly to the chip to extract heat and transfer it to a cooling tower or another heat dissipation system,” he explains. “Over the next several years, we may also see increased adoption of immersion cooling. Still, air cooling isn’t going away. It will be necessary for the rest of the server rack, so we expect to see growth in both air and liquid cooling technologies. The overall market is expanding significantly because of these changes.”
As liquid cooling gains prominence due to the growing demands of AI workloads, cooling distribution units (CDUs) also play a critical role in thermal management, Hueckel adds. “These units manage the distribution of liquid coolant directly to the areas that need it, primarily the GPUs (graphics processing units). Common CDU sizes today are around 1 to 2 megawatts (MW), though there’s also demand for smaller units around 500 to 600 kW. Looking ahead, companies are now exploring larger systems with projections for future CDUs exceeding 5 MW.”
Danielle Rossi, global director of mission-critical cooling for Trane, agrees, noting that nearly every month, a new chip technology is released, which, in turn, creates a need for rapid innovation in cooling technology.
“There's a growing demand for liquid-based cooling, including both brownfield retrofit and greenfield design,” she says. “Most current designs are hybrid, including both liquid and air cooling due to the various density requirements needed for both traditional and AI applications within the same facility. Air cooling will remain necessary for some time, but liquid cooling is quickly becoming a design norm.”
Liquid cooling has the added benefit of being more efficient than air-cooling, Kalyani notes. “This results in a higher ‘density’ of computation or storage in a smaller footprint. Air-cooled data centers will continue to exist, but any data centers set up for AI workloads will most likely begin with, or convert to, liquid-cooled systems.”
Efficiency, Sustainability Still Top-of-Mind
Many data center leaders are prioritizing energy efficiency as a means to significantly lower energy and thermal loads and accelerate their connection to the grid, according to Crain. Water conservation and refrigerant selection are additional factors that have become increasingly important to sustainability-focused data center providers, he notes.
“Most notably, we have seen this in the widespread transition to air-cooled systems using low-GWP refrigerants,” Crain explains. “Air-cooled chillers with features such as variable speed drive (VSD) and capacity control logic help deliver higher part-load efficiencies even in off-design conditions. This can help enhance energy efficiency as rack densities evolve. Additionally, the shift toward compact designs allows manufacturers to minimize material usage and reduce embodied carbon emissions.
Sustainability in data centers is all about balance. While liquid cooling offers greater efficiencies in managing high heat densities, it raises concerns about water usage — especially in regions struggling with water scarcity, Hueckel says.
“This is one reason why air-cooled chillers are popular,” he says. “These systems use air as the cooling medium and operate in a closed-loop system, which helps conserve water. While water-cooled systems can also be designed as closed-loop, they generally see less demand in regions where water is limited. In Arizona, for example, water is at a premium, so air cooling is the preferred option. In North America overall, roughly 70% of the market has historically relied on air cooling, and we expect that trend to not only continue but to grow.”
As demand for air-cooled chillers grows, there is a parallel focus on enhancing their energy efficiency, Hueckel adds. “We anticipate increased adoption of oil-free centrifugal chillers industry-wide because their energy efficiency ratios are several points higher than traditional chillers. While the initial cost is greater, the energy savings over time makes the investment worthwhile because it offers a better lifecycle cost.”
What Contractors Need to Know
The technology changes from air-cooling to chilled water cooling are happening so fast, they’re on the fly, Foran explains.
“For instance, for data centers that were well underway in construction; suddenly, there’s an urgent call for the addition of technical water loops — essentially involving retrofits for new facilities during construction,” he says. “We get those calls, naturally, because our expertise is in water — all facets of it. We have the breadth of product and deep expertise, so we find ourselves quite busy with this work, sometimes very urgently.”
His advice to contractors is to be willing to evolve, adapt, and diversify.
“Currently, there are labor and material challenges,” Foran adds. “Yet for those who keep an eye on the trends shaping the industry, with keen attention to how you can play an essential role to managers in this fast-paced work, the sky’s the limit. Be willing to be accessible at all times, to serve as a resource, to improve their own knowledge base, and to hire those who possess the skills and knowledge needed in this demanding work. Be on tap to jump in — because these managers need skilled resources willing to help them execute solutions.”
Hueckel notes that some data center chillers are sent to job sites that are still under construction, which means the equipment sits in a yard for months before installation.
“These are large, heavy systems often lifted by cranes and transported by flatbeds, and exposed to the outdoor elements,” he says. “During that process, bolts can loosen and components can shift, so it’s critical that contractors perform thorough inspections when the chillers arrive on site. Because of this, there’s a strong opportunity for service agreements to become part of the installation process. Daikin offers commissioning assistance to help get these systems up and running the right way because we know the equipment inside and out, and we know exactly what typically needs to be checked before startup. Ensuring the chiller is received in proper condition and then paired with a solid service plan is key to long-term performance.”
Crain adds that just as the data center market is rapidly evolving, so are innovations in cooling equipment. “Innovations like smart connected chillers and intelligent building automation solutions drive operational performance while supporting installation and maintenance practices.
“By leveraging these solutions, operators and technicians can access live dashboards and reporting tools to identify issues faster and easier than standard processes that rely on manual collection and data analysis,” Crain continues. “Features like AI troubleshooting modules can help further streamline problem-solving workflows. Using these advancements, teams can assess the root cause of issues sooner to optimize equipment performance and maximize up-time.”
Additionally, new technologies require adjustments through every stage of design, but people forget about the adjustment to maintenance schedules, technician experience, and serviceability, Rossi notes.
“Extensive training will be needed for both IT professionals within this space for server replacement, and for HVAC and networking professionals working with or around the equipment,” she explains. “When transitioning to liquid cooling, whether it be a full transition or hybrid design, all personnel interacting within the space should be trained and comfortable with whatever technology is being used. This training and exposure should be done early in the design process to help ensure staffing by the owner and any vendors meets the design needs. Some changes may be large, like transitioning to immersion cooling at the rack or requiring wet maintenance in the white space. In other cases, a vertical rack transition direct-to-chip from air cooling may not be as large an adjustment. Training will help limit the adjustment time.”
Looking Ahead
According to Crain, intelligent building solutions powered by AI and machine learning can unlock new opportunities to optimize performance, drive efficiency, and fortify cyber security. “Today, fewer than 10% of data centers are fully integrated with intelligent BAS solutions. However, by integrating digital optimization, the average facility can reduce energy spending by as much as 30% and cut carbon emissions by up to 40%. This presents a powerful opportunity to unlock new levels of data center performance.
“These intelligent platforms go far beyond digitally connecting equipment,” Crain explains. “They unify and analyze vast amounts of data from the equipment and subsystems within the building and combine it with dynamic factors like weather conditions, space usage, or utility rates. These platforms can then autonomously adapt, in real-time, to advance the unique goals of each facility. The data center market is a fast-paced environment where trends and standards can quickly shift in new directions. Forming strong, two-way partnerships is essential to keep pace. The most successful teams often rely on vertical integration that is inclusive of all invested parties.”
Cooling technologies must adapt to the high-density chips consistently being released within the market, Rossi notes.
“Liquid cooling will likely become more prevalent, including the continued use of hybrid applications,” she says. “Associated thermal management will need to adapt to support changing requirements regarding densities, water temperatures, efficiency goals, and heat recovery requirements. Legislation will play a part in efficiency requirements with some countries in the EU requiring heat recovery for data center applications and, in the U.S., with potential tax incentives for utilizing high-efficiency designs. It’s increasingly important to be able to adapt and innovate quickly to support the market and the chips to come.”
Power is going to be one of the biggest factors shaping the future of data centers, Hueckel says.
“Right now, access to power is a driving force behind where data centers are being built, with new facilities going up in locations where power is readily available,” he says. “In areas where the infrastructure isn’t yet in place, a big part of the challenge, as well as opportunity, will be building that power access from the ground up. In the future, more data centers may be designed with on-site power generation built into their architecture and layout.”
Looking further ahead, Hueckel adds that he’s already hearing that some companies are exploring next-generation chips with heat densities as high as 600 kW. “At that level, current chiller technology likely won’t be sufficient. It could require a major architectural shift, such as full immersion cooling using dielectric fluids and even two-phase cooling systems, where the fluid changes state to extract extreme levels of heat.
“That kind of change may still be five to 10 years out, but don’t underestimate how fast things can evolve,” he continues. “The pace of growth we're seeing across the industry is accelerating, and data center vendors are already planning for much higher rack density loads. It makes for a very exciting future where we may see a complete shift in technology to accommodate these emerging demands.”
About the Author

Nicole Krawcke
Nicole Krawcke is the Editor-in-Chief of Contracting Business magazine. With over 10 years of B2B media experience across HVAC, plumbing, and mechanical markets, she has expertise in content creation, digital strategies, and project management. Nicole has more than 15 years of writing and editing experience and holds a bachelor’s degree in Journalism from Michigan State University.