AI-ready data centers are being challenged to meet the demands of AI workloads while cooling their voracious need for power to run.
There is a mad rush to be AI-ready so as to be able to take advantage of the AI revolution that we are undergoing currently. This readiness encompasses robust computational capacity with AI-optimized systems and specialized hardware. Organizations must also consider the expanding role of AI-capable edge devices, from personal computers to smartphones to IoT sensors, in their overall AI strategy.
At the heart of this AI ecosystem lie data centers, which face unprecedented challenges in power density and latency reduction, as well as scalability. These facilities must evolve rapidly to meet the demands of AI workloads while maintaining efficiency and reliability.
A double-edged sword
HPE’s GM of HPC and AI in APAC and India, Joseph Yang said, “Ultimately, an AI-ready data centre needs to be built for current needs, but with an eye to the demands of future systems. For example, an AI-ready data centre should take into account the increased future power supply needs of running the next generations of infrastructure, which could grow from 6MW for traditional workloads to 42 MW for 100% AI workloads.”
AI’s potential to transform industries and solve complex global challenges is undeniable. However, its computational power and energy needed to cool the systems that run it, comes at a significant cost to our energy resources and environment. Today’s GPU-equipped server racks, essential for AI workloads, consume a staggering 72KW each—a figure expected to reach 90KW with the next generation of GPU servers.
Compare this to traditional CPU server racks, which typically draw 8-10KW, and the scale of our challenge becomes clear.
Joseph strongly believes, data center designers should collaborate with power generation suppliers to seek renewable energy sources, and consider placing AI workloads that are not latency sensitive in locations closer to renewable energy sources or in locations with cooler climates.
When designing AI-ready data centers, he recommends the following considerations:
The APAC Perspective
Across the Asia-Pacific region, enterprises are at a critical juncture. While AI adoption is still in its early stages, digital transformation initiatives are accelerating rapidly. This presents a unique opportunity to embed sustainability into the core of IT strategies from the outset.
The pressure to reduce carbon footprints is mounting from all sides—regulators, customers, employees, and investors. Forward-thinking organizations are realizing that environmental responsibility and business success are increasingly intertwined.
To truly optimize both business and sustainability outcomes, we must think beyond hardware. A comprehensive strategy should encompass:
- Data Efficiency: Optimize data collection, processing, and storage to minimize waste and maximize value.
- Software Efficiency: Employ best practices in software engineering to create lean, effective AI models and applications.
- Equipment Efficiency: Utilize cutting-edge hardware designed for sustainability and performance.
- Energy Efficiency: Maximize the utility of every kilowatt consumed.
- Resource Efficiency: Optimize all aspects of data center operations, from facilities to personnel.
Joseph also added, “Ultimately, sustainability must be considered upfront at the start of any digital transformation project, and not be left as a bolted-on afterthought.”
The Path Forward: Innovation and Collaboration
While retrofitting existing data centers for AI readiness poses significant challenges, new facilities offer a blank canvas for innovation. By integrating sustainability considerations from the earliest planning stages, the industry can create data centers that are both powerful and environmentally responsible.
Examples of this approach already exist. HPE’s partnership with QScale in Quebec demonstrates the viability of near-100% renewable energy use in high-performance computing environments. Its collaboration with Danfoss on modular data centers with integrated heat capture systems points to a future where data centers could become net positive contributors to energy ecosystems.
The urgency of our climate crisis demands that the IT industry accelerate these efforts. It has a responsibility to drive innovation that not only powers AI advancements but does so in a way that preserves our planet’s resources.
The journey to truly sustainable AI-ready data centers is complex, but achievable. It requires a commitment to continuous innovation, cross-industry collaboration, and a willingness to challenge conventional thinking. As we push the boundaries of what’s possible with AI, let’s ensure we’re doing so on a foundation of environmental stewardship and responsible resource management.
The future of AI is bright—let’s make sure it’s friendly to the environment as well.
(This article is based on email interview responses from Joseph Yang)