The Big AI Bang
Data centres, get ready for the new wave of AI hyperscalers.
Artificial intelligence (AI) has gone mainstream. And its impact has been felt globally.
Like many other popular technologies in the past, such as social media and mobile apps, AI tools will scale overtime as they become more sophisticated and advanced. But there is one big differentiator which makes the scalability of AI bigger than anything we’ve ever seen in the past.
AI will impact every person, at every age, in every industry around the world. Its use cases are endless and these AI tools, once properly trained, can be deployed at large. AI will scale faster than any other technology we’ve previously seen. Its growth trajectory is colossal, and we’re only at the very beginning.
Data centres will play a key role in the big AI bang by providing a secure and reliable home for these AI super pods to live. Data centres need to be prepared to adapt if they want to support the new wave of AI hyperscalers. But what exactly do they need to prepare for?
Dialling up the density.
AI engines are power-hungry. It takes a huge amount of computing power to train and run these models. As the use of AI becomes more widespread, data centre providers are going to see a significant rise in the demand for increased densities within the data hall. This means that data centres will need to pack more computing power and associated cooling into relatively smaller footprints. In fact, some of the AI hyperscalers I’ve spoken to are requesting ~10x the power density of traditional racks.
Innovation in AI is also spurring innovation in infrastructure. We’re seeing new types of infrastructure and technology being designed specifically for AI tools. This includes things like specialised hardware, such as new Graphics Processing Units (GPUs) and Tensor Processing Unit (TPUs), that are more efficient for the types of parallel calculations being done by AI training models. Alternative cooling methods such as liquid cooling and direct-to-chip cooling are also gaining traction as new, more efficient ways to cool these highly dense GPU-servers.
Providing runway to scale.
Demand for data centre infrastructure & services will undoubtedly continue to grow. To really understand the scale of this growth, we need to distinguish between the training and running (also known as the inference phase) of AI models.
Training AI models requires substantial computational power and storage capacity to process large datasets and optimise model parameters. It is a computationally intensive and time-consuming process. In contrast, inference tasks prioritise real-time processing with lower computational requirements and smaller storage needs, focusing on applying the trained model to new data efficiently.
Whilst both of these phases require compute & storage, the equation changes between these two phases. Data centre providers will need to be able to support both the training infrastructure and the inference infrastructure if they are not one and the same.
An environmentally conscious lens.
It is no secret that processing and storing data at this magnitude consumes a serious amount of energy. As the level of AI computing happening around the globe starts to seriously scale up, the wider tech industry needs to be conscious of the impact this will have on the environment each time they run their training models. To help these AI hyperscalers reduce their environmental footprint, data centre providers need to run their data centres as efficiently as possible.
At the heart of environmental sustainability is operational efficiency. Harnessing new ways to achieve more with less resources will not only positively benefit the environment but also reduce costs. This is something AI could help execute within a data centre. By analysing data on energy usage and adjusting cooling and other settings accordingly, AI systems will help data centres achieve energy savings through driving down Power Utilisation Efficiency (PUE).
But the onus to protect the environment in the wake of the AI boom should not be on AI hyperscalers and their partners alone. Governments can support the tech industry in achieving their Environmental, Social and Governance (ESG) goals by encouraging the adoption of sustainable practices through policy changes and financial incentives such as grants.
At a higher level, governments around the world need to prioritise investment into renewable energy for their nation to give technology firms and digital infrastructure companies the tools they need to achieve their sustainability goals. In Australia, the Government has recently indicated its intent to support more renewable energy, including plans to increase the supply of renewable energy in the national energy grid. The Albanese Government’s ‘Rewiring the Nation’ plan will upgrade Australia’s energy grid and aims to grow the renewables share of the National Electricity Market (NEM) to 82% by 2030.
Regulating the AI risks.
The opportunities AI presents will change the world around us, for better and for worse. This is why Governments and institutions are now scrambling to decide how best to regulate this fast-paced industry. Currently, Governments are primarily concerned about the ethical use of AI and how the utilisation of AI will impact different industries. With an innovation lens on, what also needs to be high on the agenda is security and protecting their nation’s intellectual property. The best way to achieve this is having the data and compute remain onshore, inside sovereign data centres.
Issues around data privacy, data protection and data sovereignty are ones that digital infrastructure companies are well versed in. Digital infrastructure companies, such as data centres, need to lead the way in establishing best practices around protecting the integrity of data. This will be particularly important for AI being used in critical industries such as healthcare, transport, energy and defence.
The perfect partner.
It’s now clear that this AI megatrend is going to have a bigger impact on the world than any other digital megatrend we’ve seen in the past.
AI hyperscalers will be looking for data centre partners who are flexible enough to adapt to new technology, efficient enough to run sustainably and reliable enough to quickly comply with new regulations. But, most importantly, they’ll need a data center partner that is future-focused and innovative enough to grow with them.