Data is one of the primary commodities in our modern world. Untouchable and, to most, an intangible entity that provides the information we use to navigate everyday life.
However, data is very physical.
Historically, owning and operating material servers and hard drives was common, though the advent of cloud computing has moved the problem “out of sight, out of mind”. Now, data centres handle the world’s digital information needs.
Our increasing reliance on data means increased energy consumption and cooling demand, especially as emerging technologies such as self-driving cars and machine learning algorithms devour exponentially more bytes and bits. This means increased emissions unless we develop technology to sustain our data gluttony without harming the planet.
Data is Greedy for Energy
Data centres consume a lot of energy.
Yotta NM1, a data centre under construction in Mumbai, will draw 250MW once complete. Operating at its peak, that’s the same as ~250,000 households. Most of this energy goes into powering the servers and cooling them as they expel heat during operation. Data’s not so imaginary after all.
At a global scale, data centres consume 1% of all electricity produced. The low penetration of renewables in the global energy mix results in a carbon footprint that eclipses commercial aviation.
Including network devices such as smartphones and laptops, digitisation emits 4% of greenhouse gas emissions globally.
Emerging Tech Increases Data Demand
And the data problem is only growing.
A recent study by MIT found that self-driving cars, one of the many “holy grail” technical feats of the 21st century, stand to double the energy consumed from data processing. With hundreds of thousands of decisions being made onboard every minute by the embedded deep neural networks, global adoption of this kind of technology will significantly increase our reliance on data and the energy to process it.
And that brings us nicely to artificial intelligence (AI). Recent software releases have popularised language models using neural networks that aim to mimic the function of the brain. Vast, weighted circuits of neurons are constructed by computing large quantities of data. The result of this is well known, with models that have pretty impressive speech, image generation and human-like problem-solving capabilities. What’s easily forgotten is the energy required to process this data. A recent report by Standford University found that Chat GPT-3 consumed 1,287 MWh of electricity during training, emitting 502 tonnes of CO2. Then, it was calculated that GPT-4 consumed ~57,000 MWh and emitted 13,700 tonnes of CO2. That’s a 44x increase in energy consumption between subsequent models. Now, whilst it’s easily argued that training is a one-off cost, it’s also worth recognising that the number of models and iterations will only grow. More than ever, our reliance on data, and its impact, is exploding.
At this point, it’s important to make clear that the emissions from data are almost entirely derived from energy generation. This makes the data centre’s location a critical function in deriving its footprint. A fully decarbonised grid would result in effectively “green” data. However, data is a large and growing driver of demand, which adds to the complexity of renewable integration.
Solutions
Is it a recognised problem, and how are people looking to solve it?
With electricity making up most of their operational inputs, data centres want to minimise their exposure to increasingly volatile energy markets. Ultimately, this incentivises efficiency improvements, which reduce power consumption and improve hardware reliability. A win for emissions, too.
Ignoring software enhancements, these innovations broadly fall under three categories. The first two are the server hardware itself and the cooling systems (air-cooled), which each contribute to about 40% of the total energy consumption. The third category includes optimised data centre layouts, which could involve tailored rack placement for improved cooling and connectivity. At Twynam, we look for solutions that scale. Every data centre will have a unique layout, so our focus is naturally directed at the first two areas.
In a report published by McKinsey, it was noted that chip manufacturers are becoming increasingly aware of the benefits of bespoke chips. By designing chips to handle the known tasks only, they could be made simpler, reducing the energy consumed by up to 20%.
Other moonshots include room temperature superconductors, like the recently hyped (and disproven LK-99), which promise zero loss electron flow.
Cooling is also changing. The industry is trending from air cooling towards liquid cooling, largely due to the improved convective heat transfer, leading to significant energy efficiency gains.
How Do We Measure Success?
Power utilisation effectiveness (PUE) measures the amount of energy actually used versus the energy required to perform computing. Given the need for cooling, the PUE is always greater than 1, and typically around 1.4 for an air-cooled system.
Efficient technologies such as geothermal heat pumps (pulling cold air/liquid in from an underground heat sink), free cooling (using cool ambient air/water in colder climates), and immersion cooling (submerging a server in a non-conductive liquid, and so maximising the surface area for heat transfer) promise PUE values as low as 1.03. These are just some of the technical innovations having a real impact on energy consumption. Excitingly, lower temperatures result in more efficient computing, too, with improvements of up to 20%.
Alternate Cooling
Cooling deserves a lot of attention. Large data centres can output as much as 50MW of heat, with some campuses (a group of data centres) producing multiples on this. That’s approaching the electrical power output of a small modular nuclear reactor.
There’s a lot of untapped thermal energy, a problem that has not been lost on the data centre industry. The Universitat Politècnica de València were one group that looked to exploit this. They used a polyvalent heat pump extracting waste heat from a local data centre to drive heating and cooling across their campus. The integration resulted in a 64% energy saving over traditional HVAC. Using this example, it’s not hard to imagine how this thermal energy may also be used in lower-temperature industrial processes.
Our focus
At Twynam, we’re excited about data. We invested in engineroom, who developed an immersion cooling system using a biodegradable dielectric fluid. Their technology doesn't rely on water, uses dramatically less energy than even other forms of liquid cooling, and results in improved efficiency for less cost. We’re proud to back companies solving real problems with real impact.
Whether it’s cooling, chip design, waste heat, or something we haven’t thought of, please reach out. We’d love to discuss this.
Comentários