27 September 2023

Powering down: How computing could soon face an energy crisis

Start the conversation

Daisy Wang and Jared Cole* say that computing faces an energy crunch unless new technologies are found that use less power.


Photo: Israel Palacio

There’s little doubt the information technology revolution has improved our lives.

But unless we find a new form of electronic technology that uses less energy, computing will become limited by an “energy crunch” within decades.

Even the most common events in our daily life — making a phone call, sending a text message or checking an email — use computing power.

Some tasks, such as watching videos, require a lot of processing, and so consume a lot of energy.

Because of the energy required to power the massive, factory-sized data centres and networks that connect the internet, computing already consumes 5 per cent of global electricity.

And that electricity load is doubling every decade.

Fortunately, there are new areas of physics that offer promise for massively reduced energy use.

The end of Moore’s Law

Humans have an insatiable demand for computing power.

Smartphones, for example, have become one of the most important devices of our lives.

We use them to access weather forecasts, plot the best route through traffic, and watch the latest season of our favourite series.

And we expect our smartphones to become even more powerful in the future.

We want them to translate language in real time, transport us to new locations via virtual reality, and connect us to the Internet of Things.

The computing required to make these features a reality doesn’t actually happen in our phones.

Rather it’s enabled by a huge network of mobile phone towers, Wi-Fi networks and massive, factory-sized data centres known as “server farms”.

For the past five decades, our increasing need for computing was largely satisfied by incremental improvements in conventional, silicon-based computing technology: ever-smaller, ever-faster, ever-more efficient chips.

We refer to this constant shrinking of silicon components as “Moore’s Law”.

Moore’s law is named after Intel co-founder Gordon Moore, who observed that “the number of transistors on a chip doubles every year while the costs are halved”.

But as we hit limits of basic physics and economy, Moore’s law is winding down.

We could see the end of efficiency gains using current, silicon-based technology as soon as 2020.

Our growing demand for computing capacity must be met with gains in computing efficiency, otherwise the information revolution will slow from power hunger.

Achieving this sustainably means finding a new technology that uses less energy in computation.

This is referred to as a “beyond CMOS” solution, in that it requires a radical shift from the silicon-based CMOS (complementary metal–oxide–semiconductor) technology that has been the backbone of computing for the last five decades.

Why does computing consume energy at all?

Processing of information takes energy.

When using an electronic device to watch TV, listen to music, model the weather or any other task that requires information to be processed, there are millions and millions of binary calculations going on in the background.

There are zeros and ones being flipped, added, multiplied and divided at incredible speeds.

The fact that a microprocessor can perform these calculations billions of times a second is exactly why computers have revolutionised our lives.

But information processing doesn’t come for free.

Physics tells us that every time we perform an operation — for example, adding two numbers together — we must pay an energy cost.

And the cost of doing calculations isn’t the only energy cost of running a computer.

In fact, anyone who has ever used a laptop balanced on their legs will attest that most of the energy gets converted to heat.

This heat comes from the resistance that electricity meets when it flows through a material.

It is this wasted energy due to electrical resistance that researchers are hoping to minimise.

Recent advances point to solutions

Running a computer will always consume some energy, but we are a long way (several orders of magnitude) away from computers that are as efficient as the laws of physics allow.

Several recent advances give us hope for entirely new solutions to this problem via new materials and new concepts.

Very thin materials

One recent step forward in physics and materials science is being able to build and control materials that are only one or a few atoms thick.

When a material forms such a thin layer, and the movement of electrons is confined to this sheet, it is possible for electricity to flow without resistance.

There are a range of different materials that show this property (or might show it).

Our research at the Australian Research Council Centre for Future Low-Energy Electronics Technologies (FLEET) is focused on studying these materials.

The study of shapes

There is also an exciting conceptual leap that helps us understand this property of electricity flow without resistance.

This idea comes from a branch of mathematics called “topology”.

Topology tells us how to compare shapes: what makes them the same and what makes them different.

Imagine a coffee cup made from soft clay.

You could slowly squish and squeeze this shape until it looks like a doughnut.

The hole in the handle of the cup becomes the hole in the doughnut, and the rest of the cup gets squished to form part of the doughnut.

Topology tells us that doughnuts and coffee cups are equivalent because we can deform one into the other without cutting it, poking holes in it, or joining pieces together.

It turns out that the strange rules that govern how electricity flows in thin layers can be understood in terms of topology.

This insight was the focus of the 2016 Nobel Prize, and it’s driving an enormous amount of current research in physics and engineering.

We want to take advantage of these new materials and insights to develop the next generation of low-energy electronics devices, which will be based on topological science to allow electricity to flow with minimal resistance.

This work creates the possibility of a sustainable continuation of the IT revolution — without the huge energy cost.

* Daisy Wang is a Research Fellow in the School of Physics at the University of New South Wales in Sydney. Jared Cole is Professor of Physics at RMIT University and leader of the Theoretical Chemical and Quantum Physics research group.

This article first appeared at theconversation.com.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.