The Future of Computing Requires Innovation in Energy

Unless we find a new form of electronic technology that uses less energy, computing will become limited by an “energy crunch” within decades.

There’s little doubt the information technology revolution has improved our lives. But unless we find a new form of electronic technology that uses less energy, computing will become limited by an “energy crunch” within decades.

Even the most common events in our daily life—making a phone call, sending a text message, or checking an email—use computing power. Some tasks, such as watching videos, require a lot of processing, and so consume a lot of energy.

Because of the energy required to power the massive, factory-sized data centers and networks that connect the Internet, computing already consumes 5 percent of global electricity. And that electricity load is doubling every decade.

Fortunately, there are new areas of physics that offer promise for massively reduced energy use.

The End of Moore’s Law

Humans have an insatiable demand for computing power.

Smartphones, for example, have become one of the most important devices of our lives. We use them to access weather forecasts, plot the best route through traffic, and watch the latest season of our favorite series.

And we expect our smartphones to become even more powerful in the future. We want them to translate language in real time, transport us to new locations via virtual reality, and connect us to the “Internet of Things.”

The computing required to make these features a reality doesn’t actually happen in our phones. Rather, it’s enabled by a huge network of mobile phone towers, Wi-Fi networks, and massive, factory-sized data centers known as “server farms.”

For the past five decades, our increasing need for computing was largely satisfied by incremental improvements in conventional, silicon-based computing technology: ever-smaller, ever-faster, ever-more efficient chips. We refer to this constant shrinking of silicon components as “Moore’s Law.”

Moore’s law is named after Intel co-founder Gordon Moore, who observed that “the number of transistors on a chip doubles every year while the costs are halved.” But as we hit limits of basic physics and economy, Moore’s law is winding down. We could see the end of efficiency gains using current, silicon-based technology as soon as 2020.

Our growing demand for computing capacity must be met with gains in computing efficiency, otherwise the information revolution will slow down from power hunger.

Achieving this sustainably means finding a new technology that uses less energy in computation. This is referred to as a “beyond CMOS” solution, in that it requires a radical shift from the silicon-based CMOS (complementary metal-oxide-semiconductor) technology that has been the backbone of computing for the last five decades.

Why Does Computing Consume Energy at All?

Processing of information takes energy. When using an electronic device to watch television, listen to music, model the weather, or any other task that requires information to be processed, there are millions and millions of binary calculations going on in the background. There are zeros and ones being flipped, added, multiplied, and divided at incredible speeds.

The fact that a microprocessor can perform these calculations billions of times a second is exactly why computers have revolutionized our lives.

But information processing doesn’t come for free. Physics tells us that every time we perform an operation—for example, adding two numbers together—we must pay an energy cost.

And the cost of doing calculations isn’t the only energy cost of running a computer. In fact, anyone who has ever used a laptop balanced on their legs will attest that most of the energy gets converted to heat. This heat comes from the resistance that electricity meets when it flows through a material.

It is this wasted energy due to electrical resistance that researchers are hoping to minimize.

Recent Advances Point to Solutions

Running a computer will always consume some energy, but we are a long way (several orders of magnitude) away from computers that are as efficient as the laws of physics allow. Several recent advances give us hope for entirely new solutions to this problem via new materials and new concepts.

Very Thin Materials
One recent step forward in physics and materials science is being able to build and control materials that are only one or a few atoms thick. When a material forms such a thin layer, and the movement of electrons is confined to this sheet, it is possible for electricity to flow without resistance.

There are a range of different materials that show this property (or might show it). Our research at the ARC Centre for Future Low-Energy Electronics Technologies (FLEET) is focused on studying these materials.

The Study of Shapes
There is also an exciting conceptual leap that helps us understand this property of electricity flow without resistance.

This idea comes from a branch of mathematics called “topology.” Topology tells us how to compare shapes: what makes them the same and what makes them different.

Imagine a coffee cup made from soft clay. You could slowly squish and squeeze this shape until it looks like a donut. The hole in the handle of the cup becomes the hole in the donut, and the rest of the cup gets squished to form part of the donut.

Topology tells us that donuts and coffee cups are equivalent because we can deform one into the other without cutting it, poking holes in it, or joining pieces together.

It turns out that the strange rules that govern how electricity flows in thin layers can be understood in terms of topology. This insight was the focus of the 2016 Nobel Prize, and it’s driving an enormous amount of current research in physics and engineering.

We want to take advantage of these new materials and insights to develop the next generation of low-energy electronics devices, which will be based on topological science to allow electricity to flow with minimal resistance.

This work creates the possibility of a sustainable continuation of the IT revolution—without the huge energy cost.

This article was originally published on The Conversation. Read the original article. Daisy Wang is a post-doctoral fellow at the UNSW School of Physics. Jared Cole is a professor of physics at RMIT University.

Related Posts