The cost of computation
- Date:
- April 8, 2019
- Source:
- Santa Fe Institute
- Summary:
- There's been a rapid resurgence of interest in understanding the energy cost of computing. Recent advances in this 'thermodynamics of computation' are now summarized.
- Share:
For decades, physicists have wrestled with understanding the thermodynamic cost of manipulating information, what we would now call computing. How much energy does it take, for example, to erase a single bit from a computer? What about more complicated operations? These are pressing, practical questions, as artificial computers are energy hogs, claiming an estimated four percent of total energy consumed in the United States.
These questions are not limited to the digital machines constructed by us. The human brain can be seen as a computer -- one that gobbles an estimated 10 to 20 percent of all the calories a person consumes. Living cells, too, can be viewed as computers, but computers that "are many orders of magnitude more efficient" than any laptop or smartphone humans have constructed, says David Wolpert of the Santa Fe Institute.
Wolpert, a mathematician, physicist, and computer scientist, has been on the frontlines of a rapid resurgence of interest in a deep understanding of the energy cost of computing. That research is now hitting its stride, thanks to advances in using some revolutionary tools recently developed in statistical physics, in order to understand the thermodynamic behavior of nonequilibrium systems. The reason these tools are so important is that computers are decidedly nonequilibrium systems. (Unplug your laptop and wait for it to reach equilibrium, and then see if it still works.) Although Wolpert primarily approaches these issues using tools from computer science and physics, there is also sharp interest from researchers in other areas, including those who study chemical reactions, cellular biology, and neurobiology.
However, research in nonequilibrium statistical physics largely happens in silos, says Wolpert. In a review published today in the Journal of Physics A, Wolpert collects recent advances in understanding the thermodynamics of computation that are grounded in computer science and physics. The review functions as a sort of state-of-the-science report for a burgeoning interdisciplinary investigation.
"It is basically a snapshot of the current state of the fields, where these ideas are starting to explode, in all directions," says Wolpert.
In the paper, Wolpert first summarizes the relevant theoretical ideas from physics and computer science. He then discusses what's known about the entropic cost of a range of computations, from erasing a single bit to running a Turing machine. He goes on to show how breakthroughs in nonequilibrium statistical physics have enabled researchers to more formally probe those cases -- moving far beyond simple bit erasure.
Wolpert also touches on the questions raised in this recent research which suggest real-world challenges, like how to design algorithms with energy conservation in mind. Can biological systems, for example, serve as inspiration for designing computers with minimal thermodynamic cost?
"We are being surprised and astonished in many ways," Wolpert says. In putting together the review, and coediting a book on the topic due out later this year, "we've uncovered phenomena that no one has analyzed before that were very natural to us, as we pursue this modern version of the thermodynamics of computation."
Story Source:
Materials provided by Santa Fe Institute. Note: Content may be edited for style and length.
Journal Reference:
- David H Wolpert. The stochastic thermodynamics of computation. J. Phys. A: Math. Theor., 2019 DOI: 10.1088/1751-8121/ab0850
Cite This Page: