Skip to main content
Engineering LibreTexts

7.4: Brief notes on information theory and the thermodynamics of computation

  • Page ID
    52422
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    We now examine the thermodynamics of computation.

    Minimum energy dissipated per bit

    Assume we have a system, perhaps a computer, with a number of possible states. The uncertainty, or entropy of the computer is a measure of the number of states. Recall from thermodynamics that the Boltzmann-Gibbs entropy of a physical system is defined as

    \[ S = -k_{B} \sum_{i=1}^{N} p_{i} \ln p_{i} \label{7.4.1}, \]

    where the system has N possible states, each with probability \(p_{i}\), and \(k_{B}\) is the Boltzmann constant.

    The opposite of entropy and uncertainty is information. When the uncertainty of the system decreases, it gains information.

    Now, the second law of thermodynamics can be restated as "all physical processes increase the total entropy of the universe". Let‟s separate the universe into the computer, and everything else. The corresponding entropy of each system is given by

    \[ S_{universe} = S_{computer}+S_{everything\ else} \label{7.4.2}. \]

    Thus, thermodynamics requires

    \[ \Delta S_{universe} \geq 0 \label{7.4.3}. \]

    It follows that

    \[ \Delta S_{everything\ else} \geq -\Delta S_{computer} \label{7.4.4}. \]

    i.e. if the information within a computer increases during a computation, then the entropy decreases. This change in entropy within the computer must be at least balanced by an increase in the entropy of the remainder of the universe. The increase in entropy in the remainder of the universe is obtained by dissipating heat, \(\Delta Q\), from the computer.

    According to thermodynamics the heat dissipated is

    \[ \Delta Q = T \Delta S_{everything\ else} \geq -T \Delta S_{computer} \label{7.4.5} \]

    Uncertainty and entropy can also be measured in bits. For example, how many bits are required to describe the computer with N states?

    \[ 2^{H} = N \label{7.4.6}. \]

    Here, H is known as the Shannon entropy. If the states are equally probable, with probability \(p=1/N\), then the uncertainty reduces to:

    \[ H = \log_{2}N = \log_{2}p \label{7.4.7}. \]

    Or more generally, if each state of the computer has probability \(p_{i}\).

    \[ H = \left< -\log_{2}p_{i} \right> = -\sum^{N}_{i=1} p_{i} \log_{2}p_{i} \label{7.4.8} \]

    Comparing Equation \ref{7.4.1} with Equation \ref{7.4.8} and noting that \(\ln p_{i} = (\ln 2)\log_{2} p_{i}\) gives

    \[ \Delta Q = -k_{B} T \ln(2) \Delta H_{computer} \label{7.4.9} \]

    The heat must ultimately come from the power supply. Thus, the minimum energy required per generation of one bit of information is:

    \[ E_{min} = k_{B}T\ln(2) \label{7.4.10}. \]

    This minimum is known as the Shannon-von Neumann-Landauer (SNL) limit.

    Energy required for signal transmission

    Recall Shannon's theorem for the capacity, c, in bits per second, of a channel in the presence of noise.

    \[ c = b \log_{2}\left( 1+\frac{s}{n} \right) \label{7.4.11}, \]

    where s and n are the signal and noise power, respectively, and b is the bandwidth of the channel. The noise in the channel is at least \(n = bk_{B}T\).

    The energy required per bit transmitted is:

    \[ E_{min} = \lim_{ s \rightarrow 0} \bigg\lbrace \frac{s}{c}\bigg\rbrace =\lim_{ s \rightarrow 0} \bigg\lbrace \frac{s}{b\ log_{2}(1+s/n)}\bigg\rbrace \label{7.4.12}. \]

    L'Hôpital's rule gives

    \[ E_{min} = k_{B}T\ln(2) \label{7.4.13}. \]

    consistent with the previous calculation of \(E_{min}\).

    Consequences of \(E_{min}\).

    It has been argued that since the uncertainty in energy, \(\Delta E\), within an individual logic element can be no greater than \(E_{min}\), we can apply the Heisenberg uncertainty relations to a system operating at the SNL limit to determine the minimum switching time, i.e.\(^{†}\)

    \[ \Delta E \Delta t \geq \hbar \label{7.4.14} \]

    Equation \ref{7.5.14} gives a minimum switching time of

    \[ \tau_{min} = \frac{\hbar}{\Delta E} = \frac{\hbar}{k_{B}T\ln(2)} = 0.04 \text{ ps} \label{7.4.15} \]

    Assuming that the maximum power density that we can cool is \(P_{max} ~ 100W/cm^{2}\), the maximum integration density is

    \[ n_{max} = \frac{P_{max}}{E_{min}/\tau_{min}} = \frac{\hbar P_{max}}{E_{min}^{2}} \label{7.4.16} \]

    At room temperature, we get \(n_{max} ~< 10^{10} cm^{-2}\), equivalent to a switch size of 100 x 100 nm. This is very close to the roadmap value for 2016.

    At lower temperatures, the power dissipation on chip is decreased, but the overall power dissipation actually increases due to the requirement for refrigeration.\(^{4}\) Since the engineering constraint is likely to be on chip power dissipation – refrigeration may be one method for further increasing the density of electronic components.

    \(^{†}\)This argument, due to Zhirnov, et al. "Limits to Binary Logic Switch Scaling - A Gedanken Model", Proceedings of the IEEE 91, 1934 (2003), has been used to argue that end of the roadmap Si CMOS is as good as charge based computing can get.


    7.4: Brief notes on information theory and the thermodynamics of computation is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by LibreTexts.

    • Was this article helpful?