# 8.1: Four Questions


Entropy and the Second Law of Thermodynamics is one of the most misunderstood concepts in science. Outside the sphere of technology, it has been used to argue both for and against the existence of a god. It has found applications in thermodynamics, information theory, sociology, and economics. A detailed definition is often based on probabilities, energy distributions, and explanations of the microscopic behavior of the fundamental particles (atoms and molecules) that make up a substance. Engineering definitions of entropy often rely on extensive discussions of ideal thermodynamic cycles and investigations of what constitutes the "best" possible performance for a cycle. As we will discover shortly, there is no "conservation of entropy" principle. Experience has shown that entropy is continually produced in the world and the lack of a conservation principle or the ability to even partially consume (or destroy) entropy places strict limitations on what processes are possible for any given system. Because the entropy accounting principle corresponds to the Second Law of Thermodynamics, a fundamental law of physics, it cannot be proven or developed from other more fundamental principles. In the discussion that follows, we will attempt to minimize the explanations up front and move quickly to develop an entropy accounting principle. Then using your past experience with modeling systems in terms of mass, charge, momentum, and energy, we can take our time and explore the consequences of this powerful, new concept.

As before with every accounting concept for a new property, there are four questions that must be answered. When applied to entropy, the questions become:

1. What is entropy?
2. How can entropy be stored in a system?
3. How can entropy be transported?
4. How can entropy be created or destroyed?

Once we answer these questions, we will have the appropriate accounting equation for entropy.

## 8.1.1 What is entropy?

Entropy is a property that allows us to quantify the Second Law of Thermodynamics, one of the most significant laws of physics. So before we can talk about entropy, we must state the Second Law of Thermodynamics. But before we do that, we will consider some of our everyday experiences that are related to this new property called entropy.

### Everyday experiences

As we go through our everyday tasks, our actions are based on our observations and expectations about how the physical world behaves. We may be very conscious of these assumptions or we may just go with our intuition, but in either case we expect our physical world to behave in a predictable fashion. Consider the following anecdotes and see how they match with your experience:

• You buy a cup of hot coffee, return to your office and set it on your desk. If you forget it and rediscover it one hour later, what would you expect to find — a piping hot cup of coffee that is warmer than when you first set it down, a room-temperature cup of coffee, or a cup of iced coffee? None of these scenarios violate any of the physical laws we have studied to this point.
• You find some pennies lying on an asphalt parking lot next to your car. Miser that you are, you bend over and pick up a penny and notice how warm it feels. As you stand up, another penny suddenly jumps up and lands in your hand. This second penny feels noticeably cooler than the first one did. Should you believe your eyes? From an energy standpoint, you hypothesize that the jumping process was adiabatic and the penny's internal energy (and temperature) decreased with a corresponding increase in its gravitational potential energy increase. Does this seem reasonable from an energy standpoint?
• You are a science fair judge and come upon an interesting project describing a method for charging a battery. A student has taken a $$300 \mathrm{~k} \Omega$$ resistor and connected it in series with a rechargeable, 9-volt-DC battery. Using a propane blowtorch, he holds a flame under the resistor and claims that the battery is charging. What do you think? Is this possible? Should you give him the prize for best science project or best hoax?
• On Friday evening you and some friends get some old 8-mm films and decide to watch the story of Simon Legree, Little Nell, and the Canadian Mountie. In typical fashion, at some point in the story Little Nell has been tied to the railroad tracks at the end of a railroad trestle. As the train just reaches the trestle, the Canadian Mountie blows up the trestle and as the trestle collapses the train arcs across the ravine and dissolves into a rock wall. Nothing about this seems too unusual except the passing thought of "what am I doing watching this?" Suddenly your buddy gets the bright idea to run the entire movie backwards. As you watch the improbable actions, you find the scene with the crashing train especially amusing when it is run backwards. Why? Why does running this backwards so catch our attention?

Without even knowing it, you began your study of thermodynamics and the Second Law of Thermodynamics when you were just an infant. Chances are your parents repeated a popular Mother Goose rhyme:

"Humpty Dumpty sat on a wall.
Humpty Dumpty had a great fall.
All the king's horses and all the king's men
Couldn't put Humpty Dumpty together again."

This is a great life lesson about our experience that certain processes, like breaking an egg, cannot be reversed. (Maybe that's why you find it intriguing to watch videos or movies run in reverse.)

All of these anecdotes speak to our expectation that there is a preferred direction for certain processes, and that certain processes are just not within our experience and appear to be at least highly improbable if not impossible.

The collective experience of scientists and engineers can be distilled into four formal statements about our expectations as to the behavior of the physical world:

• Spontaneous Processes — Spontaneous processes have a preferred direction of change.
• Power Cycles (Heat Engines) — The maximum thermal efficiency of a power cycle is always less than $$100 \%$$. (This is called the Kelvin-Planck Statement of the Second Law.)
• Heat Transfer — It is impossible to operate any device in such a manner that the sole effect is the heat transfer of energy from a low-temperature body to another body at a higher temperature. (This is called the Clausius Statement of the Second Law.)
• Final Equilibrium States — A closed, adiabatic system with no work transfer of energy has a preferred final equilibrium state.

Experience has shown that if any of these statements in false, then the other three are also false.

## Reversible and Irreversible Processes$${ }^1$$

A key concept in discussing the behavior of systems is the idea of reversibility. An internally reversible process is defined as follows:

A system executes an internally reversible process if at any time during the process the state of the system can be made to retrace its path exactly.

The concept of reversibility by its definition requires restorability. Any process that is not internally reversible is internally irreversible. When used without a qualifier, the term "reversible process" will be assumed here to refer to an internally reversible process.

In practice, what does "internally reversible" mean? Assume that a system undergoes an arbitrary process and we record the state of the system (i.e. all of its properties) and all interactions with the surroundings as a function of time. If the process is internally reversible, it should be possible to get the system to essentially run backwards in time by merely reversing the direction of the interactions at the boundary of the system.

Based on the definition of an internally reversible process, there are three additional consequences that will be stated here without proof:

• A work transfer of energy for an internally reversible process has the same magnitude but opposite direction if the process is reversed.
• A heat transfer of energy for an internally reversible process has the same magnitude but opposite direction if the process is reversed.
• An internally reversible process occurs in such a fashion that the system is always infinitesimally close to being in equilibrium, i.e. an internally reversible process is also a quasiequilibrium process.

An internally reversible process is a useful fiction in the study of real processes. A real process can only approach an internally reversible process in the limit as all sources of irreversibility (dissipative effects) within the system are eliminated.

Determining whether a given process is internally reversible or not is best done by identifying any source of irreversibility within the system. The presence of any irreversibility within the system makes the process internally irreversible. Irreversibilities arise from two sources:

1. Inherent dissipative effects within the system
2. A non-quasiequilibrium process.

Recall that a quasiequilibrium process was originally defined as a process that proceeds in such a manner that the process is infinitesimally close to a state of equilibrium at all times. Thus a quasiequilibrium process qualifies as an internally reversible process and can be recognized by its slow, carefully controlled execution. Any work interaction that can be carried out in a quasiequilibrium fashion is a candidate work transfer of energy for an internally reversible process. Examples might include any processes in which the mechanical energy of a system is conserved (i.e. processes for which the work-energy principle is valid). The behavior of a simple electrical circuit that contains only ideal capacitors and inductors would also qualify as an internally reversible process.

Most irreversibility is the result of dissipative effects that we commonly experience. Examples of these include:

1. electric resistance
2. inelastic deformation
3. viscous flow of a fluid (flow with fluid friction)
4. solid-solid friction (dry friction)
5. heat transfer across a finite temperature difference or as the result of a finite temperature gradient
6. hysteresis effects
7. shock waves
8. internal friction, e.g. internal damping of a vibrating system
9. unrestrained expansion of a fluid
10. fluid flow through valves and porous plugs (throttling)
11. spontaneous chemical reactions
12. mixing of dissimilar gases or liquids
13. osmosis
14. dissolving of one phase into another phase
15. mixing of identical fluids initially at different pressures and temperatures

Notice that all of these effects are within your everyday experience and that they cover a range of physical and chemical effects. By carefully examining a system for any of these dissipative effects it is possible to determine whether a system is internally irreversible or internally reversible.

$${ }^{1}$$ Adapted from K. Wark and D. Richards, Thermodynamics, 6th ed., McGraw-Hill, Inc., New York, 1999.

### Second Law of Thermodynamics

Sadi Carnot laid the groundwork for the Second Law of Thermodynamics in the 1800s by studying the performance of steam engines. Sadi Carnot's thoughts have come down to us in the form of two statements about the performance of power cycles referred to as the Carnot Principles:

• Principle I — The thermal efficiency of an internally irreversible power cycle is always less than the thermal efficiency of an internally reversible power cycle that transfers energy by heat transfer at the same boundary temperatures.
• Principle II — All internally reversible power cycles that transfer energy by heat transfer at the same boundary temperatures have the same thermal efficiency.

Although he did not give a complete statement of the Second Law of Thermodynamics, his understanding of reversible and irreversible processes was crucial. Both of these statements can be shown to be straightforward consequences of applying the entropy accounting equation that we are formulating.

We will make an axiomatic statement of the Second Law of Thermodynamics. As with the other fundamental laws of physics, the Second Law cannot be proven from more fundamental principles and embodies the collective experience and wisdom of the scientists and engineers. Our statement of the Second Law of Thermodynamics is embodied in the following three statements.

1. There exists an extensive property called entropy, $$S$$.
2. Entropy is transported across the boundaries of a closed system by heat transfer. The entropy transport rate with heat transfer, $$\dot{S}_{Q}$$, at a boundary is defined by the equation: $\dot{S}_{Q} \equiv \frac{\dot{Q}_{j}}{T_{b, \ j}} \nonumber$ where $$\dot{Q}_{j}$$ is the heat transfer rate at boundary $$j$$ and $$T_{b, \ j}$$ is the thermodynamic temperature of the boundary surface $$j$$.
3. Entropy can only be produced except in the limit of an internally reversible process where the entropy production rate is zero: $\begin{array}{c} \dot{S}_{gen} \geq 0 \\ \text { where }\left\{\begin{array}{l} \dot{S}_{gen}>0 \text { for an internally } \textit{irreversible} \text{ process } \\ \dot{S}_{gen}=0 \text { for an internally } \textit{reversible} \text{ process } \end{array}\right. \end{array} \nonumber$

#### Entropy

Entropy is an extensive property of a system that can be produced and is transported in a manner consistent with the Second Law of Thermodynamics. The dimensions of entropy are $$[ \text{Energy} ]/[ \text{Temperature} ]$$. Typical units for entropy are $$\mathrm{kJ} / \mathrm{K}$$ in SI and $$\mathrm{Btu} /{ }^{\circ} \mathrm{R}$$ in USCS.

Our study of entropy will lead to a general accounting principle that is helpful to engineers in a number of ways:

• Provides a way to establish a "thermodynamic" temperature scale that is independent of the specific thermometer used to measure temperature.
• Provides a way to determine, given a specific system, which of the many possible processes that satisfy the conservation of energy are in fact possible.
• Provides criteria for the theoretical "best" performance against which real systems can be compared.
• Provides a way to assess the usefulness (quality) of energy.
• Provides additional information to relate and predict the thermophysical properties of a substance, e.g. $$u$$, $$h$$, $$v$$, $$T$$, $$P$$ and $$s$$.

The study of the property entropy is intimately related to the study of what processes are possible and preferred and how systems can evolve with time.

## 8.1.2 How can entropy be stored in a system?

Entropy is stored with mass. The entropy of a system is calculated as follows: $S_{sys}=\int\limits_{V\kern-0.5em\raise0.3ex- _{sys}} s \rho \ d V\kern-0.8em\raise0.3ex- \nonumber$ where $$s$$ is the specific entropy, the entropy per unit mass. If the specific entropy is spatially uniform inside the system, this integral simplifies as follows:

$S_{sys} = \int\limits_{V\kern-0.5em\raise0.3ex- _{sys}} s \rho \ d V\kern-1.0em\raise0.3ex- = s \underbrace{ \int\limits_{V\kern-0.5em\raise0.3ex-_{sys}} \rho \ d V\kern-0.8em\raise0.3ex- }_{m_{sys}} = m_{sys} s \quad\quad \begin{array}{c} \text{Spatially-uniform} \\ \text{specific entropy } s \end{array} \nonumber$

The dimensions on specific entropy are $$[ \text{Entropy} ]/[ \text{Mass} ]$$. Typical units for specific entropy are $$\mathrm{kJ} /(\mathrm{kg} \cdot \mathrm{K})$$ in $$\mathrm{SI}$$ and $$\mathrm{Btu} /\left(\mathrm{lbm} \cdot{ }^{\circ} \mathrm{R}\right)$$ in USCS.

## 8.1.3 How can entropy be transported?

Entropy can be transported across the boundary of a system by two different mechanisms—heat transfer and mass transfer.

### Entropy transport by heat transfer

As stated in the Second Law of Thermodynamics, entropy is carried with heat transfer and the rate of transfer is defined as follows: $\dot{S}_{Q} \equiv \frac{\dot{Q}_{j}}{T_{b, \ j}} \quad\quad \begin{array}{c} \text { Heat Transfer Rate } \\ \text { of Entropy at Surface } j \end{array} \nonumber$ where $$\dot{Q}_{j}$$ is the heat transfer rate at boundary $$\mathrm{j}$$ and $$T_{b, \ j}$$ is the thermodynamic temperature of the boundary surface $$j$$. The net transport of entropy into a system by heat transfer at $$N$$ surfaces is the sum of the heat transfer rates of entropy at all the boundary surfaces: $\dot{S}_{Q, \text { net in}}=\sum_{j=1}^{N} \frac{\dot{Q}_{j}}{T_{b, \ j}} \nonumber$

Figure $$\PageIndex{1}$$: Net rate of entropy transport with heat transfer for a system with heat transfer at four different boundary temperatures $\dot{S}_{Q, \text { net in }}=\frac{\dot{Q}_{1, \text { in}}}{T_{b, \ 1}}-\frac{\dot{Q}_{2, \text { out}}}{T_{b, \ 2}}+\frac{\dot{Q}_{3, \text { in}}}{T_{b, \ 3}}-\frac{\dot{Q}_{4, \text { out}}}{T_{b, \ 4}} \nonumber$

Figure $$\PageIndex{1}$$ shows an example of a system with four different heat transfers of entropy, each occurring at a different boundary temperature. This sign convention for heat transfer of entropy is the same as for heat transfer of energy.

The dimensions on the heat transfer rate of entropy are $$[ \text{Energy} ][ \text{Time} ]^{-1}[\text {Temperature}]^{-1}$$. Typical units are $$\mathrm{kJ} /(\mathrm{s} \cdot \mathrm{K})$$ or $$\mathrm{kW} / \mathrm{K}$$ in SI and $$\mathrm{Btu} /\left(\mathrm{s} \cdot { }^{\circ} \mathrm{R}\right)$$ in USCS. Note that the temperature of the boundary where the heat transfer occurs must be measured in absolute units, $$K$$ or $${ }^{\circ} \mathrm{R}$$. (The precise definition of what we mean by a "thermodynamic" or "absolute" temperature will be addressed shortly.

### Entropy transport by mass flow

As our previous experience with other extensive properties has shown, any mass that crosses the boundary of a system carries with it extensive properties. Entropy is no exception. The rate at which entropy is transported across a boundary by mass flow is the product of the mass flow rate and the specific entropy, $$s$$, of the mass at the boundary: $\dot{S}_{\text {mass flow}}=\dot{m} s \nonumber$ The net rate at which entropy is carried into a system by mass flow is $\dot{S}_{\text {mass flow, net in}} = \sum_{\text {in}} \dot{m}_{i} s_{i} - \sum_{\text {out}} \dot{m}_{e} s_{e} \nonumber$ The dimensions and units on the rate of entropy transport with mass flow is the same as the that for the rate of entropy transport with heat transfer.

## 8.1.4 How can entropy by generated or consumed?

Based on the Second Law of Thermodynamics, we say that entropy can only be produced within a system and in the limit of an internally reversible process entropy it is conserved. This is a very important result and gives the entropy accounting principle its power: $\begin{array}{c} \dot{S}_{gen} \geq 0 \\ \text { where }\left\{\begin{array}{c} \dot{S}_{gen}>0 \text { for an internally } \textit{irreversible} \text{ process } \\ \dot{S}_{gen}=0 \text { for an internally } \textit{reversible} \text{ process } \end{array}\right. \end{array} \nonumber$ Experience has shown us that the entropy production term is always greater than or equal to zero. The presence of any irreversibility within the system results in entropy production during the process. Experience has also shown that entropy is produced in every real process. Thus an internally reversible process can be viewed as a limiting and ideal process that can only be approached.

Internally reversible processes (any process where $$\dot{S}_{gen} \equiv 0$$ ) play an important role in the design and analysis of real systems. First, it serves as an example of the theoretical "best" performance possible. Second, it often represents the only conditions under which we can actually do the calculations to predict the behavior of the system. Third, when combined with experimentally determined "correction factors", the internally reversible process plays a central role in predicting the actual behavior of a real systems.

## 8.1.5 Putting it all together — Entropy Accounting Equation

Applying the accounting framework to entropy, we know that $\frac{d S_{sys}}{dt} = \dot{S}_{Q, \text { net in}} + \dot{S}_{\text {mass flow, net in}} + \dot{S}_{gen} \nonumber$ Now collecting all of the results developed above, we have the rate form of the entropy accounting equation: $\frac{d S_{sys}}{dt} = \sum_{j=1}^{N} \frac{\dot{Q}_{j}}{T_{b, \ j}} + \sum_{\text {in}} \dot{m}_{i} s_{i} - \sum_{\text {out}} \dot{m}_{e} s_{e}+\dot{S}_{gen} \nonumber$ where $$\dot{S}_{gen} \geq 0$$ and $$\dot{S}_{gen}=0$$ for an internally reversible process.

In words, Eq. $$\PageIndex{10}$$ says the time rate of change of the entropy of the system equals the net rate of entropy transport into the system with heat transfer plus the net rate of entropy transport into the system with mass flow plus the rate of entropy generation (or production).

This page titled 8.1: Four Questions is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Donald E. Richards (Rose-Hulman Scholar) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.