Diagram · General · Language · Site

More about entropy — reality vibrates

I studied thermodynamics as an undergrad. Online resources really help refresh my understanding; so, here’s a summary of my notes about entropy. Hopefully these agree with how the term is used in physics.

Statements

Equation:  S = k ln Ω

[Wiki] “The second law of thermodynamics states that an isolated system’s entropy never decreases. Such systems spontaneously evolve towards thermodynamic equilibrium …”

Sean Carroll: “Entropy increases simply because there are more ways to be high-entropy than low-entropy.”

Huh? Well, let’s unpack that “intuitively obvious” moment — as we said in physics lectures when (as a prank) “snow” rained down on the speaker.

Context

0. The system under investigation is carefully defined so as to satisfy the following characteristics:

1. The system is enclosed by “walls” that bound and connect it to its surroundings. The walls may be physical or conceptual. The type of wall determines whether the system is considered isolated (idealized) or closed or open.

2. Entropy is a measure (a function of state) of a macroscopic system containing microscopic constituents (atoms, molecules). “Unlike many other functions of state, entropy cannot be directly observed but must be calculated.”

3. The details of the system’s constituents are not directly considered — their behavior is described by macroscopically averaged properties, e.g. temperature, pressure, … [1]

4. These constituents are constantly in “motion” — that is, they fluctuate dynamically and continuously in space-time (obeying the uncertainty principle).

5. The properties of the system assume equilibrium — the state of the system is considered uniform throughout. For example, the equal a priori probability postulate. And “the time-courses of processes are deliberately ignored” (except for sometimes a conceptual model which employs mathematical smoothing, such as differential equations). [2]

[1] Statistical thermodynamics extends classical thermodynamics using methods (models) which permit properties to be determined without the need to actually measure them. In quantum mechanics, “a statistical ensemble … is represented by a density matrix. … In classical mechanics, an ensemble is represented by a joint probability density function.”

[2] Non-equilibrium thermodynamics is another branch in which macroscopic properties vary locally within the system being studied, and these properties (state variables [3]) are extrapolations of (or approximations to) the standard ones used to specify the system in thermodynamic equilibrium. This may require a broader conceptual framework (regarding domain of applicability), simplifying assumptions, and additional knowledge about the system. And defining entropy at a point in time may be problematical in macroscopic terms. This is particularly useful since “almost all systems found in nature are not in thermodynamic equilibrium.”

[3] “State functions do not depend on the path by which the system arrived at its present state.”

Additional notes regarding context

A. The [microscopic] particles [of a macroscopic system] are always in motion [“roaming freely”]. The particles go “wherever they want” (“… assuming that each of the microscopic configurations is equally probable.”) “What is forcing them to do this? Nothing — the particles simply move where they would like to move …”  (So an equally distributed case is simply statistically the most probable case. There can be small fluctuations but these average out over time.)

B: [Wiki] In classical thermodynamics (Boltzmann’s definition), the entropy of a system is defined only if it is in thermodynamic equilibrium. … In what has been called the fundamental assumption of statistical thermodynamics or the fundamental postulate in statistical mechanics, the occupation of any microstate is assumed to be equally probable (i.e. Pi = 1/Ω, where Ω is the number of microstates); this assumption is usually justified for an isolated system in equilibrium.

C. Terminology: Microstate.

References

1. See video at this link: https://ch301.cm.utexas.edu/thermo/index.php#second-law/microstates-boltzmann.html

An isolated system will spontaneously transition between states such that the entropy of the system is increased to its maximum value. Why is this? Is there some strange force pushing things to higher entropy states? The simple fact is that if a final state has higher entropy, it is simply more likely to exist from the myriad of possible states. These states contain distributions of molecules and energies that are the most probable. What distributions are the most probable? The ones with the greatest number of microstates. A microstate is a specific way in which we can arrange the energy of the system. Many microstates are indistinguishable from each other. The more indistinguishable microstates, the higher the entropy.

The odds [in the example presented] are the best for group ii above (6 out of 10), but not amazingly higher. However, as we move to systems with larger numbers (Avagadro’s number is very, very large), what we see is that the “most likely” configurations are overwhelmingly likely. That is, the most likely configurations are essentially the only configurations that will ever be seen. There are many very similar configurations around the average so we observe small fluctuations, but we never see the extreme “special” configurations.

… an “ordered” state is one with very little variability, possibly just a handful of microstates possible. In contrast, a “disordered” state is one with a multitude of possibilities (thousands upon thousands of microstates) – so much so that no apparent patterns are present and we perceive “disorder.” It is best to avoid the order/disorder arguments though. Order and disorder are value judgments that we humans impose on arrangements based on our perception. Entropy is a quantifiable measure of the dispersion [distribution] of energy and our personal perceptions have no place here.

The greater the number of microstates (Ω), the greater the entropy. Boltzmann found it difficult to explain such behavior to those who were not predisposed to understanding the scientific rigor needed for such an explanation. So he “simplified” things by saying its like order and disorder. This was much easier to the layman and ever since has been perpetuated through the years. The order/disorder perception of entropy must be discontinued if we are ever going to better understand entropy and the second law of thermodynamics. We must think of energy dispersal and energy becomes more dispersed when more microstates are available.

[https://ch301.cm.utexas.edu/thermo/index.php#second-law/microstates-boltzmann.html]

We can break down entropy change as resulting from changes in specific properties of our system. There are changes in five things that will lead to a change in the entropy of the system.

  • Temperature
  • Volume
  • Phase
  • Mixing
  • Composition (chemistry)

[https://ch301.cm.utexas.edu/thermo/index.php#second-law/entropy-change.html]

2. “Entropy Explained, With Sheep – From Melting Ice Cubes to a Mystery About Time”

[https://aatishb.github.io/entropy/]

“The atomic world is a two-way street. But when we get to large collections of atoms, a one-way street emerges for the direction in which events take place.”

So there’s a deep mystery lurking behind our seemingly simple ice-melting puzzle. At the level of microscopic particles, nature doesn’t have a preference for doing things in one direction versus doing them in reverse. The atomic world is a two-way street.

And yet, for some reason, when we get to large collections of atoms, a one-way street emerges for the direction in which events take place, even though this wasn’t present at the microscopic level. An arrow of time emerges.

there’s no new law of physics that ‘tells’ the energy to spread out, or the entropy to go up. There are simply more ways to spread energy out than to keep it contained, so that’s what we should expect to see happen. Higher entropy states are more probable than lower entropy ones.

When we get to an object as big as an ice cube in a glass of water, with something like 10^25 molecules, this entropy graph becomes incredibly sharply peaked, and you’re guaranteed to be right near the peak. The odds of seeing entropy decrease are effectively zero, not because any physical law compels it to be so, but because of sheer statistics — there are overwhelmingly more ways for the energy to be spread out than there are ways for the energy to be contained.

3 thoughts on “More about entropy — reality vibrates

  1. https://en.wikipedia.org/wiki/Temperature#Plasma_physics
    The microscopic description in statistical mechanics is based on a model that analyzes a system into its fundamental particles of matter or into a set of classical or quantum-mechanical oscillators and considers the system as a statistical ensemble of microstates. As a collection of classical material particles, temperature is a measure of the mean energy of motion, called kinetic energy, of the particles, whether in solids, liquids, gases, or plasmas. … In this mechanical interpretation of thermal motion, the kinetic energies of material particles may reside in the velocity of the particles of their translational or vibrational motion or in the inertia of their rotational modes. … In condensed matter, and particularly in solids, this purely mechanical description is often less useful and the oscillator model provides a better description to account for quantum mechanical phenomena.

  2. An interesting experiment with correlated quantum spin1 — “How to Temporarily Undo the Universe’s Endless Chaos with Chloroform” posted today by Space.com — “showed that heat could briefly flow [locally] from a cold atom to a hot one inside a chloroform molecule.”

    … the heat flows against the normal current of entropy, but the correlation between the atoms that makes that paradoxical flow possible breaks down as the reverse flow happens.

    The counterintuitive heat flow in this experiment does violate the second law as it’s classically stated … A more complete statement of the second law goes like this: The known universe is so well-ordered, it’s overwhelmingly likely to tend toward disorder. …

    In other words, the universe is already so low in disorder that the tendency is toward more disorder, but a system without that precondition wouldn’t necessarily tend toward entropy. A 2008 arXiv paper examining how quantum correlations complicate entropy quotes the 19th-century physicist Ludwig Boltzmann, who stated that “the universe, considered as a mechanical system … started from a very improbable state, and is still in a very improbable state.”

    … the universe still tends toward chaos. The correlations between particles are temporary, the experimentalists wrote, and dissipate within milliseconds even as they enable these unusual heat flows.

    [1] “Two particles in a system can be correlated, meaning they share physical information — a narrower version of the effect that occurs during quantum entanglement — by aligning their spins.”

    Something further to ponder: There are quantum correlations which do not involve entanglement. So, what distinguishes entanglement is …

  3. Are there non-classical gases? Yep, quantum gases. Like the song, it’s a gas, gas, gas …

    • APS Physics > “From Quantum Quasiparticles to a Classical Gas” by Pietro Massignan (March 6, 2019) – A MIT group tracked the crossover from classical to quantum behavior in a homogenous gas made of ultracold lithium atoms. These results will serve as a benchmark for future theory and experiments that explore the complex “boundary” between the quantum and classical regimes.

    Boltzmann gases all exhibit the same behavior, regardless of their atomic makeup (whether the atoms that form them are bosons or fermions, for example). But for quantum gases, composition matters. For example, collisions occur more frequently in a quantum gas made of identical bosons than they do in a classical gas, while collisions in gases made of identical fermions (Fermi gases) are suppressed. This behavior is a clear manifestation of Pauli’s exclusion principle, which states that no two fermions can occupy the same quantum state.

    Kotecha explains that the mathematical variables required for a hydrodynamical description of GFT condensate cosmology have statistical and thermodynamical origins. Examples of these include the number density and temperature fields of a fluid. For instance, the fluid may be colder and denser in one part than the other; in order to describe the dynamics of such systems as they evolve, one first has to be able to define these differences. This is where statistical mechanics and thermodynamics come in.

Comments are closed.