E Cunningham

asked on February 1, 2026

Entropy and why it increases

What is entropy and why does it always increase?

Need Help?

Hire one of our expert Physics tutor online. 24/7 Service. Available now.

Expert Answer

Answered on March 2, 2026 by EXPERT TUTOR

Nothing Found

Dear E Cunningham,

Entropy is a measure of the number of possible microscopic arrangements — or microstates — available to a system, and according to expert tutors at My Physics Buddy, it always increases in an isolated system because disordered arrangements vastly outnumber ordered ones. Nature is simply far more likely to move toward states that can be achieved in more ways. This is the essence of the Second Law of Thermodynamics.

Understanding Entropy and Why It Always Increases

The Intuitive Picture: Disorder and Probability

Think about a new deck of cards fresh from the box — perfectly ordered by suit and rank. The moment you start shuffling, the deck rapidly becomes disordered. You could shuffle a million times and almost never return to that perfectly ordered state. This isn’t magic; it’s pure probability. There is only one perfectly ordered arrangement, but there are millions upon millions of disordered ones. The deck drifts toward disorder simply because disordered states are overwhelmingly more probable.

That, in essence, is entropy. In Physics, we define entropy not as “mess” but as the count of how many microscopic configurations — microstates — correspond to the same macroscopic appearance of a system. A high-entropy state is one that can be realised in an enormous number of ways.

The Statistical Definition: Boltzmann’s Formula

The most fundamental definition of entropy comes from Ludwig Boltzmann:

S = kB ln(Ω)

  • S — entropy of the system, measured in joules per kelvin (J K−1)
  • kB — Boltzmann’s constant, 1.38 × 10−23 J K−1
  • Ω (omega) — the number of distinct microstates consistent with the current macrostate
  • ln — the natural logarithm

When Ω is large — meaning the system can be arranged in many ways — entropy is high. When Ω is small — as in a highly ordered, constrained state — entropy is low. Because nature evolves toward the most probable configuration, and because the most probable configuration has the largest Ω, entropy increases spontaneously.

The Thermodynamic Definition

In classical Thermodynamics, entropy change is defined through heat transfer at a given temperature:

ΔS = Q / T

  • ΔS — change in entropy (J K−1)
  • Q — heat added to the system (J)
  • T — absolute temperature in kelvin (K)

For example, if 600 J of heat flows into a reservoir held at 300 K, the entropy change is:

ΔS = 600 / 300 = 2 J K−1

Notice that dividing by temperature makes sense intuitively: adding heat to a cold system (low T) spreads energy among fewer available states and causes a larger entropy jump than adding the same heat to a hot system already buzzing with energy.

The Second Law of Thermodynamics

The Second Law of Thermodynamics states formally that the total entropy of an isolated system never decreases over time:

ΔSuniverse ≥ 0

The equality holds only for perfectly reversible processes, which are idealisations. All real processes are irreversible, and they always produce a net positive entropy increase in the universe. As a Physics and Mathematics Tutor with 4+ years of experience, I can tell you that one of the most common points of confusion I see is students thinking entropy must increase everywhere — it doesn’t. A refrigerator decreases entropy inside it, but only by increasing entropy in the surrounding room by an even larger amount. The total always rises.

Why Can’t Entropy Decrease?

The answer is statistical. Imagine releasing a drop of ink into a glass of water. The ink spreads throughout the glass and never spontaneously collects back into a drop. Both the spread-out state and the concentrated drop are physically allowed by Newton’s laws — there is no force preventing the ink from re-concentrating. The reason it never does is that there are an astronomically larger number of arrangements where ink is spread out than arrangements where it is concentrated. The system moves toward the high-Ω state and essentially never returns.

This is also why time has a direction — often called the arrow of time. Entropy increasing is what makes the future different from the past at a macroscopic level. You can find a rich treatment of this connection at the Britannica entry on entropy.

A Quick Numerical Example

Consider a hot block at 500 K losing 1000 J of heat to a cold reservoir at 250 K.

  • Entropy lost by hot block: ΔShot = −1000 / 500 = −2 J K−1
  • Entropy gained by cold reservoir: ΔScold = +1000 / 250 = +4 J K−1
  • Net entropy change: ΔSuniverse = −2 + 4 = +2 J K−1

The universe’s entropy has increased by 2 J K−1, exactly as the Second Law demands.

Process Entropy of System Entropy of Surroundings Total
Ice melting at 0°C Increases Decreases ≥ 0
Refrigerator running Decreases (inside) Increases (room) ≥ 0
Gas expanding freely Increases No change > 0

Common Mistakes Students Make with Entropy

Mistake: Believing entropy always increases for every individual object or subsystem.
Fix: Remember the Second Law applies to the total entropy of the universe. A subsystem’s entropy can decrease provided the surroundings gain more entropy to compensate.

Mistake: Confusing entropy with energy or thinking high entropy means high energy.
Fix: Entropy is about the number of microstates, not energy content. A cold gas can have high entropy, and a hot ordered crystal can have relatively low entropy.

Mistake: Applying ΔS = Q/T without checking that the process is reversible or isothermal.
Fix: The formula ΔS = Q/T is exact only for reversible processes. For irreversible processes, the actual entropy change is strictly greater: ΔS > Q/T.

Exam Relevance: Entropy and the Second Law of Thermodynamics appear in IB Physics HL, A/AS Level Physics (9702), AP Physics 2, and GRE Physics. Questions range from calculating ΔS using Q/T to explaining irreversibility and the arrow of time conceptually.

Pro Tip from Ashish S: When entropy questions feel abstract, always ask yourself: “How many ways can this happen?” More ways means higher entropy — that single question unlocks almost every entropy problem.

Related Questions