In thermodynamics and statistical physics, entropy is a quantitative measure of disorder, or of the energy in a system to do work.
In statistical physics, entropy is a measure of the disorder of a system. What disorder refers to is really the number of microscopic configurations, W, that a thermodynamic system can have when in a state as specified by certain macroscopic variables (volume, energy, pressure, and temperature). By “microscopic states”, we mean the exact states of all the molecules making up the system.
Mathematically, the exact definition is:
Entropy = (Boltzmann’s constant k) x logarithm of number of possible states
S = kB logW
This equation, which relates the microscopic details, or microstates, of the system (via W) to its macroscopic state (via the entropy S), is the key idea of statistical mechanics. In a closed system, entropy never decreases, so in the Universe entropy is irreversibly increasing. In an open system (for example, a growing tree), entropy can decrease and order can increase, but only at the expense of an increase in entropy somewhere else (e.g. in the Sun).