# Entropy

*97*pages on

this wiki

Norbert Wiener, The Human Use of Human Beings: Cybernetics and Society |

**Entropy** is an important concept of cybernetics and information theory.

In general, it denotes unavailable energy or disorder. In thermodynamic sense, entropy is a measure of energy, in statistical sense it denotes variation, dispersion or diversity.

Formally, both types of entropy have in common that they are expressed as the logarithm of a probability:

Thermodynamic entropy

*S* = *k* log *W*

is a function of the dispersion *W* of heat, with *k* being Boltzmann's constant.

The entropy of an information stream is given with Shannon's equation

where *p* denotes the probability of the associated event and *H* is referred to as the **entropy** of the information source.