## FANDOM

97 Pages

Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization.
In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the
negative logarithm of its probability. That is, the more probable the message, the less information it gives. Cliches, for
example, are less illuminating than great poems.

Norbert Wiener, The Human Use of Human Beings: Cybernetics and Society

Entropy is an important concept of cybernetics and information theory.

In general, it denotes unavailable energy or disorder. In thermodynamic sense, entropy is a measure of energy, in statistical sense it denotes variation, dispersion or diversity.

Formally, both types of entropy have in common that they are expressed as the logarithm of a probability:

Thermodynamic entropy

S = k log W

is a function of the dispersion W of heat, with k being Boltzmann's constant.

The entropy of an information stream is given with Shannon's equation

$H = - K\sum\limits_{i = 1}^n {p_i ld(p_i )}$

where p denotes the probability of the associated event and H is referred to as the entropy of the information source.

## References Edit

1. Web Dictionary of Cybernetics and Systems