FANDOM


Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization.
In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the
negative logarithm of its probability. That is, the more probable the message, the less information it gives. Cliches, for
example, are less illuminating than great poems.

Norbert Wiener, The Human Use of Human Beings: Cybernetics and Society


Entropy is an important concept of cybernetics and information theory.

In general, it denotes unavailable energy or disorder. In thermodynamic sense, entropy is a measure of energy, in statistical sense it denotes variation, dispersion or diversity.

Formally, both types of entropy have in common that they are expressed as the logarithm of a probability:

Thermodynamic entropy

S = k log W

is a function of the dispersion W of heat, with k being Boltzmann's constant.

The entropy of an information stream is given with Shannon's equation

H = - K Sigma i = 1 ...n (pi ld pi)

where p denotes the probability of the associated event and H is referred to as the entropy of the information source.


References Edit

  1. Web Dictionary of Cybernetics and Systems

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.