FANDOM


The term information covers signs, signals and messages with their syntactic, semantic and pragmatic aspects.

Uncertainty (and thus information content) of a random event i may be quantitatively described in form of the negative logarithm of its probability with

I = - ld pi

where ld denotes the dual logarithm as ld M = lg M / lg 2 = ln M / ln 2

The quantity of an information stream produced by an ergodic source is given with Shannon's equation as

H = - K Sigma i = 1 n (pi ld pi).

H is referred to as the entropy of the information source.

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.