The term information covers signs, signals and messages with their syntactic, semantic and pragmatic aspects.

Uncertainty and thus information content of a random event 'i may be quantitatively described in form of the negative logarithm of its probability with

i = - ld pi

where ld denotes the dual logarithm and pi the probability of the associated event.

The quantity of an information stream produced by an ergodic source (entropy) is given with Shannon's equation.

Ad blocker interference detected!

Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.