# Information

*36*pages on

this wiki

The term **information** covers signs, signals and messages with their syntactic, semantic and pragmatic aspects.

Uncertainty and thus information content of a random event 'i* may be quantitatively described in form of the negative logarithm of its probability with*

i= - ld p_{i}

where ld denotes the dual logarithm and p_{i} the probability of the associated event.

The quantity of an information stream produced by an ergodic source (entropy) is given with Shannon's equation.