# Entropy

*36*pages on

this wiki

**Entropy** is an important concept of cybernetics and information theory.

In general, it denotes unavailable energy or disorder. In thermodynamic sense, entropy is a measure of energy, in statistical sense it denotes variation, dispersion or diversity.

Formally, both types of entropy have in common that they are expressed as the logarithm of a probability:

Thermodynamic entropy

*S* = *k* log *W*

is a function of the dispersion *W* of heat, with *k* being Boltzmann's constant.

The entropy of an information stream is given with Shannon's equation

*H* = - *K* Sigma _{i = 1} ^{n} (*p*_{i} ld *p*_{i}),

where *p* denotes the probability of the associated event and *H* is referred to as the **entropy** of the information source.