Entropy is an important concept of cybernetics and information theory.
In general, it denotes unavailable energy or disorder. In thermodynamic sense, entropy is a measure of energy, in statistical sense it denotes variation, dispersion or diversity.
Formally, both types of entropy have in common that they are expressed as the logarithm of a probability:
S = k log W
is a function of the dispersion W of heat, with k being Boltzmann's constant.
The entropy of an information stream is given with Shannon's equation
H = - K Sigma i = 1 n (pi ld pi),
where p denotes the probability of the associated event and H is referred to as the entropy of the information source.
- Web Dictionary of Cybernetics and Systems