Entropy is an important concept of cybernetics and information theory.

In general, it denotes unavailable energy or disorder. In thermodynamic sense, entropy is a measure of energy, in statistical sense it denotes variation, dispersion or diversity.

Formally, both types of entropy have in common that they are expressed as the logarithm of a probability:

Thermodynamic entropy

S = k log W

is a function of the dispersion W of heat, with k being Boltzmann's constant.

The entropy of an information stream is given with Shannon's equation

H = - K Sigma i = 1 n (pi ld pi),

where p denotes the probability of the associated event and H is referred to as the entropy of the information source.

References Edit

  1. Web Dictionary of Cybernetics and Systems

Ad blocker interference detected!

Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.