Below, is the q-ary entropy function (defined for and extended by continuity to ).
The special case of information entropy for a random variable with two outcomes is the binary entropy function, usually taken to the logarithmic base 2:
When the binary entropy function attains its maximum value.
The derivative of the binary entropy function may be expressed as the negative of the logit function:
The entropy of such a process is given by the binary entropy function.
This reflects the original statistical entropy function introduced by Ludwig Boltzmann in 1872.
To prove this, a much more sophisticated approach using the entropy function was necessary.
An important special case of this is the binary entropy function:
The logit function is the negative of the derivative of the binary entropy function.
Here is the q-ary entropy function defined as follows: