Meaning of "average information"

information entropy

information entropy

n.
In information theory, a mathematical measure of the degree of randomness in a data set, with greater randomness that implies a greater entropy and a greater predictability that implies a lower entropy. It is also called Shannon entropy .