The average amount of information produced by a stochastic (random) source of data. It measures the uncertainty or unpredictability of the data, with higher entropy indicating more unpredictability and lower entropy indicating more certainty. In information theory, entropy quantifies the randomness of a data source.
Entropy
Please Share This Share this content
« Back to Glossary Index