Data entropy


Data entropy (information entropy) is the average rate at which data is produced by a random source of data.

The basic model of a data communication system is composed of three elements:

Data entropy provides an absolute limit on the shortest possible average length of a lossless compression encoding of the data produced by a source, and if the Data entropy of the source is less than the channel capacity of the communication channel, the data generated by the Provider can be reliably communicated to the consumer.

Data entropy was introduced as a concept by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication". As expressed by Shannon – the "fundamental problem of communication" is for the consumer to be able to identify what data was generated by the Provider, based on the signal it receives through the channel.

More Information#

There might be more information for this subject on one of the following: