This page (revision-1) was last changed on 29-Nov-2024 16:16 by UnknownAuthor

Only authorized users are allowed to rename pages.

Only authorized users are allowed to delete pages.

Page revision history

Version Date Modified Size Author Changes ... Change note

Page References

Incoming links Outgoing links

Version management

Difference between version and

At line 1 added 165 lines
!!! Overview
[{$pagename}] (information [entropy]) is the average rate at which [data] is produced by a [random] source of [data].
The basic model of a [data] [communication] system is composed of three elements:
* a source of [data] ([Provider|Provider of services])
* a [communication] [channel]
* a Receiver ([consumer|Consumer of services])
[{$pagename}] provides an absolute limit on the shortest possible average length of a lossless compression encoding of the [data] produced by a source, and if the [{$pagename}] of the source is less than the [channel] capacity of the [communication] [channel], the [data] generated by the [Provider|Provider of services] can be reliably communicated to the [consumer|Consumer of services].
[{$pagename}] was introduced as a concept by Claude Shannon in his [1948|Year 1948] paper "A Mathematical Theory of Communication".
As expressed by Shannon – the "fundamental problem of communication" is for the [consumer|Consumer of services] to be able to identify what [data] was generated by the [Provider|Provider of services], based on the signal it receives through the [channel].
!! More Information
There might be more information for this subject on one of the following:
[{ReferringPagesPlugin before='*' after='\n' }]