!!! Overview
In thermodynamics, [{$pagename}] (usual symbol S) is a measure of the number of specific ways in which a thermodynamic system may be arranged, commonly understood as a measure of disorder. According to the second law of thermodynamics the entropy of an isolated system never decreases; such a system will spontaneously proceed towards thermodynamic equilibrium, the configuration with maximum entropy. 

Systems that are not isolated may decrease in [{$pagename}], provided they increase the [{$pagename}] of their environment by at least that same amount. Since [{$pagename}] is a state function, the change in the entropy of a system is the same for any process that goes from a given initial state to a given final state, whether the process is reversible or irreversible. However, irreversible processes increase the combined [{$pagename}] of the system and its environment.[1]

In information theory, [{$pagename}] is the measure of uncertainty associated with a random variable. In terms of [Cryptography], [{$pagename}] must be supplied by the [cipher] for injection into the plaintext of a [message] so as to neutralise the amount of structure that is present in the insecure plaintext [message]. How it is measured depends on the [cipher].[2]

In [Cryptography] discussions we typically make a [Computational Hardness Assumption].

!! More Information
There might be more information for this subject on one of the following:
[{ReferringPagesPlugin before='*' after='\n' }]
----
* [#1] - [Entropy|https://en.wikipedia.org/wiki/Entropy|target='_blank'] - based on information obtained 2015-08-09
* [#2] - [What is entropy?|https://crypto.stackexchange.com/questions/378/what-is-entropy|target='_blank'] - based on information obtained 2015-08-09