Overview#

In thermodynamics, Entropy (usual symbol S) is a measure of the number of specific ways in which a thermodynamic system may be arranged, commonly understood as a measure of disorder. According to the second law of thermodynamics the entropy of an isolated system never decreases; such a system will spontaneously proceed towards thermodynamic equilibrium, the configuration with maximum entropy.

Systems that are not isolated may decrease in Entropy, provided they increase the Entropy of their environment by at least that same amount. Since Entropy is a state function, the change in the entropy of a system is the same for any process that goes from a given initial state to a given final state, whether the process is reversible or irreversible. However, irreversible processes increase the combined Entropy of the system and its environment.[1]

In information theory, Entropy is the measure of uncertainty associated with a random variable. In terms of Cryptography, Entropy must be supplied by the cipher for injection into the plaintext of a message so as to neutralise the amount of structure that is present in the insecure plaintext message. How it is measured depends on the cipher.[2]

In Cryptography discussions we typically make a Computational Hardness Assumption.

More Information#

There might be more information for this subject on one of the following:

Add new attachment

Only authorized users are allowed to upload new attachments.
« This page (revision-6) was last changed on 04-Mar-2016 12:27 by jim