entropy

(noun)

noun

1. (communication theory) a numerical measure of the uncertainty of an outcome

Similar word(s): information

Definition categories: attribute

2. (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work

- entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity

Similar word(s): randomness

Definition categories: attribute

Sentences with entropy as a noun:

- The thermodynamic free energy is the amount of work that a thermodynamic system can perform; it is the internal energy of a system minus the amount of energy that cannot be used to perform work. That unusable energy is given by the entropy of a system multiplied by the temperature of the system.[1] (Note that, for both Gibbs and Helmholtz free energies, temperature is assumed to be fixed, so entropy is effectively directly proportional to useless energy.)

- Ludwig Boltzmann defined entropy as being directly proportional to the natural logarithm of the number of microstates yielding an equivalent thermodynamic macrostate (with the eponymous constant of proportionality). Assuming (by the fundamental postulate of statistical mechanics), that all microstates are equally probable, this means, on the one hand, that macrostates with higher entropy are more probable, and on the other hand, that for such macrostates, the quantity of informatio