Entropy

In thermodynamics, entropy (usual symbol S) is a measure of the number of specific ways in which a thermodynamic system may be arranged, commonly understood as a measure of disorder. 

According to the second law of thermodynamics the entropy of an isolated system never decreases; such a system will spontaneously evolve toward thermodynamic equilibrium, the configuration with maximum entropy. 

Systems that are not isolated may decrease in entropy, provided they increase the entropy of their environment by at least that same amount.

 Since entropy is a state function, the change in the entropy of a system is the same for any process that goes from a given initial state to a given final state, whether the process is reversible or irreversible.

 However, irreversible processes increase the combined entropy of the system and its environment.


The change in entropy (ΔS) of a system was originally defined for a thermodynamically reversible process as

15d79c8fd74a40d2adabc9d236d8e419


where T is the absolute temperature of the system, dividing an incremental reversible transfer of heat into that system (dQ). (If heat is transferred out the sign would be reversed giving a decrease in entropy of the system.) The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic description of the contents of a system. 


Source:  Entropy. (2014, December 17). In Wikipedia, The Free Encyclopedia. Retrieved 22:56, December 19, 2014, from http://en.wikipedia.org/w/index.php?title=Entropy&oldid=638475781


Comparison with the more recent definition 2018 wikipedia entry

In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number Ω of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Under the assumption that each microstate is equally probable, the entropy S is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant kB. Formally,


Macroscopic systems typically have a very large number Ω of possible microscopic configurations. For example, the entropy of an ideal gas is proportional to the number of gas molecules N. Roughly twenty liters of gas at room temperature and atmospheric pressure has N ≈ 6×1023 (Avogadro's number). At equilibrium, each of the Ω ≈ eN configurations can be regarded as random and equally likely.

The second law of thermodynamics states that the entropy of an isolated system never decreases. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that amount so that the total entropy increases. Entropy is a function of the state of the system, so the change in entropy of a system is determined by its initial and final states. In the idealization that a process is reversible, the entropy does not change, while irreversible processes always increase the total entropy.

Because it is determined by the number of random microstates, entropy is related to the amount of additional information needed to specify the exact physical state of a system, given its macroscopic specification. For this reason, it is often said that entropy is an expression of the disorder, or randomness of a system, or of the lack of information about it. The concept of entropy plays a central role in information theory.

Boltzmann's constant, and therefore entropy, have dimensions of energy divided by temperature, which has a unit of joules per kelvin (J K−1) in the International System of Units (or kg m2 s−2 K−1 in terms of base units). The entropy of a substance is usually given as an intensive property—either entropy per unit mass (SI unit: J K−1 kg−1) or entropy per unit amount of substance (SI unit: J K−1 mol−1)


source adapted from: Wikipedia contributors. (2018, November 11). Entropy. In Wikipedia, The Free Encyclopedia. Retrieved 01:51, November 26, 2018, from https://en.wikipedia.org/w/index.php?title=Entropy&oldid=868389128