Entropy

In thermodynamics, entropy (usual symbol S) is a measure of the number of specific ways in which a thermodynamic system may be arranged, commonly understood as a measure of disorder. 

According to the second law of thermodynamics the entropy of an isolated system never decreases; such a system will spontaneously evolve toward thermodynamic equilibrium, the configuration with maximum entropy. 

Systems that are not isolated may decrease in entropy, provided they increase the entropy of their environment by at least that same amount.

 Since entropy is a state function, the change in the entropy of a system is the same for any process that goes from a given initial state to a given final state, whether the process is reversible or irreversible.

 However, irreversible processes increase the combined entropy of the system and its environment.


The change in entropy (ΔS) of a system was originally defined for a thermodynamically reversible process as

15d79c8fd74a40d2adabc9d236d8e419


where T is the absolute temperature of the system, dividing an incremental reversible transfer of heat into that system (dQ). (If heat is transferred out the sign would be reversed giving a decrease in entropy of the system.) The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic description of the contents of a system. 


Source:  Entropy. (2014, December 17). In Wikipedia, The Free Encyclopedia. Retrieved 22:56, December 19, 2014, from http://en.wikipedia.org/w/index.php?title=Entropy&oldid=638475781