where Tc is the (absolute) temperature for the phase transition and L is the heat of transition.
To take another example, we show in class for an ideal gas undergoing a reversible process from state i to state f (use the above definition, the ideal gas law, dEint and the first law of thermodynamics):
However, you have no doubt heard that entropy is a measure of the disorder or randomness of a system. The above thermodynamic definition of entropy seems far removed from entropy as randomness. We will now relate entropy to the disorder of a system using ideas from statistical mechanics.
Statistical mechanics is a deeper look into the thermal properties of a system than the look given by thermodynamics. In statistical mechanics entropy is defined
S kB ln W.
If the system is highly disordered, W is large. We will give an example in class that will make sense of the statistical interpretation of entropy and introduce new terms such as microstate, macrostate, and probabilities.
A bit of philosophy.
Some physicists believe the second law of thermodynamics implies the world will end with everything being heat, 'the heat death of the universe'. Everybody goes to hell, so to speak.
Other physicists counter this argument with the fact that Newtonian dynamical laws are time-reversal invariant-- you couldn't tell whether you are watching a movie of an event running backwards or the real thing going forward in time. In the late 1800's a famous French physicist, Henri Poincare, proved a theorem that says for a finite system obeying Newtonian dynamics, the system will always return to its initial state (provided you wait long enough). The waiting time is called a Poincare cycle.
Finally, no one knows the answer to these dilemma's because we now realize that Newtonian mechanics is only partly right. You have to study quantum mechanics if you really want to know how things work or at least how we currently think things work.