One of the more curious aspects of the development of statistical mechanics in this century has been the somewhat hybrid nature of its practice, particularly in the study of equilibrium systems. Almost without exception, calculations of thermodynamic and other properties of such systems are carried out using the method of ensembles employed by Gibbs. Although these techniques have enjoyed unparalleled success in practical applications, many have sought their justification elsewhere -- for example, by hoping to show that ensemble averages are equivalent to time averages over a long period. This latter exercise comes under the general rubric of ergodic theory which, among other things, also argues that the system will explore the entire energy surface in phase space consistent with its Hamiltonian, over an infinite time interval. But even if systems were proven ergodic (or even quasi-ergodic), one would still have to demonstrate that the finite-time averages of physical measurements have approached their infinite-time limiting values -- and nothing in our history suggests that macroscopic systems continually explore the available energy surface. Indeed, although most systems with N degrees of freedom do not possess N integrals of the motion (i.e., are not completely integrable), they do exhibit some in addition to the energy, and thus the system orbit is confined to a hypersphere of much smaller dimension than the (2N-1)-dimensional energy surface.
It appears that the more thoughtful among the most avid practitioners of ergodic theory now concede that most Hamiltonian systems are not ergodic [6,7]. This is not to say, of course, that the beautiful theorems of ergodic theory are useless, for it is now equally clear that they have definite application in the modern development of nonlinear dynamical systems theory. We only argue that ergodic theory is irrelevant to an understanding of equilibrium statistical mechanics. Independently of the possible ergodicity of many-body systems, the method of ensembles still predicts correct experimental values, and it is only that bottom line that need be explained. Ironically, the notion of ensembles is finally seen to be just an artifice that Gibbs needed to introduce probabilities in relation to system states, and to form a theoretical entropy in terms of his `index of probability'. The only `ensemble' needed is just that of the possible alternatives for global system states, which we emphasize in modern treatments through use of the statistical operator employed in Eq.(1). If one accepts quantum mechanics as providing the fundamental physical description of the system, then contains all the information necessary to describe that system, and no further inflation of the phase space is required.
Gibbs, in writing the very first equation of his great treatise 115 years ago [4], settled the issue for all time:
For the equilibrium of any isolated system it is necessary and sufficient that in all possible variations of the state of the system which do not alter the energy, the variation of the entropy shall either vanish or be negative.
That is, the equilibrium state of a system is characterized by having maximum entropy subject to constraints provided by observation (its definition). These constraints arise in the form of macroscopic data, for we have no microscopic control over the system (there are no `Maxwell demons') -- otherwise, we would have no need for a statistical description via . Note that there is no allusion here to any dynamical state of the system, or equations of motion. In addition, we emphasize that equilibrium is by no means a state of randomness, induced in some way by large N. The N particles surely behave according to the known laws of physics, even though the condition has made it impossible for us to follow their individual motion in any way. The use of `random' in this context, as a euphemism for ignorance, is not necessarily harmful unless it is thought to have physical meaning. Unfortunately, that is usually the case, and we have commented on it in some detail elsewhere [8].