Elementary Statistical Physics
4/5
()
About this ebook
Noteworthy for the philosophical subtlety of its foundations and the elegance of its problem-solving methods, statistical mechanics can be employed in a broad range of applications — among them, astrophysics, biology, chemistry, nuclear and solid state physics, communications engineering, metallurgy, and mathematics. Geared toward graduate students in physics, this text covers such important topics as stochastic processes and transport theory in order to provide students with a working knowledge of statistical mechanics.
To explain the fundamentals of his subject, the author uses the method of ensembles developed by J. Willard Gibbs. Topics include the properties of the Fermi-Dirac and Bose-Einstein distributions; the interrelated subjects of fluctuations, thermal noise, and Brownian movement; and the thermodynamics of irreversible processes.
Negative temperature, magnetic energy, density matrix methods, and the Kramers-Kronig causality relations are treated briefly. Most sections include illustrative problems. Appendix. 28 figures. 1 table.
Related to Elementary Statistical Physics
Titles in the series (100)
Problems in Quantum Mechanics: Third Edition Rating: 3 out of 5 stars3/5Mathematics of Relativity Rating: 0 out of 5 stars0 ratingsThe Theory of Heat Radiation Rating: 3 out of 5 stars3/5Treatise on Physiological Optics, Volume III Rating: 0 out of 5 stars0 ratingsDynamic Light Scattering: With Applications to Chemistry, Biology, and Physics Rating: 5 out of 5 stars5/5Rational Mechanics: The Classic Notre Dame Course Rating: 5 out of 5 stars5/5Light Rating: 4 out of 5 stars4/5Statistical Fluid Mechanics, Volume II: Mechanics of Turbulence Rating: 0 out of 5 stars0 ratingsTheory of Linear Physical Systems: Theory of physical systems from the viewpoint of classical dynamics, including Fourier methods Rating: 0 out of 5 stars0 ratingsEquilibrium Statistical Mechanics Rating: 4 out of 5 stars4/5Get a Grip on Physics Rating: 3 out of 5 stars3/5Gravitational Curvature: An Introduction to Einstein's Theory Rating: 0 out of 5 stars0 ratingsElectronic Structure and the Properties of Solids: The Physics of the Chemical Bond Rating: 3 out of 5 stars3/5A First Look at Perturbation Theory Rating: 4 out of 5 stars4/5Brownian Movement and Molecular Reality Rating: 0 out of 5 stars0 ratingsAn Elementary Survey of Celestial Mechanics Rating: 0 out of 5 stars0 ratingsThe Functions of Mathematical Physics Rating: 0 out of 5 stars0 ratingsA History of Mechanics Rating: 4 out of 5 stars4/5Quantum Mechanics with Applications Rating: 2 out of 5 stars2/5Group Theory in Quantum Mechanics: An Introduction to Its Present Usage Rating: 0 out of 5 stars0 ratingsSymmetry: An Introduction to Group Theory and Its Applications Rating: 4 out of 5 stars4/5Theories of Figures of Celestial Bodies Rating: 0 out of 5 stars0 ratingsAn Elementary Treatise on Theoretical Mechanics Rating: 5 out of 5 stars5/5Readable Relativity Rating: 4 out of 5 stars4/5The Philosophy of Space and Time Rating: 5 out of 5 stars5/5Introduction to Electromagnetic Theory Rating: 0 out of 5 stars0 ratingsQuantum Mechanics of One- and Two-Electron Atoms Rating: 0 out of 5 stars0 ratingsNoise and Fluctuations: An Introduction Rating: 5 out of 5 stars5/5Lectures on Fluid Mechanics Rating: 4 out of 5 stars4/5Thermoelectricity: An Introduction to the Principles Rating: 4 out of 5 stars4/5
Related ebooks
Group Theory and Quantum Mechanics Rating: 4 out of 5 stars4/5Mechanics and Electrodynamics Rating: 5 out of 5 stars5/5Statistical Mechanics Rating: 5 out of 5 stars5/5Statistical Physics Rating: 4 out of 5 stars4/5Classical Mechanics: 2nd Edition Rating: 4 out of 5 stars4/5Statistical Mechanics: Principles and Selected Applications Rating: 4 out of 5 stars4/5Algebraic Methods in Statistical Mechanics and Quantum Field Theory Rating: 0 out of 5 stars0 ratingsQuantum Theory of Many-Particle Systems Rating: 5 out of 5 stars5/5Statistical Physics: A Probabilistic Approach Rating: 0 out of 5 stars0 ratingsElements of Statistical Mechanics Rating: 5 out of 5 stars5/5Introduction to Vector and Tensor Analysis Rating: 4 out of 5 stars4/5The Principles of Quantum Mechanics Rating: 0 out of 5 stars0 ratingsFoundations of Electrodynamics Rating: 0 out of 5 stars0 ratingsIntroduction to Differential Geometry for Engineers Rating: 0 out of 5 stars0 ratingsMathematics for Quantum Chemistry Rating: 5 out of 5 stars5/5Applications of Tensor Analysis Rating: 5 out of 5 stars5/5A Pedestrian Approach to Quantum Field Theory Rating: 5 out of 5 stars5/5Mere Thermodynamics Rating: 0 out of 5 stars0 ratingsQuantum Mechanics in Simple Matrix Form Rating: 4 out of 5 stars4/5Introduction to Special Relativity Rating: 4 out of 5 stars4/5Gravitational Curvature: An Introduction to Einstein's Theory Rating: 0 out of 5 stars0 ratingsDifferential Geometry Rating: 5 out of 5 stars5/5Mathematics of Relativity Rating: 0 out of 5 stars0 ratingsIntroduction to Modern Optics Rating: 4 out of 5 stars4/5General Theory of Relativity Rating: 4 out of 5 stars4/5Mathematical Foundations of Quantum Statistics Rating: 0 out of 5 stars0 ratingsIntroduction To Chemical Physics Rating: 3 out of 5 stars3/5
Physics For You
A Universe from Nothing: Why There Is Something Rather than Nothing Rating: 4 out of 5 stars4/5The God Effect: Quantum Entanglement, Science's Strangest Phenomenon Rating: 4 out of 5 stars4/5The Invisible Rainbow: A History of Electricity and Life Rating: 4 out of 5 stars4/5What If?: Serious Scientific Answers to Absurd Hypothetical Questions Rating: 5 out of 5 stars5/5How to Diagnose and Fix Everything Electronic, Second Edition Rating: 4 out of 5 stars4/5The Dancing Wu Li Masters: An Overview of the New Physics Rating: 4 out of 5 stars4/5The Physics of Wall Street: A Brief History of Predicting the Unpredictable Rating: 4 out of 5 stars4/5The First War of Physics Rating: 4 out of 5 stars4/5Midnight in Chernobyl: The Untold Story of the World's Greatest Nuclear Disaster Rating: 4 out of 5 stars4/5Quantum Physics: A Beginners Guide to How Quantum Physics Affects Everything around Us Rating: 5 out of 5 stars5/5Feynman Lectures Simplified 1A: Basics of Physics & Newton's Laws Rating: 5 out of 5 stars5/5Welcome to the Universe: An Astrophysical Tour Rating: 4 out of 5 stars4/5Quantum Physics for Beginners Rating: 4 out of 5 stars4/5String Theory For Dummies Rating: 4 out of 5 stars4/5Step By Step Mixing: How to Create Great Mixes Using Only 5 Plug-ins Rating: 5 out of 5 stars5/5The Science of God: The Convergence of Scientific and Biblical Wisdom Rating: 3 out of 5 stars3/5What the Bleep Do We Know!?™: Discovering the Endless Possibilities for Altering Your Everyday Reality Rating: 5 out of 5 stars5/5Moving Through Parallel Worlds To Achieve Your Dreams Rating: 4 out of 5 stars4/5Physics I For Dummies Rating: 4 out of 5 stars4/5The End of Everything: (Astrophysically Speaking) Rating: 4 out of 5 stars4/5Physics Essentials For Dummies Rating: 4 out of 5 stars4/5Unlocking Spanish with Paul Noble Rating: 5 out of 5 stars5/5Complexity: The Emerging Science at the Edge of Order and Chaos Rating: 4 out of 5 stars4/5The Theory of Relativity: And Other Essays Rating: 4 out of 5 stars4/5Flatland Rating: 4 out of 5 stars4/5The Reality Revolution: The Mind-Blowing Movement to Hack Your Reality Rating: 4 out of 5 stars4/5QED: The Strange Theory of Light and Matter Rating: 4 out of 5 stars4/5
Reviews for Elementary Statistical Physics
6 ratings0 reviews
Book preview
Elementary Statistical Physics - Charles Kittel
1938.
part 1.
Fundamental principles of statistical mechanics
1. Review of Classical Mechanics
The subject of classical statistical mechanics may be developed most naturally in terms of the conjugate coordinate and momentum variables qi and pi which are used in the classical equations of motion in the Hamiltonian form. The reason for working with coordinates and momenta, rather than coordinates and velocities, will appear when we discuss the Liouville theorem in Sec. 3 below. We now remind the reader of the definitions of the conjugate coordinate and momentum variables and of the content of the Hamilton equations.
We consider a conservative classical system with f degrees of freedom. For N point particles, f will be equal to 3N. We suppose that we have a set of generalized coordinates for the system:
These may be Cartesian, polar, or some other convenient set of coordinates. The generalized velocities associated with these coordinates are
The expression of Newton’s second law by the Lagrangian equations of motion is
where for a simple non-relativistic system the Lagrangian L is given by
Here T is the kinetic energy and V is the potential energy. Equation (1.1) is easily verified if the qi are Cartesian coordinates, for then we have
and, letting qi = x,
but −∂V/∂x is just the x component of the force F, and we have simply
The Hamiltonian form of the equations of motion replaces the f second-order differential equations (1.1) by 2f first-order differential equations. We define the generalized momenta by
is defined as
Then
The terms in d i cancel by the definition (1.6) of the pi. Further, from the Lagrange equations (1.1) we see that
Thus, from (1.8), we must have
These are the Hamilton equations of motion.
Example 1.1. We consider the motion of a classical harmonic oscillator in one dimension. The kinetic energy is
The potential energy will be written as
The Lagrangian is, from (1.2),
The Lagrangian equation of motion is, from (1.1),
which describes a periodic motion with angular frequency ω.
The generalized momentum is, from (1.6),
The Hamiltonian is, from (1.7),
where q ≡ x. The Hamilton equations of motion are, from (1.10),
which only confirms the definition of p, and
in agreement with the Lagrangian equation (1.14).
Example 1.2. We consider the Lagrangian (in gaussian units)
we wish first to show that this describes the motion of a particle of mass M and charge e in the electrostatic potential φ and vector potential A, where A is related to the magnetic field H by
In Cartesian coordinates the Lagrangian equation of motion for the x component is
where we have used the expressions
The last part of (1.23) expresses the fact that, in total differentiation (d/dt) with respect to t, the vector potential A may involve the time not only explicitly through t but also through the coordinates x, y, z. On combining (1.22), (1.23), and (1.1) we obtain the result (1.21) above, which in turn may be rewritten in terms of the magnetic and electric fields as the usual Lorentz force equation
where we have written
The −∂φ/∂x term involves only the electrostatic potential φ, and the −∂Ax/c ∂t term expresses the induced electric field consistent with the Maxwell equation
The generalized momentum is
and the Hamiltonian is
It is noteworthy that the Hamiltonian in the presence of a magnetic field involves essentially the velocity, as p − eA/c is proportional to the velocity, v. The definition p = Mv + eA/c is often described by saying that the generalized momentum in a magnetic field is the sum of a kinetic momentum Mv and a potential momentum eA/cMv² and a potential energy eφ.
Exercise 1.1. Using the Hamiltonian (in the uniform magnetic field
Ax Hy; Ay Hx; Az = 0,
is just the Lorentz force equation.
Exercise 1.2. Find the Hamilton equations of motion of a free particle, using cylindrical coordinates r, z, φ.
2. Systems and Ensembles
In applications of statistical mechanics we usually have in mind some particular real system, which may be, for example, a block of ice; the electrons in a length of copper wire; a reaction vessel containing H2, Cl2, and HCl molecules; a transistor; or the interior of a star. By system we shall usually mean the actual object of interest. Sometimes it proves possible to treat an individual electron or individual proton or individual molecule as a system, but as a rule, except where we specify otherwise, our systems will be of macroscopic dimensions and composed of many particles interacting among themselves in an arbitrary way. The reader should be alert to other usages of the word system which may be found in the literature.
In order to begin to discuss the thermodynamic or statistical properties of a system we must specify all the relevant parameters which are supposed fixed by an external agency. Thus, we will want to know the number of each molecular species, the volume, the energy or temperature, the magnetic field intensity, etc.
Fig. 2.1. Orbit of a system in phase space.
The ultimate extent of the knowledge we may attain concerning a system will be limited by the uncertainty principle of quantum mechanics. But apart from this limitation it is doubtful if we would often wish to examine the complete solution of the equations of motion of a macroscopic system. In the discussion of the classical dynamics of a system of macroscopic dimensions we may be concerned with a system having ~10²³ degrees of freedom. It is often difficult to contemplate solving 10²³ equations of motion. What would we do with the solutions if we had them? It might require a truck to transport the tabulation sheets describing the motion for one second of a single particle of the system. We have the further handicap that to obtain the solutions we must provide the initial conditions on all the coordinates and momenta at zero time. We might not know these.
It is an important experimental fact that we can answer simply many questions concerning systems in or near thermodynamic equilibrium without solving the equations of motion in detail. We say that a system is in thermodynamic equilibrium when the system has been placed in contact with a heat reservoir for a sufficiently long time. A large isolated system will also in time come to thermodynamic equilibrium. We know that under these conditions a system may show a simple and consistent behavior, such that many interesting and practical questions can be answered without a detailed knowledge of the motions of individual particles.
The development of a single system of N atoms in the course of time is known when we know the values of the 6N coordinate and momentum variables p and q as functions of time. We can represent the evolution graphically as a single orbit in the 6N dimensional space of the p’s and q’s. Such a space is known as the phase space or Γ space of the system. In Fig. 2.1 the notation [p], [q] indicates schematically the 3N momentum axes and 3N coordinate axes. Sometimes a six-dimensional phase space is used to represent the motion of a single particle; such a phase space is called μ space, but we shall always be concerned with the Γ space unless explicitly stated otherwise.
The physical quantities of interest to us for a system in thermodynamic equilibrium almost always are time averages over a segment of the orbit in the phase space of the system, the averages being taken over an appropriate interval of time. For example, determinations of pressure, dielectric constant, elastic moduli, and magnetic susceptibility are made normally over time intervals covering many millions of atomic collisions or vibrations. We know experimentally that such determinations are reproducible at a later date provided that the external conditions are unchanged; provided, for example, that the energy and number of particles in the system are conserved. Instead of requiring exact energy conservation it may suffice to maintain the system at a constant temperature. The experimental principle that the pressure of a fixed quantity of gas at constant temperature will be the same if measured in the year 2050 as it was in 1950 is a statement of the stability of the appropriate time average over the motion of the system. The time average itself may be taken over quite short periods: microseconds, seconds, hours, according to the requirements of the system and the measurement apparatus.
It is difficult to set up mathematical machinery to calculate the time averages of interest to us. We note, however, that the complex systems with which we are dealing appear to randomize themselves between observations, provided only that the observations follow each other by a time interval longer than a certain characteristic time called the relaxation time. The relaxation time describes approximately the time required for a fluctuation (spontaneous or arranged) in the properties of the system to damp out. The actual value of the relaxation time will depend on the particular initial non-random property: it may require a year for a crystal of copper sulfate in a beaker of water to diffuse to produce a uniform solution, yet the pressure fluctuation produced when the crystal is dropped in the beaker may damp out in a millisecond.
J. Willard Gibbs made a great advance in the problem of calculating average values of physical quantities. He suggested that instead of taking time averages we imagine a group of similar systems, but suitably randomized, and take averages over this group at one time. The group of similar systems is called an ensemble of systems and is to be viewed as an intellectual construction to simulate and represent at one time the properties of the actual system as developed in the course of time. The word ensemble is used in a special sense in statistical mechanics, a sense unrecognized by most lexicologists.
Fig. 2.2. Portion of an ensemble; the portion shown represents the part of the orbit shown in Fig. 2.1. Each dot corresponds to a system of the ensemble. In an actual ensemble the systems would usually be distributed continuously along or near the orbit.
An ensemble of systems is composed of very many systems all constructed alike as far as we can tell. Each system in the ensemble is a replica of the actual system. Each system in the ensemble is equivalent for all practical purposes to the actual system. It follows from the method of construction that every system in the ensemble is a socially acceptable system—it satisfies all external requirements placed on the actual system and is in this sense just as good as the actual system. The ensemble is randomized suitably in the sense that every configuration of coordinates and velocities accessible to the actual system in the course of time is represented in the ensemble by one or more systems at one instant of time. The ensemble is said to represent the system. In the following sections we consider methods for constructing suitable ensembles. In Fig. 2.2 we illustrate a small part of an ensemble. The part shown represents, by systems at one time, the orbit of the actual system over the time interval shown in Fig. 2.1. In Fig. 2.2 each dot is a system of the ensemble. The complete ensemble will be very much larger and more complicated than the small portion shown in the figure.
The scheme introduced by Gibbs is to replace time averages over a single system by ensemble averages, which are averages at a fixed time over all systems in an ensemble. The problem of demonstrating the equivalence of the two types of averages is the subject of ergodic theory, and is discussed in the books by Khintchine and ter Haar cited in the General References; also, the book by Tolman gives an excellent and readable discussion of the general question. It is certainly plausible that the two averages might be equivalent, but it has not been proved in general that they are exactly equivalent. It may be argued, as Tolman has done, that the ensemble average really corresponds better to the actual situation than does the time average. We never really know the initial conditions of the system, so we do not know exactly how to take the time average. The ensemble average describes our ignorance appropriately.
Our next problem is to find out how to construct suitable ensembles. If the system to be represented by an ensemble is in thermal equilibrium, then we require that the ensemble averages must be independent of time. This is a reasonable requirement. The macroscopic average physical properties of a system in thermal equilibrium do not change with time; therefore our representative ensemble must be such that the ensemble averages do not depend on the particular instant of time at which the averages are taken.
Exercise 2.1. Consider a simple linear harmonic oscillator; show that the orbit in phase space is an ellipse.
3. The Liouville Theorem
An ensemble may be specified by giving the number of systems
in the volume element dp dq of phase space. Here, for N particles,
That is, we specify an ensemble by giving the density of systems in phase space (Γ space)—the systems are then said to be represented in a statistical sense by the ensemble P (p, q).
The ensemble average of a quantity A (p, q) is defined accordingly as
To the extent that an ensemble represents the behavior of a physical system, the ensemble average of a quantity will give us the average value of a physical quantity for the actual system.
One of the requirements for a satisfactory ensemble to represent a system in statistical equilibrium is that the composition of the ensemble should be independent of time: ∂P/∂t should be zero. We must now examine the implications of this requirement. First, we note that, if the total number of systems in an ensemble does not change with time, the function P must satisfy the usual equation of continuity:
This is merely a statement that what flows into an element of volume either comes out again or builds up in the volume element. Here v is the velocity and div the divergence operator in the 6N-dimensional phase space. If we carry out the divergence operation, the equation of continuity becomes
or
The factor in parentheses is identically zero: by Hamilton’s equations
because
From (3.5) and (3.6) we have the