In this lecture we introduce the notion of a *state* in classical mechanics and the evolution equation of states, namely Liouville’s equation. Throughout we will denote phase space by and will use the letter to denote a point in phase space. To begin we introduce a compact form of Hamilton’s equations.

A PDF version of this lecture is available here.

** A compact form of Hamilton’s equations. **

We may write Hamilton’s equations in the following compact form:

- is a point in phase space;
- is the Hamiltonian,
is it’s Gradient; and

- is the matrix

Associated to the matrix is the *symplectic form *

* *

In the previous lecture we derived the evolution equation

where is the Hamiltonian and is the Poisson bracket. In terms of the symplectic form we have

The Poisson bracket has several properties:

- Linearity: for and .
- Skew symmetry: .
- Leibniz rule: .
- Jacobi identity: .

Linearity and skew-symmetry are elementary. The *Leibniz rule* follows from the corresponding fact about differentiation and the observation that

where is the first order differential operator

The Jacobi identity follows from the equality of second order partial derivatives and can be verified by a somewhat involved but direct computation.

An important observation is that the evolution *preserves the Poisson bracket,*

To verify (3) let and compute

by skey-symmetry and the Leibniz rule. Thus satisfies the evolution equation for an observable! However so (3) follows. The evolution preserves all the other algebraic structures of the observable algebra:

** States and mean values **

The state of a classical system is specified by a point . It is useful to introduce a notion of state that allows for some *uncertainty *as to the specific point in phase space. Imagine that we measure a system, but that our measurements are subject to some errors. We shall think of this as resulting in a *probability density *on phase space such that for any

For example if we measure the system to be in a configuration subject to some errors it might be natural to take

In statistical physics one considers other distributions, such as the Gibbs distribution where is the Hamiltonian and with the temperature and Boltzman’s constant.

More generally we may think of a state as a *probability measure* on phase space, which is a map from a suitable colleciton of subsets phase space (including all the open sets) with the properties

The collection of such measures is quite large and includes many measures that are quite unusual with respect to classical analysis. We need not trouble ourselves with this broad class of measures, we are mostly interested in states of the form

where is a nice function on phase space and in the *pure states*, of the form,

(Formally, a pure state is of the form (4) with , where is the mythical Dirac delta function.)

Given a state and an observable , the *distribution* of in is the probability distribution on the real line given by

for suitable . The measure is determined by its *cummulative distribution function *

In particular the *mean value of *is the integral

where the integral may be defined as a Riemann-Stieltjes integral. For states of the form (4) we have

and for pure states we have . (More generally,

where the integral (6) is defined in the sense of Lebesgue.) Note that the mean value is a linear function of the observable.

** Liouville’s Theorem **

We begin with the following

Lemma 1Let be the evolution map associated to a Hamiltonian. For each and eachwhere is the derivative of .

\begin{rem*} Recall that the derivative is the linear transformation on defined by

\end{rem*} To prove the Lemma we will need some facts about determinants. Recall for an matrix

where denotes the identity matrix and the *trace*, . Based on this we have

Proposition 2Let be a map from an interval in the real line into the space matrices. If for some , is invertible and exists then

*Proof:* Note that

where we have used the identity . The proposition follows from (7) and the assumption that is differentiable at .

Returning to the Lemma we have: *Proof:* Let . By the proposition we have

Now, by equality of second order partials

so by (1) we have

where we have abused notation by identifying a linear transformation and it’s matrix. Returning to (8) we find

For we have

by the Group property for . Differentiating with respect to at gives the result.

Thm 3 (Liouville’s Theorem)Let be a domain with finite volume. Let image of under a Hamiltonian flow. Then

*Proof:* This follows from the identity

** Evolution of states and Liouville’s equation. **

We started off with the flow equation on phase space, which can be seen as an evolution map on *pure states. *From this we derived the equation of motion for observables via the identification . We can think of this, instead, as a map on states where we define

What happens if is a pure state? Then we have

That is, the evolution of is the pure state as we should expect. What about a more general state of the form (4)? Then we have

where we have used Lemma 1. We conclude that

(Note that (9) and (10) suggest the formal identity

Play with this and understand why it must be true.)

Thus we may introduce the following more general evolution equation

for the evolution of the *state* of the system at time . This equation is known as Liouville’s equation and it has solutions as in (10).

** Two pictures **

We have introduced *two pictures *of the classical evolution

- The state, or Liouville, picture, in which the state of the system evolves according to (11) and observables remain constant. The mean value of an observable at time is given by
- The observable picture in which the state of the system remains constant and observables evolve according to (2). The mean value of an observable at time is given by .

These two pictures will stay with us when we move to Quantum mechanics. The first is known as the *Schroedinger picture* and the second as the *Heisenberg picture.*