Subsections


4.4 Expectation Value and Standard Deviation

It is a striking consequence of quantum mechanics that physical quantities may not have a value. This occurs whenever the wave function is not an eigen­function of the quantity of inter­est. For example, the ground state of the hydrogen atom is not an eigen­function of the position operator ${\widehat x}$, so the $x$-position of the electron does not have a value. According to the orthodox inter­pretation, it cannot be predicted with certainty what a measurement of such a quantity will produce.

However, it is possible to say something if the same measurement is done on a large number of systems that are all the same before the measurement. An example would be $x$-position measurements on a large number of hydrogen atoms that are all in the ground state before the measurement. In that case, it is relatively straight­forward to predict what the average, or “expec­tation value,” of all the measurements will be.

The expec­tation value is certainly not a replacement for the classical value of physical quantities. For example, for the hydrogen atom in the ground state, the expec­tation position of the electron is in the nucleus by symmetry. Yet because the nucleus is so small, measurements will never find it there! (The typical measurement will find it a distance comparable to the Bohr radius away.) Actually, that is good news, because if the electron would be in the nucleus as a classical particle, its potential energy would be almost minus infinity instead of the correct value of about -27 eV. It would be a very different universe. Still, having an expec­tation value is of course better than having no infor­mation at all.

The average discrepancy between the expec­tation value and the actual measurements is called the “standard deviation.”. In the hydrogen atom example, where typically the electron is found a distance comparable to the Bohr radius away from the nucleus, the standard deviation in the $x$-position turns out to be exactly one Bohr radius. (The same of course for the standard deviations in the $y$ and $z$ positions away from the nucleus.)

In general, the standard deviation is the quanti­tative measure for how much uncertainty there is in a physical value. If the standard deviation is very small compared to what you are inter­ested in, it is probably OK to use the expec­tation value as a classical value. It is perfectly fine to say that the electron of the hydrogen atom that you are measuring is in your lab but it is not OK to say that it has countless electron volts of negative potential energy because it is in the nucleus.

This section discusses how to find expec­tation values and standard deviations after a brief introduction to the underlying ideas of statistics.


Key Points
$\begin{picture}(15,5.5)(0,-3)
\put(2,0){\makebox(0,0){\scriptsize\bf0}}
\put(12...
...\thicklines \put(3,0){\line(1,0){12}}\put(11.5,-2){\line(1,0){3}}
\end{picture}$
The expec­tation value is the average value obtained when doing measurements on a large number of initially identical systems. It is as close as quantum mechanics can come to having classical values for uncertain physical quantities.

$\begin{picture}(15,5.5)(0,-3)
\put(2,0){\makebox(0,0){\scriptsize\bf0}}
\put(12...
...\thicklines \put(3,0){\line(1,0){12}}\put(11.5,-2){\line(1,0){3}}
\end{picture}$
The standard deviation is how far the individual measurements on average deviate from the expec­tation value. It is the quanti­tative measure of uncertainty in quantum mechanics.


4.4.1 Statistics of a die

Since it seems to us humans as if, in Einstein's words, God is playing dice with the universe, it may be a worthwhile idea to examine the statistics of a die first.

For a fair die, each of the six numbers will, on average, show up a fraction 1/6 of the number of throws. In other words, each face has a proba­bility of 1/6.

The average value of a large number of throws is called the expec­tation value. For a fair die, the expec­tation value is 3.5. After all, number 1 will show up in about 1/6 of the throws, as will numbers 2 through 6, so the average is

\begin{displaymath}
\frac{\mbox{(number of throws)}\times
(\frac16\,1+\frac1...
...4+\frac16\,5+\frac16\,6)}
{\mbox{number of throws}}
= 3.5
\end{displaymath}

The general rule to get the expec­tation value is to sum the proba­bility for each value times the value. In this example:

\begin{displaymath}
{\textstyle\frac{1}{6}}\,1 + {\textstyle\frac{1}{6}}\,2 + ...
...{\textstyle\frac{1}{6}}\,5 + {\textstyle\frac{1}{6}}\,6 = 3.5
\end{displaymath}

Note that the name “expec­tation value” is very poorly chosen. Even though the average value of a lot of throws will be 3.5, you would surely not expect to throw 3.5. But it is probably too late to change the name now.

The maximum possible deviation from the expec­tation value does of course occur when you throw a 1 or a 6; the absolute deviation is then $\vert 1-3.5\vert$ $\vphantom0\raisebox{1.5pt}{$=$}$ $\vert 6-3.5\vert$ $\vphantom0\raisebox{1.5pt}{$=$}$ 2.5. It means that the possible values produced by a throw can deviate as much as 2.5 from the expec­tation value.

However, the maximum possible deviation from the average is not a useful concept for quantities like position, or for the energy levels of the harmonic oscillator, where the possible values extend all the way to infinity. So, instead of the maximum deviation from the expec­tation value, some average deviation is better. The most useful of those is called the “standard deviation”, denoted by $\sigma$. It is found in two steps: first the average square deviation from the expec­tation value is computed, and then a square root is taken of that. For the die that works out to be:

\begin{eqnarray*}
\sigma & = & \big[
{\textstyle\frac{1}{6}}(1-3.5)^2+{\text...
...{\textstyle\frac{1}{6}}(6-3.5)^2
\big]^{1/2} \\
& = & 1.71
\end{eqnarray*}

On average then, the throws are 1.71 points off from 3.5.


Key Points
$\begin{picture}(15,5.5)(0,-3)
\put(2,0){\makebox(0,0){\scriptsize\bf0}}
\put(12...
...\thicklines \put(3,0){\line(1,0){12}}\put(11.5,-2){\line(1,0){3}}
\end{picture}$
The expec­tation value is obtained by summing the possible values times their proba­bilities.

$\begin{picture}(15,5.5)(0,-3)
\put(2,0){\makebox(0,0){\scriptsize\bf0}}
\put(12...
...\thicklines \put(3,0){\line(1,0){12}}\put(11.5,-2){\line(1,0){3}}
\end{picture}$
To get the standard deviation, first find the average square deviation from the expec­tation value, then take a square root of that.

4.4.1 Review Questions
  1. Suppose you toss a coin a large number of times, and count heads as one, tails as two. What will be the expec­tation value?

    Solution esda-a

  2. Continuing this example, what will be the maximum deviation?

    Solution esda-b

  3. Continuing this example, what will be the standard deviation?

    Solution esda-c

  4. Have I got a die for you! By means of a small piece of lead integrated into its light-weight structure, it does away with that old-fashioned uncertainty. It comes up six every time! What will be the expec­tation value of your throws? What will be the standard deviation?

    Solution esda-d


4.4.2 Statistics of quantum operators

The expec­tation values of the operators of quantum mechanics are defined in the same way as those for the die.

Consider an arbitrary physical quantity, call it $a$, and assume it has an associated operator $A$. For example, if the physical quantity $a$ is the total energy $E$, $A$ will be the Hamiltonian $H$.

The equivalent of the face values of the die are the values that the quantity $a$ can take, and according to the orthodox inter­pretation, that are the eigen­values

\begin{displaymath}
a_1,\, a_2,\, a_3,\, \ldots
\end{displaymath}

of the operator $A$.

Next, the proba­bilities of getting those values are according to quantum mechanics the square magnitudes of the coefficients when the wave function is written in terms of the eigen­functions of $A$. In other words, if $\alpha_1$, $\alpha_2$, $\alpha_3$, ...are the eigen­functions of operator $A$, and the wave function is

\begin{displaymath}
\Psi = c_1 \alpha_1 + c_2 \alpha_2 + c_3 \alpha_3 + \ldots
\end{displaymath}

then $\vert c_1\vert^2$ is the proba­bility of value $a_1$, $\vert c_2\vert^2$ the proba­bility of value $a_2$, etcetera.

The expec­tation value is written as $\big\langle a\big\rangle $, or as $\big\langle A\big\rangle $, whatever is more appealing. Like for the die, it is found as the sum of the proba­bility of each value times the value:

\begin{displaymath}
\big\langle a\big\rangle = \vert c_1\vert^2 a_1 + \vert c_2\vert^2 a_2 + \vert c_3\vert^2 a_3 + \ldots
\end{displaymath}

Of course, the eigen­functions might be numbered using multiple indices; that does not really make a difference. For example, the eigen­functions $\psi_{nlm}$ of the hydrogen atom are numbered with three indices. In that case, if the wave function of the hydrogen atom is

\begin{displaymath}
\Psi =
c_{100} \psi_{100}+
c_{200} \psi_{200}+
c_{21...
...{211}+
c_{21-1} \psi_{21-1}+
c_{300} \psi_{300}+
\ldots
\end{displaymath}

then the expec­tation value for energy will be, noting that $E_1$ $\vphantom0\raisebox{1.5pt}{$=$}$ $\vphantom0\raisebox{1.5pt}{$-$}$13.6 eV, $E_2$ $\vphantom0\raisebox{1.5pt}{$=$}$ $\vphantom0\raisebox{1.5pt}{$-$}$3.4 eV, ...:

\begin{displaymath}
\big\langle E\big\rangle =
- \vert c_{100}\vert^2 13.6 \...
...\mbox{ eV}
- \vert c_{211}\vert^2 3.4 \mbox{ eV}
- \ldots
\end{displaymath}

Also, the expec­tation value of the square angular momentum will be, recalling that its eigen­values are $l(l+1)\hbar^2$,

\begin{displaymath}
\langle L^2\rangle =
\vert c_{100}\vert^2 0 +
\vert c_...
...{21-1}\vert^2 2 \hbar^2 +
\vert c_{300}\vert^2 0 +
\ldots
\end{displaymath}

Also, the expec­tation value of the $z$-component of angular momentum will be, recalling that its eigen­values are $m\hbar$,

\begin{displaymath}
\langle L_z\rangle =
\vert c_{100}\vert^2 0 +
\vert c_...
...t c_{21-1}\vert^2 \hbar +
\vert c_{300}\vert^2 0 +
\ldots
\end{displaymath}


Key Points
$\begin{picture}(15,5.5)(0,-3)
\put(2,0){\makebox(0,0){\scriptsize\bf0}}
\put(12...
...\thicklines \put(3,0){\line(1,0){12}}\put(11.5,-2){\line(1,0){3}}
\end{picture}$
The expec­tation value of a physical quantity is found by summing its eigen­values times the proba­bility of measuring that eigenvalue.

$\begin{picture}(15,5.5)(0,-3)
\put(2,0){\makebox(0,0){\scriptsize\bf0}}
\put(12...
...\thicklines \put(3,0){\line(1,0){12}}\put(11.5,-2){\line(1,0){3}}
\end{picture}$
To find the proba­bilities of the eigen­values, the wave function $\Psi$ can be written in terms of the eigen­functions of the physical quantity. The proba­bilities will be the square magnitudes of the coefficients of the eigen­functions.

4.4.2 Review Questions
  1. The 2p$_x$ pointer state of the hydrogen atom was defined as

    \begin{displaymath}
\frac 1{\sqrt 2}\left(-\psi_{211}+\psi_{21-1}\right).
\end{displaymath}

    What are the expec­tation values of energy, square angular momentum, and $z$ angular momentum for this state?

    Solution esdb-a

  2. Continuing the previous question, what are the standard deviations in energy, square angular momentum, and $z$ angular momentum?

    Solution esdb-b


4.4.3 Simplified expressions

The procedure described in the previous section to find the expec­tation value of a quantity is unwieldy: it requires that first the eigen­functions of the quantity are found, and next that the wave function is written in terms of those eigen­functions. There is a quicker way.

Assume that you want to find the expec­tation value, $\big\langle a\big\rangle $ or $\big\langle A\big\rangle $, of some quantity $a$ with associated operator $A$. The simpler way to do it is as an inner product:

\begin{displaymath}
\fbox{$\displaystyle
\big\langle A\big\rangle = \langle \Psi\vert A \vert \Psi\rangle.
$}
\end{displaymath} (4.43)

(Recall that $\langle\Psi\vert A\vert\Psi\rangle$ is just the inner product $\langle\Psi\vert A\Psi\rangle$; the additional separating bar is often visually convenient, though.) This formula for the expec­tation value is easily remembered as “leaving out $\Psi$” from the inner product bracket. The reason that $\langle\Psi\vert A\vert\Psi\rangle$ works for getting the expec­tation value is given in derivation {D.17}.

The simplified expression for the expec­tation value can also be used to find the standard deviation, $\sigma_A$ or $\sigma_a$:

\begin{displaymath}
\fbox{$\displaystyle
\sigma_A =
\sqrt{\langle(A - \big\langle A\big\rangle )^2\rangle}
$}
%
\end{displaymath} (4.44)

where $\big\langle(A-\langle{A}\rangle)^2\big\rangle $ is the inner product $\big\langle\Psi\big\vert(A-\langle{A}\rangle)^2\Psi\big\rangle $.


Key Points
$\begin{picture}(15,5.5)(0,-3)
\put(2,0){\makebox(0,0){\scriptsize\bf0}}
\put(12...
...\thicklines \put(3,0){\line(1,0){12}}\put(11.5,-2){\line(1,0){3}}
\end{picture}$
The expec­tation value of a quantity $a$ with operator $A$ can be found as $\big\langle A\big\rangle $ $\vphantom0\raisebox{1.5pt}{$=$}$ $\big\langle\Psi\big\vert A\Psi\big\rangle $.

$\begin{picture}(15,5.5)(0,-3)
\put(2,0){\makebox(0,0){\scriptsize\bf0}}
\put(12...
...\thicklines \put(3,0){\line(1,0){12}}\put(11.5,-2){\line(1,0){3}}
\end{picture}$
Similarly, the standard deviation can be found using the expression $\sigma_A$ $\vphantom0\raisebox{1.5pt}{$=$}$ $\sqrt{\big\langle(A-\big\langle A\big\rangle )^2\big\rangle }$.

4.4.3 Review Questions
  1. The 2p$_x$ pointer state of the hydrogen atom was defined as

    \begin{displaymath}
\frac 1{\sqrt 2}\left(-\psi_{211}+\psi_{21-1}\right).
\end{displaymath}

    where both $\psi_{211}$ and $\psi_{21-1}$ are eigen­functions of the total energy Hamiltonian $H$ with eigenvalue $E_2$ and of square angular momentum $\L ^2$ with eigenvalue $2\hbar^2$; however, $\psi_{211}$ is an eigen­function of $z$ angular momentum $\L _z$ with eigenvalue $\hbar $, while $\psi_{21-1}$ is one with eigenvalue $\vphantom0\raisebox{1.5pt}{$-$}$$\hbar $. Evaluate the expec­tation values of energy, square angular momentum, and $z$ angular momentum in the 2p$_x$ state using inner products. (Of course, since 2p$_x$ is already written out in terms of the eigen­functions, there is no simplifi­cation in this case.)

    Solution esdb2-a

  2. Continuing the previous question, evaluate the standard deviations in energy, square angular momentum, and $z$ angular momentum in the 2p$_x$ state using inner products.

    Solution esdb2-b


4.4.4 Some examples

This section gives some examples of expec­tation values and standard deviations for known wave functions.

First consider the expec­tation value of the energy of the hydrogen atom in its ground state $\psi_{100}$. The ground state is an energy eigen­function with the lowest possible energy level $E_1$ $\vphantom0\raisebox{1.5pt}{$=$}$ $\vphantom0\raisebox{1.5pt}{$-$}$13.6 eV as eigenvalue. So, according to the orthodox inter­pretation, energy measurements of the ground state can only return the value $E_1$, with 100% certainty.

Clearly, if all measurements return the value $E_1$, then the average value must be that value too. So the expec­tation value $\big\langle E\big\rangle $ should be $E_1$. In addition, the measurements will never deviate from the value $E_1$, so the standard deviation $\sigma_E$ should be zero.

It is instructive to check those conclusions using the simplified expressions for expec­tation values and standard deviations from the previous subsection. The expec­tation value can be found as:

\begin{displaymath}
\big\langle E\big\rangle = \big\langle H\big\rangle = \langle\Psi\vert H\vert\Psi\rangle
\end{displaymath}

In the ground state

\begin{displaymath}
\Psi = c_{100} \psi_{100}
\end{displaymath}

where $c_{100}$ is a constant of magnitude one, and $\psi_{100}$ is the ground state eigen­function of the Hamiltonian $H$ with the lowest eigenvalue $E_1$. Substituting this $\Psi$, the expec­tation value of the energy becomes

\begin{displaymath}
\big\langle E\big\rangle = \langle c_{100}\psi_{100}\vert ...
...c_{100}^* c_{100} E_1 \langle\psi_{100}\vert\psi_{100}\rangle
\end{displaymath}

since $H\psi_{100}$ $\vphantom0\raisebox{1.5pt}{$=$}$ $E_1\psi_{100}$ by the definition of eigen­function. Note that constants come out of the inner product bra as their complex conjugate, but unchanged out of the ket. The final expression shows that $\big\langle E\big\rangle $ $\vphantom0\raisebox{1.5pt}{$=$}$ $E_1$ as it should, since $c_{100}$ has magnitude one, while $\langle\psi_{100}\vert\psi_{100}\rangle$ $\vphantom0\raisebox{1.5pt}{$=$}$ 1 because proper eigen­functions are normalized to one. So the expec­tation value checks out OK.

The standard deviation

\begin{displaymath}
\sigma_E = \sqrt{\langle(H-\big\langle E\big\rangle )^2\rangle}
\end{displaymath}

checks out OK too:

\begin{displaymath}
\sigma_E = \sqrt{\langle\psi_{100}\vert(H-E_1)^2\psi_{100}\rangle}
\end{displaymath}

and since $H\psi_{100}$ $\vphantom0\raisebox{1.5pt}{$=$}$ $E_1\psi_{100}$, you have that $(H-E_1)\psi_{100}$ is zero, so $\sigma_E$ is zero as it should be.

In general,

If the wave function is an eigen­function of the measured variable, the expec­tation value will be the eigenvalue, and the standard deviation will be zero.
To get uncertainty, in other words, a non­zero standard deviation, the wave function should not be an eigen­function of the quantity being measured.

For example, the ground state of the hydrogen atom is an energy eigen­function, but not an eigen­function of the position operators. The expec­tation value for the position coordinate $x$ can still be found as an inner product:

\begin{displaymath}
\big\langle x\big\rangle = \langle\psi_{100}\vert{\widehat...
...7pt\int}x \vert\psi_{100}\vert^2 {\,\rm d}x{\rm d}y {\rm d}z.
\end{displaymath}

This integral is zero. The reason is that $\vert\psi_{100}\vert^2$, shown as grey scale in figure 4.9, is symmetric around $x$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0; it has the same value at a negative value of $x$ as at the corre­sponding positive value. Since the factor $x$ in the integrand changes sign, integr­ation values at negative $x$ cancel out against those at positive $x$. So $\big\langle x\big\rangle $ $\vphantom0\raisebox{1.5pt}{$=$}$ 0.

The position coordinates $y$ and $z$ go the same way, and it follows that the expec­tation value of position is at $(x,y,z)$ $\vphantom0\raisebox{1.5pt}{$=$}$ (0,0,0); the expec­tation position of the electron is in nucleus.

In fact, all basic energy eigen­functions $\psi_{nlm}$ of the hydrogen atom, like figures 4.9, 4.10, 4.11, 4.12, as well as the combin­ation states 2p$_x$ and 2p$_y$ of figure 4.13, have a symmetric proba­bility distribution, and all have the expec­tation value of position in the nucleus. (For the hybrid states discussed later, that is no longer true.)

But don’t really expect to ever find the electron in the negligible small nucleus! You will find it at locations that are on average one standard deviation away from it. For example, in the ground state

\begin{displaymath}
\sigma_x=\sqrt{\langle(x-\big\langle x\big\rangle )^2\rang...
...^2 \vert\psi_{100}(x,y,z)\vert^2 {\,\rm d}x{\rm d}y {\rm d}z}
\end{displaymath}

which is positive since the integrand is everywhere positive. So, the results of $x$-position measurements are uncertain, even though they average out to the nominal position $x$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0. The negative experi­mental results for $x$ average away against the positive ones. The same is true in the $y$ and $z$ directions. Thus the expec­tation position becomes the nucleus even though the electron will really never be found there.

If you actually do the integral above, (it is not difficult in spherical coordinates,) you find that the standard deviation in $x$ equals the Bohr radius. So on average, the electron will be found at an $x$-distance equal to the Bohr radius away from the nucleus. Similar deviations will occur in the $y$ and $z$ directions.

The expec­tation value of linear momentum in the ground state can be found from the linear momentum operator ${\widehat p}_x$ $\vphantom0\raisebox{1.5pt}{$=$}$ $\hbar\partial$$\raisebox{.5pt}{$/$}$${\rm i}\partial{x}$:

\begin{displaymath}
\langle p_x\rangle =
\langle \psi_{100}\vert{\widehat p}...
...l\frac12\psi_{100}^2}{\partial x} {\,\rm d}x{\rm d}y {\rm d}z
\end{displaymath}

This is again zero, since differen­tiation turns a symmetric function into an anti­symmetric one, one which changes sign between negative and corre­sponding positive positions. Alternatively, just perform integr­ation with respect to x, noting that the wave function is zero at infinity.

More generally, the expec­tation value for linear momentum is zero for all the energy eigen­functions; that is a consequence of Ehrenfest's theorem covered in chapter 7.2.1. The standard deviations are again non­zero, so that linear momentum is uncertain like position is.

All these observ­ations carry over in the same way to the eigen­functions $\psi_{n_xn_yn_z}$ of the harmonic oscillator. They too all have the expec­tation values of position at the origin, in other words in the nucleus, and the expec­tation linear momenta equal to zero.

If combin­ations of energy eigen­functions are considered, it changes. Such combin­ations may have non­trivial expec­tation positions and linear momenta. A discussion will have to wait until chapter 7.


Key Points
$\begin{picture}(15,5.5)(0,-3)
\put(2,0){\makebox(0,0){\scriptsize\bf0}}
\put(12...
...\thicklines \put(3,0){\line(1,0){12}}\put(11.5,-2){\line(1,0){3}}
\end{picture}$
Examples of definite and uncertain quantities were given for example wave functions.

$\begin{picture}(15,5.5)(0,-3)
\put(2,0){\makebox(0,0){\scriptsize\bf0}}
\put(12...
...\thicklines \put(3,0){\line(1,0){12}}\put(11.5,-2){\line(1,0){3}}
\end{picture}$
A quantity has a definite value when the wave function is an eigen­function of the operator corre­sponding to that quantity.