The Talk.Origins Archive: Exploring the Creation/Evolution Controversy

The Constancy of Constants
From the thread "Flood dating discrepancies"

Post of the Month: October 2001
by Steve Carlip

Subject:    Re: Flood dating discrepancies
Date:       October 14, 2001
Author:     Steve Carlip
Message-ID: 9qd6m4$4h2$

Steve Schulin <> wrote:

> [...] you don't mention a single assumption that you make.
> I can name one assumption that's coming under attack
> from many directions -- the assumption that radioactive
> decay rates have remained relatively constant since, well, I
> call it Creation. The scientific presumption of such constants
> has many practical benefits, but claiming that such assumptions
> somehow preclude any possibility of other assumptions being
> true is very unscientific.

I don't know whether to laugh or cry. Laugh, I guess. But I wonder if Steve Schulin has any idea how frustrating it is to hear a nonphysicist to tell physicists what their ``assumptions'' are without first learning some of the relevant physics, and without trying to find out what the experimental evidence is.

Physicists would *love* to find evidence that radioactive decay rates are not constant, partly because it would make life more interesting (physics is the most fun when there's some mystery to be understood), and partly because the right rate of change could explain some coincidences in cosmology. The idea that ``constants'' may not be constant goes back at least to Dirac (Nobel Prize, 1933), and it has spawned a huge effort to search for evidence of change.

So far, the evidence is clear: the constants of nature really are constant. There's been some excitement recently over some tentative indications that the fine structure constant, which determines some decay rates (and lots of other things), may have changed, but if it has changed, it's been by less than about one part in 10^15 per year---see Webb et al., Phys. Rev. Lett. 87, 091301 (2001).

Let me give an example:

The supernova SN1987A was observed in 1987, when we saw a star ``explode'' about 170,000 light years from Earth. This distance is unambiguous---it can be obtained by trigonometry, with no assumptions except that Euclidean geometry is nearly right in and near our galaxy.

After the initial supernova, much of the energy produced by SN1987A came from the radioactive decays of cobalt-56 and cobalt-57. These decays can be identified because they emit gamma rays of very precise frequencies, which are easily detectable. We've looked at the decay rates, and they're exactly the same as the ones we observe in the laboratory. So there's been no change in at least the 170,000 years it took for the light to reach us.

Note that you don't have to assume a constant speed of light here---the supernova gives an independent check. That's because many of the features of a supernova, from the amount of energy and the number of neutrinos emitted to the spectral lines of the elements in the ``afterglow,'' depend sensitively on the speed of light. If, for example, the speed of light had been different when the supernova occurred, we wouldn't have seen the cobalt decays at all, since the frequency of the gamma rays emitted in the decay depends on the speed of light.

I use this example because it's relatively simple to understand. But there have been *lots* of other searches for changes in physical constants, using methods ranging from astrophysical observations of the spectra of distant stars, to searches for anomalous luminosities of faint stars, to studies of abundance ratios of radioactive nuclides, to (for current variations) direct laboratory measurements.

The result is a net of observations that fit together quite rigidly ---you can't tweak one without contradicting many others. For instance, if you suppose the speed of light varies, that affects spectral lines in distant stars. It affects different lines in different ways, and so would be easy to see. (That's what Webb et al. were looking for.) You can try to compensate by allowing the charge of the electron to vary in synch with the speed of light. But that requires that the charge of the proton must vary as well, since otherwise hydrogen gas wouldn't be neutral (which would have dramatic and easily observable effects). But if the charge of the proton varies, the rates of nuclear reactions will change, affecting the production of energy by stars in a way we don't see. You might then propose that the strength of the nuclear interaction could change exactly in synch with the speed of light and the charge of the electron and proton. But nuclear interactions affect neutrons as well, and again you'd end up with drastic changes in the behavior of stars that we would see (and don't). People have gone through this kind of argument carefully and quantitatively. It just doesn't work.

I suggest that you look at the sci.physics FAQs on this question,
and look at the references before you say much more about this particular ``assumption.''

Steve Carlip

Home Page | Browse | Search | Feedback | Links