Introduction
In a recently published technical report entitled, The Atomic Constants, Light, and Time, Trevor Norman and Barry Setterfield put forward their most recent evidence in favor of the hypothesis that the speed of light, c, has been decreasing in the past. This hypothesis has received much acclaim in some sectors of the creationist community since its initial introduction by Barry Setterfield a few years ago.
Much of the impetus behind the decay of c hypothesis stems from its implications, which are generally viewed as favorable to the young earth creationist position. In particular, the knotty problem of starlight from distant galaxies seems to be solved in a straightforward, naturalistic manner. Implications, however, no matter how favorable, cannot substitute for the clear empirical basis necessary to substantiate this hypothesis.
Measurements of the speed of light have been made for the past three hundred years which could potentially provide the required empirical basis. Norman and Setterfield tabulate the results of 163 speed of light determinations in The Atomic Constants, Light, and Time, and claim clear support for the decay of c hypothesis from this data set. My inability to verify this claim when this data set was subjected to appropriate, objective analyses is the motivation for this article, which is intended to caution creationists against a wholesale, uncritical acceptance of the Norman and Setterfield hypothesis. At the present time, it appears that general support by the creationist community of the decay of the speed of light hypothesis is not warranted by the data upon which the hypothesis rests.
Norman and Setterfield's Data Analysis
The university text, Data Reduction and Error Analysis for the Physical Sciences, by Philip R. Bevington, opens with the following words:
It is a well-established rule of scientific investigation that the first time an experiment is performed the results bear all too little resemblance to the "truth" being sought. As the experiment is repeated, with successive refinements of technique and method, the results gradually and asymptotically approach what we may accept with some confidence to be a reliable description of events (1969, p. 1).
Thus, any gradual and asymptotic approach of the results of experiments to measure c to its present-day value needs to be carefully and critically scrutinized to determine if the effect is due to real, physical changes in the structure of the universe which have altered c, or if it is merely the result of "refinements of technique and method" of measurement.
It is also well known that a given body of data can be inadvertently manipulated due to subjective bias in such a way as to yield unwarranted conclusions. The best way to avoid this problem in the current context is to treat the entire data set as a whole. This minimizes the effects of systematic experimental error and enhances the possibility of discerning any real, overall trend in the value of c.
Unfortunately, the authors of the technical report devote great effort to the discussion and analysis of the data in separate, small groups for any kind of c decay trend within the group, and report changes which can only be explained as technique refinement, as if they were unequivocally in support of c decay. They do, in one place, however, consider the whole body of data collectively. In this one instance, they use a nonweighted least squares technique to find the straight line which best fits the data (ignoring the relative uncertainties in the different data points), and conclude:
When all 163 values involving 16 different experimental methods are used, the linear fit to the data gives a decay of 38 km/s per year (p. 25).
If this was the end of the matter, it would certainly seem to provide powerful evidence in favor of the c decay hypothesis. Unfortunately, even a cursory glance at the data reveals that the above analysis is inappropriate for the given data set, and, hence, the conclusions drawn from it are not valid.
The Data Reanalyzed
The graph on the next page displays the percent difference between the 163 measured values of c and the modern value for the speed of light. The vertical lines on some of the data points are error bars which express the range of uncertainty in the measurement which was reported by the researcher. The range of uncertainty was not reported for many of the earliest measurements, so some of the data points are plotted without error bars. Most of the data points after 1850 do have error bars, but they are too small, in most cases, to be seen on the scale of the graph. The relatively few data points between 1850 and 1900 which have very large error bars, result from two indirect methods of measuring c, which inherently yield low-precision results.
In a non-weighted least squares fit, every data point has equal weight in determining where the best fit straight line should be drawn through the data. For a data set consisting of measurements having error bars of varying lengths, it is not appropriate to give every data point equal weight as Norman and Setterfield have done. It is standard practice to weight the data points in inverse proportion to the size of their error bars. That is, data points with large error bars (greater uncertainty), have less impact on where the best fit straight line should be drawn than do data points with small error bars. This is especially important for the current data set, since the reported error bars range from ± 20,000 km/s to ± 0.0003 km/s.
When I analyzed the entire data set of 163 points using the standard, weighted, linear least squares method, the decay of c was determined to be:
decay of c = 0.0000140 ± 0.0000596 km/s/year.
This result says pretty plainly that there is no discernible decay trend in the data set presented by Norman and Setterfield.
The Data
Though an objective analysis of the data does not reveal any overall decay trend, the one-sidedness of the data before 1800 seems odd. In this regard, there are some peculiarities in Norman and Setterfield's selection of data of which the reader needs to be aware.
The data point which stands out by itself in the upper left-hand corner of the graph is most striking. It is attributed to uncorrected observations of the Roemer type, by Cassini, in 1693. To obtain the speed of light by this method, the earth's orbital radius (i.e. distance from the sun) is divided by the measured time of transit of that radius by light (about 8 minutes, 20 seconds, today). The following quote from Norman and Setterfield is illuminating:
Observations by Cassini (1693 and 1736) gave the orbit radius delay as 7 minutes 5 seconds. Roemer in 1675 gave it as 11 minutes from selected observations. Halley in 1694 noted that Roemer's 1675 figure for the time delay was too large while Cassini's was too small (p. 11).
Norman and Setterfield have chosen to use a reworked or "corrected" value for Roemer's c determination (this is the earliest measurement shown on the graph), and an uncorrected value for Cassini's. It is peculiar that Norman and Setterfield were content to use an uncorrected value for Cassini, given the comments by the eminent and talented Halley, above. It is also unfortunate, since this single, anomalous point is responsible for most of the apparent 38 km/s/year decay which they report. Furthermore, Roemer's uncorrected c determination would graph below the line at -24%, more than offsetting the uncorrected Cassini value.
Conclusion
A number of creationist scientists have been subjecting The Atomic Constants, Light, and Time to careful scrutiny since its release in August 1987. It is anticipated that the results of the investigations of these scientists will soon be available to the creationist community. In the interim, caution is clearly in order.
It seems doubtful, however, that the creation of stars and the appearance of starlight will ever be adequately explained within a totally naturalistic framework. We understand from the Bible that the creation of the universe, and this includes the stars, was a supernatural event which God accomplished by the power of His spoken Word in six solar days about 10,000 years ago. Since God has not chosen to reveal to us in exhaustive detail just how or why He created stars the way He did, the theories we construct to answer these questions will always be, like all scientific theories, tentative and subject to revision. Some will, doubtless, find this uncertainty intolerable; others will feel that their intellects are offended by any reference to the supernatural. But God has not commanded us to understand all of His infinite works, much less to squeeze them into the smallness of a naturalistic framework, but simply to trust Him.**
References
Bevington, Philip R. 1969. Data Reduction and Error Analysis for the Physical Sciences, McGraw-Hill, Inc., New York.
Norman, Trevor and Barry Setterfield. 1987. The Atomic Constants, Light, and Time, Flinders University of South Australia, School of Mathematical Sciences, Technical Report.
* Dr. Aardsma is Head of the Astro/Geophysics Department in the ICR Graduate School.
** A more extensive critique of The Atomic Constants, Light. and Time, by Dr. Aardsma, can be obtained by contacting The Institute for Creation Research, P.O. Box 2667, El Cajon, CA 92021; phone, (619) 448-0900. Publication of a number of papers on this topic is currently anticipated in the June 1988 issue of the Creation Research Society Quarterly (vol. 25, no. 1), published by the Creation Research Society, P.O. Box 14016, Terre Haute, Indiana, 47803.