Absolute and relative ages of all structured objects found in Nature provide vital clues to help calibrate the cosmic-evolutionary story. Not only do we seek to know the specific ages of many ordered systems and key events in universal history; we especially want to be sure that those ages fare well in a temporal sequence along the arrow of time. Recall that in the PARTICLE EPOCH, we were concerned that stars better be younger than the Universe, for the obvious reason that nothing can be older than its parent. Likewise, here in the PLANETARY EPOCH, we want to be sure that Earth is younger than the oldest stars, indeed contemporaneous with the Sun, lest something be amiss in the planetary-system models described here. In turn, in the next CHEMICAL EPOCH, we shall need to be alert that fossilized life is in all cases younger than the rocky Earth (unless life arrived intact from beyond our planet). These kinds of “sanity checks” are useful periodically while developing the scenario of cosmic evolution. A reasonable and consistent sequence of objects’ relative ages is just as important as their absolute ages.

Take Earth for example. Estimates of Earth’s age have increased dramatically over the past couple of centuries. The most widely quoted of the early estimates is attributed to Anglican Archbishop James Ussher, who in the mid-18th century used the Bible to reason that Earth had been created in 4004 B.C.—October 23, by most accounts! Other researchers at the time, however, preferring to ponder Earth itself and not just beliefs about it, were convinced that our planet must be a good deal older than this 6000-year-old value implied by a literal translation of Scripture.

Although few scholars claimed precision, by ~200 years ago they mostly agreed with the pioneering French naturalist, Georges Buffon, who argued that Earth is at least 100,000 years old. Reportedly, even heretically for the time, he maintained in his unpublished diary that Earth likely formed several million years ago. Gradually during the early 19th century and led especially by the Englander Charles Lyell (who heavily influenced Darwin’s thinking), most geologists came to accept Earth’s age as spanning millions of years, yet still fully a thousand times younger than what we know today. As for a specific value for the duration of our planet, many were then content to see, in the words of the father of modern geology, the Scottish farmer James Hutton, “no vestige of a beginning, no prospect of an end.”

Lord Kelvin was an exception. By the mid-19th century, this British physicist had become familiar with the then new subject of thermodynamics (the science of moving heat, or changing energy) and he used it to try to calculate Earth’s age. Arguing that any gravitationally contracting object cools at a certain rate, he reasoned that our planet would have been molten hot sometime between tens of millions and hundreds of millions of years in the past. However, as noted in the PARTICLE EPOCH, even these longer durations fell short of our planet’s true age. While Kelvin did the calculation correctly, he was unaware of a most important phenomenon—radioactivity.

Not until the early 20th century did French scientists, again as noted earlier mainly the Curies, isolate radium from pitchblend, thereby learning how that heavy element decays into several lightweight elements. Such decays, naturally occurring throughout Earth, provide an additional source of energy on our planet and thus extend Kelvin’s inferred value for Earth’s age. Soon thereafter, a pioneering British atomic physicist, Ernest Rutherford, championed the idea of using radioactive elements to directly date Earth materials. Finally, the true age of Earth could be found; well, almost.

Technically, many heavy nuclei, such as those of uranium, thorium, and plutonium, are inherently unstable (Figure 4.10). If left alone, they gradually break down into lighter nuclei, in the process emitting some elementary particles and releasing energy. This change happens spontaneously, without any external influence. (The energy released by the disintegration of radioactive elements drives the process of nuclear fission, in either controlled nuclear reactors or uncontrolled atomic bombs.) The decay from “parent” nuclei to more stable, “daughter” offspring is not immediate however; rather, it happens at a characteristic pace—its half-life. Half-lives vary greatly, as measured in the laboratory; for example, half a sample of plutonium will decay in ~3 million years while that of uranium needs ~700 million years. Thus, if we can measure the amount of unstable parent nuclei of a given element remaining today in, say, a rock, and also measure the amount of its stable decay product, then we can specify the time during which the decay occurred. This method is widely used by geologists and gives age estimates with an accuracy of a few percent, but there is a caveat.

FIGURE 4.10 FIGURE 4.10 – Radioactive nuclei naturally decay into lighter nuclei after certain periods of time, called half-lives. By knowing the half-life and by measuring the relative abundances of parent and daughter nuclei in a given sample of rock, estimates can be made of the age of the rock since it last solidified. (Prentice Hall)

The radioactive-dating technique rests on the assumption that the rock has remained solid while the radioactive matter decayed. If the rock melts, there’s no particular reason to expect the daughter nuclei to remain in the same locations their parents had occupied, and the whole method fails. Therefore, radioactive dating measures the time elapsed since the rock in question last solidified. In many cases, this will be a lower limit, given that most rocks underwent some heating in their past.

Throughout the first half of the 20th century, radioactive methods gave Earth ages variously in the range of 1-3 billion years. As understanding of nuclear physics advanced during the second half of the century, so has progressively older rock been found on our planet. Today, the oldest rocks are found in Greenland and Labrador, dated to be nearly 4 billion years and proving that our planet is at least that old. Furthermore, since Earth is highly differentiated—its heaviest elements are mainly at the planet’s core, in contrast to its lightweight elements being at or near the crust—it must have been molten at some earlier time, lest the heavies not have been able to sink into the core. A combination of thermodynamic tests of rock cooling rates, radioactive dates of stray meteorites and the lunar highlands, and theoretical studies of the Sun’s evolutionary state all converge on an age for planet Earth of 4.6 billion years.

This episode in the changing estimates of Earth’s age is a good example of how the scientific method eventually yields a definite sense of objectivity, even though it is sometimes affected by the subjective whims and biases of individual researchers. Over the course of time, groups of scientists checking, confirming, and refining experimental tests will neutralize the subjectivism of single workers. Often a few years of intensely focused research is enough to bring much objectivity to bear on any given problem, although some particularly tricky issues—such as Earth’s old age herein, as much as for Earth orbiting the Sun in Galileo’s day—were swamped for generations by cultural and institutional bias fostered by tradition, religion, and even politics.

Today, with an open mind and a readiness to revise our models to reflect new theoretical ideas and better experimental tests, scientists maintain that Nature yields a certain measure of objectivity through empirical facts, thus granting us a progressively better “approximation of reality.” It is in this sense that science claims to make progress, both in quantitative terms of a fuller, more accurate knowledge and in qualitative terms of a richer understanding of how we know what we know.


<<BACK            HOME            NEXT>>