Encyclopedia > Age of Earth

  Article Content

Age of the Earth

Redirected from Age of Earth

The age of the Earth is estimated to be 4.6 billion (4.6 × 109) years. This estimate represents a compromise between the oldest known terrestrial rocks – small crystals of zircon from the Jack Hills[?] of Western Australia – and astronomers' estimates of the age of the solar system. It is clear from the rocks that the Earth is at least 4.4 billion years old. Comparing the mass and luminosity of the sun to those of a multitude of other stars, it appears that the solar system can not be much older than those rocks. Since no known meteorites are older than 4.6 billion years, it is assumed that the Earth is probably about the same age. This article describes the modern dating methods used to arrive at the age of the Earth, and outlines their history.

Table of contents

Prescientific notions

In the centuries preceding the scientific revolution, the age of the Earth was determined by religious creation tales. The Han Chinese thought the Earth was created and destroyed in cycles of over 23 million years. Westerners were more conservative. In a book published in 1654, not long before his death, Archbishop James Ussher (often spelt with one "s") of Armagh, Ireland, calculated from the Bible augmented by some astronomy and numerology that the Earth was created on October 23, 4004 BC. One John Lightfoot actually published this date about a decade before Ussher. Lightfoot pinned the time of creation down to around 9:00 am (GMT presumably).

In both cases, the estimates were gross understatements. Few people had conceived the idea of a time that stretched out far into the past before the arrival of humankind, or for that matter stretched out far into the future beyond the end of humankind. One who did was Aristotle, who thought that the Earth and the universe had existed from eternity.

First concepts

By the 18th century, a few naturalists were trying to place the age of the Earth on a more scientific basis. The naturalist Mikhail Lomonosov, regarded as the founder of Russian science, was one of the first to undertake this exercise, suggesting in mid-century that the Earth had been created separately from the rest of the Universe several hundred thousands of years before.

Lomonosov's ideas were mostly speculative, but in 1779 the French naturalist the Comte du Buffon[?] tried to obtain a value for the age of the Earth using an experiment. He created a small globe that resembled the Earth in composition and then measured its rate of cooling. This led him to estimate that the Earth was about 75,000 years old.

Very few of their colleagues paid them much mind, either leaving the question of the age of the Earth to creation tales, or if they were impious simply assuming that the Earth always had been, always would be. However, there were many naturalists whose studies of strata, the layering of rock and earth, gave them an appreciation that the Earth had been through many changes during its existence, however long that might be.

These layers often contained fossilized remains of unknown creatures, and there seemed to be a progression of types of such creatures from layer to layer. In the 1790s, the British naturalist William Smith pointed out that if two layers of rock at widely differing locations contained similar fossils, then it was very plausible that the layers were the same age.

Other naturalists used this idea to construct a past history of the Earth, though their timelines were inexact as they did not know how long it took to lay down such layers. Smith's nephew and student, John Phillips, later calculated by such means that the Earth was about 96 million years old.

In 1830, the geologist Charles Lyell took the next step and proposed that the features of the Earth were in perpetual change, eroding and reforming continuously, and the rate of this change was roughly constant. This was a challenge to the traditional view, which saw the past history of the Earth as static, with changes brought about by intermittent catastrophes. Many naturalists were influenced by Lyell to become "uniformitarians" who believed that changes were constant and uniform.

William Thompson's calculations

In 1862, the physicist William Thomson of Glasgow published calculations that fixed the age of the Earth at between 20 million and 400 million years. He assumed that the Earth had been created as a completely molten ball of rock, and determined the amount of time it took for the ball to cool to its present temperature.

Geologists had trouble accepting a finite age for the Earth. Biologists could accept that the Earth might have a finite age, but even 100 million years seemed much too short to be plausible. Charles Darwin, who had studied Lyell's work, had proposed his theory of the evolution of organisms by natural selection, a semi-random process that implies great expanses of time. Even 400 million years didn't seem long enough.

In a lecture in 1869, Darwin's great advocate, Thomas H. Huxley, attacked Thomson's calculations, suggesting they appeared precise in themselves but were based on faulty assumptions. Huxley was correct, and in fact Thomson's estimates would prove far too short, but Thomson had at least attempted to root the debate in facts rather than speculation, and apply a little rigor to it.

He certainly managed to provoke a long and productive debate. The German physicist Hermann von Helmholtz and the American astronomer Simon Newcomb[?] joined in by independently calculating the amount of time it would take for the Sun to condense down to its current diameter and brightness from the nebula of gas and dust from which it was born. They came up with a value of 100 million years, which seemed to set an upper limit on the age of the Earth that was consistent with Thomson's calculations. However, they assumed that the Sun was only glowing from the heat of its gravitational contraction. They knew of no other ways for it to produce its energy.

Other scientists backed up Thomson's figures as well. Charles Darwin's son, the astronomer George H. Darwin[?] of the University of Cambridge, proposed that the Earth and Moon had broken apart in their early days when they were both molten. He calculated the amount of time it would have taken for tidal friction to give the Earth its current 24-hour day, and concluded that Thomson was on the right track.

In 1899, John Joly of the University of Dublin calculated the rate at which the oceans should have accumulated salt from erosion processes, and determined that the oceans were about 90 million years old.

The discovery of radioactivity

By the turn of the 20th century, Thomson had been made Lord Kelvin in appreciation of his many scientific accomplishments. He had reason to feel confident of himself, and the fact that multiple attempts to determine the age of the Earth seemed to show that it was about 100 million years old led him to feel very certain that his estimates were correct. The geologists could only suggest that Kelvin didn't have all the facts, and they still believed that the Earth was far older than 100 million years.

The breakthrough that would ultimately resolve the conflict took place in 1896, when the French chemist A. Henri Becquerel discovered radioactivity. Three types of radioactive emissions were found:

In 1898, two other French researchers, Marie and Pierre Curie, discovered the radioactive elements polonium and radium. Working on this research, in 1900 the New Zealand physicist Ernest Rutherford and the Canadian physicist R. K. McClung of McGill University showed that the various rays emitted by radioactive substances carried a great deal of energy. In 1902 and 1903, Rutherford and his British colleague Frederick Soddy described radioactivity as the spontaneous degeneration of one atomic element into another.

Little attention was paid to the work of Rutherford and his colleague at the time, but the Curies had greater public stature. In 1903 Pierre Curie and his associate Albert Laborde announced that radium produces enough heat to melt its own weight in ice in less than an hour. Rutherford and Howard T. Barnes showed that this heat was due to alpha radiation. The relatively massive helium nuclei collided with molecules in the ice crystal, creating heat that melted the ice.

Geologists quickly realized that the discovery of radioactivity upset the assumptions on which most calculations of the age of the Earth were based. These calculations assumed that the Earth and Sun had been created at some time in the past and had been steadily cooling since that time. Radioactivity provided a process that generated energy. George Darwin and Joly were the first to point this out, also in 1903.

There was the issue of whether the Earth contained enough radioactive material to significantly affect its rate of cooling. In 1901 two German schoolteachers, Julius Elster and Hans F. Geitel, had detected radioactivity in the air and then in the soil. Other investigators found it in rainwater, snow, and groundwater. Robert J. Strutt of Imperial College, London, found traces of radium in many rock samples, and concluded that the Earth contained more than enough radioactive material to keep it warm for a long, long time.

Strutt's work created controversy in the scientific community. Lord Kelvin spoke for those who still believed in in the older estimates, fighting a stubborn rear-guard action in public against the new findings up to his death in 1907, though he admitted in private that his calculations had been shown to be incorrect. Geologists and biologists who had stuck by their gut feelings that the Earth was much older than 100 million years were relieved to have been at least partly vindicated.

The invention of radioactive dating

To feel completely vindicated, they needed to come up with new and more rigorous estimates of the age of the Earth. Radioactivity, which had overthrown the old calculations, yielded a bonus by providing a basis for new calculations, in the form of radioactive dating.

Rutherford and Soddy had continued their work on radioactive materials and concluded that radioactivity was due to a spontaneous transmutation of atomic elements. An element broke down into another, lighter element, releasing alpha, beta, or gamma radiation in the process. They also determined that a particular radioactive element decays into another element at a distinctive rate. This rate is given in terms of a "half-life", or the amount of time it takes half of a mass of that radioactive material to break down into its "decay product".

Some radioactive materials have short half-lives, some have long half-lives. Uranium, thorium, and radium have long half-lives, and so persist in the Earth's crust, but radioactive elements with short half-lives have generally disappeared. This suggested that it might be possible to measure the age of the Earth by determining the relative proportions of radioactive materials in geological samples.

In reality, radioactive elements do not always decay into nonradioactive ("stable") elements directly, instead decaying into other radioactive elements that have their own half-lives, and so on, until they reach a stable element. Such "decay series", such as the uranium-radium and thorium series, were known within a few years of the discovery of radioactivity, and provided a basis for constructing techniques of radioactive dating.

The pioneers were Bertram B. Boltwood, a young chemist just out of Yale, and the energetic Rutherford. Boltwood had conducted studies of radioactive materials as a consultant, and when Rutherford lectured at Yale in 1904, Boltwood was inspired to describe the relationships between elements in various decay series.

Late in 1904, Rutherford took the first step toward radioactive dating by suggesting that the alpha particles released by radioactive decay could be trapped in a rocky material as helium atoms. At the time, Rutherford was only guessing at the relationship between alpha particles and helium atoms, but he would prove the connection four years later.

Soddy and Sir William Ramsay, then at University College in London, had just determined the rate at which radium produces alpha particles, and Rutherford proposed that he could determine the age of a rock sample by measuring its concentration of helium. He dated a rock in his possession to an age of 40 million years by this technique. This assumed that the rate of decay of radium as determined by Ramsay and Soddy was accurate, and that helium didn't escape from the sample over time. Rutherford's scheme was inaccurate, but it was a useful first step.

Boltwood focused on the end products of decay series. In 1905, he suggested that lead was the final stable product of the decay of radium. It was already known that radium was an intermediate product of the decay of uranium. Rutherford joined in, outlining a decay process in which radium emitted five alpha particles through various intermediate products to end up with lead, and speculated that the radium-lead decay chain could be used to date rock samples.

Boltwood did the legwork, and by the end of 1905 had provided dates for 26 separate rock samples, ranging from 92 to 570 million years. He did not publish these results, which was fortunate because they were flawed by measurement errors and poor estimates of the half-life of radium. Boltwood refined his work and finally published the results in 1907.

Boltwood's paper pointed out that samples taken from comparable layers of strata had similar lead-to-uranium ratios, and that samples from older layers had a higher proportion of lead, except where there was evidence that lead had leached [leaked?] out of the sample. However, his studies were flawed by the fact that the decay series of thorium was not understood, which led to incorrect results for samples that contained both uranium and thorium. However, his calculations were far more accurate than any that had been performed to that time. Refinements in the technique would later give ages for Bolton's 26 samples of 250 million to 1.3 billion years.

Arthur Holmes and the vindication of radioactive dating

Although Boltwood published his paper in a prominent geological journal, for whatever reasons the geological community had little interest in radioactivity. Boltwood gave up work on radioactive dating and went on to investigate other decay series. Rutherford remained mildly curious about the issue of the age of the Earth but did little work on it.

Robert Strutt tinkered with Rutherford's helium method until 1910 and then gave up on it. However, Strutt's student Arthur Holmes became interested in radioactive dating and continued to work on it after everyone else had given up.

Holmes focused on lead dating, because he regarded the helium method as unpromising. He performed measurements on rock samples in 1911 and concluded that the oldest was about 1.6 billion years old. These calculations were not particularly trustworthy. For example, he assumed that the samples had contained only uranium and no lead when they were formed.

More important, in 1913 research was published showing that elements generally exist in multiple variants with different masses, or "isotopes". In the 1930s, isotopes would be shown to have nuclei with differing numbers of neutral particles known as "neutrons". In that same year, other research was published establishing the rules for radioactive decay, allowing more precise identification of decay series.

Many geologists felt these new discoveries made radioactive dating so complicated as to be worthless. Holmes felt that they gave him tools to improve his techniques, and he plodded ahead with his research, publishing before and after the First World War.

His work was generally ignored until the 1920s, though in 1917 Joseph Barrell, a professor of geology at Yale, redrew geological history as it was understood at the time to conform to Holmes's findings in radioactive dating. Barrell's research determined that the layers of strata had not all been laid down at the same rate, and so current rates of geological change could not be used to provide accurate timelines of the past history of the Earth.

Holmes's persistence finally began to pay off in 1921, when the speakers at the yearly meeting of the British Association for the Advancement of Science came to a rough consensus that the Earth was a few billion years old, and that radioactive dating was credible.

No great push to embrace radioactive dating followed, however, and the die-hards in the geological community stubbornly resisted. They had never cared for attempts by physicists to intrude in their domain, and had successfully ignored them so far.

The growing weight of evidence finally tilted the balance in 1926, when the National Research Council of the US National Academy of Sciences finally decided to resolve the question of the age of the Earth by appointing a committee to investigate. Holmes, being one of the few people on Earth who was trained in radioactive dating techniques, was a committee member, and in fact wrote most of the final report.

The report concluded that radioactive dating was the only reliable means of pinning down geological time scales. Questions of bias were deflected by the great and exacting detail of the report. It described the care in which measurements were made, and defined not only the methods of used, but also their error bars[?] and limitations.

Modern dating techniques

The decay of radioactive materials can take place through "alpha decay", in which an isotope loses two neutrons and two protons, or through "beta decay", in which an isotope loses an electron, converting a neutron into a proton.

Radioactive or "radiometric" dating works by determining the relative concentrations of the "parent" and "daughter" isotopes in a decay process. Radiometric dating schemes are regarded as accurate because radioactive decay is not influenced by any normal physical process. It is neither slowed nor accelerated by heat, pressure, or magnetic and electric fields. It can be accelerated by radioactive bombardment, but such bombardment tends to leave evidence of its occurrence.

Furthermore, the processes that form specific materials are often conveniently selective as to what elements they incorporate during their formation. In the ideal case, the material will incorporate a parent isotope and reject the daughter isotope. In this case, the only daughter isotopes found through examination of a sample were created since the sample was formed.

However, if a material that selectively rejects the daughter isotope is heated, any daughter isotopes that have been accumulated over time will be lost through diffusion, setting the isotopic "clock" to zero. The temperature at which this happens is known as the "blocking temperature" and is specific to a particular material.

Although radiometric dating is accurate in principle, the accuracy is very dependent on the care with which the procedure is performed. The possible confounding effects of initial contamination of parent and daughter isotopes have to be considered, as do the effects of any loss or gain of such isotopes since the sample was created. Accuracy is enhanced if measurements are taken on different samples taken from the same rock body but at different locations. This permits some compensation for variations.

Of course, the half-life of the parent isotope has to be known, and the measurements of the parent and daughter isotopes have to be accurate as well. Radiometric dating can be performed on samples as small as a billionth of a gram using a mass spectrometer. The mass spectrometer was invented in the 1940s and began to be used in radiometric dating in the 1950s.

The mass spectrometer operates by generating a beam of ionized atoms from the sample under test. The ions then travel through a magnetic field, which diverts them into different sampling sensors, known as "Faraday cups", depending on their mass and level of ionization. On impact in the cups, the ions set up a very weak current that can be measured to determine the rate of impacts and the relative concentrations of different atoms in the beams.

The uranium-lead radiometric dating scheme is one of the oldest available, as well as one of the most highly respected. It has been refined to the point that the error in dates of rocks about three billion years old is no more than two million years.

Uranium-lead dating is best performed on the mineral "zircon" (ZrSiO4), though it can be used on other materials. Zircon incorporates uranium atoms into its crystalline structure as substitutes for zirconium, but strongly rejects lead. It has a very high blocking temperature, and is very chemically inert.

One of its great advantages is that any sample provides two clocks, one based on uranium-235's decay to lead-207 with a half-life of about 4.5 billion years, and one based on uranium-238's decay to lead-206 with a half-life of about 700 million years, providing a built-in crosscheck that allows accurate determination of the age of the sample even if some of the lead has been lost.

Two other radiometric techniques are used for long-term dating. Potassium-argon dating involves the beta decay of potassium-40 to argon-40. Potassium-40 has a half-life of 1.3 billion years, and so this method is applicable to the oldest rocks. Radioactive potassium-40 is common in micas, feldspars, and hornblendes, though the blocking temperature is fairly low in these materials, about 125C.

Rubidium-strontium dating is based on the beta decay of rubidium-87 to strontium-87, with a half-life of 50 billion years. This scheme is used to date old igneous and metamorphic rocks, and has also been used to date lunar samples. Blocking temperatures are so high that they are not a concern. Rubidium-strontium dating is not as precise as the uranium-lead method, with errors of 30 to 50 million years for a 3-billion-year-old sample.

Footnote -- Short-range dating techniques

There are a number of other dating techniques that have short ranges and are so used for historical or archaelogical studies. One of the best-known is the carbon-14 (C14) radiometric technique.

Carbon-14 is a radioactive isotope of carbon-12, with a very short half-life of 5,730 years. In other radiometric dating methods, the heavy parent isotopes were synthesized in the explosions of massive stars that scattered materials through the Galaxy, to be formed into planets and other stars. The parent isotopes have been decaying since that time, and so any parent isotope with a short half-life should be extinct by now.

Carbon-14 is an exception. It is continuously created through collisions of neutrons generated by cosmic rays with nitrogen in the upper atmosphere. The carbon-14 ends up as a trace component in atmospheric carbon dioxide (CO2).

An organism acquires carbon from carbon dioxide during its lifetime. Plants acquire it through respiration and photosynthesis, and animals acquire it from consumption of plants and other animals. When the organism dies, the carbon-14 begins to decay, and the proportion of carbon-14 left when the remains of the organism are examined provides an indication of the date of its death. Carbon-14 radiometric dating has a range of about 50,000 years.

The rate of creation of carbon-14 appears to be roughly constant, as cross-checks of carbon-14 dating with other dating methods show it gives consistent results. However, local eruptions of volcanoes or other events that give off large amounts of carbon dioxide can reduce local concentrations of carbon-14 and give inaccurate dates. The releases of carbon dioxide into the biosphere as a consequence of industrialization have also depressed the proportion of carbon-14 by a few percent; conversely, the amount of carbon-14 was increased by above-ground nuclear bomb tests that were conducted into the early 1960s.

Another relatively short-range dating technique is based on the decay of uranium-238 into thorium-230, a process with a half-life of 80,000 years It is accompanied by a sister process, in which uranium-235 decays into protactinium-231, which has a half-life of 34,300 years.

While uranium is water-soluble, thorium and protactinium are not, and so they are selectively precipitated into ocean-floor sediments, from which their ratios are measured. The scheme has a range of several hundred thousand years.

Archaeologists use tree-ring dating (dendrochronology) to determine the age of old pieces of wood. Trees grow rings on a yearly basis, with the spacing of rings being wider in good growth years than in bad growth years. These spacings can be used to help pin down the age of old wood samples, and also give some hints to climate change. The technique is only useful to about 4,000 years in the past, however, because it requires overlapping tree ring series.

Although determining geologic time by measuring the rate of deposition of sediments is not reliable over the large scale, it is still useful for certain scenarios, such as the deposition of layers of sediment on the bottom of a stable lake. The approach is now known as "varve analysis" (the term "varve" means a layer or layers of sediment).

Another technique used by archaelogists is to inspect the depth of penetration of water vapor into chipped obsidian (volcanic glass) artifacts. The water vapor creates a "hydration rind" in the obsidian, and so this approach is known as "hydration dating" or "obsidian dating", and is useful for determining dates as far back as 200,000 years.

Natural sources of radiation in the environment knock loose electrons in, say, a piece of pottery, and these electron accumulate in defects in the material's crystal lattice structure. When the sample is heated, at a certain temperature it will glow from the emission of electrons released from the defects, and this glow can be used to estimate the age of the sample to a threshold of a few hundred thousand years.

Finally, "fission track dating" involves inspection of a polished slice of a material to determine the density of "track" markings left in it by radioactive decay of uranium-238 impurities.

The uranium content has to be understood, but that can be determined by placing a plastic film over the polished slice of the material, and then bombarding it with slow neutrons. This causes induced fission of U235, as opposed to spontaneous fission of U238. The fission tracks produced by this process are recorded by a thin plastic film placed against the surface of the sample. The uranium content of the material can then be calculated so long as the neutron dose is known.

This scheme has a maximum range of about a million years and works best with micas, tektites (glass fragments from volcanic eruptions), and meteorites. However, the dates may be inaccurate if the sample was heated to high temperatures in the past as blocking temperatures are generally low, or if the sample was exposed on the surface of the Earth where it was bombarded with cosmic rays.


The initial version of this article was based on a public domain text by Greg Goebel, http://www.vectorsite.net/.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
BBC News 24

...     Contents BBC News 24 BBC News 24 is the BBC's 24-hour news television channel. It first broadcast in November 1997 and at first only cable ...

 
 
 
This page was created in 33.6 ms