Thursday 20 February 2020

What was Radiometric dating?

The discovery that natural radiant energy was a much more complex phenomenon than previously thought with various sources dates to the last decade of the nineteenth century. In 1895 Rontgen discovered a new kind of radiation, which he called X-rays, and apparently ‘aroused an amount of interest unprecedented in the history of physical science’ according to J. J. Thomson, Head of the Cavendish Laboratory in Cambridge, reporting to professional colleagues the following year.
We now know that it was indeed a remarkable breakthrough and one that was to bring great benefits and dangers that were not initially appreciated, dangers that were also to arise with the other newly discovered forms of radiation. In 1896 the brilliant French physicist Henri Becquerel discovered that crystals of a uranium salt accidentally placed on top of a wrapped and unexposed glass photographic plate caused the plate to blacken as if it had been exposed to light. Becquerel realized that the crystals were spontaneously emitting some unknown type of energy similar but different to X-rays.
The radiation was solely due to the radium and, unlike light energy, could not be reflected. He also discovered that a lump of radium mineral that he carried around in his pocket burned his skin. But it was the young Polish scientist Marie Curie and her French husband Pierre who made a study of these strange ‘Becquerel rays’. Marie Curie made the all-important discovery that the radiant energy emitted by the uranium salt was an inherent property of the element uranium and together with her husband Pierre named the new phenomenon ‘radioactivity’. In addition, Marie Curie found that the element thorium also emitted similar radiation.
When the Curies examined two naturally occurring uranium ores, pitchblende, and chocolate, they discovered that the radiation emitted was more intense than the uranium or thorium content of their ores, indicating the presence of other radioactive elements. Following the laborious separation processes of fractional crystallization, they managed to distinguish the presence of polonium and the much more radioactive uranium. However, it was a brilliant New Zealander, Ernest Rutherford, working with British chemist Frederick Soddy, who made the breakthroughs that were to lead to the development of radiometric dating.
From experiments on thorium compounds in 1902, Rutherford and Soddy discovered that the activity of a substance is directly proportional to the number of atoms present. From this observation, they formulated a general theory that predicted the rates of radioactive decay and went on to suggest that the gaseous element helium might be a ‘decay’ product of a radioactive element.
At that time, it was not known how many elements were radioactive nor what their decay products might be since radioisotopes had yet to be discovered and there was no instrument available that could measure radioactivity. Nevertheless, Rutherford’s brilliant insights allowed him to suggest that radioactivity might be used as a ‘clock’ to date the formation of some naturally occurring minerals and therefore the rocks that contained them. Rutherford had enormous respect for Kelvin and when addressing a meeting that the great man was attending, referred to Kelvin’s 1862 claim that the Sun could not keep shining unless ‘the great storehouse of creation’ contained some unknown source of energy.
Rutherford and others had now discovered that hidden source – the energy emitted by radioactive elements as they decay within the rocks of the Earth, which is enough to counteract and significantly slow down the rate of cooling. He tried to placate Kelvin by portraying the old man’s prophetic and prescient disclaimer as the hallmark of a very great scientist – ‘that prophetic utterance refers to what we are now considering tonight, radium!’ But Kelvin never really accepted the role that radioactive elements played in the creation of the Solar System, a process that we now understand as stellar nucleosynthesis.
In 1905, Rutherford wrote that ‘if the rate of the production of helium by radium (or other radioactive substance) is known, the age of the mineral can at once be estimated from the observed volume of helium stored in the mineral and the amount of radium present’. On this basis, he determined the very first radiometric date for a fergusonite mineral, which gave a Uranium-Helium age of around 497 million years and one of 500 million years for a uraninite mineral from Glastonbury, Connecticut.
But Rutherford wisely cautioned that these were minimum ages because some of the helium gas would undoubtedly have escaped during the processing of the materials. He suggested that calculations based on lead might be superior: if the production of lead from radium is well established, the percentage of lead in radioactive minerals should be a far more accurate method of deducing the age of a mineral than the calculation based on the volume of helium for the lead formed in a compact mineral has no possibility of escape.
In the same year, an American radiochemist, Bertram Boltwood, went on to provide the first reasonably accurate means of dating the formation of certain minerals within the Earth. Boltwood studied at Yale then in Germany and, on returning to America, worked to improve the analytical techniques of radiochemistry pioneered by his friend Rutherford, who at this time was at McGill University in Montreal. Boltwood made a systematic analysis of radioactive uranium-bearing rocks and noticed that generally both helium and lead were present, with the lead being the stable product of the decay chain from uranium.
Boltwood went on to develop a technique that allowed him, with the aid of a Geiger counter, to measure decay rates and with some chemical apparatus to analyses the remaining lead and uranium concentrations and the ratio of the radioactive isotopes. Initially, he tried out his new method on 10 uranium minerals from rocks whose relative geological age was roughly known, publishing the results in 1907.
These samples ranged in age from 2200 million years for a thoriate (thorium and uranium oxide) from Ceylon (now known as Sri Lanka) to 410 million years for uraninite (uranium oxide) from Glastonbury, Connecticut. The oldest date increased the age of the Earth by an order of magnitude. Although by modern standards these results were not very accurate, for instance, the age of the Glastonbury uraninite has been recalculated to 265 Ma, Boltwood’s technical developments were of enormous importance.
Now the physicists really had to take notice and admit that Kelvin was way off the mark. It began to seem that the geologists had been right all the time to argue that the Earth must be much, much older than 20 million years or so. By 1910, a British geologist, Arthur Holmes, was pursuing a similar approach and embarked on a lifetime’s quest ‘to graduate the geological column with an ever-increasingly accurate time scale’.
He calculated the age of a Norwegian rock, which contained several radioactive minerals, like 370 million years. As the rock was known to have originated within the Devonian geological system, he thus provided the first date for that geological system and period. In retrospect, this was the most accurate of the early radiometric dates and, if Holmes had had the resources to continue his work, radiometric dating would have progressed much faster than it did. Holmes also recalculated some of Boltwood’s published data and arranged them to produce the first geological timescale. He was to improve on this scale continuously for the rest of his professional life.

No comments:

Post a Comment