Radio carbon dating determines the age of ancient objects by means of measuring the amount of carbon-14 there is left in an object.
A man called Willard F Libby pioneered it at the University of Chicago in the 50's. This is now the most widely used method of age estimation in the field of archaeology.
A commonly used radiometric dating technique relies on the breakdown of potassium (Ar in an igneous rock can tell us the amount of time that has passed since the rock crystallized.
Professor Willard Libby produced the first radiocarbon dates in 1949 and was later awarded the Nobel Prize for his efforts.
Radiocarbon dating works by comparing the three different isotopes of carbon.
These various chronologies and their inherent inconsistencies, known as ‘relative dates,’ are a constant series of hurdles in the quest of historians and archaeologists to record mankind’s existence on earth.
However, in the 1940s, the organization of time was transformed by the revelation of radiometric dating and the subsequent creation of a scientific chronology of humankind, known as ‘absolute dating’.
But other timekeeping methods exist and are still used in the modern world, circumventing the easy processing of dates and history between cultures.
Throughout history, time has been defined in a variety of ways: by everything from the current ruler, or empire, or not defined at all.
Radioactive elements were incorporated into the Earth when the Solar System formed.
All rocks and minerals contain tiny amounts of these radioactive elements.
Common materials for radiocarbon dating are: The radiocarbon formed in the upper atmosphere is mostly in the form of carbon dioxide. Because the carbon present in a plant comes from the atmosphere in this way, the radio of radiocarbon to stable carbon in the plant is virtually the same as that in the atmosphere.
Plant eating animals (herbivores and omnivores) get their carbon by eating plants.
As explained below, the radiocarbon date tells us when the organism was alive (not when the material was used).