For better or worse, we humans have been measuring things for a long time. In Genesis Chapter 6, God provides Noah with detailed plans for building a very large wooden boat – 300 x 50 x 30 cubits to be exact. Noah presumably knew how to measure a cubit, the distance from the elbow to the tip of the middle finger. 6,000 years ago, the Egyptians built their pyramids using cubits as their standard measure. Long before there was an accurate way to measure time, Galileo (1564-1642) used musicians to supply a steady beat and help determine the acceleration due to gravity. Technology has come a long way, and we now have ridiculously accurate atomic clocks, along with lasers for precise length measurement.
By the Middle Ages, trade had expanded, and a need for recognized standards arose. In the late 18th Century, the French Academy of Sciences decided that the standard for length should be the shortest distance from the North Pole to the Equator (passing, of course, through Paris). One ten-millionth of this distance, which would be measured by a pair of French mathematician/astronomers, was christened the “meter.” While less dependent on human anatomy than the cubit, it did pose some difficulty in accuracy and replication. Thanks to modern science, we now know that there were errors in those original calculations, and the true meter is about 0.2 mm short. In testimony to the somewhat arbitrary nature of “standards”, that error has never been corrected. The half meridian through Paris hasn’t changed.
Today the maintenance of standards is the responsibility of the National Institute of Standards and Technology. You would expect them to have insanely accurate standards for length, weight and time and you would not be disappointed. As an example, the NIST Strontium atomic clock is accurate to within 1/15,000,000,000 of a second and would not have gained or lost even a second if it had been started at the dawn of the Universe. <continue reading>