Featured Product
This Week in Quality Digest Live
Metrology Features
Master Gage and Tool Co.
Why it matters for accurate measurements
Scott A. Hindle
Part 4 of our series on SPC in the digital era
Having more pixels could advance everything from biomedical imaging to astronomical observations
Tara Fortier
It will likely change in the next decade
Douglas C. Fair
Part 3 of our series on SPC in a digital era

More Features

Metrology News
New KMR-Mx Series video inspection system to be introduced at the show
Study of intelligent noise reduction in pediatric study
Easy to use, automated measurement collection
High-end microscope camera for life science and industrial applications
Three new models for nondestructive inspection
Machine learning identifies flaws in real time
Advancing additive manufacturing
ABB robot charger automatically detects boreholes, fills them with charges, with no humans present
Two awards annually for students studying precision metrology

More News

Kees van Deemter

Kees van Deemter’s default image


Vague Measurements

The genesis of metrology

Published: Sunday, March 28, 2010 - 17:34

Consider the most fundamental of measurements: the measurement of physical distances, as when we use feet and yards, or meters. Once upon a time, a foot must have been thought of as the size of, well, a human foot, without worrying whose foot exactly. These days, we are no longer satisfied with this level of imprecision, of course, and concepts such as the foot and the meter have long been standardized—at least to an extent. Since meters are the international currency in most areas of science and engineering, let us briefly look at the history of the meter.

How do you define a meter or a yard in an objective way, so anyone can check whether their measurement instruments are accurate? It doesn’t matter what we decide, say, a meter to be, as long as everyone understands the concept in the same way. The natural idea is to relate the unit in question to some known and invariable thing that is accessible to anyone: a bedrock as we shall say. An early and interesting example is Huygens’s idea, way back in 1673, to base measurements on the distance travelled by the swing of a clock in a second. Others chose the distance travelled by a falling object in one second as their bedrock, however, this suffered from the fact that gravity varies across the surface of the Earth, because the Earth is not perfectly round. Accordingly, none of them managed to gain general acceptance.

As accurate measurement became more and more important, the French government declared, in 1790, the ideal of establishing a system of measurements suitable “pout tous les temps, pour tous les peuples” (“for all times, for all peoples.”) To arrive at such a system, the French government initiated a research project whose ultimate bedrock was mother Earth itself: a meter was equated to 1/40,000,000 of the length of the meridian that runs from the North to the South Pole, exactly through the Panthéon, a building in Paris which had started its career as a church, but had graduated to a “temple of reason” during the French Revolution. 

Naturally, it took scientists a while before they figured out the implications of the Parisian definition. In the end, they produced a bar of platinum that they argued to have just the right size. From this moment onwards, this bar became their de facto bedrock.

It was of the essence, of course, that the size of the bar was kept as constant as possible. For this reason, it was to be kept always at the same temperature, at 0°C, the melting point of ice. Platinum was chosen so as to minimize corrosion and any remaining fluctuations caused by temperature. Unimaginable care was taken to give the bar an optimal shape, mainly to make sure that it bent as little as possible between its two supports. A lot of work was done in subsequent decades to make this standard meter bar as stable as possible, and to allow it to be copied for the benefit of people unfortunate enough to live far away from Paris. But tests in the 20th century indicated a lack of precision, in measurements based on the Parisian meter, of about 0.00005 mm (0.05 μm).

A measurement error of 0.00005 mm may sound like a trifle, but in microscopy, for example, it isn’t. Even for the construction of a humble watch, one needs machinery that produces tiny nuts and bolts whose dimensions match their specification very closely, or else the two won’t fit together. The Parisian approach had been highly successful (having been in use for about a century) but new technologies demanded greater precision and it was difficult to see how the error of 0.00005 mm could be reduced much further. All this meant that researchers kept looking for a bedrock more stable and clearly defined than the Earth or any object on it.

One improvement, proposed around 1900, was made by choosing the wavelength of a particular kind of light (multiplied by some suitably large number, of course, to arrive at a distance close to the Parisian meter) as the new bedrock. Not only did this lead to a considerable reduction in measurement errors, it was more elegant as well. In comparison with the properties of light, the size of the Earth is a mere accident! Perhaps most important of all, the new definition had practical advantages. In the old situation, in which a platinum bar in Paris called the shots, a producer of measurement tools living in Boston, say, would have had to travel to Paris, unless he was confident enough to base himself on American equivalents. In the new situation, where light is the bedrock, it doesn’t matter whether you are in Paris or Boston: The distance is always the same. The basic idea of using wavelengths as the bedrock of measurement was refined in subsequent years, for example by using wavelengths emitted by Cadmium (around 1927) and Krypton 86 (around 1960).  Interestingly, some recent definitions combine the idea of using light as the bedrock with Huygens’s old idea of using time: the Metre Convention of 1983 introduced a new definition whereby a meter was equated with the distance traveled by light through a vacuum in 1/299,792,458 of a second.

It is safe to assume that the search for stabler and more stabler definitions will go on, and that new definitions will make increasingly sophisticated use of physics. Some sobering conclusions are worth drawing. First, some of the basic notions underlying our everyday activities (as well as science) are subject to change and reinterpretation: A meter is a different beast not from what it was in the year 1700, 1800, or 1900. Secondly, these notions are defined in increasingly sophisticated ways, which require the work of skilled physicists, leaving the rest of us with mere approximations.

But, crucially, even the most sophisticated definitions have some residual imprecision, and it is unclear how this imprecision can ever be got rid of completely.

Excerpt reprinted from Not Exactly: In Praise of Vagueness by Kees van Deemter (Oxford University Press, 2010). This excerpt originally appeared on the OUPblog.


About The Author

Kees van Deemter’s default image

Kees van Deemter

Kees van Deemter is a reader in computing science at the University of Aberdeen. His book, Not Exactly: In Praise of Vagueness, looks at how vagueness dominates the way we speak and think – the “tall” woman, the “obese” man. Using mathematical logic, philosophy, linguistics, and artificial intelligence, van Deemter weaves an intriguing account of the nature and importance of vagueness in our lives, and the efforts of scientists to capture and represent it.