Skip Ribbon Commands
Australian Government: National Measurement InstituteAustralian Government: National Measurement Institute
National Measurement Institute

Significant Achievements

Redefinition of the Metre

In the period 1960–1983 the metre was redefined twice, as new physical techniques became available. NMI made significant contributions to each of these changes.

The availability of discharge lamps containing single isotopes of various elements led to a proposal to replace the International Prototype Metre (a bar of platinum-iridium alloy) with the wavelength of a specified spectrum line. The candidate lines were a red line of cadmium-114, a green line of mercury-198, and an orange line of krypton-86. Colin Bruce, with Bert Hill, made detailed studies of the stability and reproducibility of lamps based on these lines. Bruce was a member of the international committee that drew up the formal redefinition that was adopted in 1962. The metre was defined as a multiple of the wavelength (606 nm) of the orange spectrum line of krypton-86. Bruce and Philip Ciddor developed a new interferometer that they used to measure line-standard scales in terms of the new definition. Their work established that there had been only a marginally significant change in the metre as a result of the change of definition.

In the 1950s, Bruce and Barry Thornton developed the first general formulas for one of the major corrections needed in the calibration of length standards by interferometry, and these were adopted internationally. Experimental and theoretical work by Bruce and Ciddor resolved a long-standing problem in correcting for the wavelength-dependent phase change at optical surfaces; this correction was important in the determination of the wavelength of the new krypton standard.

Shortly after the new definition of the metre was adopted, lasers with highly reproducible wavelengths were developed, and Bruce made accurate comparisons of these new wavelength standards with the krypton standard. For several years it seemed that one or other of the new lasers would be used to replace the krypton lamp in a new definition of the metre, and John Magyar, Nick Brown and Philip Ciddor developed a range of frequency-stabilised lasers and made detailed studies of several of the candidate lines.

However, yet another new technology appeared, that made it possible to measure the very high frequency of light waves. This opened up the exciting prospect of measuring both the frequency of a laser line (in terms of the atomically defined second) and its wavelength (in terms of the krypton wavelength). This would provide a measurement of the speed of light, a fundamental physical quantity. From this, the metre could be redefined as the distance travelled by light in a specified fraction of a second, thereby tying it to the SI unit of time. NMI did not have the enormous resources required for the frequency measurements, but studied intensively the frequency differences between various components of the absorption lines in iodine that were proposed as reference frequencies. Ciddor participated in the meeting of international specialists that drew up the formal definition of the metre in terms of the atomic second and an agreed value of the speed of light in a vacuum.

Although the definition of the metre is now unlikely to change, its practical implementation through agreed frequencies for various lasers led to the development by Nick Brown, Barry Ward, Rod Duffy, Chris Walsh and Esa Jaatinen of increasingly precise and versatile stabilisation techniques for lasers in both the visible and infrared regions of the spectrum. These lasers are used in-house for the calibration of length standards, in various other standards facilities (e.g. capacitance, geodetic instrumentation, pressure) and also in the project to develop an atomic definition of the kilogram. They have also been used in astronomical instruments such as the Sydney University Stellar Interferometer. Since most practical measurements are made in ordinary atmospheric conditions, it is necessary to determine the speed of light in those conditions. In a series of papers, Ciddor completed a major revision of the equations used to calculate the speed of light in different atmospheric conditions; his new equations have been adopted internationally for use when the highest accuracy is required. These applications are as diverse as the calibration of length standards, precision geodetic surveying and satellite laser ranging.

The Calculable Capacitor

For about twenty years from 1968 the standard of resistance, the ohm, was maintained solely by NMI on behalf of all international standards laboratories. The development of the required instrumentation was a striking example of the benefits of combining NMI's expertise in several fields: electrical theory and measurements, mechanical engineering and optical metrology. It also involved a very surprising theoretical discovery.

The accepted method of establishing a standard of impedance or realising the ohm started with the calculated inductance of a coil; this required the measurement of many dimensions of the coil. In the 1950s, Keith Clothier was pursuing an alternative scheme that started with a parallel-plate capacitor whose value could be calculated from a single dimension. Mel Thompson was investigating other designs based on the cross-capacitances between the sides of a cylinder of known length.

The theory of electrostatics was one of the triumphs of 18th and 19th century physics, and was generally considered to be completed. It was therefore a considerable surprise when Thompson and Douglas Lampard announced a new theorem. They had considered a series of designs based on a cluster of parallel metal bars and found, to their surprise, that the particular shape of the structure had only a very small effect on the capacitance for a given length. They were able to establish a precise theory of this result, now known as the Thompson-Lampard Theorem. This approach was adopted instead of Clothier's original design. Careful experimentation on test models by Clothier and Hugh Bairnsfather led to the establishment of design tolerances that would produce a capacitance of small, but useful, size (about 1/6 pF). The precise value would be controlled by moving a single electrode through a distance that was measured by an optical interferometer. The light source for the interferometer was originally a mercury-198 discharge lamp, and later a stabilised helium-neon laser.

A complex series of comparisons, with successively larger transfer standard capacitors (developed by Malcolm MacGregor), yielded the impedance of a 5000 pF capacitor, which was compared to the ac resistance of a 10 kΩ resistor. Dennis Gibbings transferred this to a dc value by theoretical calculations from the geometry of the resistor and extrapolation of its resistance-frequency curve to zero frequency. Bob Richardson then compared the dc resistance to a 1 Ω resistor, which constituted the laboratory's working standard.

This last comparison relied on the invention by Bruce Hamon of a 'build-up resistor', which allowed a single-step comparison between 4-terminal resistors differing in value by a factor of 100. Initially two stages were involved - from 10 kΩ to 100 Ω, and 100 Ω to 1 Ω. Later a single build-up resistor with 100 resistors of 100 Ω provided a direct transfer over the ratio of 104 to 1. Hamon's invention, which had the additional desirable feature that the dissipation in each resistor was the same in series and parallel connections, has been widely adopted in precise electrical measurements. Its success was dependent on extraordinarily skillful construction work and electronic developments by those mentioned above, and many others. McGregor subsequently assisted the US National Bureau of Standards in building a similar calculable capacitor, and other laboratories adopted similar designs. However, NMI's implementation, with its associated electrical measurement system, was judged to be the best suited for use as an international working standard.

Because of the substantial number of steps involved in the transfer from the calculable capacitor to the 1 Ω dc standard, extremely high accuracy was required at every stage. It is evidence of the quality of this work that until 1983, one of the dominant sources of uncertainty was the speed of light, which is needed to calculate the capacitance of the basic capacitor in SI units. With the adoption in 1983 of a defined value for the speed of light, this component of uncertainty was removed, and the overall uncertainty of the 1 Ω standard was reduced to about 1 part in ten million. The very precise electromechanical controls for the interferometer mirrors that Clothier had developed for this work were crucial to several of the other major instruments described in these pages.

The NMI calculable capacitor has been continually refined by Greig Small and Peter Coogan as new optical and electronic techniques became available, and NMI is now collaborating with the BIPM in the design of an entirely new version.

For later developments in resistance standards, see quantised Hall resistance.

Gravitational Acceleration

The value of the local acceleration due to gravity, usually written 'g', is required in precise measurements of mass, pressure, and force. A world-wide network of 'gravity stations' was established many years ago, at which local gravity had been measured by comparison with a reference station in Potsdam. The absolute measurement of g in terms of the basic quantities length and time is a difficult enterprise and very few values were known. However, in the 1960s there was strong evidence that local values based on the Potsdam value were seriously in error (by 140 µm/s2, or about 14 parts per million). In 1968 an adjusted value was adopted internationally, but there was a suggestion that the discrepancy might vary at different sites, so a new measurement was undertaken at NMI, far removed from all other absolute sites.

One method of measuring g is to drop an object and time its passage between two points, a method traditionally associated with Galileo. This method is subject to various errors that were difficult to estimate, so George Bell, Dennis Gibbings and Jim Patterson adopted a symmetrical technique in which the object was thrown upwards and the times of the upward passage between two points and the return downward passage were measured. This sounds simple, but given that the object was a very precise trihedral reflector that had to be both launched and caught gently, the mechanical design was a tour de force.

The mirror consisted of three silica optical flats held in a metal frame so that they formed a 'cube-corner reflector', like the floor and two walls of a room meeting in a corner. This delicate device was launched (at several metres per second) by an air-driven catapult in an evacuated chamber, and the catapult was then driven in reverse at just the right time to gently catch the reflector in a resilient mounting. Numerous precautions in design and construction were taken to ensure that the reflector did not rotate during its flight, thereby misaligning the interferometer or, worse, leading to a disastrous landing. The passages of the reflector through two defined levels were detected by a white-light interferometer. The separation of the levels was set by the length of an accurately measured length bar. In this way the measurements were tied to the laboratory standards of length and time.

The result, published in 1972, indicated that the Potsdam value was in error by 138 µm/s2, in excellent agreement with the correction that had been adopted in 1968 (140 µm/s2). This experiment was of particular importance because it was the first in the Southern Hemisphere, and at a location remote from all other absolute sites, thereby strengthening the World Gravity Network and arguing against the suggested variation of the discrepancy with location. In recent years there have been developments overseas in the one-way (dropping) technique, based on direct fringe counting with laser sources and high-speed computing; the accuracy and portability of the instruments have been greatly improved.

Search for an Atomic Kilogram

All of the basic SI units are now defined in terms of atomic quantities except for one, the kilogram, which is still defined as the mass of a particular artefact. There is an obvious resolution of this unsatisfying situation — define the unit of mass in terms of the mass of an atom. All atoms of a given isotope of a given element have the same mass, so there is no danger of losing or damaging the defining quantity, as there is with the kilogram artefact. The problem is that to relate an atomic mass to the present kilogram (so as to ensure continuity in the size of the unit) one must count out a very large number of atoms (about 1027).

NMI is part of an international collaboration that is measuring the number of silicon atoms in nearly perfect crystals whose mass is close to 1 kg. The crystals are cut and polished into extremely round and smooth spheres by techniques developed in CSIRO for manufacturing precise optical components. The small errors in the shape of each sphere are measured by very sensitive mechanical instruments and the mean diameter is found by measuring many diameters with a laser interferometer. (An early version of the interferometer, developed by George Bell, Ed Morris and Jim Patterson, was used to measure silica spheres that were used to measure the density of water.) Other laboratories determine the isotopic composition of the crystals and the mean spacing between the silicon atoms. From all this information the number of atoms in each sphere is calculated. The sphere is then weighed against a kilogram standard to establish the mass of one silicon atom. Every step in this process challenges the limits of present technology; for instance, the mean diameter of the sphere must ultimately be measured to about the diameter of an atom.

The current status of the NMI project is that the spheres can be made round to within about 50 nm and the mean diameter of the NMI sphere has been measured to about 2 nm. (If the sphere were as big as the Earth its diameter would be known to about 200 mm, and the highest mountain would be less than 5 m high.) The best published result is that the mass of the silicon atom is known in terms of the SI kilogram to 1.5 parts in 107. Before a change in the definition can be made this must be improved by at least a factor of 10, and work is continuing in many laboratories on all aspects of the problem.

Within NMI the major contributors to this project have been Mike Kenny, Ed Morris and Brad Ward (mass measurement), Chris Walsh, Esa Jaatinen and Philip Ciddor (interferometry) and Walter Giardini (shape measurement). The NMI sphere and others being used in related projects overseas were manufactured by Achim Leistner (CSIRO) in an outstanding achievement. The very thin, but significant, oxide layer that builds up on the silicon surface is extremely difficult to identify and measure, but suitable ion-beam and ellipsometric techniques were developed by Leszec Wielunski (NMI) and Roger Netterfield (CSIRO) respectively.

Absolute Determination of the Volt

The SI base unit for electrical quantities is the ampere, a measure of current, but in practice it is desirable to have an independent measure of electromotive force (commonly called voltage). Anyone who has played with charging a plastic comb by rubbing it with cloth and then used it to pick up scraps of paper will appreciate that this provides a means of measuring the force between charged objects. As long ago as 1834 this technique was developed into the attracted disk electrometer, and gradual improvements increased the accuracy to 0.01% by 1938. In 1965 Keith Clothier of NMI proposed that modern optical measurement techniques could greatly improve the accuracy. He proposed, in effect, to measure the height of an 'electrostatic tide' induced in a sea of mercury by an applied voltage. The idea was that the upwards attraction on the mercury would be balanced by the weight of the elevated column of mercury.

As in the case of the calculable capacitor the implementation of this simple but clever idea involved a vast amount of experimentation, new technology and theory, and a final result was not reported until 1989. This experiment was of considerable importance, because it had become clear that the current value of the practical volt was significantly in error. The NMI results supported other recommendations (based on entirely different techniques) for the proposed new value, which was embodied in the new working definition of the volt as maintained by the Josephson effect.

This experiment required the measurement of the height (less than a millimetre) of the mercury 'tide' to very high accuracy, and also accurate knowledge of the density of mercury and the local value of gravity. Improved values for the density of mercury had been precisely measured at the National Physical Laboratory in England, and the local value of gravity had been measured at NMI.

The optical system developed by Clothier and Graeme Sloggett used a group of Fabry-Perot (FP) laser interferometers and white light interferometers. These were used to measure the gaps between a semi-transparent upper electrode and three interconnected pools of mercury, one below the upper electrode and the others on either side to provide reference surfaces that are not affected by the electrical force. The particular combination of FP interferometers in transmission and white light interferometers in reflection was novel, and raised many optical problems. The mechanical construction of the FP interferometers was adapted from Clothier's original design for the calculable capacitor.

Anyone who has seen mercury will appreciate the extreme difficulty of locating its very mobile surface to within 3x10-8 mm in the presence of vibration, and indeed extraordinary efforts by Clothier and Hugh Bairnsfather were required to isolate the mercury pools from vibration. The entire apparatus was mounted on vibration isolators. Specially shaped edges to the mercury pools minimised the curvature of the meniscus surface. In addition, a major investigation by Don Benjamin was needed to establish a way to produce and maintain a uniform oxide coating on the mercury surface. The voltage required to lift the mercury by a millimetre or so is several kilovolts, and a comparison chain was developed by Malcolm Currey to transfer the result to the 1-volt level. This experiment achieved the desired accuracy of a few parts in 10 million, and confirmed the proposed change in the accepted value of the volt.

The story of this experiment was used by Bob Frenkel of NMI to illustrate an article on the history and importance of absolute, universal and accessible standards of measurement and the dedication of those who develop them.

WK Clothier, GJ Sloggett, H Bairnsfather, MF Currey and DJ Benjamin (1989) Metrologia 26, 9–46
R Frenkel (2000) Metrology, metrication and anti-science Quadrant, July–August, 42–51

Redefinition of the Candela and the Stefan-Boltzmann Constant

There are two criteria for deciding when a measurement is 'good'. The first, which motivates the constant drive for improved techniques, is the scatter of the results. Obviously, a small scatter suggests that the technique is reliable and sets tighter statistical limits on the result. The second, and often the more difficult criterion to meet, is that anything that might shift the result away from the 'true' value has been allowed for. Many of these 'systematic errors', as they are often called, are obvious — Is the thermometer reading correctly? Have we allowed for the effects of atmospheric pressure? Are there any stray magnetic fields that might affect the instrument?

Unfortunately, there is no way of knowing that we have considered all possible systematic errors. There are, however, some very powerful ways to detect that some unidentified cause is affecting the results — make a measurement of the same quantity with differently designed apparatus of the same general type, or in a different location, or, even better, by a totally different method. This is why important physical quantities are always measured in one, or preferably all, of these ways. An example is the current work on the redefinition of the kilogram.

In some cases it is possible to calculate a theoretical value for the quantity of interest from several reliably measured quantities and well-established theory. If this differs from the experimental value, then either the experiment or the theory (or both) must be defective. Only the free-ranging, sometimes wild, intuitions of a skilled metrologist can produce a possibility worth investigating. NMI was able to resolve such a discrepancy in the value of a very fundamental quantity, the Stefan-Boltzmann Constant (sigma), which relates the amount of radiation emitted by a hot body to its temperature. This relationship is the basis of a vast amount of theoretical physics and engineering, for example the measurement of high temperatures in metallurgical processing, or the design of lighting.

The definitions of units for the 'quantity' of light are complicated because of their dependence on geometry and on the properties of human vision. Early units were based on the light emitted by a standard flame. The first modern definition was in terms of the total amount of light emitted by a standard candle of specified composition. The original Système Internationale defined a 'new candle', later renamed the 'candela', in terms of the luminance (amount of light per unit solid area) of a blackbody at a specified temperature.

There was a theoretical relation between the value of sigma and several very accurately known quantities, including the speed of light 'c' and the Planck constant 'h'. There were also several accurate measurements that agreed well with each other, but which all fell well above the theoretical value. Naturally, corrections had been made for various plausible systematic errors, and other errors were proposed, but these all increased the discrepancy. This unacceptable situation was finally resolved by Bill Blevin and Bill Brown in 1970.

Measurements of sigma involve the use of screens to select radiation from a small area of a hot body (usually a pool of gold or platinum at its melting point, contained in a carbon furnace). Further screens allow only the radiation emitted in a known small range of angles to fall on an electrical detector.

Precise radiometry is very difficult, and the accuracies attainable (about 0.1%) may seem poor compared with other fields, such as measurements of mass, length or time. Nevertheless, Blevin had pointed out in 1969 that improvements in measuring the radiation from hot bodies had reached the stage where small corrections for the diffraction of light at the apertures were becoming significant, and he proceeded to evaluate the corrections for the apparatus that he was developing for a new measurement. His final result differed from the theoretical value by less than his experimental error and was, unlike all previous results, actually less than the theoretical value.

Since sigma was at the time the basis of all measurements of visible light and thermal radiation, the removal of this discrepancy marked a major advance in both photometry and radiometry. The definition of the candela was later refined by specifying the use of a specific frequency (colour) of light, and is used in conjunction with well-known curves relating the strength of the visual sensation of a 'standard' (i.e. typical) human observer to the colour. Nevertheless, the Stefan-Boltzmann Constant remains a very important physical quantity, and the experimental confirmation of its theoretical value provides a secure basis for much theoretical work.