**The Parisian kilogram prototype should move to a museum in 2018**

Yesterday, Phys.org published an interesting news report

Important milestone reached on road to a redefined kilogramthat explains some experiment that helps to realize my 2012 call to fix the numerical value of Planck's constant.

The experiment is described in "AIP Review of Scientific Instruments" article

Invited Article: A precise instrument to determine the Planck constant, and the future kilogramby Haddad and 7 co-authors.

They use the NIST-4 watt balance. It compares some mechanical and electromagnetic forces and also uses the quantum Hall effects and the Josephson effect in superconductivity to convert electromagnetic forces to Ampere-independent units.

In effect, they measure Planck's constant as\[

h = 6.626 \,069 \,83(22)\times 10^{-34} {\rm J}\cdot {\rm s}

\] The relative standard uncertainty above is 34 parts per billion. There exists a similar, Canadian experiment, whose standard uncertainty is 19 parts per billion. The plan is to have 3 independent measurements that push the error below 20 parts per billion and then switch to a new definition of the kilogram – an event that should take place in 2018. So aside from the Canadians, NIST will have to improve their measurement (we're just discussing now) and one more team will have to become as good as the Canadians.

Right now, the kilogram is still defined – and has been defined for 127 years – as the mass of this platinum-iridium prototype in Paris. Replicas of this prototype exist and their mass is believed to have drifted by some 30 parts per billion (micrograms) in average. In practice, this drift can't be quite known too reliably at all times and it's often being neglected so this 30 parts per billion or slightly less is the estimated unavoidable error of any quantity with SI units that include the kilogram or its nonzero power (the relative error gets worse if the exponent of the kilogram is higher).

Wikipedia's page for Planck's constant says that the value \(6.626\dots\times 10^{-34}\) jouleseconds has the standard uncertainty of \(8.1 \times 10^{-42}\) jouleseconds. That's about 12 parts per billion uncertainty. You rarely need to know the constant at this precision. So if the inaccuracy increased to 20 parts per billion, it wouldn't be a big deal.

One of the investigators from NIST talked about their job:

All the redefining should go on with little impact on the outside world. "It's the frustrating part about being a metrologist," Schlamminger said. "If you do your job right, nobody should notice."Right. If and when they are switching the definitions, they struggle to make the transition as smooth as possible. It's similar to the central banks' work when currencies are being merged (e.g. joining the Eurozone).

Well, to some extent, one could say that the fundamental physicists' job is often "frustrating" for the same reason: When you do your job really well, you discover laws that agree with everything that is known perfectly, so people may complain that you're not saying anything new about the doable observations at all. And indeed, you are not. It's the key virtue of the theory that it agrees with the observations – it's right when your theory doesn't contradict them at all. But it's obviously far from simple to actually construct/discover and understand a

*single*theory that may explain

*all*the known observations.

OK, so in 2018, the definition of the kilogram could change so that the numerical values of \(h,\hbar\) will be known absolutely precisely, much like \(c\) is already known to be \(299,792,458\,{\rm m/s}\) precisely. Many people misunderstand how it's possible at all. For example, the first Phys.org commenter named Humy complains:

Hang on! Surely it is inevitable that technology will keep on improving with time so that eventually, at some time in the future, say, 40 years from now, a new way is found to measure Planck's constant with an uncertainly of much less than that 19 parts per billion, lets say, just 0.1 parts per billion, but then what if that new measurement doesn't agree with that \(h\) value that was "fixed for all time," by that earlier computer program!? Surely it would then be idiotic to keep it "fixed for all time," thus it shouldn't ever be regarded or said to be "fixed for all time"?He clearly doesn't get it at all.

The definition of "one kilogram" will be the same as the current definition of "one kilogram" up to the current error margin comparable to 10 parts per billion. But in principle, if you're able to achieve more precise measurements, the new definition of one kilogram will be really different. The kilogram will be

*defined*so that \(h\) is a very particular and completely specified number of jouleseconds, just like one meter is defined so that the speed of the light in the vacuum is \(299,792,458\,{\rm m/s}\). Because both \(c\) and \(h,\hbar\) are dimensionful i.e. dependent on units, there can't be any measurement of their numerical values that would be independent of our choice of the units. And because the kilogram will be defined (and one meter already is defined) so that it may be "adjusted", it will be automatically adjusted so that the numerical value of \(c\) or \(h,\hbar\) remains exactly what it is supposed to be.

It's obvious that you can't ever run into any contradiction here. After all, the new definition will be analogous to the definition of one kilogram as the "mass of the Parisian prototype". You could also say that you may measure the mass of this prototype more precisely than ever before, and prove that its mass wasn't \(1.0000000000000000\) kilograms, after all. But the point is that this can never happen because the prototype's mass and one kilogram are precisely equal by the definition of the latter (one kilogram). There is one unknown, the "kilogram", and there is one equation, either "the mass of the Parisian cylinder is one kilogram" (the current, 127-year-old definition of the kilogram) or "\(h,\hbar\) are equal to the particular numbers of jouleseconds" (future definition of the kilogram). And one equation for one unknown simply always has a solution!

At the same moment, if the accuracy of the measurements of \(\hbar\) may be made as good as the precision that with which you need to measure masses – or anything that has a "kilogram" in the expansion of the unit in the SI system – it's clear that the change of the redefinition of one kilogram so that \(h,\hbar\) is fixed to a number compatible with the "current measurements of \(h,\hbar\) according to the current, old definition of the kilogram" won't change anything about any value we have now, up to the error margins that we admit, anyway.

But if and when the precision of the measurement grows, the interpretation of the errors will change. The constants \(h,\hbar\) will be known precisely. On the other hand, the future precise measurements may find out that the Parisian prototype actually has the mass of \(1.00000000000146\) instead of \(1.00000000000000\) kilograms. That's possible because the Parisian piece of metal will have nothing fundamentally to do with the precise one kilogram anymore. There will be a historical explanation why the mass of the object is almost precisely one kilogram but not quite: At some moment, this stupid metal was used to

*define*the unit of mass and the same unit was kept as constant as possible while its fundamental definition has switched to a different methodology. From that moment, the number will have deviated from one.

We've faced exactly the same situation many times in the past. One second used to be defined as \(1/86,400\) of the solar day. But we use the atomic clocks for one second now. And that's why we're able to show that the solar days aren't quite regular and their durations aren't quite constant. Sometimes we need to insert a leap second. That's because the solar days no longer dictate the time

*fundamentally*. We have more precise methods to define and measure the seconds (and durations) than to divide solar days. But we also have an explanation why the atomic clocks show that one solar day is approximately 86,400 seconds long: it used to be true by definition (but it's no longer so as the current definition of one second is different).

Similarly, one meter used to be defined so that the meridian had the circumference of exactly 40,000 kilometers. However, more precise and lab-based definitions of one meter – and more precise and usable methods to measure lengths of other objects – have appeared. First, the meter was defined by a stick, then by some wavelength of the radiation, and now as \(1/299,792,458\) of a light second. The accuracy has increased and because the "prototype" of one kilogram has changed a few times, we surely know that the circumference of the meridian is no longer precisely 40,000 kilometers. However, we still have a historical explanation why the circumference is very close to 40,000 kilometers: The precise version of the statement used to be used in a definition of one meter!

If you think that there is or there will be any real problem with the transition to new units with a fixed \(h,\hbar\), you are making a rudimentary error in your reasoning.

I obviously think that a fixed known value of \(h,\hbar\) is as good and useful as a fixed known value of \(c\), the speed of light in the vacuum. SI units with these properties basically differ from the adult particle physicists' units where \(c=\hbar=1\) only by some precisely known numerical rescaling. It's better to define one kilogram by the condition that "if you measure \(h,\hbar\), you get a particular number of jouleseconds" because this definition may be realized independently in every lab in the world and you don't need to depend on the French aristocrats, socialists, postmodernists, and the French fries.

Right now, the situation is that if you're on a diet and you want to know how successful you have been really precisely, you have to call Francis Hollande (a socialist!) and ask him to lend you the piece of platinum-iridium. Telephone calls aren't for free and you're not even guaranteed that he will kindly send you the platinum-iridium toy. Since 2018, you will be able to build your own watt balance at home for $10 million and measure the success rate of your diet without the socialists. It's obvious why the whole civilized world outside France wants this change. ;-)

In the future, it could be a good idea to switch to \(c=\hbar=1\) units including the numerical values. We could express everything in the units of a power of "gev", which could be a rationalized new name for \(1\GeV\). One gev would be the unit of mass, momentum, energy. One veg, the inverse, would be the unit of distances and times, among other things, and so on. We could also use the names gegev, gegegev, gegegegev and veg, vegeg, vegegeg, vegegegeg for the integral powers of a gev and veg, to simplify the language. ;-) Some new, more extreme prefixes would have to spread to describe the values from the everyday life, however.

We could also switch to the Planck units by fixing something like Newton's constant, too. That's unrealistic now because Newton's constant is only being measured (a 2014 paper) as \[

G = 6.67191(99) \times 10^{−11}\, {\rm m}^3{\rm kg}^{-1}{\rm s}^{-2}

\] which is the relative error of 150 parts per million – too high uncertainty. (Gravity is very weak and therefore rather hard to be measured too accurately.) You surely don't want to introduce the relative uncertainty of 150 parts per million to almost all dimensionful quantities that we ever measure (that could mean the substantial loss of 0.015% of your money whenever you sell gold or other commodities, among other bad things). But \(h,\hbar\) are measured with the relative accuracy that pretty much matches the errors we have in

*all*quantities that include one kilogram in its unit, so it's simply not a problem to fix \(h,\hbar\) and it may be officially done in 2 years as planned.

## snail feedback (0) :

Post a Comment