Don't worry, our vital standards are not going away, but they are changing radically. A fascinating article in the July 2014 issue of Physics Today reviews the history of the definition of our basic units -- meter, kilogram, second -- and how these definitions have evolved as part of the present International System of Units (SI, from the French Système International d'Unités). It then clearly explains the very significant, dramatic, and non-intuitive changes that are officially approved and being made new basic units of the metrology world.
For many years, for example, the "master" meter was a platinum bar with scratch-marks, kept in a vault in Paris at the Bureau International des Poids ets Mesures (BIPM). Just think about the logistics of comparing secondary standard to that primary one. The meter was subsequently redefined in terms of wavelengths of a specific spectral emission. Similarly, the standard for time was changed from a fraction of an Earth's year (which actually does vary) to vibrations of an atomic clock. These new standards are not only more accurate, they are reproducible and don't rely on a single physical artifact.
But the kilogram has been a problem and remained as an artifact-based standard, defying many attempts to design and build a reproducible standard with sufficient precision and consistency (remember, this is the world of ppm performance). Until very recently, the primary kilogram was a master cylinder known as the International Prototype of the Kilogram (IPK), Figure 1, and protected like the old meter standard. In addition to the logistics issues, it seems the master was losing weight, when compared to multiple secondary kilogram standards (or maybe they were gaining?). Seriously, when you are looking for this level of perfection, a few molecules rubbing off the surface despite careful handling can make a difference. The interesting news is that the new kilogram will be defined by a reproducible system called a "watt balance," an incredibly sophisticated embodiment of a conceptually "simple" idea.
Figure 1. The world standard for the kilogram is the last SI unit to be an artifact, but that's changing.
The changes go far beyond the basic units. The ampere, so vital our electronics work, will no longer be defined in terms of current through parallel wires and the attractive force between them -- always a tricky one to explain and assess. Instead, it will be defined by fundamental atomic constants and quantum physics phenomena, while the volt is defined by an array of superconducting Josephson junctions.
You might think that this is all nice and good, but we'll still have our basic building blocks of time, distance, mass, and four other so-called "base units," but that's not the case. The really dramatic fundamental change of the new SI system is that the existing seven base units themselves are changed to seven new ones. These new base units are very different than the ones we're used to, and which seemed, for better and worse, somewhat more intuitive to most of us. But progress is progress, so say goodbye kilogram and electric current, and hello to electric charge, Avogadro's number, and luminous intensity (to cite a few of the old and new ones). (An enjoyable book on the history of metrology, its many problems and solutions, as well as evaluation and transformations, is World in the balance: the historic quest for an absolute system of measurement by Robert P. Crease, Figure 2.)
Figure 2. World in the balance: the historic quest for an absolute system of measurement by Robert P. Crease
The Physics Today article is worth your time for both the explanatory background it provides, and to bring you up to date on these very significant changes to fundamental standards upon which we rely so intensively (often without needing to think about any deeper meaning). While accuracy to the level that the new definitions and standards provide is not critical to most of us and our work, reality is that in any test and measurement situation, there is always the issue of "how do you know that measurement is accurate?"
A general guideline is that any equipment measuring a device under test (DUT) must be at least four times better than the desired accuracy of the result. That's called test uncertainly ratio (TUR). But, how to you verify the measurement uncertainty of the test equipment? Going a step further, how do you know the uncertainty of the system used to check the test equipment itself? Pretty soon, you are in deep philosophical territory about the meaning of reality and perfection.
What's your view on the major changes we are undergoing in the basic units of the SI system, and how they are defined? Will it affect you, or is it a "don't care" change?