As far as comparing NIST tables to NIST curves quantitatively, any engineers able to evaluate a polynomial function can do this if they wish. But that is not the same thing as validation. You are absolutely right, validation is vastly harder. And here, mostly beside the point.
Maybe an average American is 51.3 percent female and lives in Ohio, but that does not mean any individual conforms to this exactly. In the same way, I do not trust too much that any individual thermocouple device tracks NIST standard curves exactly. Why? Partly because I trust what NIST says about variability of the devices. Partly because I trust device manufacturers when they claim that their devices could deviate 1 to 2 degrees from the standard curves.
The point is, no matter how good the NIST curves are for representing typical devices, no matter how good a chip set is at implementing the NIST curves, these do not and cannot correct for individual device variations, or any other process factors that the curves do not represent. When accuracy matters, calibration rarely requires the rigor of a NIST laboratory, though you do need a certified instrument of sufficient accuracy to serve as a reference. It can be expensive, care is necessary, and any results are expected to be slightly wrong. So these risks must be balanced against the risks that the device yields measurement errors larger than 2 degrees, rather than the 0.1 degrees as presumed without looking.
I can't believe any enginee--other than one is a super-sophisticated lab with extremely complex instrumentation--would try to replicate or validate the NIST tables versus actual measurements--how do you know what the "correct" temperature is? How do you know the accuracy of your test set-up? It's easy to be "slightly" wrong--and that's all it takes.
If you should ever perform the exercise of comparing the NIST curves to the NIST tabulated data, you will find that the worst case fiting errors are roughly on the order of 1/100 degree C, while the errors the article talks about are on the order of 1/10 degree C, while the errors on typical individual thermocouple devices are on the order of 1 degree C. Engineers interested in the "highest possible accuracy and linearization" will trust no reference idealization too much, work with an ice bath cold junction reference to recreate repeatable conditions, and work only from a calibration curve (or table) that reflects their sensor's individual characteristics. And it's hard work.
A typical engineering wannabee mathematician.
The tables are the most exact measured data of thermocouple
performance. The equations are only mathematical approximations
and are less accurate than the tables.
Engineers interested in highest possible accuracy and linearization
should work only from the tables.
One other comment. The article concentrates on chip performance -- fair enough -- but there are more factors that affect measurement accuracy.
Noise is a major consideration. To put this in perspective, to resolve 1/10 degree C you need approximately 16 good bits resolution. (Having more mostly helps for spanning a wide range.) On this scale, what you will actually observe for your thermocouple is perhaps 10 stable bits and the rest will be chattering. Aggressive lowpass filtering, chopping the bandwidth of the noise severely, will let you see the bits you need to see, but with a cost in tracking error produced by the filter lag.
Any error in the CJ temperature will, in effect, add directly into the total measurement error. This is not the fault of the chip set, just the consequence of having TWO measurement errors. As the article states: "... allows the cold junction temperature to be measured with +-0.30 degree C accuracy, or better." Well, to get 0.1 accuracy, really really better would be good.
It is typically not practical to represent the NIST tables on-chip in the polynomial form, so chip sets typically provide Ecf correction by clever combinations of nonlinear elements and resistors, forming a sort of "analog computer" approximating the polynomials. The article does not say much about this approximation. Is it better than 0.1 degree C? That would be quite impressive.
When you combine these other practical uncertainties with the considerations of chip performance covered by the article, the net result is probably along the lines of what the NIST labs have reported from their practical experience. With typical thermocouples and appropriate diligence, you can probably get to +- 1 degree C accuracy. Reducing that error to under about 0.5 degrees is challenging. +- 0.1 degrees accuracy? Show me.
This article seems a little obsessive about the NIST thermocouple tables and "recent advances." That's somewhat strange, since the tables appear to be from the 1990's. The materials have not changed physical properties since then... maybe the manufacturing is better today. If NIST reduced the bias by 0.001 degree C or so, I'm happy for that, but realistically, this doesn't affect practical measurements much.
You have to keep in mind what exactly the NIST curves are. The curves are a best "consensus" approximation of the composite results over many tests with many representative test samples. The polynomials do not perfectly match physical reality for any given real device. What matters is which part of the statistical cloud your particular thermocouple device operates in. This can be influenced by age, contaminants, physical handling...
If you get spools of high-quality thermocouple wire from a high-quality supplier, they will provide you correction terms over various temperature ranges, accounting for bias relative to the NIST standard curves. The resolution in these corrections is typically about 0.1 degree C, with a range around +- 0.3 degree C. Compared to the target of 0.1 degree C accuracy, this has some important implications that the article hinted at when using the words "recalibration of the system." Do the chips have a "recalibrate" pin? Things aren't always so easy.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.