You mentioned about calibration. Do these devices require runtime calibration or one-time calibration in the factory before shipping?
We would generally calibrate as part of the test process during manufacture. I don't think they need run time calibration- if one needed that sort of accuracy I would imagine that one would use some different form of external current measurement. I would add that I have only worked with the IR device. The others may be different.
Without calibration does it make a huge deviation or could the tolerance be accommodated in the firmware if too much of accuracy not needed?
I am sure you can use software to handle the variations. The current mirror ratio can vary from 4800 to 6000 with a nominal value of 5300- a variation of about +/- 10%. There is a variation of +/-5% over the temperature range and an offset of about 150mA at a 2A output. You make the choice based on your application, I guess.
If you look at the earlier post (in this thread) from Max (about 5 posts ago) he refers to an Infineon application note that actually discusses calibration of this sort of device.
Do these devices require runtime calibration or one-time calibration in the factory before shipping?
The thought occurs to me that if you are reading the signal from the drivers you would be using and ADC and some additional circuitry, at the very least probably a resistor to convert the mirror current to a voltage. The ADC may also need some calibration, and it would be better is the whole thing was calibrated as a system. If you go to the second post before yours, you will find a pointer to part 1 of a blog I did on system calibration.
Just so happens I was looking for current transducers the other day, under Sensors -> Current Sensors on Digikey (notably NOT under ICs -> Linear "anything") and I have to admit I was immediately impressed at the array of actual application specific ICs available. You can get prices down around 51 cents in quantities of a few thousand for some configurations, notable vendors in the low end include Allegro Microsystems and Silicon Laboratories. I'd emphasize this is for a range up to tens of amps, if you go higher into industrial motors and such you'd likely look at something like the LEM units which are a space- and cost- saving Hall effect replacement for a current transformer but in a higher price product class. But I'd admit until I "stumbled upon" the previous class of chips I wasn't really aware of them, thought perhaps you folks might not be either, might want to go there and crack open a data sheet or two.
notable vendors in the low end include Allegro Microsystems and Silicon Laboratories. I'd emphasize this is for a range up to tens of amps, if you go higher into industrial motors and such you'd likely look at something like the LEM units
I have used the Allegro hall effect parts as well as current transformers and both form a part of my arsenal. I haven't looked at LEM for a while now, I will follow your reccomendation.
True, I agree with you on that it better be calibrated along with the ADC as a system. One could use the ADCsoften available built-in the modern microcontrollers. I will check out your & Max's blogs. Thanks for the information!
This looks a very interesting device. I only seem to find a product brief. Do you know if there is a data sheet available? Perhaps it is still preliminary. I did find a YouTube video, but not much else. There is also supposed to be an evaluation kit, but I can't find stock of it or the part anywhere.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.