I've got several handheld meters here. One is 24 years old.
I've got 3 Fluke 8060As of that age. I far prefer them to my Agilent U1251A. Unfortunately they are beginning to show their age. They have turned yellow and the on/off switch is going. I have had to replace (and void the calibration) the fklying battery connector on each one.
When I used to work at Keithley I saw quite a few null meters or old electrometers with analog needles on them still being used. Pretty sure the guys using them weren't still keeping the calibration up to date for the last 40 years =)
eafpres, many companies use handheld DMMs in production. Often the same meter might be used in more than one setup, say to measure voltage and then current. I supposed there is some exposure, but if you;re using a handheld meter for production, then the uncertainty of the measurements isn't a big concern.
@Martin--it is interesting to think through this switch thing. All handheld DVMs/DMMs have at least one big rotary switch. When we used those in production, they had to be calibrated/certified. There were stickers on them showing the dates, and when they were due back to re-cert. But I don't ever recall anybody checking the meter if they changed the rotary switch. There is some exposure there that you could be doing bad measurements.
@eafpres, it depends on the application. I wouldn;lt do that if you needed to maintain calibratiion. You wouldn't want to ship out-of-spec product or hold back good product. But if you're just experimenting in the lab, well.
Chirs, as we now know, some scope companies offer additional features that are software enabled. They use encrypted keys now because apparently some people had figured out the key codes and were even selling copied keys.
@Martin--so if you had a measurement system and you changed the gain via the DIP switch, did you think it necessary to recalibrate or at least measure something you considered a known? Switches can fail, you can set them wrong, they can add noise, etc.
I used to work on panel meters that use DIP switches to change resolution (think add a digit). If anyone went inside, they could figure it out. All you had ot do was remember the original setting if this didn't work. Today, you take a picture with your phone before changing things.
I agree completely! but not everyone offers BW upgrades at all and if you are cracking into the code or making small changes on the board you'd have to know a lot about the design to be sure you were getting correct cal constants and the like.
Many on old analog board used DIP switches for setting amplifier gains.
"There was a time when you could change some equipment with DIP switches either legit from the outside, and sometimes less legit if you got inside and found the DIP that set configuration. I once had a setup to measure velocity of projectiles using two optical sensors spaced a known distance apart. You could change the range of the time and the units it reported in via a DIP switch."
I don't know that anyone does it this way, but you could envision a manufacturing system where they test the scopes on the line and, in essence, bin them for BW. So the best performing units get the high BW model numbers and so on. So, I don't think its completely unrealistic to think that their could be some issues with hacking in a BW upgrade.
There was a time when you could change some equipment with DIP switches either legit from the outside, and sometimes less legit if you got inside and found the DIP that set configuration. I once had a setup to measure velocity of projectiles using two optical sensors spaced a known distance apart. You could change the range of the time and the units it reported in via a DIP switch.
antedeluvian - I guess that depends on how you did the upgrade. If you got your soldering iron out to do it, then you may or may not be able to put it back. Often this would be users who don't care all that much about the cal doing this sort of thing. If I didn't care about the precise cal or warranty I'd just measure a few test points and remember if it is a little high or a little low at high frequency and go from there.
If the instrument were calibrated for its full BW at the factory, then in theory it's in cal when the higher BW is enabled. Most isntruments have a digital filter that cuts off the BW at some point untin unlocked.
I guess you could call this a hardware hack. When I was just out of college, there were over-the-air subscription TV channels. Every electronics company in the boston area has a schematic for a channel 68 decoder. the were based on either and LM1800 or LM1310 stereo decoder. I even made a PCB as I had worked in a PCB factory in HS and college.
I'd assume, if done legitimately, that the instrument contains cal points across all the bandwidths it could be used and doesn't need redone. On our Rigol scopes we don't offer a legitimate after sale BW upgrade. On units like that you may be taking your chances a bit more. If I were going to upgrade the BW on the sly I'd definitely run whatever auto-cal sequence I could as well as a few tests.
antedeluvian--we had similar needs for using HP/Agilent network analyzers on production lines. the HP boxes were easy to control over IEE 488 (GPIB) using an interface card in a standard PC. Then we controlled when to make a measurement by the PC, with a GUI. The data all came over to the PC and were analyzed for pass/fail criteria against stored return loss profiles for reference devices. So in effect the instrument becomes virtual like Martin says, and it is up to the test engineers to make sure their code is right.
Many instrument hardware platforms today, such as scopes, support an entire series of models based on available firmware options. Those interested may be able to change their instrument's capabilities through firmware. that is where the risk of proper calibration and warranty can come into play.
I once did some work with Stanford Research Systems Model DS345 Synthesized Function Generator. I didn't like their user interface, but the provided the details on controlling the device via an RS232 port. I set up a UI (at least a portion of the UI) in Excel.
As far as modifying equipment and providing to others, probably not if it was off the shelf equipment, and especially if it had one of those stickers that says warranty voided if sticker is torn, missing, etc.
When I was in the antenna business, the largest "piece" of test equipment was the RF test chamber. It had a source horn and a network analyzer. Everything else was custom, and was modified frequently. There has been an issue for a long time in the antenna business that the gain and pattern tests are not standard, and often published specs / test results are not believeable. We went through a test of 5 or 6 global sites, sending around a set of carefully designed and fabricated test antennas. We found 1 dB from the lowest reported gains to the highest. 1 dB is a large error for most antennas.
A lot of test equpment has embedded software as well as software you can load to do certain things. What if someone writes their own code or modifes the original code? Is that a hack or a modification?
My unlocked iPhones (3 in my house) tell me it's almost 1:00 here in the still frozen northeast. Today, we're discussing modifying test equipment, yes or no? I make a distinction between modfying and hacking. electrical engineers modify, code warriors hack. Do you agree?
I have the small bench top Rigol DSA815 spectrum analyzer that I'd love to be able to run off 12V DC. I know I can strap a gell-cell and 12V to 120 VAC inverter on the back, but it would be so much more elegant to insatll a DC power jack on the back and figure out how to tap into the internal power supply.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.