Electronicdesign 5674 Billlaumeister 595x335 1
Electronicdesign 5674 Billlaumeister 595x335 1
Electronicdesign 5674 Billlaumeister 595x335 1
Electronicdesign 5674 Billlaumeister 595x335 1
Electronicdesign 5674 Billlaumeister 595x335 1

Calibration: Needless Or A Necessity?

Jan. 30, 2013
Accurate calibration is not a needless luxury. It ensures the reliability of test instruments, the safety of personnel, and the quality of the final product. 

Calibration always seems to be scheduled at the most critical time in a project. For example, assume that a piece of test equipment is on a six-month calibration cycle. At month five, the design engineer starts a test sequence that lasts two months. If the instrument is recalibrated in the middle of the test, the drift or errors that built up in the first five months can be significant enough to require the tests to be redone.

Wouldn’t it be better to calibrate instruments before starting on a big project requiring that equipment? And shouldn’t the calibration schedule be posted so there are no surprises? One should schedule calibration in the project planning software, just like any other event, as it may be on a critical path. If it is ignored, calibration can delay the project.

What Is Calibration?

Some people check two instruments like a scope and a meter and consider them “calibrated” if they provide the same reading. But there are some issues with this approach, the least of which is that it is totally unscientific. To compare the logic, no one would allow bookkeepers to audit their own work.

When performing calibration like this, there are three obvious conditions that are impossible to decipher. First, when one instrument is right and one is wrong, which is which? Second, if both instruments are wrong in opposite directions, how can an engineer tell that neither result is correct? And finally, when both instruments are wrong in the same way, an engineer has a bad result and won’t even know. One cannot tell when an instrument is correct without a true traceable external standard.

The calibration standard must always be significantly more accurate than the instrument under test. Don’t forget that the standard also has a tolerance. If the tolerance band of the device under test (DUT) and the tolerance band of the standard overlap, there isn’t a clear calibration. That’s why calibration typically requires a standard that has at least 10 times the accuracy of the DUT.

Having a standard with clear, small tolerances allows the DUT to be adjusted, preventing the DUT from providing an out-of-specification reading as it experiences normal drift between calibrations. When calibrating state-of-the-art instruments, it is impossible for the standard to be 10 times better. As a practical matter, a standard that is four times more accurate can be used, but only in conjunction with more complex procedures that include cross checks with other standards.

Metrology Labs

Serious engineers buy top instrument brands and expect those electronic test instruments to be accurate. Things drift over time, though, so these engineers send their instruments to a metrology lab for calibration. But what really happens at the metrology lab?

Most metrology labs are good and effective. Unfortunately, there are exceptions. Some companies look for the lab that is the lowest bidder without studying other qualifications. It is up to engineers to verify the veracity of the lab. A great lab will be happy to show you around and will be proud of the traceability of its standards. Take the calibration manual for your instrument and sit with the lab personnel. Verify that the lab has the necessary instruments to perform accurate calibration.

You must also ensure that a lab’s instruments are properly calibrated with traceability. Most countries have their own national standards laboratory. The U.S. has the National Institute of Standards and Technology (NIST).1 A good metrology lab has records showing that its instruments compare correctly to a chain of standards that go back to a master standard maintained by NIST or another national standard.

The internal components of test instruments such as voltage references, input dividers, amplifier gain, and offset can drift over time. A good calibration schedule ensures that typically minor drifts don’t impair the measurement. Calibration finds and corrects this drifting. But accidents do happen. Instruments can be dropped or one can slip and probe a high voltage.

For example, a digital multimeter (DMM) can be overloaded, causing a large error. Because the inputs are fused or breaker-protected, some mistakenly think that the overloading doesn’t produce an out-of-calibration result. However, a high voltage can jump across the input-protection device, or a transient can destroy the circuit before the protective device has time to react. 

When we send an instrument for calibration, we expect the metrology lab to bring the instrument back into calibration. Engineers should also receive a report showing how far out of calibration the instrument was before, and how far it is after adjustment. If the report shows significant calibration errors, it may be necessary to redo a project’s work that was completed with the instrument and take new measurements.

How Often Is Calibration Required?

There is no one answer because instruments, environments, and applications vary. Test instrument manufacturers recommend calibration intervals for typical conditions. Extreme conditions and very critical measurements may require more frequent calibration. Here are some general categories of calibration intervals:

• Routine calibration as required by customer contract, quality standard organization, military specification, or other industry requirements: It is necessary to review the applicable requirements before the test to ensure that the calibration or certification for the test equipment is met.

• Before and after a key measuring project: For example, when a new product pilot run is complete, a design engineer will characterize the product to ensure that it meets specifications and will optimize the test procedure. Final test adjustments made here can substantially decrease test time and affect profitability. Complete and reliable testing requires the state of the instruments to be verified both before and after the test period.

• When you suspect a measurement is faulty or when the instrument has been overloaded or dropped: It is important to check the calibration and safety fidelity, such as a wire shorting to the case if it was dropped.

Accurate calibration is not a needless luxury. It ensures the reliability of test instruments and even the safety of personnel. For example, before working on a piece of equipment, one might measure a meter’s voltage to be sure it is safe. If a meter is broken or provides inaccurate information, it could result in injury or death. Additionally, calibration ensures quality. It guarantees the accurate test results necessary to verify that products in final test and shipped to customers indeed meet specifications.

Reference

1. The National Institute of Standards and Technology (NIST) is an agency of the U.S. Department of Commerce: www.nist.gov/index.html

Bill Laumeister is an engineer in strategic applications with the Precision Control Group at Maxim Integrated. He works with customers who use DACs, digital potentiometers, and voltage references. He has more than 30 years of experience and holds several patents. 

Sponsored Recommendations

Highly Integrated 20A Digital Power Module for High Current Applications

March 20, 2024
Renesas latest power module delivers the highest efficiency (up to 94% peak) and fast time-to-market solution in an extremely small footprint. The RRM12120 is ideal for space...

Empowering Innovation: Your Power Partner for Tomorrow's Challenges

March 20, 2024
Discover how innovation, quality, and reliability are embedded into every aspect of Renesas' power products.

Article: Meeting the challenges of power conversion in e-bikes

March 18, 2024
Managing electrical noise in a compact and lightweight vehicle is a perpetual obstacle

Power modules provide high-efficiency conversion between 400V and 800V systems for electric vehicles

March 18, 2024
Porsche, Hyundai and GMC all are converting 400 – 800V today in very different ways. Learn more about how power modules stack up to these discrete designs.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!