Minimizing Temperature Drift in Your Current Measurement (.PDF Download)
As more systems become electrified, thermal management has turned into one of the “hottest” issues facing designers. Using current measurements for thermal management is a leading indicator of system performance and faults, whereas simply monitoring the temperature is potentially a lagging indicator. Accurately monitoring the current consumed, especially over temperature, has become vital as designers pack more functionality into tighter areas.
While room-temperature calibration tends to be relatively straightforward, performing multi-temperature calibration is time-consuming and costly. Identifying ways to minimize the effects of temperature on current measurements can improve system performance and minimize system design margins, as well as potentially lower the total cost of ownership (TCO).
Sources of Error in Current Measurements
As I stated in my September 2015 article, “Mitigate Error Sources to Maximize Current-Measurement Accuracy,” there are multiple contributing sources of error in current-measurement applications. In the article, I listed these sources of errors:
Amplifier-related errors:
- Input offset voltage (VOS) and VOS drift
- Common-mode rejection ratio (CMRR)
- Power-supply rejection ratio (PSRR)
- Gain error and gain drift
System errors:
- Gain-setting network tolerance, matching, and drift
- Printed-circuit-board (PCB) layout
- Shunt-resistor tolerance and drift