Hi, I'm using an Allen Bradley 1769-IF4 Analog input crd with a Compact Logix L33ER. Its 14 bit (0-16384), and I'm taking raw readings. I am applying and measuring (with a multi-meter) a precise voltage of 1.50 volts across the analog input terminals. I'm configured for a single ended input signal, 0 to 10V.
The input value I am getting is 2350 which equals 10* 2350/16384 = 1.43 Volts.
The card specs say that it is internally calibrated yet this error is approximately 4.7%, way outside the specs for the card (0.2%). I am not observing any steady state offset or noise on the signal.
I know I can apply a fudge calibration factor in PLC code, but am curious if anyone else has used this card and achieved better accuracy, and what procedure they were using.
Thank you
The input value I am getting is 2350 which equals 10* 2350/16384 = 1.43 Volts.
The card specs say that it is internally calibrated yet this error is approximately 4.7%, way outside the specs for the card (0.2%). I am not observing any steady state offset or noise on the signal.
I know I can apply a fudge calibration factor in PLC code, but am curious if anyone else has used this card and achieved better accuracy, and what procedure they were using.
Thank you