Correcting Analog Sensor Offsets in PLC Program

Jieve

Member
Join Date
Feb 2012
Location
USA
Posts
274
Curious, is it common to program in offsets to calibrate/correct analog sensor readings in certain applications? As an example, I have a couple flowmeters on a simple system that continuously read slightly below zero when the system is switched off (no flow), and their readings differ from each other. Is it desirable to zero these values in the program, or is it better not to do this?
 
I prefer to use a "low flow cut-off" in your case.

Anything below "X" value you force the output to zero.

Applying an offset shifts the whole scale.
 
I prefer to use a "low flow cut-off" in your case.

Anything below "X" value you force the output to zero.

Applying an offset shifts the whole scale.


+1



Error(noise) is usually more noticiable only near 4mA.


Is error coming from meter AO or from cabling?



(And how you can know real error on 20mA if you don't know error from meter on Max flow. You would need 2nd flow meter or flow calibrator.)



Real error can be also different for 4ma and 20mA values, so for calibrate you need different offsets for min and max.



Have you checked how much AI measures off if you send 4mA and 20mA with mA calibrator directly to AI?


If you need really recalibrate 4 and 20mA values and not noise signal off, you should adjust boths sides individually and made lineanization with new values.
(Some PLCs have even calibrate on AI input cards builded)
 
I always allow for an analog calibration in the HMI, and if the scaled value < 0, then it is = 0...

I get it, but I think you open yourself for another issue.

Several sensors go OVER 20mA to say if its in an error/fault/overload, and on the flip some go under 4mA for error/fault.

If the system is off I'd probably make calibration confirm its calibrated (or you do it) and have it zero if the flow path is blocked by a valve. (This isn't the best option either because if the valve didn't actually close, the flow should give you an alarm. So maybe... setup an alarm condition but HMI just says zero to make operators happy.)
 
I get it, but I think you open yourself for another issue.

Several sensors go OVER 20mA to say if its in an error/fault/overload, and on the flip some go under 4mA for error/fault.

If the system is off I'd probably make calibration confirm its calibrated (or you do it) and have it zero if the flow path is blocked by a valve. (This isn't the best option either because if the valve didn't actually close, the flow should give you an alarm. So maybe... setup an alarm condition but HMI just says zero to make operators happy.)




Error bits are generated from raw values on AI-block, but to outside from block measuring is showing limited values.
 
what plc are you using?
for example, in an ab slc 503 or higher, When calibrating any signal, you need to apply the max input to the plc and get the raw data, then apply the min value and get the raw value and use the scale with parameters command.
before that command was available, i used an offset command as you stated to get zero. i have also used a fluke calibration meter to scale analog inputs/ verify calibration.
one thing that you might do is look at each wire connection. do the flowmeters act like thermocouples per say. by that, i mean that in regards to thermocouple connections, regular terminal blocks won't work, they have to be thermocouple type terminal blocks to work correctly. small things like that will introduce error in your calibration and will create the error you are talking about.
james
 
Regardless of PLC I have a programmed function that reads the analog inputs and detects under/overrange and under/overflow conditions. For sensors I disable these warnings/faults if the sensor isn't currently active (i.e. pump off).



However, I have this small system with 3 pressure sensors and 2 flowmeters controlled by a Siemens S7-1200 w/ 16-bit AI, and all sensors read negative when the system is switched off. All sensors are 4-20mA. Without thinking too deeply about it, I was initially thinking that the entire scale might be off or sensors not properly calibrated or something, and thought that shifting the entire scale slightly based on the zero point on each sensor would make sense. But I haven't tried to check the inputs with a meter or calibrator or anything. Maybe the measurement reference is shifted slightly to the internal ADC reference somehow? Would have to think about this some more.



Interesting idea regarding the calibration (applying max and min values to PLC and scaling based on that). You would have to have full control over the process variable then, and have to know that accurately, no? Or I could, for example, apply 4.000mA via an external circuit to the module and read the input. I suppose that would help eliminate module error, but not sensor error.
 
Most instruments have zero & span settings so it is best to calibrate the sensor before doing it in the PLC, I used to calibrate the sensor analogue output then do the same for the PLC if required. This reduces the errors through the system or at least reduces the offsets, there seems little point in having an instrument that should give 4-20ma over a range of 0-150 if it is giving -5-148 then compensating in the PLC. If the instrument has zero & span (low/high) settings get these right first, then calibrate the PLC zero & span by including the sensor this will reduce errors in both the sensor & analogue card.
 
Regardless of PLC I have a programmed function that reads the analog inputs and detects under/overrange and under/overflow conditions. For sensors I disable these warnings/faults if the sensor isn't currently active (i.e. pump off).



However, I have this small system with 3 pressure sensors and 2 flowmeters controlled by a Siemens S7-1200 w/ 16-bit AI, and all sensors read negative when the system is switched off. All sensors are 4-20mA. Without thinking too deeply about it, I was initially thinking that the entire scale might be off or sensors not properly calibrated or something, and thought that shifting the entire scale slightly based on the zero point on each sensor would make sense. But I haven't tried to check the inputs with a meter or calibrator or anything. Maybe the measurement reference is shifted slightly to the internal ADC reference somehow? Would have to think about this some more.



Interesting idea regarding the calibration (applying max and min values to PLC and scaling based on that). You would have to have full control over the process variable then, and have to know that accurately, no? Or I could, for example, apply 4.000mA via an external circuit to the module and read the input. I suppose that would help eliminate module error, but not sensor error.


Recheck analog configuation. 1200 had some weird situation before, that selecting 0 or 4mA for minimum didn't actually affect to raw data values.


Also if your flow meters are powered from AC, then your 0VDC is usually little bit different on PLC than on flow meters out as there is 2 different power supplies involved if your mA 0VDC is connected to ground somewhere on PLC cabinet wiring.
I assume that 1200 AI cards aren't isolated between different AI channels.
 
Last edited:

Similar Topics

I have a Micrologix 1400 with an 1762-IR4 RTD. I use it to read airbag resistance on seats on an assembly line. I am often getting into a...
Replies
2
Views
1,169
Curious, is it common to program in offsets to calibrate/correct analog sensor readings in certain applications? As an example, I have a couple...
Replies
2
Views
1,391
Hi all, splitting out from this thread because it's a somewhat different question to the original. I have to migrate some code from a Micrologix...
Replies
17
Views
3,995
Hi We're having issues with an FX1N-1DA-BD Analog output module. We're using an metallic contamination sensor MCS 1000 to detect particles. This...
Replies
1
Views
60
I am converting a SLC 500 to a Compact Logix. I plan on using a Compact Logix 5380 with conversion Kit. The problem is that the analog input cards...
Replies
1
Views
116
Back
Top Bottom