1756-IF16 difference in mA reading

surajchem

Member
Join Date
Apr 2012
Location
houston
Posts
16
Hi All,
I have 4 PDIT (HART) transmitter attached to 1756-IF16 AI card as single ended float.
The PDIT (Rosemount 3051CD) is ranged from -1 to 1 (re-scaled from -3 to 3) in of H2O and generally reading around 0.0023 inh of H2O in operating conditions and the customer require it upto 3 decimal.

here is what is happening.
I have my fluke meter in series. I scaled the channel to read 4-20 engineer units to mimic 4-20mA readings.

Using the Hart communicator, I can see that the PDIT is reading 11.546mA (-0.0566 in. H2O), my meter reads 11.548 mA and PLC reads 11.987mA (-0.055 in.H20) and there is similar trend on all 4 PDIT.

However, When I disconnect the transmitter and use my meter as a simulator, if i force 11.548mA, i read 11.548 mA in my PLC.

so not sure why i am seeing the difference in mA signal when the transmitter is connected.

Would appreciate any input and solution to get over it.
Thank you.
 
Your meter and the Hart seem to agree to about 2 decimal places FOR MILIAMPS, which is pretty good. The PLC 1756-IA16 analog input module probably doesn't have and will never have the resolution to get 3 decimal place accuracy in miliamperes.

So with what you have, you can give the customer a 3-decimal-place data display, but you can't give them accuracy to 3 decimal places reading in miliamperes. However, reading in inches of water, you just may have it already because the typical 0.002 reading is 3 decimal places. Cut if off at 3 places so you don't display the error at 4 decimal places in inches of water.
 
Last edited:
Hi There,
Thank s for your reply.
But I am puzzled by the fact that if i use my fluke meter to simulate 4-20 mA signal with precision upto 3 decimal i am able to read it in my PLC
 
It could be noise on the line that the input module is sensitive to and the Fluke isn't. Put a scope on the signal.

There is also a specified gain drift on the module of +/- 0.41uA/Deg C, is the value stable? or does it change? It could just be caused by the ambient temperature of the module.

How do you have the module connected? 16 Channels? Differential? High speed?
 
Your meter and the Hart seem to agree to about 2 decimal places FOR MILIAMPS, which is pretty good. The PLC 1756-IA16 analog input module probably doesn't have and will never have the resolution to get 3 decimal place accuracy in miliamperes.

I thought the same thing so I looked up the specs. That input card is 16 bit over a range of 0-21mA. That's 0.00032 mA per bit. Not too shabby.

If the hart device is supplying power to the loop maybe it does not have enough to drive both the meter and the analog input.
 

Similar Topics

Hi, I have a ControlLogix system with 1756-IF16 analogue inputs. I can't scale the inputs at the card as there is a requirement to facilitate...
Replies
14
Views
356
Hi all, installed on chassis A17 an A/I from Allen-Bradley , problem is what ever I do , all channels are sticked on value 39.9 and cannot change...
Replies
1
Views
145
Hi Guys, Hoping that someone could please confirm if the 1756-IF16/B is/isn't compatible with the 1756-L1 5550 processor(13.24). I'm sure I...
Replies
2
Views
103
Hello, friends my 1756-if16 differential connection, has been used again (almost 10 years), the measurement value of each channel of the entire...
Replies
4
Views
971
I have an existing 1756-IF16 wired as single ended with all RTNs tied together. Channels 0-3 have 2-wire loop powered devices connected with all...
Replies
3
Views
1,067
Back
Top Bottom