How much resolution can a mA loop realistically have?

defcon.klaxon

Lifetime Supporting Member
Join Date
Feb 2015
Location
Far NorCal
Posts
616
Hi all,

Have a client with a wastewater plant, I was asked to change the scaling for a CL2 analyzer that was replaced. I was a bit shocked when they said the new analyzer is scaled at 0-2000 mg/L :eek: Typically they're seeing ~20 mg/L so we're talking about a 1% range of measurement.

I asked why in the world they'd need the scaling to be that extreme, and I got some vague story about how that was the only way that they could get it to tune in to their grab samples. This obviously makes no sense, but I need to explain to the manager why the local readout isn't matching the screen.

I don't know the brand of the analyzer (I remoted in to make the changes via TeamViewer so I've not recently been onsite, but I can find out), but while watching the raw input value, it's not even changing despite the local readout changing. With such a small range of measurement, I'm not surprised that the loop doesn't have a resolution of micro amperes, but before I try to explain that to the manager I just wanted a sanity check. Does it make sense that the mA loop from the analyzer just isn't going to have that level of resolution?
 
Every type of analyzer I have worked with allows the mA scaling to be adjusted to what is reasonable for the application. In water applications that is typically 0-5 ppm or 0-10 ppm. I think we have one wastewater customer that uses 0-50 ppm. Most a/d converters are 14 bit or 16 bit some are still 12 bit, so (assuming 14 bit) the resolution is going to be divided into 16384 parts (theoretically). With a range of 2000, the best you could hope for is a resolution down to 0.122 ppm if both A/D converters involved are at least 14 bit. That is theoretical assuming perfect accuracy which is not reality. Note there's a difference between resolution and accuracy and both factors will have an impact at both ends of the wire where the conversion takes place.

If they can calibrate the analog signals as precisely as possible, that can help. Most analyzers allow a simulate mode where you tell it to put out 0 or 4mA and monitor the raw number in the PLC, then do 20mA so that even without being on site with a handheld instrument you can at least match up the two a/d converters and tweak the scaling math in the PLC.
 
Last edited:
Thanks for the info OkiePC, unfortunately the operator I was working with has left for the day so I'm gonna have to wait until Monday to get the info for the analyzer, I'll see what I can find.
 
An important issue here "may be" some sort of communication problem. How long is the cable between the sensor and the analog input card in the DCS? I advice someone look into this just to rule out communication issues.
 
I think your concern is appropriate.

It can be a real challenge to discover what the AO resolution actually is because a lot of vendors do not publish the spec. More often, they'll publish an accuracy spec, but while AO resolution is typically easy to find for PLC AO cards, in the analyzer world, that's not the case.

I've had this come up when the spec is not publicly published and have gotten two different answers, from two different people at the same vendor, which means they're just guessing, giving me an answer to get me off the phone.

Honeywell's UDA 2182 pH/ORP/conductivity process analyzer spec states an accuracy of ±0.01mA.

If that were the spec for the analyzer in question, then over a span of 20.00mA (the lower 20% of the range is lost when the output is 4-20) the 0.01mA becomes the low order one's digit for the span of 0-2000 in your case.

But again, it's an accuracy spec, not a resolution spec.

I think you're right, that you're dealing with a distinct lack of resolution in the expanded scale that's been chosen for whatever oddball reason.

Within the past week, some forum had a thread dealing with a 10 Bit A/O, effective resolution of only 1:1000, so 10 bit A/O's are out there.
 
If the reason for it needing to be scaled like that are valid, Would doing it over Modbus or other communications be an option? Have your reading come from the Modbus value and a sensor fail or shutdown on the analog?
 
0-2000ppm range for a "CL2 analyzer?" To the wastewater experts here, I'm with OP here: does that sound right? Is there a bigger issue than scaling here?
 
I hate to say it but it looks like woever orderd the replacment CL2 analyzer ordered te wrong one. the outut sacling is defined in the analyzer you cant in most cases you cant't change that I guess thei may be some out there that would allow that. the best you can expect toacheive is to scale youe input to match what they give you But you bit value wll be off by a factor of 10 (200 X 10 = 2000) so the accuary of input will off on the low end
if your cucumer can live with then then go for it if not they orderd the worg analyzer they need to fix it
 
There may be a difference between what the INSTRUMENT is capable of reading, and what the TRANSMITTER is capable of sending.

The instrument/probe may be quite capable of discerning 0.1 mg/l of chlorine in its full range of operation: 0-2000 mg/l.

But you may only be interested in, perhaps, its 0-100 mg/l range. Often, the transmitter/AO to the PLC's AI, can be set up ("calibrated"), so that 4 mA is 0 mg/l and 20 mA is 100 mg/l. That would mean that if the probe were to detect more than ~105 mg/l, the value into the PLC would be "pegged" (to use an old analog meter phrase).

But, this would allow the PLC to comfortably read 20 mg/l at 7.2 mA, and each 1.0 mg/l is 0.16

If the transmitter is set up to send the full range of the instrument (0-2000) at 4-20, then, yes, 20 mg/l is only 4.16 mA with each mg/l only 0.008 mA, which will be more subject to transmission loss (4.16 on the instrument output may only be 4.08 on the PLC input), noise, and lack of discernment between small readings.

This is basically what OkiePC said; I'm just changing the language so that perhaps a manager can understand it.
 
I find as a general rule that analog inputs have greater resolution than analog outputs. Analog outputs are where the vendors cheap out. But it all depends on the specifics because there are tens of thousands of field devices and thousands of devices with AI's.

It is not unusual to find 12 bit AO's which have an effective resolution of only 1:4000

1:4000 across 20.00mA resolves only 5.0uA, or 0.5mA

Most decent Analog inputs are 14 bit, the better ones 16 bit.

The problem is more likely the output resolution of the selected span than the ability of the input to handle the signal.
 
with the flow meters i've worked with I can scale the analog output independent of the Meters own range of capability. But You can't really get away with a 2% of the meter's range of capability being your 4-20 scale and get much resolution from it.

Did figure this out the hard way once.. had a differential pressure sensor that I scaled 10-20 Inch WC.. problem is someone goofed and ordered a sensor with a range of capability up to 3000" WC... there wasn't any useable resolution off of that.

So you gotta have one close enough to your process range to get a useable resolution on your analog.
 

Similar Topics

A bit of background here: We use an incremental encoder with a counting module in our PLC configuration. Using dual phase / quadrature counting...
Replies
26
Views
8,987
I am new In a CCW and as a beginner I am trying to learn programming but i am noticing that my CCW software is taking around 1 minute to download...
Replies
2
Views
109
Today I was working on my project for school and we were using a power supply with 24V and we accidentally had the current at 0.9A. We heard a pop...
Replies
9
Views
554
Hi all, Can a machine be "too safe"? I originally wanted to ask a different question about best-practices when switching a machine from non-auto...
Replies
9
Views
960
Good Evening , I have been asked to do some teaching at a community college for industrial automation for some young adults. I'm thinking 2...
Replies
28
Views
10,880
Back
Top Bottom