OkiePC:
I realize the scaling is linear across the scaling of the card, but that's on the engineering units side. I was talking about the *negative* side. I was thinking it *wasn't* linear there, and it jumps directly to -25 (if it's below 4 mA) because 4-to-20 mA range is a span of 16, 4mA is 25 %, therefore -4mA is -25%, and then anything further negative than -25.xx% would lower the -4 mA number downward.
Is this not correct? I thought that's how people were coming up with the 3.96 mA figure. It *does* calculate out that way...
ASF:
But the tank won't overfill as long as you have alarm limits set to shut the valves, which you *should* have anyway. Now, if the instrument is just plain *dead*, then you should either have a secondary protection instrument (ie: a high-level switch) which will tell you, OR some kind of "No Weight/Level Change" logic to detect the level not changing with the valve open.
But anyway - for this particular card (which is a 0- 20 or 21 mA raw range), this program currently has the scaling set to 4-20 mA with an engineering range of 0 to 10,000 (ppm). Are you saying I should change that to something like 3.5mA to 20.5 mA with the same engineering range? And if so, wouldn't that conflict with what the instrument is supposedly being calibrated to (4-20 mA)? Wouldn't it be cleaner to just leave it 4-20 mA with 0-10,000, and then write code to interpret the over/under range number to see if it's far enough out to justify maintenance taking a look at it? I mean, -25.xxx *looks* pretty bad, but it's only a 'hair' below 4 mA, so that's probably not going to be a maintenance concern, right?