SoftwareJanitor
Member
OK, the "Janitor" here with more dumb questions:
I've got an Analog I/O point (AT-13) on a 1756-IF8 module which is configured as a 4-20 ma input signal and 0-10,000 Engineering Units (ppm, by the way).
However, when I watch this AT-13 tag Online (aliased to the I/O point R3S3:4:I.Ch2Data), I see -25.55127 as the value, which I presume is the raw % number for 0 mA.
1st question: WHY am I seeing this? Shouldn't I be seeing the converted/scaled value in engineering units??? Shouldn't the tag be showing ZERO?
2nd question:
This value is being MOVed into another tag (holding area for the "sample" of this transmitter), and THAT value shows up as 239.6343. So just via a MOV instruction, -25 becomes 239.6. How does THIS magically occur?
The sample tag and the AT-13 tag are both defined as REAL. BTW.
I've got an Analog I/O point (AT-13) on a 1756-IF8 module which is configured as a 4-20 ma input signal and 0-10,000 Engineering Units (ppm, by the way).
However, when I watch this AT-13 tag Online (aliased to the I/O point R3S3:4:I.Ch2Data), I see -25.55127 as the value, which I presume is the raw % number for 0 mA.
1st question: WHY am I seeing this? Shouldn't I be seeing the converted/scaled value in engineering units??? Shouldn't the tag be showing ZERO?
2nd question:
This value is being MOVed into another tag (holding area for the "sample" of this transmitter), and THAT value shows up as 239.6343. So just via a MOV instruction, -25 becomes 239.6. How does THIS magically occur?
The sample tag and the AT-13 tag are both defined as REAL. BTW.
Last edited: