I appreciate your willingness to help, I really do. What you seem to not be understanding is that this is an input address. Its value is dictated by the hardware response to the applied input signal, not by anything done with the software. It is an analog input. 3009 is referenced in only two places in the program, both shown here.
The analog input is scaled using the logic found here:
In functions exactly as desired, multiplying the value of the anlaog input by a scaling constant and placing the result in registers 40500 and 40501. Then dividing the contents of 40500 and 40501 by the constant 4095 to complete the scaling.
In the event there is a transducer failure then the signal at the input will drop to 0 mA. This is desired, and it will produce the desired range error. This is the logic that detects that. Nothing complex there.
The problem is that the hardware does not deliver a value into register 30009 that I can use to differentiate between 0mA (bad) and 3.999 mA (still good). Since no pressure transducer is perfectly stable it is not reasonable to expect it to always deliver exactly 4.0000 milliamps at 0 pressure. And no analog input is perfectly stable either, the LSB will always be changing, so with an exactly 4.0 mA input (even one supplied by a fluke process calibrator) the input still bounces between out of range and in range. The same problem exists on three different systems (and apparently others from the thread responses), so its systemic to the analog input design.
The solution is to adjust the zero offset of the analog instrument by tweaking the zero pot on the transducer, so that even when there is no pressure the transducer still reports a small pressure of 1-2 bar, but that throws off the accuracy at the low end of the process, which is bad, its not a solution, its a kludge.