So I'm using a P&F optical read head for positioning. The position data comes into the PLC as two SINT bytes. Monitoring those SINTs in my 1769 controller, they range in values from -127 to 127. When the higher resolution SINT crosses from -1 to 0, is when the lower resolution bit increases 1 digit.
The manual states:
The X position is output in the two's complement.
4 byte consistent Input data 32-bit X data
LSB first
LSB = least significant byte
Resolution: 0.1 mm, 1 mm, 10 mm, binary coded
At a resolution of 1 mm and 10 mm: Lmax = 10.00 km =
10,000,000 mm
Any help in converting this to a decimal position?
Thanks!
The manual states:
The X position is output in the two's complement.
4 byte consistent Input data 32-bit X data
LSB first
LSB = least significant byte
Resolution: 0.1 mm, 1 mm, 10 mm, binary coded
At a resolution of 1 mm and 10 mm: Lmax = 10.00 km =
10,000,000 mm
Any help in converting this to a decimal position?
Thanks!