Make sure you understand the implications of using INT instead of SINT. For example, if your data is coming over 2 SINTs (Or 4 SINTs) and is byte-swapped then you are going to get the wrong value by trying to process the single INT. Another good example is a device that has mixed data of 8-bit, 16-bit and 32-bit types, where using INT may cross the boundary of different words of data (Again resulting in the wrong value). I guess the third example is if your device is presenting ASCII charaters as SINTs like a barcode scanner or vision system result. ASCII is better to process as a SINT array, as each character is represented individually.
If your device is providing all data as 16-bit words then INT shouldn't produce these issues.
If your device is providing all data as 16-bit words then INT shouldn't produce these issues.
Last edited: