Kim, explain to the technicians that just because you can do something, it doesn't mean you should. You might also make references relative to the futility of sow's ear/silk purse transformations.
A typical specification for an RTD is ±0.5%. For extra money you can get ±0.1% accuracy. That is roughly equivalent to 10 bit resolution (although nobody makes a 10 bit A/D that I know of). This means that whether you are using a 128-bit or a 12-bit I/O card the actual accuracy of the measurement will be the same, no matter how many decimal places you display. Furthermore, I have trouble imagining an application where the difference between 355.000 °F and 355.355 °F is incredibly significant. If you have such an application, you should start at the sensor to get better accuracy, not at the I/O card. After you have eliminated or minimized all the other sources of error, then it is time to see if ±0.0015% is really significant or not relative to the other errors.
One other point is the difference betwen resolution and accuracy. For example, I have used A/D cards from one manufacturer that is 16 bit resolution, but the actual accuracy of the measurement is only ±1%.