Hi all,
Probably a bit out of my depth here as I'm not an instro but just wondering if someone could provide some clarity on calibrating/scaling the thermocouples a customer has on some equipment they've purchased overseas. It's a heating oven that typically operates in a 150-300°C range, it has have 4 Type-K thermocouples (-270 to 1370°C) wired back to a 1769-IT6 thermocouple module.
Currently, each input is just scaled in their absolute raw values, -32,767…32,767 scales to -270 to 1370. Putting a Fluke dry-well calibrator on each probe at 0°C and 300°C ends up in a pretty significant amount of error and difference in each probe (assume this is because probe readings aren't linear, although the IT6 thermocouple card might compensate for this and have formula-based scaling in it?).
It looks like a previous tech tried to correct errors in readings by making slight adjustments to the whole span, e.g. he changed the RawMin to -32500 to try and offset an inaccuracy they were getting when testing at 250°C, but I can't see how this was ever going to work properly.
Ultimately, I've got a couple of queries;
1) Should we just be scaling it within our expected range? This seems to make more sense to me considering we can't test the readings at the real absolute values of -270 and 1370. Should I just put the probe in the dry-well and find the reading at say, 0°C, set that point as the RawMin then check the probe at 300°C and set that analog reading as the RawMax? This then gives us a basis to work with and an easy point with which to make minor adjustments if scaling is out?
2) How do you calibrate the full span of a thermocouple without equipment that can actually test the full range? If you have a freezer at -40°C, is it actually necessary to ensure you can test it down at that range? Our dry-well can only cover from -15 to 350 to interested to see how it's done. Does it rely on multiple testing points in the range you can test and then constructing a coefficient to extrapolate it out?
3) Should I just be looking at installing probes/thermocouples that are more closely specced to the actual operating range of the oven?
Appreciate any insight people can provide. Just note, I acknowledge if we really want to complete the calibration properly I probably need to engage a specialist. Just for the basic setup and testing though to get it moving in the right direction I'm hoping somebody here can help out.
Thanks
Probably a bit out of my depth here as I'm not an instro but just wondering if someone could provide some clarity on calibrating/scaling the thermocouples a customer has on some equipment they've purchased overseas. It's a heating oven that typically operates in a 150-300°C range, it has have 4 Type-K thermocouples (-270 to 1370°C) wired back to a 1769-IT6 thermocouple module.
Currently, each input is just scaled in their absolute raw values, -32,767…32,767 scales to -270 to 1370. Putting a Fluke dry-well calibrator on each probe at 0°C and 300°C ends up in a pretty significant amount of error and difference in each probe (assume this is because probe readings aren't linear, although the IT6 thermocouple card might compensate for this and have formula-based scaling in it?).
It looks like a previous tech tried to correct errors in readings by making slight adjustments to the whole span, e.g. he changed the RawMin to -32500 to try and offset an inaccuracy they were getting when testing at 250°C, but I can't see how this was ever going to work properly.
Ultimately, I've got a couple of queries;
1) Should we just be scaling it within our expected range? This seems to make more sense to me considering we can't test the readings at the real absolute values of -270 and 1370. Should I just put the probe in the dry-well and find the reading at say, 0°C, set that point as the RawMin then check the probe at 300°C and set that analog reading as the RawMax? This then gives us a basis to work with and an easy point with which to make minor adjustments if scaling is out?
2) How do you calibrate the full span of a thermocouple without equipment that can actually test the full range? If you have a freezer at -40°C, is it actually necessary to ensure you can test it down at that range? Our dry-well can only cover from -15 to 350 to interested to see how it's done. Does it rely on multiple testing points in the range you can test and then constructing a coefficient to extrapolate it out?
3) Should I just be looking at installing probes/thermocouples that are more closely specced to the actual operating range of the oven?
Appreciate any insight people can provide. Just note, I acknowledge if we really want to complete the calibration properly I probably need to engage a specialist. Just for the basic setup and testing though to get it moving in the right direction I'm hoping somebody here can help out.
Thanks