... probably more than you wanted to know ...
Greetings, djshorty,
first of all, Eric (and everyone else) has it completely right ... so don’t let me confuse you ...
but when you said:
So when using an analog input you have to program the PLC to convert this signal into something useful?
you indicated that (and please forgive me if I’m wrong) perhaps you’re trying to make this a little more difficult than it needs to be ...
let’s use temperature as a specific example ...
let’s say that your temperature signal arrives at the PLC as an input that ranges between the values of 3277 and 16384 ... let’s further suppose that the temperature transmitter that gives you this signal has a range of 0 to 500 degrees F ...
in this case, when the value 3277 comes in, you would MORE THAN LIKELY want to convert this raw input number into a “meaningful” (Eric’s word) value of 0 degrees F ... then display this new number ... or use it to control a process ...
and also, when the value 16384 comes in, you would MORE THAN LIKELY want to convert this raw input number into a “meaningful” value of 500 degrees F ... then display this new number ... or use it to control a process ...
now this “conversion” process is usually called “scaling” ... NOT “linearization” ... and scaling is a VERY common programming procedure ... something that you’ll see used quite often ...
on the other hand, “linearization” is not all that common ... in fact, it’s hardly ever used ... (and I’m already anticipating the arguments about that statement) ...
that’s because MOST analog signals ... in MOST applications ... are going to be so close to linear that you’re NOT going to have to worry about the small amount of error between (a) what the actual temperature is ... and (b) what the temperature sensor is reporting ...
in the RARE cases that you need to be as precise as possible THROUGHOUT the ENTIRE range of the sensor’s output, then by all means linearize the signal using the ideas that have already been brought up in this thread ...
now looking at Figure A below, we see that the actual temperature (red trace) is shown as a straight line ... and in a perfect world, the signal from the sensor (black trace) would track right on top of the actual temperature ... specifically, the two lines would be perfectly superimposed ... but unfortunately the sensor in this figure is NOT “linear” ... and so the black trace is curved ... and sadly it’s only exactly and precisely accurate at two specific points ... these are the “calibration” points at 0 and 500 degrees ...
[attachment]
keep in mind that the sensor signals in the figures above are GREATLY exaggerated to show the details involved ... a real sensor’s line would be much more nearly straight ... in fact, if you were to run into a real sensor which gave a signal THIS far out of linear, you’d junk it immediately and replace it with something better ...
now on to linearization ...
the process of “linearizing” the signal would involve programming a series of steps to mathematically generate a NEW signal (based on the raw sensor’s input signal) which would more accurately track the actual temperature ... in other words, you’d take the raw curved line in ... and calculate a new straighter line to use for controlling or monitoring the process ... the other responders have already covered the ideas behind that procedure, so I’m not going to rehash it ...
but there is one other possibility that hasn’t been covered yet ... take a look at Figure B and consider this:
suppose that the only range in which we’ll EVER be concerned with the accuracy of our temperature signal is within the range of (let’s say) 100 to 300 degrees ... if that’s the case, why are we forcing ourselves to “calibrate” the temperature sensor at the extreme ends of its scale? ... in other words, why make it “perfect” at 0 degrees ... and “perfect” at 500 degrees? ... as long as we’re NEVER going to be interested in accuracy at those extreme temperatures anyway, who cares how accurate the instrument would be IF AND WHEN we ever got there? ... and yet this is exactly how many (most?) instrumentation technicians will calibrate their instruments ...
so let’s consider this instead ... let’s readjust the little screws and “calibrate” the temperature sensor to be “perfect” at 100 degrees ... and “perfect” at 300 degrees ... that’s what’s shown in Figure B ... notice that the shape of the sensor’s raw signal (the black trace) hasn’t been changed at all ... but now we’ve just shifted it over a little bit to bring it closer to the actual temperature JUST IN THE RANGE THAT WE CARE ABOUT ...
now this is NOT the way most people do it ... and I know that I’d be asking for a lot of harsh debate if I actually recommended it ... so I won’t ... all I’m doing is throwing it out there for what it’s worth as something to be considered ...
words of advice: you’ll definitely want to learn how to “scale” the PLC’s raw analog input signals ... that’s something you’re absolutely sure to need ... but save the “linearizing” ideas in your little notebook of “things that I might need someday” ... for example when one of BobB’s “curved end - horizontal tanks” show up ... then get on with your life ... most analog signals are already quite linear enough for most applications ... you’ll have plenty of other things to worry about without looking for trouble up that particular tree ...
finally, notice how many posts Eric has made so far (1029 as of this writing) ... I’d say that he’s been around the block more than once or twice ... and look what he said about linearizing in an earlier post:
I've never used (or needed) something like this, but I'm sure they must exist...
let that wise statement be an indication of just how much effort you should put into mastering this “linearization” problem ... then party on ...
PS: for those who might care, personally I’d actually shift the calibration shown in Figure B even a little bit further over to the left ... just to sort of “split the difference” of the usable range between 100 and 300 degrees ... but regardless, the next “instrumentation” guy to come along behind me would undoubtedly go right back to calibrating at 0 and 500 degrees and mess everything up ...