[email protected]
Guest
R
Here I am again. This time, the question is much simpler, and I'm sure I'll get a lot more creative answers. Last fall I built a width tester that tests the thickness of grinding wheels we produce. The company standard is such that these testers need to be accurate to plus or minus one thousanth of an inch. When I built them, it seemed very simple, and everything worked fine. The testers really consisted of nothing more than a small cylinder, upon which I had mounted an LVDT. I took the input from the LVDT to an input, and used a scale with parameters function to calibrate the input from 0 to 1 inch. This worked quite well with fair repeatablity until we discovered that they needed to be calibrated somewheres around once a week. I took care of the calibration needs by adding a separate calibration function so that the operators could do it. Now, one has gotten to the point that the LVDT is not giving a true linear signal from 0 to one inch. These LVDT's are very expensive, and rather than replace them every year, I was wondering if someone makes such an item that is made to be particular to one thousandth or if anyone has any idea how I can cure the problem of a nonlinear input. Also, would it help to place the LVDT in a lightly pressurized enclosure, thus keeping the atomosphere around them clean. I'm not really clear on how touchy they as far as environment. I thought about shrinking the range and going with 4 modes, each taking care of 1/4 inch, but I'm not sure that is the best way to go. I'm open to any and all suggestions.