tester ideas

Here I am again. This time, the question is much simpler, and I'm sure I'll get a lot more creative answers. Last fall I built a width tester that tests the thickness of grinding wheels we produce. The company standard is such that these testers need to be accurate to plus or minus one thousanth of an inch. When I built them, it seemed very simple, and everything worked fine. The testers really consisted of nothing more than a small cylinder, upon which I had mounted an LVDT. I took the input from the LVDT to an input, and used a scale with parameters function to calibrate the input from 0 to 1 inch. This worked quite well with fair repeatablity until we discovered that they needed to be calibrated somewheres around once a week. I took care of the calibration needs by adding a separate calibration function so that the operators could do it. Now, one has gotten to the point that the LVDT is not giving a true linear signal from 0 to one inch. These LVDT's are very expensive, and rather than replace them every year, I was wondering if someone makes such an item that is made to be particular to one thousandth or if anyone has any idea how I can cure the problem of a nonlinear input. Also, would it help to place the LVDT in a lightly pressurized enclosure, thus keeping the atomosphere around them clean. I'm not really clear on how touchy they as far as environment. I thought about shrinking the range and going with 4 modes, each taking care of 1/4 inch, but I'm not sure that is the best way to go. I'm open to any and all suggestions.
 
ABB makes an ultra-sonic eye (0-10v or 4-20ma) that might be what your looking for they seem to be holding up good to various enviorments. If hooked to 5/03 or higher processor, you could program an runtime average and see if that don't solve your Cal. needs.
 
Russ,

Try looking at either a laser sensor or mearsurment displacement sensor. Most major sensor manufactures makes these types of sensors, such as Omron, Keyence or Sick Optics.
 
Is an optical solution out of the question? We had some success with a laser micrometer setup from keyence. We built a fixture that closes on the test part, and the micrometer measures the travel of a flag that moves with the clamp. An indirect method, very repeatable, and should never need calibration.

Good luck!

TM
 
russmartin,

Please don't take this the wrong way, but have you discussed your calibration procedure with the manufacturer of the LVDT? It sounds to me like you may have been adjusting the span when you you should have been adjusting the null, or vice versa.
 
We've used an A/B vision system to measure the width of a web traveling at very high speeds. It's basically a digital camera so the resolution is down to a pixel. We would verify the calibration with a gauge block and I don't remember it ever being out of calibration. It was kind of pricey but that was 5 or 6 years ago. Ideally you would hook it up to control your process rather than measure the results afterwards.
 
tomneth,

AB sold their vision division to Cognex a few years ago, so they would not be an option. Although there are others who sell low end vision systems such as Omron, Keyence and Cognex. Any of these would work, but you have the problem with lighting changes and set-up this could be difficult if you have a dirt or dust problem in your application.

Mike
 
The unspoken part of Steve's post is...

This is unusual performance for something as solid as a LVDT.

There is something wrong in the way that you are using it. This could be any one, or more, of the following...
  • Having the right tool for the job (Range & Resolution),
  • Power,
  • Programming,
  • Environmental Conditions,...
It seems to me (and I think, to Steve as well), that this system should be a hell of a lot more solid than you indicate. This particular design has been proven many times to be solid in concept and implementation. That is, the concept is sound and the implementation is readily accomplishable! You should NOT need to pursue a different concept!

So, the first question that occurs to me is...

What is the "Nominal" Range & Resolution?

It is important that your normal reading occur near the mid-point of the range. The device should be linear for the longest time at/near the mid-point. While it's true that all things do degrade over time - there's no reason to give the device a jump-start on degrading out-of-range!

If you find you are losing linearity, it's usually the case that the problem occurs at one extreme or the other. In general, the slope through the mid-point is maintained most accurate over time.

If your normal reading is at, or near, one of those extreme points, then you will find your device degrading prematurely.

I don't think the question of "How are you implementing your compensation?" really matters. At least, not at this point.

As I said, the concept is sound! It should be working!
 
thanks boys

Thank you all for your input. The last response from Terry Woods was what I was hoping to hear. This is my first application with LVDT's, and I was afraid that they are not meant for this precise of a measurement. I will adjust the LVDT to use the very center of the range and see if this shows any improvement. Also, I still am wondering about the environment that they are all right to be in. Are they fragile to normal plant environments? Meaning dust, tiny plastic particles maybe, etc. As for Mr. Steve Bailey, the documentation on the LVDT that I was passed down was very minimal. When I am talking about the calibration procedure, it really has nothing to do with adjusting anything on the LVDT. I am simply taking a current stamp of the analog input to the PLC at the high and low fringes and placing them in my Scale. Thus, calibrating it to what the current input is at these places. I would not rule out that the LVDT is actually out of calibration, but to date I am not aware of any such procedure with the one I have. I will look into it. Thanks again for all the suggestions. I'll be checking back.

Russ
 
LVDT's (Linear Variable Differential Transformers) are by design very robust, devices. They are a non-contacting device and therefor can't wear out. Typically they are damaged thru mechanical abuse i.e. maintenance man uses them as a step ladder, fork lift truck operator bashes into machine, etc. We use LVDT's in most of our industrial hydraulic equipment (we are an OEM) where they are subjected to an environment of high temperature, vibration, and mechanical shock and perform very well. First question, are you using a true LVDT in which a seperate card or module is used to provide the oscillator & demodulator; or are you using a DCDT which is and LVDT with the oscillator/demodulator built-in, meaning you only have to worry about supply voltage in and signal out?

If you are using a true LVDT with a seperate signal conditioning card, are you sure it is compatible with the type of LVDT (3-wire, 4-wire, 5-wire, 6-wire, etc) that you are using? If it is a DCDT then you don't need to worry about this.

LVDT's are most linear around the middle of their stroke and can become non-linear at the outer limits of their stroke.

Make sure that the connecting rod is compatible with the core. Typically a non-ferrous or stainless steel rod is used to connect the core to the moving object being detected. The devices are usually very rugged and immune to the effects of dust, dirt, oil (We operate them immersed in oil by design). I suppose you could have problems if the dust is ferrous in nature and somehow builds up on the core.

Biggest problem we run into is outside noise influences i.e. our customer doesn't follow the recommended cabling practices, uses poor quality shielded cable, runs the cable near high voltage AC devices, etc.
 
What are they talkin about

I did some checking on the specs of the particular LVDT that I am using. This particular unit shows a linearity of +/-.25% FRO. When the manufacturer states this, I assume they mean that for every 1", or 1.000, I should have an error of .0025. The LVDT I was trying to use had a 2" range, so over the whole range, I will have a linearity error of about .005" Am I reading into this correctly? As I understand this, this is not accurate enough to ever read accurately to .001".
 
I don't know what "FRO" is supposed to mean.

But... it sounds like maybe it's a case of not having the right device for the job (range & resolution).

I assume you are reading "directly". Have you considered doing something like this? The "pivot" is critical. It can't have much slop. You have to do the numbers to see if indeed you do get the precision you need.

I haven't got time to do the numbers right now.

3d510c587109bcc6.gif
 
good idea

Terry,

That is an excellent idea. I'm sure that we would be able to get what we need out of a setup like that. Even if the lever on the LVDT side had to be 3 times what the object side is, we could still make that work. I will look into it, it's a very interesting idea. Thank you a lot.

Russ
 

Similar Topics

Hi Guys, OK, I've been given the go-ahead to build a simulator/tester at work, and was wondering if you guys had any ideas on what to use as my...
Replies
2
Views
3,990
Hey guys, hoping someone here could give me a little advice. I'm working with a CR1000-04000 in Crimson 3.1 and I was interested in adding the...
Replies
4
Views
114
Hello guys, I had a problem in a local network where I have a lot of devices to a Stratix 8000 administrable Switch and this one to a 1756-EN2TR...
Replies
2
Views
485
Has anyone worked with a Cosmo Air Leak Tester(LS r-902) with Ethernet/Ip, to connect to A Compact Logix, I´m doing a proyect with this equipment...
Replies
7
Views
2,046
Hi all, Hoping some of you have some recommendations for an all in 1 tester, the most well known brand here is megger so 1 by those would be great...
Replies
3
Views
1,806
Back
Top Bottom