Curious if ayone has ever calibrated a PLC5 Analog input module

TheWaterboy

Lifetime Supporting Member + Moderator
Join Date
May 2006
Location
-27.9679796,153.419016
Posts
1,924
Has anyone ever calibrated an analog Input module for a PLC5?
Was it worth the effort?
I have a REV C 8 channel input module in a new project (yes.. a NEW project) the module channel is ranged 0-5000 and 20mA input gives 4200. 0mA input gives -17. "Calibrate" is highlighted on the module dialog.

I can insist that the vendor fix this but realistically its close enough for this purpose so I'm simply curious if it worth the effort.
 
Why mess with spending the time? What will it give you? I always scale my cards anyways because a lot of the time my field devices have some offset aswell.
 
Yes, 12 bits is 4096 counts.

When you say the module is ranged 0-5000 is that in engineering units or raw counts?

Accuracy? Depends on the model board and its spec. But it might be assumed that the board is field calibrated to that spec.

If the output of the field device can easily be forced and has easily accessible adjustments for zero and span, then I recommend a 'loop calibration' where the AI is left as is and the field instrument is jiggered so that zero = zero and span = span.

This is routine with magnetic flow meters, for example. They've got a keypad and a menu for analog output trim.

If that's not the situation, then you have to evaluate whether the error can effect things. Rule of thumb is that repeatability is critical in 100% of all processes, but 'accuracy' (comparison to a known standard) is critical in only about 1% and custody transfer (billing).

From your tone, I suspect if you leave as is, you'll be fine.
 
Last edited:
Raw min = 0
Raw max = 5000

I'm sure I'll be OK too, its used as an indication. This is mostly an academic discussion.
I gather from the general group of answers that no one ever cals the modules to correct this.
 
You also would need some very accurate test equipment to calibrate the module. (Traceable). How accurate is your milliamp meter? Has it been calibrated, is it traceable?
 
I've installed/replaced dozens of these, probably 100 by now and I have never calibrated one.

Having said that none of the control systems I have worked on have been involved in a moon shot either.

I have alwasys been curious when reading the instructions as to the need to calibrate? Wouldn't/shouldn't that have been done at the factory? Would shipping these things throw them out of calibration?

Curious too.
 
I have only seen one instance of calibrating Analog Input cards and it was a gov. mandated operation. The analogs where measurung multiple pressure readings on a combustible gas system. The reason for the calibration had to do with confirming all readings where linear. We used a tagged 4 to 20 mA variable power supply.

Since we were measuring pressure in a duct filled with very explosive gasses I think the goverment was right and it should be checked on a regular schedule.

Will say this though, I worked for that company for several years and the only time I ever had to make adjustments to an analog card was when I installed a new card.

I think it is a good thing to know and if you are comparing multiple inputs, having them all scaled the same could save alot of agrivation.
 
This setup has been sitting for 5 years and is only now begining to become important. So this does qualify for the initial startup category.
...but we aren't pushing explosive materials.

When I get bored perhaps I'll have a lash at it, just for fun.
 
The most important thing to remember with analog instrumentation is that some people will assume that the input module is reasonably well calibrated - after all they come from the factory pre-calibrated, and short of component failure, aging will not have that much effect on their accuracy.

Calibration is most likely provided more for regulatory traceability (documentary proof that the module performs to a specificatiom), than to overcome some form of degradation of the A/D converter circuitry.

I would, as a matter of course, perform calibration of that module, so that I have a baseline for the whole system. I would also be questioning why its existing calibration is so far adrift from manufacturer's spec., and may be thinking the module is faulty and needs replacement, or has someone else just messed up it's calibration just to get a "whole system" in-line, perhaps you'll never know.

Perhaps I'm "old school", but in my opinion you shouldn't overcome a problem by fixing something else, at some point in time the other thing you have trimmed to make the system work as a whole will need replacement, will come pre-calibrated, and you will have to "destroy" it to make it work with your suspect card. The field device should be able to be changed for an identically scaled one any time you wish.

Imagine your car rev.counter indicates 1,000 rpm too high at all times - would you "fix" the system by bending the needle ??
 
Last edited:
1) I'm not sure it's the AI that might be out of calibration. What was the source that was used?

2) I'm not very clear how a 12 bit board whose A/D does 4096 counts is supposed to cover a 0-5,000 count span.
 

Similar Topics

I have a client who periodically experiences network communication issues. Sometimes when I VPN into the site, their SCADA systems will flash comm...
Replies
2
Views
168
I'm focusing on Rockwell programming for a couple of projects, and I've been thinking about some of the details I've come across while working...
Replies
13
Views
2,316
Hello all. I was starting to get into a new project (that will probably never see the light of day because $$$) at the plant I work in and...
Replies
26
Views
5,561
Just curious, Have you noticed how the Honeywell C200 and the A-B Controllogix look alike? Are this two related somehow? Are they compatible...
Replies
3
Views
2,040
Studio v31 FTV v10.01 Found the logic was acting backwards. so I switch the name plates on the three way "Control Mode" Right is the FactoryTalk...
Replies
3
Views
1,529
Back
Top Bottom