Mark Buskell
Member
I have an encoder (1024 pulses) going into a Control Logix. I basically capture the pulses every second. Subtract the new pulses from the last recorded pulses. Divide by the circumference and mutiply by 60. That gives me the FPM for the line speed.
Another programmer uses the ramp method. They have a 1024 pulse encoder to. They are basically scaling 4-20 (3277 to 16383) to 0 to 160 FPM.
I am using an HSC card and they are using an analog input module.
I have noticed at lower speeds my display will show the line speed at 1 FPM faster than his. For example 45 vrs 46. At higher speeds the displays match. I suspect that they have a zero offset error that is causing this, but then again it could be me and how I do it.
The difference in the displays is no big deal, the line works perfectly.
My question is how do others compute the line speed and do you
have any opionions about how they do it versus my method.
Another programmer uses the ramp method. They have a 1024 pulse encoder to. They are basically scaling 4-20 (3277 to 16383) to 0 to 160 FPM.
I am using an HSC card and they are using an analog input module.
I have noticed at lower speeds my display will show the line speed at 1 FPM faster than his. For example 45 vrs 46. At higher speeds the displays match. I suspect that they have a zero offset error that is causing this, but then again it could be me and how I do it.
The difference in the displays is no big deal, the line works perfectly.
My question is how do others compute the line speed and do you
have any opionions about how they do it versus my method.