Velocity calculation

rocksalt

Member
Join Date
Aug 2008
Location
Texas
Posts
102
I tried to set up a few rungs to snapshot position, wait a second, take a snapshot, get difference, use math to get IPM.

Somehow I don't think the timer is accurate for this purpose as I get varying results. Are the timers in the program relative to time or scan times?
 
Last edited:
It depends on the PLC you are using. Another factor is the PLC scan time itself. The actual time will be the timer setting plus some portion of the PLC scan time which depends on where the PLC is in the scan when the timer goes true.
Another way to do it (if your PLC supports this) is to trigger an interrupt routine from the system clock to capture your data.
 
You have to use the time between the scans.
newtime = now()
newpos = position
speed= (newpos-oldpos)/ ( newtime-oldtime)
oldpos=newpos
oldtime=newtime

however the resolution is important same as time.
whenever one is below 10 it is not very stable.
so you will need some midling or average.
 
It is a compact logix ver 19 (1769-L32E).
Sounds like the time should be accurate enough for the purpose I am using it for. Program is only 209.6k. I may need to sample it over a longer period and average it to get a better stable result.
 
I recommend using the Coordinated System Time (CST) timestamp to determine the precise time between logic executions. It's in microseconds, so it's the best precision you have in the CompactLogix.

Use a GSV instruction to read that value (two DINTs, but you'll only need the lower one) in subsequent executions of your distance measuring logic and subtract to get the difference.

What type of device are you using to get the distance measurements ? There may be some measurement delay or other sampling interval issue you need to account for.
 
There are still a lot of time delays in a GSV.
Getting back to basics.
Why do you need to speed, what is the application?
What is the resolution of the encoder and timer. Ken said the timer resolution can be as fine as 1microsecond.
At what speed is the system moving?
What kind of resolution is required?
 
I have a flow control that needs to be adjusted the same for both sides of the valve. Pretty simple scenario used to aid maintenance in setting the motion speed. The speed of the object is roughly 1" every 3 seconds.

The feedback is 4-20mA from a linear position transducer (temposonics 24"). The RPI for the analog input card is 80mS.

I was sampling the position every second by a timer.dn bit to trigger a ONS to a counter. Used math to calculate the IPM and reset the average value every 5 seconds.

When I set it up to sample IPM every second I do not get consistent values. I should have set up a trend to see what the position looked like to see if there is some stiction in the magnet to the slide or some other jerking I cannot perceive.
 
Last edited:
Measuring slow motion is the most difficult motion to measure accurately.
First NEVER USE TIMERS.
Use the system clock.
When you get a reading it may be only a millisecond old or it could be 80 milliseconds old.
at 1/3 of an inch per second that is an uncertainty in position measurement of at least 0.026 inches. That doesn't take into account other delays. 0.026 in over 5 seconds is an error of 0.005 inches / sec which isn't bad assuming. The resolution of the feed back is about 0.00146 inches per count assuming 16384 counts over 24 inches. The quantizing error is 0.00146 inches per count over 5 seconds or about 0.0003 inches per second. The problem is the time interval really or consistently 5 seconds and it probably is not. That too will generate noise. If the scan time is you should do the calculations above only divide by 5 seconds plus one scan time. Now you have an idea of how much the speed calculation can change and that doesn't take into account analog noise.

Always use SSI Temposonic transducers. The resolution can be as fine as 1 micron which really helps when calculating slow speeds. There are Temposonic rods that will return both the position in counts and velocity in counts per second. The time base of the FPGA or DSP used inside the tempsonic rod is much better than what is available on a PLC.

I am not sure you really need the speed if you simply want two actuator to follow each other.
 
I don't need the axis to follow one another. The problem is the system has bang bang valves and I attempted to position the axis using them. If the flow controls get changed I need a way to get them back to where they need to be.

This is a crude setup at the least. Without proportional control it is impossible to reach a desired position each time. However, the machine is ran manually to a target position by the operator.

I wrote some logic that will get within .040" of position using bang bang valves. The operator will just move it the desired direction to fine tune the position. The problem comes when the flow control is set to high and a control bump overshoots the target from either side of the current position.

The proper way to do this will cost 15K and not really make the company more money short term.

So, hence the question to get a velocity reading consistent such that the flow controls can be set within a range, through a procedure but moving the axis manually, reading the value and then adjusting the flow control to that value.
 
Why not just turn the valve off when it gets close to the set point? When I used bang-bang valves 30 years ago we simply turned off the valves when they got close to the set point.

I sell hydraulic motion controllers. Computing the actual speed is very difficult using a motion controller and much more difficult using a PLC.
Why not simply compute the speed over the whole stroke? The shorter the time period you try to calculate speeds over the more "noise" you will get. Over the whole stroke the quantizing, jitters and resolution problems will be less significant.

if you want to get fancy you can try a low pass filter on the speed. It may give you a better value but since you aren't using the actual speed for control it would probably be overkill.
 
I do turn off the valve when it reaches set point. Even better I calculate the overshoot and stop short so it hits position. The time it takes the spool to shift, the temp of the fluid, the.. the.. the.. all have an effect on the final position.

In order for the machine to work properly the position needs to be within .003 (from what the operator tells me).

The reason I wanted to see the velocity was so that my offset to stop early was more consistent. I have added another function that calculates the offset needed based on the previous auto move so the next is close to spot on.

I will play with it tomorrow and see what the trend looks like, reduce the RPI and use the system clock to see what I can come up with. It't not a do or die situation just something to make it better than what we currently have.


Again, the only reason I wanted the velocity was so that if the speed needed to be set between a range maintenance could look at the panelview, jog the axis and adjust the speed to match a range of operation.
 
Look at the execution time of the task in which your logic resides. You may want to use a periodic instead of continuous task so you can predict a repeatable time for the task.

I have added another function that calculates the offset needed based on the previous auto move so the next is close to spot on.

This should help you for two reasons. First, it should tell you how repeatable your control is from end to end. If you don't apply the offset and just look at the results over several operations and find that the calculated offset is inconsistent, you will need to nail down why before trying to improve it. If you apply the calculated offset after every operation without this foreknowledge you may end up chasing your tail. Also, it will probably be useful to have that logic duplicated for each direction if you have not already done so. There are a large number of reasons why one direction may behave differently and many of them will be beyond your control, but can be measured and adjusted separately.
 
Just FYI, TON timers on Compact and ControlLogix accumulate time by looking at the time now and the time last scan and rounding the difference to the nearest ms. That means that every single scan of a TON on a Rockwell system can accumulate error of up to half a ms. The accumulated time on the TON can get really far off from the actual time passed very quickly if you are executing the TON quickly.

On Rockwell, for anything that is over a long period of time or needs good precision, you should be reading the clock values yourself. Most other platforms implement the TON function by storing a start time and constantly check the current time against it; a much more accurate way of doing things and essentially what you need to do manually to get good timing on a CompactLogix.
 

Similar Topics

Hi all, I am having a problem with my servo motor and I'm absolutely stuck as to why it wont run. Without any gearbox or any load attached to...
Replies
9
Views
3,475
Hello, I have a reactor with a cascaded PID control for control of the batch and jacket inlet temperature. We throw in all the required...
Replies
6
Views
1,711
A query for the motion control experts: I have an app where I measure the position (encoder ∝ K * angle) of a rotating device, and use that along...
Replies
15
Views
3,686
Hi folks, I am upgrading a SLC to a ControlLogix platform. The goal of the project isn't simply to get the same program to run on different...
Replies
7
Views
2,770
Hello All, We are developing a machine that will pull in corrugated pipe into a system. The machine in general is like a worm, as in the pipe has...
Replies
9
Views
3,309
Back
Top Bottom