chantecler
Member
Hi,
I modified the structure of a program (let’s call it app) in a CompactLogix 1769-L30ER due to high variation in the output times. I mean the eight outputs should be alternatively on for 250ms periods but I was getting up to 20% variations.
Originally the app consisted of a single continuous task with one main program with two routines.
To make the story short, I divided the app into two periodic tasks with one program each with one routine each. The resulting behavior is highly satisfying.
I’m obtaining these readings from logix designer
Task 1
Period 1ms
Priority 4
Watchdog 500ms
Max. measured scan time 0.520ms
"Std." measured scan time 0.120ms
Interval times [0.630 to 1.370ms]
Program’s max scan time 90us
Program’s "Std" scan time approx. 55us
Task 2
Period 40ms
Priority 10
Watchdog 500ms
Max. measured scan time 0.740ms
"Std." measured scan time 0.400ms
Interval times [39.700 to 40.300ms]
Program’s max scan time 90us
Program’s "Std" scan time approx. 70us
My question is: Why Task 2 uses 0.400ms while the only program in it uses only 70us?
I also have these processor settings:
System overhead 10%
Reserved unused overhead time slice for system tasks
And also reduced the 3 I/O module RPIs from 20.0, 20.0, 5.0ms to 5.0, 0.5, 0.5ms
I’m not quite clear about this.
Thank you
I modified the structure of a program (let’s call it app) in a CompactLogix 1769-L30ER due to high variation in the output times. I mean the eight outputs should be alternatively on for 250ms periods but I was getting up to 20% variations.
Originally the app consisted of a single continuous task with one main program with two routines.
To make the story short, I divided the app into two periodic tasks with one program each with one routine each. The resulting behavior is highly satisfying.
I’m obtaining these readings from logix designer
Task 1
Period 1ms
Priority 4
Watchdog 500ms
Max. measured scan time 0.520ms
"Std." measured scan time 0.120ms
Interval times [0.630 to 1.370ms]
Program’s max scan time 90us
Program’s "Std" scan time approx. 55us
Task 2
Period 40ms
Priority 10
Watchdog 500ms
Max. measured scan time 0.740ms
"Std." measured scan time 0.400ms
Interval times [39.700 to 40.300ms]
Program’s max scan time 90us
Program’s "Std" scan time approx. 70us
My question is: Why Task 2 uses 0.400ms while the only program in it uses only 70us?
I also have these processor settings:
System overhead 10%
Reserved unused overhead time slice for system tasks
And also reduced the 3 I/O module RPIs from 20.0, 20.0, 5.0ms to 5.0, 0.5, 0.5ms
I’m not quite clear about this.
Thank you