Off topic: Cycle time improvements....

Borte

Member
Join Date
Feb 2004
Location
Norway
Posts
238
Hello guys!

Just curious on your input on this issue....

Why does it always seem to cause instability and problems on a machine when you start decreasing the cycle time...

I have worked on optimizing a lot of machines and I stille have left seeing a machine that did not create some kind of problem when the cycle time goes down. I get timing problems, syncronisation problems, statuses not updating the way it's supposed to... Everything basically...
For me this tells a tale about bad design or "quick" programming to save some hours during production of the machine. It's like let's not care about this issue at this time, someone else can have that problem....

So I'm just curious if any of you have experienced the same issues or if It's just me who have been "unlucky"...

Regards
Borte
 
I think your unlucky.

It really boils down to the way the machine was programmed. If you have a good program timers don't matter. I rarely ever use timers except when needed.

I see alot of guys use timers to fix "scan time" problems or "interlock" problems. Those are the kind of programs that will do what you have experienced.

I am going to side with the rest, your just unlucky and the programs your working in could use some help I bet.
 
Last edited:
Why does it always seem to cause instability and problems on a machine when you start decreasing the cycle time...

Borte,
I agree with Chakorules...you are awful unlucky...like many of us. I dislike seeing timers in programs to solve problems, also, but sometimes we aren't lucky enough to be the ones developing the program. Depending on the type of power source used for the machine, i.e., air, oil, mechanical...all kinds of timing issues can arise due to fit-up of machine members, tensions of chains and so on and so on. Most times that I've seen these problems were when a project was handed down AFTER the design of the machine mechanics were completed and the programmer suddenly had his turn to 'make it right'. So, as they say...been there, done that. So IMO, chalk it up to being unlucky...unlucky enough to have to straighten someone elses mess out. Don't feel alone on this banghead

Bob
 
I agree with Chako. If someone is using timers to ensure proper machine sequence, or if speeding up the machine causes sequence problems, then something is wrong with the code. Typically, when I program a machine, I will also program a simulator that "runs" the machine at much higher speeds using virtual I/O just so I can "harden" my code and verify the sequences. In no case should the code not work on the real machine from a logical perspective.

That isn't to say that timers are never necessary. For instance, I might use a timer to allow a pneumatic cylinder to come to a full stop if there is a hydraulic slowdown damper in place, or something like that. But even then, the timer itself (or speeding up or slowing down of the cylinder) will not affect the code. "Tweaking" the machine setup to decrease the machine cycle time may still affect quality, but it won't affect the code in the machines that I program.

Hey Borte, by the way, I'm still trying to figure out the problem you had with SFC15. I have some downtime on a machine this weekend, and I'll give it a shot.
 
I have to add one caveat to what I said above: There is a limit to how much the machine cycle time can be reduced, even with a good program. I recently wrote the program for a machine that had 87 profibus nodes (including 52 AC drives), and the profibus process image map updates took 17ms. There were a couple of inputs that were on for only 20ms, and if the scan time and I/O update was out of synch just right, I would miss random input transitions (I found a solution to this using a different sensor). So, if someone had a machine that had, for instance, a slowdown reed switch mounted on a cylinder, it's very possible that if someone sped up the cylinder too much, the PLC would miss the input entirely. Even then, a good programmer would sense this condition and throw up a warning or fault that the input was not detected.
 
As far as the timer adjustment thing goes..... When I first started touching the "computer type thing" in an attempt to make our machines work better, I just changed the values that the previous programmer had put into the program. Since I had lots of time on my hands, and no one else could be interested in trying to change things, I payed carefull attention to how the machine responded (in the long term) with different timer values. The end result was a set of timer values for all 10 machines that gave the best results for a variety of conditions.
NOT THE BEST WAY TO FIX THINGS.......
But the only option available to me at the time....
Now that I have a little bit more experience.... I try to fix the basic mechanical issues that pop up, instead of adjusting for them... but the same values are still being used for 10 different machines.
 
Borte said:
For me this tells a tale about bad design or "quick" programming to save some hours during production of the machine. It's like let's not care about this issue at this time, someone else can have that problem...

I think you hit the nail on the head there. I think there's reason those who have already responded to your post don't see the problems you've described. They've already engineered them out of the system. So what if that limit switch makes sooner? That just means my permissive circuit is fired up a bit earlier.

Unfortunately, my experience has been more like yours. The equipment was really designed on the fly (there is a very important difference betweeen a "designer" and an "engineer"). So, programming fixes were made to compensate for mechanical issues. That WILL cause problems when you start to speed things up! If the machine is not mechanically sound, actions/reactions will change as you push it. PLC code with timers that "simulate" sensors will definately give you heart-burn!

AK
 
I have seen where "Improving the scan cycle" meant turning subroutines on & off at different times and unless the program was written carefully for such use it can cause many unexpected results with subroutines sitting in limbo.
 
It's interesting to see that everybody have had some kind of experience on this issue....

What I usually end up doing is taking what ever is in the program and throw it out and restart... This is usally a lot of work, but at least you'r getting the wanted decrease in cycletime and that's what it's all about.

I don't think the problem is usally with timers (at least not in the programs that I have seen) I think it's more of a scan problem. But if it was programmed properly this wouldn't happen, like earlier pointed out. There's no reason the machine should behave "funny" when going faster... What happens when the mechanics gets worn... or what could happen if a sensor breaks down... or what happens if... Scary thought...

If I do find a program that's being controlled by timers... I would either call the company who made the program and have them reprogram it properly or reprogram it my self. This of course depends on warranty issues and so on but you'r usually having less warranty issues with a program that's behaving proberly than one that's "just" behaving.

Regards
Borte
 
Being the Devil's advocate...

Are you talking about generally decreasing scan time for the sake of a decreased scan time or are you talking about increasing a machine's throughput? If you are talking about increasing a machine's throughput are you talking about going past the original design throughput or simply going faster inside the design envelope?

I would agree that simply decreasing scan time should not cause odd things to happen. I also agree that simply increasing speed inside the machines designed speed envelope should not cause any issues. The machine was spec'ed and purchased given some max speed, it should run that speed.

However, IMHO, if you are trying to operate the machine outside it's design envelope I think all bets are off. How much additional capacity should an OEM build into a machine? It was sold at 500 cylces/minute. Should it be designed to do 750 cycles/minute? I don't think so. The system was designed around a spec. This means sensors, input cards, processors and program throughput were designed around the same spec. Faster sensors, higher response input cards, faster processors, etc are not free.

Its like the 1.8GHz P4 you want to overclock to 2.4GHz. Sure, some of them will do it. But there are no guarantees. And since you bought the 1.8GHz processor you shouldn't be too upset if it can't run at 2.4GHz.

Keith
 
Hello kamenges!

Whe are talking about reducing the cycletime of the machine not the plc (throughput of the machine).

I'm aware that the machine is specified to a certain throughput, what I'm trying to achive is a faster machine within the limits of a program change or the tweaking of air pressures, movement dampers and so on... It's more a question about optimizing the throughput of the machine, not changing the machine to achive this.

Usually when you recive a machine it has built in systems to be "on the safe side of operation" (which usally limits the cycletime). When the machine has been used for a while the recuirements for sequences and processes may change, allowing for the machine to be "tweaked". In other word analyzing the sequences and processes to discover unnessesary steps and remove this. This saves cycletime on a machine without going outside the "physical" design envelope of the machine.

Borte
 
By increasing machine cycle speeds above the design, you have to consider everything about the machine. And you can seriously magnify problems that don't show their faces at normal speeds.

Simple one's usually overlooked include stroke time of pneumatic/hydraulic cylinders, reaction times of proximity switches, possibly exceeding maximum count rates with encoder inputs, inability to move inertial loads as fast as required, developing mechanical resonances...lots of things.

Yes, there is often room for tweaking and optimizing, but it's almost never a matter of just increasing the prime mover speed.
 
Borte-

I think I see your point, although I only partially agree with it.
You are correct. There are ways to design software that makes it unherently more modifyable (is that a word??). And it sounds like you run into trouble when you when you try to modify a process or sequence and you break another section of the code as a result. This is most likely caused by one section of code having an undocumented, unseen or inadvertent dependance on the section of code you changed. That is, you are trying to logically modify spagetti code.

Let me preface this by saying I work for an OEM and am therefore a bit biased. I agree with you that it would be great if we could build all our programs starting with an object model (or some other accepted analysis method) and only make modifications if the model is changed first. That would make your life much easier as you tweak your process to get what you want. You would have well defined sequence and process interfaces with known (or at least predictable) reactions to changes. The only problem is the general machinery market has been pushed to be pretty lean in terms of machinery pricing and, consequently, manpower. And in an alarming number of cases the controls/software development function is considered the red-headed stepchild of the OEM engineering group (I hope that translates). It's looked at as a necessary evil, not a revenue source like the design group at Microsoft. So the controls group gets very little time to conceive and implement a detailed design using a solid analysis method. As a result you may get some pretty twisted dependanceies in software borne out of the need to do too much with too few resources in too little time.

I think you probably take the best route possible in rewriting the sections you need to. Granted, it takes a long time. But it lays a foundation that you can use for future modifications. And if your employer will support this financially, it has a lot of future payoff potential.

Keith
 
I to have worked for an OEM, in fact I used to work for the OEM who designed the machine that I'm currently working on. I'm aware of the mechanical issues and the design limits of this machine. I'm just trying to tweak the last part of efficency out of the machine.

I totally agree with the fact that OEM's usally don't have enough time on their hand (at least not the automation department who's usally the last group of people to work on it, so they have to "catch up" for all the time spent by other departments) to create the design / program that is preffered. But I don't think taking shortcuts to save time will make the process of finishing the machine any faster. Quite the opposit, the debugging time whould be longer, since the code whould be harder to troubleshoot and debug.
I personally prefer to spend some extra hours on the machine when programming it rather than using the hours later for troubleshooting.

But as you meantioned kamenges this is always a question about money. So if the employer accepts the extra hours, go for it. If not, it's all about creating the best within the given timeframe.

Cheers
Borte
 

Similar Topics

T
I read Leitmotif's post, and decided to share a notion I've been kicking around. Maybe somebody might have heard about it before, or if it's...
Replies
4
Views
2,130
Hello all. I recently had to re install my AutoCAD Electrical 2024. I am trying to find the Rockwell catalog to install in it but cant find...
Replies
0
Views
112
This is the motor nameplate from one of the powder hoist motors on the USS Texas an older generation battle ship, built in 1911-1914 ish. The...
Replies
1
Views
303
Totally off topic, but strange observation. I purchased a few cisco ethernet switches off of ebay some years ago. They are part of a home lab...
Replies
6
Views
2,112
Good morning, Please excuse me if this is too off topic here, but it is a programmable controller of sort. I have an Omron ES100P temperature...
Replies
4
Views
2,376
Back
Top Bottom