Hi Guys,
I have a good enigma for you. I am having a brain **** at the moment and can't get my head around this logic. Let me give some background.
I have a chain oiler that sprays on every chain link using an absolute encoder (the encoder is used for other things but it was handy for this application as well). I give the logic for the oiler a start position then it calculate the next 10 sprays from that. The problem we encounter is when the chain speed is changed. This changes the spray position (because even though the encoder should follow this the mechanical stuff(solenoids, etc) still take the same amount of time to actuate)and if no one checks the chain can run for days without proper oiling.
my idea was to see how much the position changed from the slowest speed to the fastest (about 10mm) and use this information in a scale with parameters to give me an amount to add or subtract to the start position to adjust it automatically. I am having trouble working out when to add and when to subtract. I tried using a data transition instruction that goes true when the speed value changes but how to work the logic to subtract when it goes slower and add when it goes faster is slipping by me. I'm probably thinking too deep into it. I have a lot of other stuff going on during a startup right now so that isn't helping either.
I have a good enigma for you. I am having a brain **** at the moment and can't get my head around this logic. Let me give some background.
I have a chain oiler that sprays on every chain link using an absolute encoder (the encoder is used for other things but it was handy for this application as well). I give the logic for the oiler a start position then it calculate the next 10 sprays from that. The problem we encounter is when the chain speed is changed. This changes the spray position (because even though the encoder should follow this the mechanical stuff(solenoids, etc) still take the same amount of time to actuate)and if no one checks the chain can run for days without proper oiling.
my idea was to see how much the position changed from the slowest speed to the fastest (about 10mm) and use this information in a scale with parameters to give me an amount to add or subtract to the start position to adjust it automatically. I am having trouble working out when to add and when to subtract. I tried using a data transition instruction that goes true when the speed value changes but how to work the logic to subtract when it goes slower and add when it goes faster is slipping by me. I'm probably thinking too deep into it. I have a lot of other stuff going on during a startup right now so that isn't helping either.