JLand
Member
Hi folks,
I have a situation where I need to read a digital input and output a digital signal based off of this input plus some delay.
For example, the input will go high for just 1 second. 20 seconds after it first went high, my output will go high for just 1 second.
The delay will be an HMI setpoint and the digital input may be on for any amount of time ( it could be more than or less than the delay) and the digital input may cycle on and off multiple times in the duration of the delay timer.
What's the best way to do this?
For reference, I am running on an AB L72.
Thanks!
I have a situation where I need to read a digital input and output a digital signal based off of this input plus some delay.
For example, the input will go high for just 1 second. 20 seconds after it first went high, my output will go high for just 1 second.
The delay will be an HMI setpoint and the digital input may be on for any amount of time ( it could be more than or less than the delay) and the digital input may cycle on and off multiple times in the duration of the delay timer.
What's the best way to do this?
For reference, I am running on an AB L72.
Thanks!
Last edited: