Here is my application. I have a conveying system on load cells with a variable position knifegate (4-20ma). Using a loss in weight calculation that updates every 10 seconds I find the rate in LBS/min.
Most times I would make my own "PID" by finding the error between the setpoint and the actual every 10 seconds or so, then add or subtract a arbitrary number to the knifegate analog output to increase or decrease the rate. Obviously this is not a very accurate way to do this.
Thus, I have tried to set up a PID in 5000 to control my knifegate. This is the first time I have tried to use the actual PID instruction (my company does not do a lot of closed loop control). I have the scale, load cell simulator, and PLC on my desk and I am trying to test it. However it is very hard to tell if the PID is adjusting appropriately as I slowly turn the weight down.
Is the 10 second delay between process variable updates too long for a PID to work? What is the best way to test this?
The PID has default settings except for:
P=5
I=0.0001
D=0
Deadband=4.0 w/ no deadband crossing
Thanks
Most times I would make my own "PID" by finding the error between the setpoint and the actual every 10 seconds or so, then add or subtract a arbitrary number to the knifegate analog output to increase or decrease the rate. Obviously this is not a very accurate way to do this.
Thus, I have tried to set up a PID in 5000 to control my knifegate. This is the first time I have tried to use the actual PID instruction (my company does not do a lot of closed loop control). I have the scale, load cell simulator, and PLC on my desk and I am trying to test it. However it is very hard to tell if the PID is adjusting appropriately as I slowly turn the weight down.
Is the 10 second delay between process variable updates too long for a PID to work? What is the best way to test this?
The PID has default settings except for:
P=5
I=0.0001
D=0
Deadband=4.0 w/ no deadband crossing
Thanks