RatherBeFishing
Member
Hi guys, Long time lurker, first time poster.
As a little info, I have a background in industrial controls and electronics for over 5 years now, but I am starting to move to the engineering side of the room compared to my last 4 years of mostly breakdown work. So I will likely post a lot, as I like to learn. I would rather become good at this and have a less stressful life, than struggle and live with anxiety of my job.
Anyhow, I have an elevator type application where I cannot auto-tune the ASR (automatic speed regulation) of a Yaskawa A1000 drive. It is in Closed Loop Vector control. The application was setup for the drive to hold a center position (process moves off of 0V reference run back towards the 0V, while on 0V you're simply holding the 0V at whatever current it takes).
Well, I ended up settling on setting up the two different ASR constants (Gain and integral time). One for high speed (over 10hz) and one for low speed (under 10hz). The constants are close, but a little more gain and a little lower integral time for over 10hz speed regulation.
ASR P gain 1 = 53
ASR I time 1 = 0.421 seconds
ASR P gain 2 = 48.65
ASR I time 2 = 0.495 seconds
How do I know when it is tuned to the best performance? The process is doing well, but I'm getting oscillations occasionally. If I turn down the gain, the oscillations settle out, but the response time suffers too much. Same with the integral time, but up instead of down. I try to keep the current to a minimum, but the current spikes during the oscillations.
Also, I can hear the drive making a clicking noise, which appears to be happening when it is switching between forward and reverse while near the zero reference. Sometimes at several or more clicks per second. How bad is this?
I simply do not want to shorten the life of the motor, or let the process sag. What are my options?
As a little info, I have a background in industrial controls and electronics for over 5 years now, but I am starting to move to the engineering side of the room compared to my last 4 years of mostly breakdown work. So I will likely post a lot, as I like to learn. I would rather become good at this and have a less stressful life, than struggle and live with anxiety of my job.
Anyhow, I have an elevator type application where I cannot auto-tune the ASR (automatic speed regulation) of a Yaskawa A1000 drive. It is in Closed Loop Vector control. The application was setup for the drive to hold a center position (process moves off of 0V reference run back towards the 0V, while on 0V you're simply holding the 0V at whatever current it takes).
Well, I ended up settling on setting up the two different ASR constants (Gain and integral time). One for high speed (over 10hz) and one for low speed (under 10hz). The constants are close, but a little more gain and a little lower integral time for over 10hz speed regulation.
ASR P gain 1 = 53
ASR I time 1 = 0.421 seconds
ASR P gain 2 = 48.65
ASR I time 2 = 0.495 seconds
How do I know when it is tuned to the best performance? The process is doing well, but I'm getting oscillations occasionally. If I turn down the gain, the oscillations settle out, but the response time suffers too much. Same with the integral time, but up instead of down. I try to keep the current to a minimum, but the current spikes during the oscillations.
Also, I can hear the drive making a clicking noise, which appears to be happening when it is switching between forward and reverse while near the zero reference. Sometimes at several or more clicks per second. How bad is this?
I simply do not want to shorten the life of the motor, or let the process sag. What are my options?
Last edited: