Steve Bailey
Lifetime Supporting Member + Moderator
I ran into an interesting situation today and the resolution of it is puzzling me.
My client has a 5 HP, 460 VAC motor with an Omron 3G3JV drive. That's a basic volts/Hz (scalar) drive. We've known for a long time that the motor size is marginal for the application. The motor nameplate is 6.5 FLA and it's been drawing around 6.2 - 6.4 amps in operation. In fact, there have been times when the maintenance person has had to bump up the parameter in the drive that sets the max allowed current in order to get a complete production run. A typical production run is about 1.5 - 2 hours long and they generally only make one run per day, occasionally two. Motor speed ranges from 650 to 1000 RPM. It's constant for each production run. Different products are produced at different speeds.
Due to some changes in the product they're producing, they need to reduce the line speed. They found that as they reduce the speed down to around 20 Hz, the current remains reasonably constant, but below 20 Hz it starts to rise. The lower the speed, the higher the current. There doesn't appear to be any reason to believe that the required torque increases at lower speed.
As a result of the increased current draw, they couldn't slow down the line speed as much as they wanted without having the drive trip out on a timed overcurrent fault. The drive was set for max allowed current equal to nameplate current.
In an attempt to keep the current constant over a wider range, we decided to try to play with the slope of the Volts/Hz curve. On this drive it's done by setting two pairs of parameters. The first pair sets the high frequency and corresponding voltage. This was set at 60 Hz, 460 volts, which was the default setting. The second pair sets a low frequency and corresponding voltage. These were also at the default settings of 1.5 Hz, 24 volts.
My initial thought was to increase the voltage setting at the low frequency. My reasonong was that by increasing the voltage at any frequency, we would see a corresponding decrease in current at that frequency. It didn't work out that way. Increasing the voltage setting at 1.5 Hz resulted in higher current. Decreasing the voltage setting at 1.5 Hz resulted in lower current. We reduced the setting to 6 volts at 1.5 Hz and got a lot less variation in current all the way down to 5 Hz, which is way slower than we need to go.
Current draw before changing:
30 Hz, 6.0 A
25 Hz, 6.1 A
20 Hz, 6.3 A
15 Hz, 7.1 A
10 Hz, 8.1 A
After the change:
30 Hz, 5.6 A
25 Hz, 5.5 A
20 Hz, 5.3 A
15 Hz, 5.6 A
10 Hz, 5.4 A
5 Hz, 5.1 A
The change that worked seems counter-intuitive to me. Does anybody have an explanation?
More important, have we put the motor or drive at risk as a result of the change?
One additional change we will be making soon: The person who originally installed the machine did not pull the wires from the motor's thermal switch back to the drive cabinet. We will be doing that soon and wiring the thermal in series with the run contact so that if the motor actually overheats it will shut down.
My client has a 5 HP, 460 VAC motor with an Omron 3G3JV drive. That's a basic volts/Hz (scalar) drive. We've known for a long time that the motor size is marginal for the application. The motor nameplate is 6.5 FLA and it's been drawing around 6.2 - 6.4 amps in operation. In fact, there have been times when the maintenance person has had to bump up the parameter in the drive that sets the max allowed current in order to get a complete production run. A typical production run is about 1.5 - 2 hours long and they generally only make one run per day, occasionally two. Motor speed ranges from 650 to 1000 RPM. It's constant for each production run. Different products are produced at different speeds.
Due to some changes in the product they're producing, they need to reduce the line speed. They found that as they reduce the speed down to around 20 Hz, the current remains reasonably constant, but below 20 Hz it starts to rise. The lower the speed, the higher the current. There doesn't appear to be any reason to believe that the required torque increases at lower speed.
As a result of the increased current draw, they couldn't slow down the line speed as much as they wanted without having the drive trip out on a timed overcurrent fault. The drive was set for max allowed current equal to nameplate current.
In an attempt to keep the current constant over a wider range, we decided to try to play with the slope of the Volts/Hz curve. On this drive it's done by setting two pairs of parameters. The first pair sets the high frequency and corresponding voltage. This was set at 60 Hz, 460 volts, which was the default setting. The second pair sets a low frequency and corresponding voltage. These were also at the default settings of 1.5 Hz, 24 volts.
My initial thought was to increase the voltage setting at the low frequency. My reasonong was that by increasing the voltage at any frequency, we would see a corresponding decrease in current at that frequency. It didn't work out that way. Increasing the voltage setting at 1.5 Hz resulted in higher current. Decreasing the voltage setting at 1.5 Hz resulted in lower current. We reduced the setting to 6 volts at 1.5 Hz and got a lot less variation in current all the way down to 5 Hz, which is way slower than we need to go.
Current draw before changing:
30 Hz, 6.0 A
25 Hz, 6.1 A
20 Hz, 6.3 A
15 Hz, 7.1 A
10 Hz, 8.1 A
After the change:
30 Hz, 5.6 A
25 Hz, 5.5 A
20 Hz, 5.3 A
15 Hz, 5.6 A
10 Hz, 5.4 A
5 Hz, 5.1 A
The change that worked seems counter-intuitive to me. Does anybody have an explanation?
More important, have we put the motor or drive at risk as a result of the change?
One additional change we will be making soon: The person who originally installed the machine did not pull the wires from the motor's thermal switch back to the drive cabinet. We will be doing that soon and wiring the thermal in series with the run contact so that if the motor actually overheats it will shut down.