I have a question for any boffins out there that can possibly help me. I am running two PLC controlled motors via mitsubishi FR520 inverters, both are 0.5 HP, identical make (ABB), specification and interter settings. Both are attached to 10:1 gearboxes which each drive a belt between two identically sized pulleys. (Both systems are physically separate and identical mechanically). With it being a fairly low torque application I wired both motors in star. Both inverters showed a running current of 0.3A. All nice, however on testing the torque of each motor (by pressing fingers to the belts). One could not be stalled by hand and the other could be quite easily. On testing the voltage of each phase at each motor, they were all identical. What am I missing that could cause this difference in output torque? (To get around the problem I had to change one motor to delta configuration, but this brings the current up to 0.7A and runs the motor hotter than I would like (about 40 degrees C)). Any ideas, or have I missed something obvious?