I am currently working on a servo application in which I made a big screwup specifying the correct motor. It apparently comes down to inertia mismatch which is something I can't quite grasp. This is what I was hoping to get some more information on.
Here are the details...
The application required a small servo motor to spin a dial. The dial is used to flip over some parts that are coming in upside down. The reason for the servo motor is we need acurate positioing and speed control. The initial design was to direct drive the dial with this motor.
Based on the inertia of the load (dial) and the speed requirements, I figured this was not a problem at all. I figured out the torque requirements for the motor based on the total system inertia and the acceleration requirements. This allowed me to pick out a small Y series motor from AB. The rest of the system was already on a AB Sercos network so to integrate this cheap motor and corresponding drive was a sure thing. Or so I thought...
It appeared as though the motor had plenty of torque to move the load at the required rate. At least that is what I thought (basec on the my calculations.) Then I started working on the application and realized that the motor just can't do it. So naturally I asked myself why...
So I started doing some research and got my local AB motion control guy to help. He explained to me about the inertia mismatch thing but really did not explain it all. Just kinda of said it was a rule of thumb and reconfirmed that my current configuration was not going to work. That was not quite the answer I was looking for.
So I am currently in the process of adding a planetary gear box to reduce the reflected inertia to the motor. Not because the motor cannot handle it but because the load-to-motor inertia ratio is way too high.
I can't quite get the theory of this down. If the motor has the torque to do the job, why does the mismatch override this.
Can someone please explain the theory and importance behind inertia mismatch when dealing with servo applications?
Here are the details...
The application required a small servo motor to spin a dial. The dial is used to flip over some parts that are coming in upside down. The reason for the servo motor is we need acurate positioing and speed control. The initial design was to direct drive the dial with this motor.
Based on the inertia of the load (dial) and the speed requirements, I figured this was not a problem at all. I figured out the torque requirements for the motor based on the total system inertia and the acceleration requirements. This allowed me to pick out a small Y series motor from AB. The rest of the system was already on a AB Sercos network so to integrate this cheap motor and corresponding drive was a sure thing. Or so I thought...
It appeared as though the motor had plenty of torque to move the load at the required rate. At least that is what I thought (basec on the my calculations.) Then I started working on the application and realized that the motor just can't do it. So naturally I asked myself why...
So I started doing some research and got my local AB motion control guy to help. He explained to me about the inertia mismatch thing but really did not explain it all. Just kinda of said it was a rule of thumb and reconfirmed that my current configuration was not going to work. That was not quite the answer I was looking for.
So I am currently in the process of adding a planetary gear box to reduce the reflected inertia to the motor. Not because the motor cannot handle it but because the load-to-motor inertia ratio is way too high.
I can't quite get the theory of this down. If the motor has the torque to do the job, why does the mismatch override this.
Can someone please explain the theory and importance behind inertia mismatch when dealing with servo applications?