In today's world of AC drives, you can roughly divide the technology into three groups: Volts per Hertz or Scalar Drives, Sensorless Vector Drives, and Flux Vector Drives. These are arranged in order of increasing precision.
A Volts per Hertz drive is a simple open loop device. By that I mean that the inverter sends a preset level of frequency and voltage to the motor as a command to run at a particular speed. The drive has almost no way of knowing to what extent the motor is complying with its command. About the only feedback this type of drive gets is motor current and based on this information, it can reduce motor slip by about half thru a technique called slip compensation. This feature adds a little frequency to the output as the motor current goes up. You can expect to reduce motor slip by about half using nothing more than slip compensation.
A Flux Vector Drive is at the opposite end of the spectrum. It requires an encoder on the motor shaft to feed back to the drive the shaft speed and position. Thru this additional information, the drive can calculate motor shaft torque very precisely and, of course, can read shaft speed directly from the encoder. This system can achieve speed regulation down to about one-tenth of motor slip. In addition, due to the encoder, the error is non-cumulative--very important when tensioning of webs and similar applications are involved. The major disadvantage of this system is that the motor must have an encoder mounting provision making it non-NEMA and therefore much more expensive. The cost of the encoder and receiver module in the drive is also significant.
A sensorless vector drive has capabilities somewhere in the middle of Volts per Hertz and Flux Vector. The tricky thing about Sensorless Vector is that, for one manufacturer the performance is barely any better than Volts per Hertz while, for another manufacturer, the performance can approach Flux Vector in some applications. You have to read the specs carefully or do your own testing because you can't necessarily believe everything you are told or read.
Sensorless vector does not use a motor shaft encoder or any other feedback device. As a result, a standard NEMA motor can often be used. The drive is built with the intelligence to take the motor current which is the vector sum of magnetizing and torque-producing currents and unwind those two components solving the vector equation for each often enough to provide a reasonably accurate torque calculation. With this information, a sensorless vector drive can be a torque regulator or a rather accurate speed regulator. Depending on the manufacturer, sensorless vector drives can regulate speed in the range of half motor slip to one-fifth of motor slip. For many, even most, drive applications, this is good enough and can be done without an encoder. That's what is making sensorless vector so popular.
Notice that the whole issue of speed regulation is a matter of how the drive manages the induction motor's tendency to slip as its load increases. I can't emphasize enough that the choice of motor makes more difference in speed regulation that the choice of drive. For example, if you buy a standard efficiency four pole motor (1800rpm sync speed) and the nameplate says 1745rpm, that means that from no load to full rated load, the motor slips down 55 rpm or about 3%. This is the speed error that the drive has to find a way to eliminate. On the other hand, if you spend a little more and buy a premium efficient motor with nameplate speed of 1782rpm, that's only 18rpm slip or 1%. Suddenly, with the same drive, your system has only one-third the speed error as before. My point is that the motor is a critically important part of a tightly regulated system and a poor choice can destroy what would otherwise be a good performer.
Notice also that speed regulation is almost entirely a matter of speed changes due to CHANGES IN LOAD ON THE MOTOR. If you have a system where the load never changes, you can take the worst motor with the dumbest drive and have a very good system. Just don't change the load and the speed will sit right were you put it!
Incidently, there are two other sources of speed error, the first being drive drift and the second being speed reference error. Drive drift occurs when a constant drive speed input signal exists and the drive output drifts off the proper speed due to heat, line voltage or any other internal cause. Modern drives have such small drift that it is never significant and can be ignored.
Speed reference error can take a number of forms, one simply being induced noise on the desired signal. Shielding of the speed input cables normally take care of this. Most modern drives take the analog speed input signal and convert it to digital thru an analog/digital converter. These have a resolution based on the digital output bits, 8 bit resolution being rather poor and 12 bit being very good. This resolution error is normally too small to be significant and is ignored. One outrageous source of speed reference noise is the placing of a speed pot on a vibrating machine surface. This is particularly bad if the pot is dirty, is a cermet or carbon type, or is mechanically sloppy or worn out. The vibrating contact in the pot sends a signal to the drive that varies all over the place and the drive faithfully tries to vary it's output all over the place too. Usually the drive is cursed for this behavior when, in fact, it is innocent. This is a problem that deserves a good look if you are seeing irratic speed.
Sorry this got so long. But, there, I've spilled about everything I know on this subject. If you don't agree, just say so!