The difference is about the precision really. A Servo motor works essentially like a synchronous motor, the frequency applied to the stator is the speed that the rotor will spin at, because the rotor has permanent magnets; its magnetic fields are always present. So if you tell a servo to run at 300RPM, it runs at 300RPM immediately (within the torque capacity of the design).
An induction motor is, by definition, going to need to have slip; the difference between the synchronous speed of the rotating magnetic fields in the stator and the induced magnetic fields in the rotor. Without slip, you have no torque. So if I want a motor to spin at 300RPM and it has 3% slip for example, the VFD must determine that the slip is 3%, compensate for it, and make the stator frequency equivalent to 309.27 RPM, that way the rotor speed ends up at 300. This of course is subject to all kinds of internal and external influences and variations, therefore the performance envelope is very dynamic and requires quite a bit of number crunching. That is what a "Vector" drive gets you. But not all vector drive control algorithms are the same, especially when it comes to torque. To perform a torque controlled application, you need a really good VFD that in essence anticipates the slip in an even tighter envelope, called a Torque Regulator loop, which usually comes with Flux Vector Control or Field oriented Control algorithms. The cheaper the drive, the less capable it is of doing this. So in the AB world, "Architecture" class drives like the 7 series are capable of this, but the lower cost "Component" class drives like the 4, 40 and 520 series are not. The 525 can do what is referred to a "Velocity Vector" control, with the encoder feedback, but the microprocessor inside of it is meant to be lower cost, so it does not possess the internal kahunas to accomplish the necessary number crunching you need for Torque Regulation. Velocity Vector means it will adjust the output to maintain a speed much much more precisely than a basic SVC drive is able to, which does improve the torque capability, but the precision by which it can maintain this is nowhere near what an FVC/FOC or servo drive can.
So how much better are the technologies? It's not easy to compare, but let's try by looking first at how fast the various technologies can respond to a change in the load, referred to as a "Step Change Response" as measured in radians per second. Radians are parts of a circle, 2pi radians = a full circle (FWIW). So the higher the radians per second (rad/sec) the response rate is, the faster the motor control algorithm can react to a change in load conditions, such as when accelerating or re-accelerating as something changes in the load.
With a standard V/Hz drive it's impossible to discuss the response in radians, because the response is in seconds. In other words the V/Hz VFD tells the drive to run the motor at 300Hz, a few seconds later it gets there, then from that point on has no idea if the motor is at that speed or not, until maybe the current goes so high that it trips on overload.
With a simplistic Sensorless Vector Control (SVC) algorithm, as found in really "hard to believe" low cost drives, you might get a response of maybe .5 radians/sec., better than a non-SVC drive. Better quality SVC drives get to about 5 rad/sec, so a 10x increase in response time. Upping that to a Velocity Vector Control algorithm such as the 525 with an encoder feedback can take that to 10 rad/sec, the response is twice as fast in other words, so the precision is a big leap better as far as SPEED regulation. But going to FVC/FOC or an entry level servo can take it to a quantum leap of 100-120 rad/second response rate as far as speed control.
What does that look like in the real world though? V/Hz we say is capable 1-2% speed regulation, but in a 6:1 turn down range at best, meaning the slower it gets, the less accurate that becomes. Low end SVC becomes 1% more consistently and at a 20:1 turn down. The better SVC algorithms like in the PF40 or 525 w/o the encoder feedback take it to 0.5% speed regulation and a 100:1 ratio, then the Velocity Vecor takes it to 0.1%. But FVC/FOC and servos become 0.001% regulation and a 1000:1 turn down ratio.
From a torque standpoint, an induction motor is capable of 200-220% of FLT peak for a few seconds, a servo motor is more like 250-300% peak torque, so that's why the response and accuracy can be so much better accomplished.
Now, do you NEED that kind of accuracy and response for a feed into a saw? I seriously doubt it. Many people tried VFDs years and years ago when they were only V/Hz and they were unsatisfied with the performance and wrote them off for any future consideration, deciding that servo was the only way to go. Unfortunately for them, they have now missed out on a lot of very significant improvements in performance that comes with standard off-the-shelf drives. I've done a few saw feeder drive applications over the years, mostly to optimize the feed rate by using a closed loop feedback of the saw loading (kW) to the VFD running the feeder. I can attest to the fact that older V/Hz drives were indeed unsatisfactory in that regard, but I cannot think of a sawing operation that would need any better than a good quality SVC drive now, unless, as was mentioned, it's a matter of precise positioning. Were it my money, I would use a PF755 without the external encoder, in SVC mode at first, see how it performs. If your boss doesn't like it, you can change to using Encoderless FVC mode (that drive is capable of that now), then if that's not good enough, you can add an encoder to the motor and change again to full blown FOC control. With that, you will get servo-like performance with an induction motor, but take it there in steps of ever increasing complexity so that if the simpler way works, you are done.