I have no experience with the hardware you are using, but the general concept is you have to scale it somehow. Usually, scaling depends on the sensor. Your 20mA value will occur at the highest range your sensor detects, your 4mA signal will be at your 0 range.
For example, lets say you have a sensor that converts 0 - 100 PSI to a 4 - 20 mA signal. According to the sensor, it will "send out" 4mA when the there is no pressure on the sensor. If i hook up a 100 PSI line to the sensor, the sensor will send out 20mA. Now, if you go put this on a machine where the pressure only reaches from 0 - 50 PSI, you still have to take into account that the 20mA signal is at 100 PSI (so you will only be using half the effective range of the sensor).
With all that being said, when you hook your sensor to your PLC, it then assigns a number to any analog value within a range. This is your counts. So if your PLC input card range is 0 to 20,000 for a 0 - 20mA signal, when your sensor sends out 4 mA you get a count of 4000. When it sends out 20mA you get 20000. When you get a count of 0, you know that a wire is broken on your sensor because you are receiving no signal (fault and alert the operator). You then go inside your program and use a scale function (or you can derive a formula and use math functions) to convert the range of 4000 - 20000 to the PSI range of 0 - 100.
I'm not sure if this is the method you have to use for your hardware, but this is how it works on most platforms i've had experience with.