I have an extremely dumb math question.
An analogue input gives me 0 to 27648 when operating in 0 to 20mA
In the program I want to scale into a mA value.
So, INPUT / 27648 gives me a range of 0 to 1, after that I make a new range by multiplying it with 20. Then I have the Value INPUT in mA in the program.
But, if u do this: INPUT / 1382,4, then u get the same.
27648 / 20mA gives me a usable factor.
Is there anyone who can explain me more, why 27648 / 20mA, from where it comes...
Tnx
An analogue input gives me 0 to 27648 when operating in 0 to 20mA
In the program I want to scale into a mA value.
So, INPUT / 27648 gives me a range of 0 to 1, after that I make a new range by multiplying it with 20. Then I have the Value INPUT in mA in the program.
But, if u do this: INPUT / 1382,4, then u get the same.
27648 / 20mA gives me a usable factor.
Is there anyone who can explain me more, why 27648 / 20mA, from where it comes...
Tnx