We like to know the speed of a vehicle, it‘s a normal car. Two photoelectric sensors (mounted only 1 meter distance from each other) are connected on two digital inputs of the PLC. The PLC measure the time that the car needs to ride from sensor 1 to the sensor 2.
For instance the car needs 100 millisecond between sensor 1 and sensor 2. The PLC calculated the distance (m) divided by the time (sec), what gives 1 meter divide by 0,1 sec x 3,6 = 36 km/h. But inside the car on the speed meter there was only +/- 12 km/h !
Other instance, when I’m driving 50 millisec. between the 2 sensors the PLC calculate 72 km/h but inside the car on the speed meter it was only 30 km/h.?
I do not need a big accuracy but these differences between what I read inside the car and calculated by the PLC are too big – I think it is a principal problem .
Does someone have an idea ?
Thanks and best regards.
For instance the car needs 100 millisecond between sensor 1 and sensor 2. The PLC calculated the distance (m) divided by the time (sec), what gives 1 meter divide by 0,1 sec x 3,6 = 36 km/h. But inside the car on the speed meter there was only +/- 12 km/h !
Other instance, when I’m driving 50 millisec. between the 2 sensors the PLC calculate 72 km/h but inside the car on the speed meter it was only 30 km/h.?
I do not need a big accuracy but these differences between what I read inside the car and calculated by the PLC are too big – I think it is a principal problem .
Does someone have an idea ?
Thanks and best regards.