I purchased a 4-20ma AC Current Transducer, it can be set to 0-100a, 0-150a, 0-200a. It is not True-RMS.
This is connected to a panel meter and I wanted to know why the measurement from this equipment is different than my True-RMS clamp meter.
I was testing both out on a 120v AC branch circuit. The current transducer and panel meter were showing 6.8 Amps but the True-RMS clamp meter was showing 7.2 Amps.
To me a difference of 0.4 Amps seems excessive. This was even with the current transducer on the 0-100a setting so it would be as accurate as it could be.
The scaling was set correctly on the panel meter...On the feeder supplying the breaker panel there was about 50a of current on L1. The True-RMS meter and the current transducer & panel meter differed by an amp on this.
Is there really that big of a difference with a True-RMS measurement? Or could it be dirty power in our building? Maybe the AccuAmp current transducer from Automation Direct is that inaccurate? It's made in the USA :/
Thanks
This is connected to a panel meter and I wanted to know why the measurement from this equipment is different than my True-RMS clamp meter.
I was testing both out on a 120v AC branch circuit. The current transducer and panel meter were showing 6.8 Amps but the True-RMS clamp meter was showing 7.2 Amps.
To me a difference of 0.4 Amps seems excessive. This was even with the current transducer on the 0-100a setting so it would be as accurate as it could be.
The scaling was set correctly on the panel meter...On the feeder supplying the breaker panel there was about 50a of current on L1. The True-RMS meter and the current transducer & panel meter differed by an amp on this.
Is there really that big of a difference with a True-RMS measurement? Or could it be dirty power in our building? Maybe the AccuAmp current transducer from Automation Direct is that inaccurate? It's made in the USA :/
Thanks