The answer to your question depends on what the requirement for 'calibration' means.
25 years ago, before 'smart' transmitters, it was routine practice to calibrate a pressure transmitter as often as quarterly (every 3 months) because the transmitters drifted, sometimes as much as 3-4%. Even though temperature compensation was attempted, but the difference between winter and summer readings or even day and night readings could be attributed to temperature drift as much as anything else.
Calibration involved
- shutting off (or disconnecting) the process from the transmitter,
- applying a an actual wet 'zero' pressure with air or hydraulic fluid (zero is sometimes zero, other times it is an offset value)
- making an adjustment so the 4mA read 4.000 mA and not 3.933ma.
- applying a span pressure (air, gas or hydraulic fluid)
- adjusting the electrical span (20mA) so it reads 20.00mA
- going back and forth between zero and span and iteratively adjusting the zero and span until both hold their settings.
And it still didn't take out the drift due to temperature change.
Sometimes the cal was done in the field, with 5 valve manifolds installed (on DPs) for applying the cal pressures, other times on the bench.
The Wally box was one of several tools that most large instrumentation shops had for calibrating their transmitters, for applying air or gas pressure to the transmitter.
Smart transmitters have been around since their introduction in 1983, and in wide use since the early 1990's. Smart transmitters typically use HART for configuring the transmitter.
The tag name, engineering units (PSI or KPa), and zero and span values can be setup using a HART communicator.
Modern smart transmitters from the major reputable industrial manufacturerers have negligible drift, that is, close to zero drift. There is no detectable difference between a pressure reading on the coldest winter day or hottest summer day when reading the output of a primary source calibrator like a dead weight tester.
Technically speaking, setting the zero and span with a HART communicator is not "calibration", it is "configuration". Most HART transmitters have non-standard commands (the DD for that device must be in the HART communicator) that allow one to 'force an output', whereby the millianp current output can be checked and adjusted. Force zero output, check for 4.000 mA, adjust if necessary. That does qualify as "calibration".
I am not aware of a major manufacturer who has calibration beyond the tweaking of the 4-20mA output. In other words, the transmitter 'reads' the applied pressure, converts it to a value and that's the value it is. There are methods of 'live zeroing' where whatever pressure is applied is considered zero, but that means the transmitter still sees that applied pressure but considers the applied pressure as 'zero' and that becomes 4mA.
If your transmitters have HART, it pays to have a HART configuation tool, either a handheld or a PC with HART app, like Siemens's PDM. Since smart transmitters can be field configured for different ranges, it pays to be able to range the transmitter for each situation. No pressure need be applied, no adjustment need be made, just keyboard entries on the HART communicator.
Smart transmitters can't be internally calibrated on the input side, and can only be calibrated on the output sid (4-20mA). The realization of this comes when reading the manufacturer's documentation for the pressure transmitter. There are procedures for adjusting (tweaking) the 4.000 and 20.000mA outputs, but no means of altering what value the transmitter converts the applied pressure to.
The elimination of drift has significantly cut down on the necessity for calibration.
That being said, there are regulatory requirements (whether company instituted or government instituted) where a 'calibration' is necessary.
What constitutes "calibration" in your case?
Configuring zero and span ranges (probably not, unless the purpose of the Wally box isn't clear)?
Would forcing an output through HART and checking the 4-20mA output suffice?
Do you have to apply a zero and span pressure, or apply additional pressure points and confirm the transmitter's response?
If you can live with forcing zero/span and checking/adjusting the output, then a HART communicator with a good digital meter (milliamp) scale will do the job.
If you need to apply pressure then, then you probably need the HART communicator to make the adjustments (if transmitters are HART enabled) and a pressure source, which could be a Wally box or a number of any other pressure source calibrators.
Dan