Yosi - My multiplier is 6.8
There are no hard and fast rules that we here have ever found. Too many things affect programming time.
Yes, I/O count is one, but that is mostly mechanical time (that is, assigning labels/descriptions/symbols) for digital I/O's.
Analogs and math cost more time-wise.
Communications cost yet more.
Specialty modules cost more.
But, the biggest factor is the quality of the specification documentation. If you can split a problem into several smaller, mostly independent modules, programming time is decreased.
If you can actually generate, or have a flow-diagram of exactly how things should work, programming is almost a linear translation time-wise. If documentation is poor, you start approaching bubble-sort programming times. I generally consider an average programming task to be in O(log n) times, but poor understanding of the problem before writing code can easily get to O(n^2) or O(n^3) times.
The 'n' in the 'Big O' notation above refers basically to time per I/O point, counting BOTH physical and HMI/Datalogging I/O.
Whatever programming time may be, Debugging will be at least 3 times programming /sigh.