quite a few old-school programmers still don't fully trust the new asynchronous scan cycle used by the "Logix" platforms
Is it a matter of trust, or is it wanting (or needing) a consistent set of inputs to work with throughout the scan?
"Left to right, and down" means the abscissa is signal* and the ordinate is time, to use a plot analogy (e.g. in sheet music the abscissa is time and the ordinate is frequency).
* plus time, maybe
So if I [Go look for a 1] at a particular input on [N] rungs and/or branches, then an asynchronous I/O scanning system gives me a minimum of [2N] cases, and theoretically as many as 2**N cases, I need to think about, instead of just one.
I suspect it would rarely matter for normal operation in most processes, but that level of indeterminacy would wreak havoc with any troubleshooting efforts, especially for a rare or transient problem.
So I may have to do the analysis to determine which critical subset of inputs to buffer, at which point it may be more productive to simply buffer them all, to save time.
So it may not be only an issue of trust; I'm just sayin'
"Old-school" sometimes means "Fool me once ..."