It is usually the time each type of input takes to switch on when presented with a 'true' or '1' signal.
"Normal" inputs may have an input switching time of 1mS or more. In many cases additional delays may be desired to eliminate 'contact bounce' so provided your process can accomodate it, delays of many milliseconds may be common.
If we have an input which takes 1 millisecond to switch on, and say, 1mS to switch off, then the fastest signals you can detect in sequence will be 1 every 2mS, or 500Hz. For a high-speed counter you may have to deal with signals arriving at 20KHz, 30KHz, 40kHz or higher. Obviously the circuitry in that input must be able to switch on and off in very much shorter times - down to a few microseconds.
How they do that goes much too deep in semiconductor design for me!
regards
Ken