Differences between MicroLogix clocks and SLC-5/03 clocks or timers will probably reflect the differences in scantime and in real-time-clock design between the controllers. I don't know all of the internal details, but in general a MicroLogix scans faster (not to mention that it has a 0.001 timebase available) and has an RTC design with components that are 10 to 15 years newer.
Drift of 4 minutes per hour on an SLC-5/03 is very high and suggests an oscillator that is degraded or something else is wrong in the system.
Some of the "my controller time is drifting excessively" problems that I have seen are when a user inadvertently configures both a read and write connection to the system clock from their HMI. Every time these HMI devices read the system clock, they also write to the system clock, so you get a 1 second error introduced depending on exactly when over the course of a second the event occurred and how long the transaction takes. On some systems I was seeing 1 second error every 60 seconds because that was the update rate of the HMI's system clock sync feature.
I have worked on a system that requires very good time accuracy both for its various Windows servers and its S7-300 controllers, but it is isolated from the Internet and has no view of the sky. And no, it's not hiding from American bombers. Not since the spring of 1945, anyhow.
We installed a time reference device from EndRun technologies that uses CDMA cell phone signals to derive a highly accurate time signal. It distributes that to all the networked PCs and servers using NTP, and uses the discrete outputs to provide a sync signal to all the S7 controllers once an hour.
PC time drift is not a big issue now that virtually all PCs are connected to the Internet and can access a network time server. But put a bunch of ordinary Dell PCs on a totally isolated system and come back in a few months, and you'll find their clocks scattered around a window of fifteen or twenty minutes.