I think the most important consideration now is just the programming implications, i.e. a delay in accepting values from anything until everything is fully powered and checked. It's not so much of a problem as it used to be, but it can still rear it's head under the right circumstances.
The "urban legend" may stem from the early days of PLCs when the power supplies were linear, not switch mode as they all are now. The linear power supplies tended to be more finicky about input power, hence the requirement for CVTs ahead of them, and the output was ramping as you powered it up, which could cause errors if not accounted for. One nasty one was if someone turned on the main power, then immediately hit the Start button as the linear power supply was still ramping up, because things might start out of sequence. With SMPS, there is no output until it is a good stable output, and it is very quick to get there as well.
However with regard to EEPROMs, they all have a finite number of write cycles, so if you are storing to an EEPROM with each power down, you are technically consuming that capability. That was typically in the 10's of thousands of write cycles, which may have been another contributor to the urban legend, but modern EEPROMs or Flash Memory is now in the millions of write cycles, so the likelihood of consuming it in the life of the machine by powering it off once per day or even once per shift, is so low as to be insignificant any more.