Comparing floating-point to floating-point is haphazard as-is, noise (even filtered) and ADC quantization error will only exacerbate the problem. You should compare values to see whether they fall within an expected range of error.
I agree, but in my case I'm only using integer values over a scale of 4 to 125.
In any case, after grinding away on it this morning, I've come up with something that works. There are possibly better ways, but this works and seems to be pretty efficient.
I've attached a screenshot of the logic. Basically, if it falls in the range of "bad" pressures, it allows the Change of Values instruction to run. The CHG instruction looks at my pressure and samples it every 300ms, giving me a integer value of change in that sample time. If you go from 5psi to 7psi in 300ms, the change would be 2.
So, to start the timer, I'm comparing that value to zero (meaning, no change in 300ms). If that is true, timer starts.
I gave that value an initial value of 5 so that it doesn't immediately start timing the moment it goes into "bad pressure". Doing that allows it to sample once and derive an accurate value as to not penalize the player due to bad code work on my end.