Agreed - why not trap whatever it is you're looking for. I can think of few cases where the data logging goal isn't one of the following over a sustained time period:
1. Overall trend to view on a graph
2. Relative extrema (mins, max), rapidly changing values, or specific "target" values
3. Boolean alerts/alarms, which could be based on complex conditions
4. Basic aggregate statistical information (averages, std deviation, etc)
5. Some sort of regulatory requirement
Every one of these could be accomplished without logging data eve ry 100ms. I can imagine two examples of why you might have to log fast. One is if your process is extremely rapid, like modeling explosions, in which case you probably wouldn't log the data for very long. The other is if you're doing some sort of intense statistical analysis on all the data, in which case "running calculations" or short logging periods would likely be sufficient.
Do you intend on looking at 5 million points in a spreadsheet?
Why else would anyone actually need to log that much data so rapidly? (Not a rhetorical question).
seppoalanen said:
Why all 50, why not only changed data?