Description
Some elements (notably the debug graphs) seem to auto-scale the display after analyzing the data during parsing to maximize the visible portion of the graph. While this is generally helpful, there are times would it would be useful to manually set and "lock" the scale to allow comparisons between different data sets. For example the user might be trying to examine noise in a debug trace. The trace with very little dynamic range (low noise) will end up scaled (magnified) more and appear to be worse than another log with more noise (resulting in less scaling). I think of the auto-ranging to be behaving like setting a graph up in Excel and choosing the axis min-max vs. having it auto select based on analyzing the data looking for the absolute min/max.
So maybe an option in the graph settings dialog to disable the auto-ranging and just literally use the scale % specified by the user applied to the raw data.