Hi there! I've got a Landsat 8 OLI reflectance image that was distributed as a 16-bit integer with a published multiplicative/additive factor. I correctly added this to the SCP band set definition, and when doing the pixel identification tool it properly shows the data scaled between 0 and 1. However, when generating ROIs and plotting them, the y-axis in the plot reverts back to the unscaled 16-bit "raw" DN values. The image was saved as a virtual band set, if that matters. Any ideas? Here are some screenshots:
Multiplicative/additive settings:

Pixel ID tool with floating point values ranging from 0.0 to 1.0:
Spectral signature plot showing the raw DN on the y-axis:
