You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 23, 2023. It is now read-only.
I have a small question: When averaging a stack of HAADF images-the absolute counts of each pixel positions are getting averaged or are they also getting normalized between 0 and 65535? Because when I open a stack of frames with the plot_interactive() function, the max/min values are about 11000 to 13000, but if I average, the max/min values are from 0 to 65535 ...
For QuantitativeSTEM it would be important that the absolute count number would not change, but only being averaged over the whole stack to get an average number....
The text was updated successfully, but these errors were encountered:
Yes, in the average function I normalize to stretch out the minimum to 0 and maximum to 65535. I will fix this in the next version. As a workaround do:
data = stack.data.mean(axis=0)
average = data_io.create_new_image(data, stack.pixelsize, stack.pixelunit, stack, "Averaged all frames in stack")
I have a small question: When averaging a stack of HAADF images-the absolute counts of each pixel positions are getting averaged or are they also getting normalized between 0 and 65535? Because when I open a stack of frames with the plot_interactive() function, the max/min values are about 11000 to 13000, but if I average, the max/min values are from 0 to 65535 ...
For QuantitativeSTEM it would be important that the absolute count number would not change, but only being averaged over the whole stack to get an average number....
The text was updated successfully, but these errors were encountered: