-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Caching mechanism for AcquisitionModel
norm
#1243
Comments
Ideally, this caching happens at the C++ level. However, the cache should be invalidated when a Unfortunately, anything with caching gets complicated in a multi-threaded context. We don't seem to have any in SIRF though, so we probably don't care yet. So, something like bool already_setup = false;
// if negative, it hasn't be computed yet
double cached_norm = -1.;
double norm() const
{
if (cached_norm < 0)
cached_norm = norm_calculation
return cached_norm;
} |
Do not like the idea, I am afraid - we will need to invalidate the cache in every non- |
On the C++ side, we know what the non-const methods do, so we know when we have to invalidate (and that would just be when the underlying model changes). A bit painful, but not too much, I think. On the Python side, |
For PET, there's maybe 8 methods that invalidate the cache (changing the additive/background term doesn't change the norm, but fine). So, the strategy would be to
Not painless :-( but far safer for the |
what if the user is after a more accurate computation of the norm, and just wants to re-run it with a larger number of iterations? (Or different subset or subsets' number.) I believe it is the user's code responsibility to decide whether to use already computed norm or compute it again. |
When the
norm
of anAcquisitionModel
is called it gets calculated and returned. However, once called again the norm is recalculated and as the value is not cached. This can lead to a consistent increase of computation time.I made a thin wrapper for SIRF MR
AcquisitionModel
as below.https://github.com/paskino/SIRF-Contribs/blob/b8e542d4f7af1ed1e47eaed8517d4be0517aca17/src/notebooks/utils.py#L147-L162
Python already provides a caching mechanism:
https://docs.python.org/3/library/functools.html#functools.lru_cache
The text was updated successfully, but these errors were encountered: