Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposal: Cache Impact Assessment items #103

Open
marc-vdm opened this issue Jul 17, 2024 · 0 comments
Open

Proposal: Cache Impact Assessment items #103

marc-vdm opened this issue Jul 17, 2024 · 0 comments

Comments

@marc-vdm
Copy link
Contributor

Current situation
BW loads impact assessment methods (characterization, normalization and weighting) from disk every time they are used or switched

e.g. here, here and here

Proposal
I suggest we load this data from disk only once and then store it in a cache (e.g. self._lcia_cache = {}). The cache could store all relevant items for that item (e.g. self.characterization_mm and self.characterization_matrix for an impact category). The load functions (like def load_lcia_data) could first check cache and if the relevant item was not found, load the item from disk.

An optional argument could be added that only caches impact assessments when True, though I would suggest to make this default behaviour, as RAM size should not be too large for sparse matrices.

Caching these items will make re-use of impact assessments substantially faster as user doesn't need to re-load these from disk every time.

When changing matrices through scenarios or monte-carlo, the cached version could just as well be changed as one loaded from disk.

Why
While this effect is not substantial -especially for a single calculation-, it can help speed up repeat calculations.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant