-
Notifications
You must be signed in to change notification settings - Fork 597
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory errors #45
Comments
I have this problem as well. @yocabon any suggestions? |
Hi, the current global optimization strategy was not meant to be run on large collections of images. We are working on a better solution, but no guarantee at this point. |
@yocabon Do you have any suggestions for parameters suitable for relatively dense images (e.g., more than 100 images per scene)? I've tried different prefilter and scene_graph settings to reduce the total number of pairs to roughly ~1k, but none seem to work as effectively as when subsampling 32 images with a complete scene graph. Maybe longer alignment iterations or other? Thank you! |
Hi, thank you for your great work!
I'm currently working on testing this on larger datasets (5-10k images) and notice that a very large amount of (V)RAM would be required to make it work. I've already generated a pairs file to reduce the number of pairs from 50M to 3M, but this still seems to be way too large. Do you have any pointers/suggestions I could try out to make it scale better?
I'm using cloud compute with 80GB VRAM and 220GB RAM so that shouldn't be an issue btw.
The text was updated successfully, but these errors were encountered: