You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
scGhost is insightful and has proven to be valuable in single-cell analysis.
I am currently using scGhost to analyze a dataset of 3000-4000 cells, with the resolution set to 100k. However, I have encountered an issue where memory exceeds 370G+. Your paper was based on a 500k, and I am unsure whether using scGhost at the 100k is a feasible option.
Are there any known memory issues when using scGhost at the 100k?
Would you have any suggestions for optimizing memory consumption when working at 100K?
looking forward to any guidance you can provide, THANKS!
The text was updated successfully, but these errors were encountered:
Hello, for whole-genome runs scGHOST was designed for 500kb - we found that memory usage scaled quadratically with resolution and 100kb at whole-genome scale would easily exceed the memory capabilities of a lot of machines.
To optimize memory for 100kb, you could try to run scGHOST one chromosome at a time, but without whole-genome clustering it might be difficult to ensure, for example, scA1 in chromosome 3 is the same as scA1 in other chromosome without a bulk-level comparison.
That said, a whole-genome solution may be to load one chromosome at a time from disk, save intermediate results to disk, and flush from memory once random walks are completed. This would greatly increase run time and because we still use whole-genome clustering we would still need to determine if this strategy actually can complete a 100kb run using a reasonable amount of memory.
scGhost is insightful and has proven to be valuable in single-cell analysis.
I am currently using scGhost to analyze a dataset of 3000-4000 cells, with the resolution set to 100k. However, I have encountered an issue where memory exceeds 370G+. Your paper was based on a 500k, and I am unsure whether using scGhost at the 100k is a feasible option.
looking forward to any guidance you can provide, THANKS!
The text was updated successfully, but these errors were encountered: