You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be a great reference for our paragraph that starts
One specific reproducibility pitfall that is often missed in applying deep learning is the default use of non-deterministic algorithms by CUDA/CuDNN backends when using GPUs.
I could add it now or wait until the next version if we're still considering this content frozen for submission.
The text was updated successfully, but these errors were encountered:
I think this is indeed a nice paper discussing the fact that in deep learning, it is (1) not only important to use fixed random seeds but (2) also to use deterministic algorithms.
The tool that comes with the paper seems a bit overkill though since most DL frameworks have the choice of deterministic algorithms build in now, e.g., in PyToch
This paper dives into sources of non-determinism in machine learning frameworks: https://arxiv.org/abs/2104.07651
It would be a great reference for our paragraph that starts
I could add it now or wait until the next version if we're still considering this content frozen for submission.
The text was updated successfully, but these errors were encountered: