Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix the function schema for split embedding backward (pytorch#2212)
Summary: Pull Request resolved: pytorch#2212 The momentum1_host is a write back tensor and in the functiona schema, it should be labeled as "Tensor(b!)" in order to make the CPU fallback correctly write data back to it. Can't find a more elegant solution to fix this now. Reviewed By: jspark1105 Differential Revision: D52082756 fbshipit-source-id: f19ad2332ec5a0f150ad37e7203cb8682fad26a6
- Loading branch information