forked from pytorch/botorch
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Summary: Pull Request resolved: pytorch#1999 This commit improves the robustness of `fatmax` and `logsumexp` for inputs with infinities. - In constrast to `torch.logsumexp`, `logsumexp` does not give rise to `NaN`s in its backward pass even if infinities are present. - `fatmax` is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of an `alpha` parameter, which controls the the asymptotic power decay of the fat-tailed approximation. In addition, the commit introduces helper functions derivative of `logsumexp` and `fatmax`, e.g. `logplusexp`, `fatminimum`, `fatmaximum`, fixes a similar infinity issue with `logdiffexp`, and improves the associated test suite. Reviewed By: Balandat Differential Revision: D48878020 fbshipit-source-id: 46561efb10c921b77c1ed483ab383b30e8ac7e20
- Loading branch information
1 parent
748b46a
commit 9649b1c
Showing
2 changed files
with
325 additions
and
37 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.