forked from open-mmlab/mmcv
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Reimplement cc_attention using pure pytorch (open-mmlab#1201)
* Reimplement cc_attention using pure pytorch * fix: avoid BC-Breaking * delete cc_attention related cpp and cuda files * delete cc_attention related lines in pybind.cpp * make out Tensor contiguous. * remove unneeded lines. * Update mmcv/ops/cc_attention.py Co-authored-by: Zaida Zhou <[email protected]> * Update TestCrissCrossAttention * passing pre-commit * Update docstring of CrissCrossAttention * Update docstring of CrissCrossAttention * Update mmcv/ops/cc_attention.py Co-authored-by: Zaida Zhou <[email protected]> * [docs]polish the docstring * [Docs] Polish the docstring Co-authored-by: Zaida Zhou <[email protected]> Co-authored-by: Zaida Zhou <[email protected]>
- Loading branch information
Showing
9 changed files
with
53 additions
and
695 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
Oops, something went wrong.