Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Spatialtemporal position embedding;tensor slicing utility (facebookre…
…search#69) Summary: Pull Request resolved: facebookresearch#69 - Spatial-temporal position embedding is the first layer of the multimodal GPT attention stack - Refactor [`AddBroadcastEmbedding`](https://github.com/mugen-org/MUGEN_baseline/blob/main/lib/models/gpt/attention.py#L256) and the associated utility [`tesor_slice`](https://github.com/mugen-org/MUGEN_baseline/blob/main/lib/models/gpt/utils.py#L75) - Use `pytest` to perform parametrized tests, test setup and etc. as the starting point to migrate our testing framework away from python `unittest`. Test Plan: ``` (torchmm) langong-mbp:gpt_attention langong$ python -m pytest --cov=torchmultimodal/modules/layers test/modules/layers/test_position_embedding.py ================================================================================ test session starts ================================================================================= platform darwin -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 rootdir: /Users/langong/gpt_attention plugins: cov-3.0.0 collected 7 items test/modules/layers/test_position_embedding.py ....... [100%] ---------- coverage: platform darwin, python 3.8.13-final-0 ---------- Name Stmts Miss Cover -------------------------------------------------------------------------- torchmultimodal/modules/layers/attention.py 97 97 0% torchmultimodal/modules/layers/codebook.py 81 81 0% torchmultimodal/modules/layers/conv.py 74 74 0% torchmultimodal/modules/layers/mlp.py 22 22 0% torchmultimodal/modules/layers/normalizations.py 7 7 0% torchmultimodal/modules/layers/position_embedding.py 38 0 100% torchmultimodal/modules/layers/transformer.py 133 133 0% -------------------------------------------------------------------------- TOTAL 452 414 8% ====================== 7 passed in 2.38s ================== ``` ``` (torchmm) langong-mbp:gpt_attention langong$ python -m pytest --cov=torchmultimodal/utils/ test/utils/test_common.py -vv ================================================================================ test session starts ================================================================================= platform darwin -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /Users/langong/local/miniconda3/envs/torchmm/bin/python cachedir: .pytest_cache rootdir: /Users/langong/gpt_attention plugins: cov-3.0.0 collected 6 items test/utils/test_common.py::test_shift_dim PASSED [ 16%] test/utils/test_common.py::TestTensorSlice::test_default PASSED [ 33%] test/utils/test_common.py::TestTensorSlice::test_size_minus_one PASSED [ 50%] test/utils/test_common.py::TestTensorSlice::test_uneven_begin_size PASSED [ 66%] test/utils/test_common.py::TestTensorSlice::test_invalid_begin XFAIL (Invalid begin) [ 83%] test/utils/test_common.py::TestTensorSlice::test_invalid_size XFAIL (Invalid size) [100%] ---------- coverage: platform darwin, python 3.8.13-final-0 ---------- Name Stmts Miss Cover ------------------------------------------------------- torchmultimodal/utils/__init__.py 0 0 100% torchmultimodal/utils/common.py 66 21 68% torchmultimodal/utils/file_io.py 10 3 70% ------------------------------------------------------- TOTAL 76 24 68% ===================== 4 passed, 2 xfailed in 1.41s ============================ ``` Reviewed By: RdoubleA Differential Revision: D37100730 Pulled By: langong347 fbshipit-source-id: 1b1d99ff924fe88078e4d7563fcf52d334185dca
- Loading branch information