-
Notifications
You must be signed in to change notification settings - Fork 234
Insights: huggingface/optimum-habana
Overview
-
- 6 Merged pull requests
- 9 Open pull requests
- 0 Closed issues
- 1 New issue
Could not load contribution data
Please try again later
6 Pull requests merged by 6 people
-
Temporary WA for get_type error
#1806 merged
Feb 28, 2025 -
[1.20.0] Temporary workaround to avoid segmentation fault
#1798 merged
Feb 26, 2025 -
Add device auto-discovery and cli option
#1787 merged
Feb 26, 2025 -
exp flags for acc issues
#1795 merged
Feb 26, 2025 -
Fix the restart issue for Sentence Transformer STS example in validation
#1799 merged
Feb 26, 2025 -
Add trust_remote_code
#1786 merged
Feb 24, 2025
9 Pull requests opened by 7 people
-
Fixes pytest runtime error - Incompatible input shapes, broadcast not possible
#1796 opened
Feb 25, 2025 -
fea(): Skipped the torch_fx tests
#1797 opened
Feb 25, 2025 -
Move model to device before wrapping with FSDP
#1801 opened
Feb 26, 2025 -
Fix race condition when downloading nltk tokenizer
#1802 opened
Feb 27, 2025 -
Separate slow tests by required number of cards
#1803 opened
Feb 27, 2025 -
Slow test updates
#1804 opened
Feb 27, 2025 -
Final prep for G3 device context support
#1807 opened
Feb 27, 2025 -
Fix dataset_version for ST example requirement.txt
#1809 opened
Feb 28, 2025 -
Upgrade to Transformers v4.49
#1810 opened
Feb 28, 2025
1 Issue opened by 1 person
-
datasets version
#1808 opened
Feb 28, 2025
10 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
[Transformers future] Loss Computation for Compatibility with Transformers 4.48.3
#1794 commented on
Feb 28, 2025 • 15 new comments -
Optimized DeepSeek-V2 attention prefill with MHA.
#1791 commented on
Mar 3, 2025 • 6 new comments -
Upstream HPU
#1741 commented on
Mar 3, 2025 • 3 new comments -
Fail to do Qwen2-VL-72B inference with StaticCache
#1790 commented on
Feb 26, 2025 • 0 new comments -
add cogvideox support for gaudi.
#1600 commented on
Feb 27, 2025 • 0 new comments -
Add GLM4V
#1668 commented on
Feb 26, 2025 • 0 new comments -
Upgrade to Transformers v4.48
#1698 commented on
Feb 28, 2025 • 0 new comments -
Enabling Snowflake Arctic on Gaudi 3
#1719 commented on
Feb 27, 2025 • 0 new comments -
Extend lm_eval functionality
#1729 commented on
Feb 28, 2025 • 0 new comments -
Ig/mixtral pytests contextmanager
#1793 commented on
Feb 24, 2025 • 0 new comments