Skip to content

Commit

Permalink
Flash attention 2.1
Browse files Browse the repository at this point in the history
  • Loading branch information
yekta committed Feb 18, 2024
1 parent 102694f commit 855b4fc
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 3 deletions.
3 changes: 1 addition & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
FROM nvcr.io/nvidia/pytorch:23.07-py3
# FROM stablecog/cuda-torch:12.1.0-2.1.0-cudnn8-devel-ubuntu22.04
FROM stablecog/cuda-torch:12.1.0-2.1.0-cudnn8-devel-ubuntu22.04

RUN mkdir -p /app/data
WORKDIR /app
Expand Down
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -38,4 +38,4 @@ git+https://github.com/suno-ai/bark.git
git+https://github.com/ai-forever/Kandinsky-2
git+https://github.com/openai/CLIP.git
git+https://github.com/yekta/denoiser.git
https://github.com/Dao-AILab/flash-attention/releases/download/v2.2.2/flash_attn-2.2.2+cu121torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
https://github.com/Dao-AILab/flash-attention/releases/download/v2.1.0/flash_attn-2.1.0+cu121torch2.1cxx11abiTRUE-cp310-cp310-linux_x86_64.whl

0 comments on commit 855b4fc

Please sign in to comment.