Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
ftgreat authored Jun 13, 2023
1 parent 4f07687 commit f1f3817
Showing 1 changed file with 2 additions and 3 deletions.
5 changes: 2 additions & 3 deletions examples/Aquila/Aquila-pretrain/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,9 @@ The Aquila language model inherits the architectural design advantages of GPT-3
We also support [Huggingface](https://huggingface.co/BAAI). -->

运行Aquila-7B系列需要内存30G, 显存18G,生成最大长度200 token。

To run the Aquila-7b series, you need at least 30GB of memory and 18GB of GPU memory, and the maximum length of text generated should be 200 tokens.
最低硬件需求:运行Aquila-7B系列需要内存30G, 显存18G,生成最大长度 2048 tokens。

Minimum hardware requirements for running the Aquila-7b series, you need at least 30GB of memory and 18GB of GPU memory, and the maximum length of text generated should be 2048 tokens.

## 模型细节/Model details

Expand Down

0 comments on commit f1f3817

Please sign in to comment.