Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
BlinkDL authored Dec 26, 2024
1 parent 4c712cc commit 64000d4
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,13 @@ RWKV twitter: https://twitter.com/BlinkDL_AI (lastest news)

RWKV discord: https://discord.gg/bDSBUMeFpc (9k+ members)

RWKV-7 "Goose" is the best **linear-time & constant-space (no kv-cache) & attention-free** architecture on this planet at this moment, suitable for both LLM and multimodal applications, and more (check [rwkv.com](https://rwkv.com) for examples).
RWKV-7 "Goose" is the best **linear-time & constant-space (no kv-cache) & attention-free** architecture on this planet at this moment, suitable for both LLM and multimodal applications, and more (see [rwkv.com](https://rwkv.com)).

It is a [meta-in-context learner](https://raw.githubusercontent.com/BlinkDL/RWKV-LM/main/RWKV-v7.png), test-time-training its state on the context via in-context gradient descent at every token, and 100% RNN.
RWKV-7 is a [meta-in-context learner](https://raw.githubusercontent.com/BlinkDL/RWKV-LM/main/RWKV-v7.png), test-time-training its state on the context via in-context gradient descent at every token, and 100% RNN.

RWKV is a [Linux Foundation AI project](https://lfaidata.foundation/projects/rwkv/), so totally free. RWKV runtime is [already in Windows & Office](https://x.com/BlinkDL_AI/status/1831012419508019550).

You are welcome to ask the RWKV community (such as [RWKV discord](https://discord.gg/bDSBUMeFpc) for advice on upgrading your attention-based models to rwkv7-based models :)
You are welcome to ask the RWKV community (such as [RWKV discord](https://discord.gg/bDSBUMeFpc) for advice on upgrading your attention/ssm models to rwkv7 models :)

<img src="RWKV-v7-niah.png">

Expand Down

0 comments on commit 64000d4

Please sign in to comment.