Skip to content

Commit

Permalink
📝 docs: Add Groq usage docs (lobehub#1598)
Browse files Browse the repository at this point in the history
* docs: Add Groq usage docs (lobehub#1596)

* 📝 docs: update docs

---------

Co-authored-by: TC <[email protected]>
  • Loading branch information
arvinxx and tcmonster authored Mar 16, 2024
1 parent 05aa79f commit f8be0f4
Show file tree
Hide file tree
Showing 3 changed files with 88 additions and 0 deletions.
2 changes: 2 additions & 0 deletions docs/usage/features/agent-market.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -40,3 +40,5 @@ In LobeChat's Assistant Market, creators can discover a vibrant and innovative c
| [Product Description](https://chat-preview.lobehub.com/market?agent=product-description)<br /><sup>By **[pllz7](https://github.com/pllz7)** on **2024-02-14**</sup> | Create captivating product descriptions to improve e-commerce sales performance<br />`E-commerce` |

> 📊 Total agents: [<kbd>**177**</kbd> ](https://github.com/lobehub/lobe-chat-agents)
[submit-agents-link]: https://github.com/lobehub/lobe-chat-agents
44 changes: 44 additions & 0 deletions docs/usage/providers/groq.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
---
title: Using Groq in LobeChat
image: https://github.com/lobehub/lobe-chat/assets/34400653/cf368841-f8e6-4cc5-b81d-09f82ef0afd9
---

import { Callout, Steps } from 'nextra/components';

# Using Groq in LobeChat

<Image alt={'Using Groq in LobeChat'} cover src={'https://github.com/lobehub/lobe-chat/assets/34400653/d0d08d98-a8d2-4b97-97c0-24a4f01d7eac'} />

Groq's [LPU Inference Engine](https://wow.groq.com/news_press/groq-lpu-inference-engine-leads-in-first-independent-llm-benchmark/) has excelled in the latest independent Large Language Model (LLM) benchmark, redefining the standard for AI solutions with its remarkable speed and efficiency. By integrating LobeChat with Groq Cloud, you can now easily leverage Groq's technology to accelerate the operation of large language models in LobeChat.

<Callout type={'info'}>
Groq's LPU Inference Engine achieved a sustained speed of 300 tokens per second in internal benchmark tests, and according to benchmark tests by ArtificialAnalysis.ai, Groq outperformed other providers in terms of throughput (241 tokens per second) and total time to receive 100 output tokens (0.8 seconds).
</Callout>

This document will guide you on how to use Groq in LobeChat:

<Steps>
### Obtaining GroqCloud API Keys

First, you need to obtain an API Key from the [GroqCloud Console](https://console.groq.com/).

<Image alt={'Get GroqCloud API Key'} height={274} inStep src={'https://github.com/lobehub/lobe-chat/assets/34400653/6942287e-fbb1-4a10-a1ce-caaa6663da1e'} />

Create an API Key in the `API Keys` menu of the console.

<Image alt={'Save GroqCloud API Key'} height={274} inStep src={'https://github.com/lobehub/lobe-chat/assets/34400653/eb57ca57-4f45-4409-91ce-9fa9c7c626d6'} />

<Callout type={'warning'}>
Safely store the key from the pop-up as it will only appear once. If you accidentally lose it, you will need to create a new key.
</Callout>

### Configure Groq in LobeChat

You can find the Groq configuration option in `Settings` -> `Language Model`, where you can input the API Key you just obtained.

<Image alt={'Groq service provider settings'} height={274} inStep src={'https://github.com/lobehub/lobe-chat/assets/34400653/88948a3a-6681-4a8d-9734-a464e09e4957'} />
</Steps>

Next, select a Groq-supported model in the assistant's model options, and you can experience the powerful performance of Groq in LobeChat.

<Video alt={'Select and use Groq model'} src="https://github.com/lobehub/lobe-chat/assets/28616219/b6b8226b-183f-4249-8255-663a5e9f5af4" />
42 changes: 42 additions & 0 deletions docs/usage/providers/groq.zh-CN.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
---
title: 在 LobeChat 中使用 Groq
image: https://github.com/lobehub/lobe-chat/assets/34400653/cf368841-f8e6-4cc5-b81d-09f82ef0afd9
---

# 在 LobeChat 中使用 Groq

<Image alt={'在 LobeChat 中使用 Groq'} cover src={'https://github.com/lobehub/lobe-chat/assets/34400653/d0d08d98-a8d2-4b97-97c0-24a4f01d7eac'} />

Groq 的 [LPU 推理引擎](https://wow.groq.com/news_press/groq-lpu-inference-engine-leads-in-first-independent-llm-benchmark/) 在最新的独立大语言模型(LLM)基准测试中表现卓越,以其惊人的速度和效率重新定义了 AI 解决方案的标准。通过 LobeChat 与 Groq Cloud 的集成,你现在可以轻松地利用 Groq 的技术,在 LobeChat 中加速大语言模型的运行。

<Callout type={'info'}>
Groq LPU 推理引擎在内部基准测试中连续达到每秒 300 个令牌的速度,据 ArtificialAnalysis.ai 的基准测试确认,Groq 在吞吐量(每秒 241 个令牌)和接收 100 个输出令牌的总时间(0.8 秒)方面优于其他提供商。
</Callout>

本文档将指导你如何在 LobeChat 中使用 Groq:

<Steps>
### 获取 GroqCloud API Key

首先,你需要到 [GroqCloud Console](https://console.groq.com/) 中获取一个 API Key。

<Image alt={'获取 GroqCloud API Key'} height={274} inStep src={'https://github.com/lobehub/lobe-chat/assets/34400653/6942287e-fbb1-4a10-a1ce-caaa6663da1e'} />

在控制台的 `API Keys` 菜单中创建一个 API Key。

<Image alt={'保存 GroqCloud API Key'} height={274} inStep src={'https://github.com/lobehub/lobe-chat/assets/34400653/eb57ca57-4f45-4409-91ce-9fa9c7c626d6'} />

<Callout type={'warning'}>
妥善保存弹窗中的 key,它只会出现一次,如果不小心丢失了,你需要重新创建一个 key。
</Callout>

### 在 LobeChat 中配置 Groq

你可以在 `设置` -> `语言模型` 中找到 Groq 的配置选项,将刚才获取的 API Key 填入。

<Image alt={'Groq 服务商设置'} height={274} inStep src={'https://github.com/lobehub/lobe-chat/assets/34400653/88948a3a-6681-4a8d-9734-a464e09e4957'} />
</Steps>

接下来,在助手的模型选项中,选中一个 Groq 支持的模型,就可以在 LobeChat 中体验 Groq 强大的性能了。

<Video alt={'选择 Groq 模型'} src="https://github.com/lobehub/lobe-chat/assets/28616219/b6b8226b-183f-4249-8255-663a5e9f5af4" />

0 comments on commit f8be0f4

Please sign in to comment.