diff --git a/docs/docs/guides/05-using-server/01-start-server.md b/docs/docs/guides/05-using-server/01-start-server.md index 649eab4234..03973a9d1b 100644 --- a/docs/docs/guides/05-using-server/01-start-server.md +++ b/docs/docs/guides/05-using-server/01-start-server.md @@ -12,6 +12,8 @@ keywords: conversational AI, no-subscription fee, large language model, + local server, + api server, ] --- @@ -23,9 +25,9 @@ Navigate by clicking the `Local API Server` icon on the left side of your screen

-![01-local-api-view](./assets/01-local-api-view.png) +![01-local-api-view](./assets/01-local-api-view.gif) -## Choose your model +## Choosing a Model On the top right of your screen under `Model Settings`, set the LLM that your local server will be running. You can choose from any of the models already installed, or pick a new model by clicking `Explore the Hub`. @@ -33,9 +35,9 @@ On the top right of your screen under `Model Settings`, set the LLM that your lo ![01-choose-model](./assets/01-choose-model.png) -## Set your Server Options +## Server Options -On the left side of your screen you can set custom server options. +On the left side of your screen, you can set custom server options.

@@ -49,9 +51,9 @@ You can make the local server more accessible by clicking on the address and cho ### Port -Jan runs on port `1337` by default, but this can be changed. +Jan runs on port `1337` by default. You can change the port to any other port number if needed. -### CORS +### Cross-Origin Resource Sharing (CORS) Cross-Origin Resource Sharing (CORS) manages resource access on the local server from external domains. Enabled for security by default, it can be disabled if needed. diff --git a/docs/docs/guides/05-using-server/02-using-server.md b/docs/docs/guides/05-using-server/02-using-server.md index e01fd1af2d..62fdd50a53 100644 --- a/docs/docs/guides/05-using-server/02-using-server.md +++ b/docs/docs/guides/05-using-server/02-using-server.md @@ -11,6 +11,8 @@ keywords: conversational AI, no-subscription fee, large language model, + local server, + api server, ] --- @@ -40,7 +42,7 @@ With your local server running, you can click the `Try it out` button on the top Use the API endpoints, request and response body examples as models for your own application. -### Curl request example +### cURL Request Example Here's an example curl request with a local server running `tinyllama-1.1b`: diff --git a/docs/docs/guides/05-using-server/assets/01-local-api-view.gif b/docs/docs/guides/05-using-server/assets/01-local-api-view.gif new file mode 100644 index 0000000000..cb221fce45 Binary files /dev/null and b/docs/docs/guides/05-using-server/assets/01-local-api-view.gif differ diff --git a/docs/docs/guides/05-using-server/assets/01-local-api-view.png b/docs/docs/guides/05-using-server/assets/01-local-api-view.png deleted file mode 100644 index 6d5b13e6f5..0000000000 Binary files a/docs/docs/guides/05-using-server/assets/01-local-api-view.png and /dev/null differ