Skip to content

Commit

Permalink
Update proxy url in inference providers blog (huggingface#2653)
Browse files Browse the repository at this point in the history
* Update proxy url in inference providers blog

* mapped model
  • Loading branch information
Wauplin authored Feb 5, 2025
1 parent 42c24f2 commit 54cd010
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions inference-providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -152,16 +152,16 @@ console.log(chatCompletion.choices[0].message);

### From HTTP calls

We expose the Routing proxy directly under the huggingface.co domain so you can call it directly, it's very useful for OpenAI-compatible APIs for instance. You can just swap the URL as a base URL: `https://huggingface.co/api/inference-proxy/{:provider}`.
We expose the Routing proxy directly under the huggingface.co domain so you can call it directly, it's very useful for OpenAI-compatible APIs for instance. You can just swap the URL as a base URL: `https://router.huggingface.co/{:provider}`.

Here's how you can call Llama-3.3-70B-Instruct using Sambanova as the inference provider via cURL.

```bash
curl 'https://huggingface.co/api/inference-proxy/sambanova/v1/chat/completions' \
curl 'https://router.huggingface.co/sambanova/v1/chat/completions' \
-H 'Authorization: Bearer xxxxxxxxxxxxxxxxxxxxxxxx' \
-H 'Content-Type: application/json' \
--data '{
"model": "meta-llama/Llama-3.3-70B-Instruct",
"model": "Llama-3.3-70B-Instruct",
"messages": [
{
"role": "user",
Expand Down

0 comments on commit 54cd010

Please sign in to comment.