Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Plans to support DeepSeek? #923

Open
bamzi opened this issue Jan 21, 2025 · 6 comments
Open

Plans to support DeepSeek? #923

bamzi opened this issue Jan 21, 2025 · 6 comments
Labels
enhancement New feature or request

Comments

@bamzi
Copy link

bamzi commented Jan 21, 2025

Reviewing the documentation of DeepSeek, the example codes use python OpenAi library which indicates API parity with OpenAI.

Is there a plan to support DeepSeek models and endpoint?

Example: DeepSeek Structured output

from openai import OpenAI

client = OpenAI(
    api_key="<your api key>",
    base_url="https://api.deepseek.com", # <--- deepseek api endpoint
)
@bamzi bamzi added the enhancement New feature or request label Jan 21, 2025
@desprit
Copy link

desprit commented Feb 2, 2025

config := openai.DefaultConfig(os.Getenv("OPENAI_API_KEY"))
config.BaseURL = "https://api.deepseek.com"
client := openai.NewClientWithConfig(config)

resp, err := client.CreateChatCompletion(
		context.Background(),
		openai.ChatCompletionRequest{
			Model: "deepseek-chat",
			Messages: []openai.ChatCompletionMessage{
				{
					Role:    openai.ChatMessageRoleSystem,
					Content: MakeSystemPrompt(),
				},
				{
					Role:    openai.ChatMessageRoleUser,
					Content: prompt,
				},
			},
			ResponseFormat: &openai.ChatCompletionResponseFormat{
				Type: openai.ChatCompletionResponseFormatTypeJSONObject,
			},
		},
	)

@feng626
Copy link

feng626 commented Feb 6, 2025

deepseek-reasoner What about this?

@feng626
Copy link

feng626 commented Feb 6, 2025

status code: 400, status: 400 Bad Request, message: %!s(), body: {"code":20024,"message":"Json mode is not supported for this model.","data":null}

@desprit
Copy link

desprit commented Feb 6, 2025

status code: 400, status: 400 Bad Request, message: %!s(), body: {"code":20024,"message":"Json mode is not supported for this model.","data":null}

Why are you asking it here? You are getting an error from deepseek, not from go-openai lib.

@feng626
Copy link

feng626 commented Feb 17, 2025

type ChatCompletionStreamChoiceDelta struct {
openai.ChatCompletionStreamChoiceDelta
ReasoningContent string json:"reasoning_content,omitempty"
}

type ChatCompletionStreamChoice struct {
openai.ChatCompletionStreamChoice
Delta ChatCompletionStreamChoiceDelta json:"delta"
}

type ChatCompletionStreamResponse struct {
openai.ChatCompletionStreamResponse
Choices []ChatCompletionStreamChoice json:"choices"
}

response := ChatCompletionStreamResponse{}
rawLine, streamErr := stream.RecvRaw()

Rewrote it. OK

@panzhongxian
Copy link

👍 @feng626 adding the reasoning_content single line can store the deepseek-r1 model streaming response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants