Chat Completions

Creates a model response for the given chat conversation.

Recent Requests
Log in to see full request history
TimeStatusUser Agent
Retrieving recent requests…
LoadingLoading…
Body Params
string
enum
required
Defaults to deepseek-ai/DeepSeek-V2.5

The name of the model to query.

messages
array of objects
required
length between 1 and 10

A list of messages comprising the conversation so far.

messages*
string
enum
required
Defaults to user

The role of the messages author. Choice between: system, user, or assistant.

Allowed:
required
Defaults to SiliconCloud推出分层速率方案与免费模型RPM提升10倍,对于整个大模型应用领域带来哪些改变?

The contents of the message.

boolean
Defaults to false

If set, tokens are returned as Server-Sent Events as they are made available. Stream terminates with data: [DONE]

integer
1 to 4096
Defaults to 512

The maximum number of tokens to generate.

stop
array of strings

A list of string sequences that will truncate (stop) inference text output.

stop
float
Defaults to 0.7

Determines the degree of randomness in the response.

float
Defaults to 0.7

The top_p (nucleus) parameter is used to dynamically adjust the number of choices for each predicted token based on the cumulative probabilities.

float
Defaults to 50
float
Defaults to 0.5
integer
Defaults to 1

Number of generations to return

response_format
object

An object specifying the format that the model must output.

Headers
string
enum
Defaults to application/json

Generated from available response content types

Allowed:
Responses

Language
Credentials
Bearer
LoadingLoading…
Response
Click Try It! to start a request and see the response here! Or choose an example:
application/json
text/event-stream