A chat choice from a list of chat choices.

interface ChatsChoice {
    content_filter_results?: ChatsContentFilterResults;
    finish_reason: string;
    index: number;
    logprobs?: ChatsLogProbs;
    message: ChatsMessageResponse;
}

Properties

content_filter_results?: ChatsContentFilterResults

Information about the results of content filtering.

finish_reason: string

The reason the model stopped generating tokens.

This will be one of:

  • "stop" if the model hit a natural stop point or a provided stop sequence
  • "length" if the maximum number of tokens specified in the request was reached
  • "content_filter" if content was omitted due to a flag from our content filters
  • "tool_calls" if the model called a tool
  • "function_call" if the model called a function (deprecated in favor of "tool_calls").
index: number

Index of the choice in the list of choices.

logprobs?: ChatsLogProbs

The log probability information for the choice.

A chat completion message generated by a model.