The reason the model stopped generating tokens. This will be one of: - "stop" if the model hit a natural stop point or a provided stop sequence - "length" if the maximum number of tokens specified in the request was reached - "content_filter" if content was omitted due to a flag from our content filters - "tool_calls" if the model called a tool - "function_call" if the model called a function (deprecated in favor of "tool_calls").
The reason the model stopped generating tokens. This will be one of: -
"stop"
if the model hit a natural stop point or a provided stop sequence -"length"
if the maximum number of tokens specified in the request was reached -"content_filter"
if content was omitted due to a flag from our content filters -"tool_calls"
if the model called a tool -"function_call"
if the model called a function (deprecated in favor of"tool_calls"
).