interface ChatOpenAICallOptions {
    options?: OpenAICoreRequestOptions<Record<string, unknown>>;
    parallel_tool_calls?: boolean;
    promptIndex?: number;
    response_format?: {
        type: "json_object";
    };
    seed?: number;
    stream_options?: {
        include_usage: boolean;
    };
    strict?: boolean;
    tool_choice?: OpenAIToolChoice;
    tools?: any[];
}

Hierarchy (view full)

Properties

options?: OpenAICoreRequestOptions<Record<string, unknown>>

Additional options to pass to the underlying axios request.

parallel_tool_calls?: boolean

Whether or not to restrict the ability to call multiple tools in one response.

promptIndex?: number
response_format?: {
    type: "json_object";
}
seed?: number
stream_options?: {
    include_usage: boolean;
}

Additional options to pass to streamed completions. If provided takes precedence over "streamUsage" set at initialization time.

Type declaration

  • include_usage: boolean

    Whether or not to include token usage in the stream. If set to true, this will include an additional chunk at the end of the stream with the token usage.

strict?: boolean

If true, model output is guaranteed to exactly match the JSON Schema provided in the tool definition. If true, the input schema will also be validated according to https://platform.openai.com/docs/guides/structured-outputs/supported-schemas.

If false, input schema will not be validated and model output will not be validated.

If undefined, strict argument will not be passed to the model.

0.2.6

tool_choice?: OpenAIToolChoice
tools?: any[]