CustomLLMModelProperties
Properties
Link copied to clipboard
This determines whether we detect user's emotion while they speak and send it as an additional info to model.
Default `false` because the model is usually good at understanding the user's emotion from text.
Link copied to clipboard
This determines whether metadata is sent in requests to the custom provider.
off will not send any metadata. Payload will look like { messages }
variable will send assistant.metadata as a variable on the payload. Payload will look like { messages, metadata }
destructured will send assistant.metadata fields directly on the payload. Payload will look like { messages, ...metadata }
Further,
variable and destructured will send call, phoneNumber, and customer objects in the payload.
Default is
variable.
Link copied to clipboard
This sets how many turns at the start of the conversation to use a smaller, faster model from the same provider before switching to the primary model. Example, gpt-3.5-turbo if provider is openai. Default is 0.
Link copied to clipboard
This is the temperature that will be used for calls.
Link copied to clipboard
These are the tools that the assistant can use during the call. To use transient tools, use tools. Both tools and toolIds can be used together.