Conversational Tool Selection
A conversational tool selection uses the full power of the LLM model to either choose an action or send a reply to the user, depending on the situation and prompts provided. This step type will:
Use the conversation history
The content template becomes the system prompt fed to the LLM model
The tools available to the model will be based on the combination of the built-in actions and any custom actions defined in the options
If the model chooses to send a regular response completion (rather then to use a tool), that will be translated into the
user_interactor.send_message
action.
Options
| The name of the model that will be used to process the prompt. |
| Temperature to use with the model. The exact mathematical definition of temperature can vary depending on the model provider. |
| The maximum number of tokens to be included in the completion from the LLM provider. |
| The number of different variations to keep in the cache for this prompt. When the input data to the LLM is exactly the same, the prompt can be ‘cached’. By default only |
| The maximum number of conversation events to include in the conversation history data fed to the bot. |
| Custom actions to include in the tool selection. These must be provided with a smart_chain_binding_name that indicates which smart chain to execute for the action. The 'text' field on the output from the smart-chain will be used as the action result. |
| Actions to exclude from the tool selection, referenced by their action_ids. Can include actions that are defined in the custom_actions section. |
| Agent Modules to exclude from the tool selection, referenced by their module_ids. Can include module_ids that are only defined in the custom_actions section. |
Output
| The module ID of the action that was selected. |
| The action ID of the action that was selected. |
| The parameters that the LLM generated for the action. This will match the schema provided for the action parameters. |
| The raw text of the prompt that was sent up to the LLM provider. |
| If the selected action includes text, such as a message send, then this is text of the response to be said back to the user. Otherwise blank |
Properties
type | LLM |
needs conversation | true |
uses content template | true |
uses options template | true |
customizable output schema | false |