A plain text user prompt is the most basic form of prompt. This is equivalent to going to ChatGPT, opening a brand new chat window, and pasting in a single prompt to the conversation and taking the result as the output. This is compatible with the oldest forms of instruct models.
Properties
type | LLM |
needs conversation | true |
uses content template | true |
uses options template | true |
Options
| The name of the model that will be used to process the prompt. |
| Temperature to use with the model. The exact mathematical definition of temperature can vary depending on the model provider. |
| The maximum number of tokens to be included in the completion from the LLM provider. |
| The number of different variations to keep in the cache for this prompt. When the input data to the LLM is exactly the same, the prompt can be ‘cached’. By default only |
Output
| The output completion text from the LLM. |
| The raw text of the prompt that was sent up to the LLM provider. |
Add Comment