Break Apart Text With Prompt
A special type of prompt that will break apart a given chunk of text into various sections, depending upon the business logic in the provided prompt.
The content template becomes a system prompt. And the input text to be broken apart first gets processed to add wrapping and line numbers, and then the text with line numbers gets fed as a user message. The model then does a structured output to define the various sections of the output.
Options
| The text that should be formatted and broken apart by the model. |
| The name of the model that will be used to process the prompt. |
| Temperature to use with the model. The exact mathematical definition of temperature can vary depending on the model provider. |
| The maximum number of tokens to be included in the completion from the LLM provider. |
| The number of different variations to keep in the cache for this prompt. When the input data to the LLM is exactly the same, the prompt can be ‘cached’. By default only |
Output
| The a list of strings containing the text for each section. |
| The raw line numbers that were selected by the model. |
Properties
type | LLM |
needs conversation | false |
uses content template | true |
uses options template | true |
customizable output schema | false |