LLM Conditions#
This module provides LLM-based conditions.
- class LLMCondition(**data)[source]#
Bases:
BaseConditionLLM-based condition. Uses prompt to produce result from model and evaluates the result using given method.
-
prompt:
AnnotatedAlias[Union[ConstResponse,BaseResponse]]# Condition prompt.
-
history:
int# Number of dialogue turns aside from the current one to keep in history. -1 for full history.
-
filter_func:
BaseHistoryFilter# Filter function to filter messages in history.
-
prompt_misc_filter:
str# Regular expression to find prompts by key names in MISC dictionary.
-
position_config:
Optional[PositionConfig]# Config for positions of prompts and messages in history.
-
max_size:
int# Maximum size of any message in chat in symbols. If a message exceeds the limit it will not be sent to the LLM and a warning will be produced.
-
method:
BaseMethod# Method that takes model’s output and returns boolean.
-
prompt: