LLM Conditions#

This module provides LLM-based conditions.

class LLMCondition(**data)[source]#

Bases: BaseCondition

LLM-based condition. Uses prompt to produce result from model and evaluates the result using given method.

llm_model_name: str#

Key of the model in the models dictionary.

prompt: AnnotatedAlias[Union[ConstResponse, BaseResponse]]#

Condition prompt.

history: int#

Number of dialogue turns aside from the current one to keep in history. -1 for full history.

filter_func: BaseHistoryFilter#

Filter function to filter messages in history.

prompt_misc_filter: str#

Regular expression to find prompts by key names in MISC dictionary.

position_config: Optional[PositionConfig]#

Config for positions of prompts and messages in history.

max_size: int#

Maximum size of any message in chat in symbols. If a message exceeds the limit it will not be sent to the LLM and a warning will be produced.

method: BaseMethod#

Method that takes model’s output and returns boolean.

async call(ctx)[source]#

Implement this to create a custom function.

Return type:

bool