LLM API#
Wrapper around langchain.
- class LLM_API(model, system_prompt='', position_config=None)[source]#
Bases:
objectThis class acts as a wrapper for all LLMs from langchain and handles message exchange between remote model and chatsky classes.
- async respond(history, message_schema=None)[source]#
Process and structure the model’s response based on the provided schema.
- Parameters:
history (
list[BaseMessage]) – List of previous messages in the conversationmessage_schema (
Union[None,Type[Message],Type[BaseModel]]) – Schema for structuring the output, defaults to None
- Return type:
- Returns:
Processed model response
- Raises:
ValueError – If message_schema is not None, Message, or BaseModel
- async condition(history, method)[source]#
Execute a conditional method on the conversation history.
- Parameters:
history (
list[BaseMessage]) – List of previous messages in the conversationmethod (
BaseMethod) – Method to evaluate the condition
- Return type:
bool- Returns:
Boolean result of the condition evaluation
- class BaseLLMScriptFunction(**data)[source]#
Bases:
BaseModelBase class for script functions that use an LLM model.
-
history:
int# Number of dialogue turns aside from the current one to keep in history. -1 for full history.
-
filter_func:
BaseHistoryFilter# Filter function to filter messages in history.
-
prompt_misc_filter:
str# Regular expression to find prompts by key names in MISC dictionary.
-
position_config:
Optional[PositionConfig]# Config for positions of prompts and messages in history.
-
max_size:
int# Maximum size of any message in chat in symbols. If a message exceeds the limit it will not be sent to the LLM and a warning will be produced.
- async _get_langchain_context(ctx)[source]#
Convert
Contextto langchain messages usingget_langchain_context().Arguments to the function are passed from attributes of this class and from the
LLM_APImodel stored in pipeline:Model is retrieved from pipeline using
llm_model_name;Model’s
system_promptis executed and passed toget_langchain_context()assystem_prompt;If
position_configis None, model’sposition_configis used instead;The rest of the arguments are passed as is.
- Parameters:
ctx (
Context) – Context object.- Return type:
list[BaseMessage]- Returns:
A list of LangChain messages.
-
history: