LLM responses.#
Wrapper around langchain.
- class LLM_API(model, system_prompt='', position_config=None)[source]#
Bases:
object
This class acts as a wrapper for all LLMs from langchain and handles message exchange between remote model and chatsky classes.
- async respond(history, message_schema=None)[source]#
Process and structure the model’s response based on the provided schema.
- Parameters:
history (
list
[BaseMessage
]) – List of previous messages in the conversationmessage_schema (
Union
[None
,Type
[Message
],Type
[BaseModel
]]) – Schema for structuring the output, defaults to None
- Return type:
- Returns:
Processed model response
- Raises:
ValueError – If message_schema is not None, Message, or BaseModel
- async condition(history, method)[source]#
Execute a conditional method on the conversation history.
- Parameters:
history (
list
[BaseMessage
]) – List of previous messages in the conversationmethod (
BaseMethod
) – Method to evaluate the condition
- Return type:
bool
- Returns:
Boolean result of the condition evaluation