openai_assistant
langroid/agent/openai_assistant.py
OpenAIAssistant(config)
¶
Bases: ChatAgent
A ChatAgent powered by OpenAI Assistant API:
mainly, in llm_response
method, we avoid maintaining conversation state,
and instead let the Assistant API do it for us.
Also handles persistent storage of Assistant and Threads:
stores their ids (for given user, org) in a cache, and
reuses them based on config.use_cached_assistant and config.use_cached_thread.
This class can be used as a drop-in replacement for ChatAgent.
Source code in langroid/agent/openai_assistant.py
add_assistant_files(files)
¶
Add file_ids to assistant
Source code in langroid/agent/openai_assistant.py
add_assistant_tools(tools)
¶
Add tools to assistant
Source code in langroid/agent/openai_assistant.py
enable_message(message_class, use=True, handle=True, force=False, require_recipient=False, include_defaults=True)
¶
Override ChatAgent's method: extract the function-related args.
See that method for details. But specifically about the include_defaults
arg:
Normally the OpenAI completion API ignores these fields, but the Assistant
fn-calling seems to pay attn to these, and if we don't want this,
we should set this to False.
Source code in langroid/agent/openai_assistant.py
thread_msg_to_llm_msg(msg)
staticmethod
¶
Convert a Message to an LLMMessage
set_system_message(msg)
¶
Override ChatAgent's method. The Task may use this method to set the system message of the chat assistant.
Source code in langroid/agent/openai_assistant.py
process_citations(thread_msg)
¶
Process citations in the thread message. Modifies the thread message in-place.
Source code in langroid/agent/openai_assistant.py
llm_response(message=None)
¶
Override ChatAgent's method: this is the main LLM response method.
In the ChatAgent, this updates self.message_history
and then calls
self.llm_response_messages
, but since we are relying on the Assistant API
to maintain conversation state, this method is simpler: Simply start a run
on the message-thread, and wait for it to complete.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
message |
Optional[str | ChatDocument]
|
message to respond to (if absent, the LLM response will be based on the instructions in the system_message). Defaults to None. |
None
|
Returns: Optional[ChatDocument]: LLM response
Source code in langroid/agent/openai_assistant.py
llm_response_async(message=None)
async
¶
Async version of llm_response.