chainlit
langroid/agent/callbacks/chainlit.py
Callbacks for Chainlit integration.
ChainlitAgentCallbacks(agent, config=ChainlitCallbackConfig())
¶
Inject Chainlit callbacks into a Langroid Agent
Source code in langroid/agent/callbacks/chainlit.py
start_llm_stream()
¶
Returns a streaming fn that can be passed to the LLM class
Source code in langroid/agent/callbacks/chainlit.py
start_llm_stream_async()
async
¶
Returns a streaming fn that can be passed to the LLM class
Source code in langroid/agent/callbacks/chainlit.py
cancel_llm_stream()
¶
finish_llm_stream(content, is_tool=False)
¶
Update the stream, and display entire response in the right language.
Source code in langroid/agent/callbacks/chainlit.py
show_llm_response(content, is_tool=False, cached=False, language=None)
¶
Show non-streaming LLM response.
Source code in langroid/agent/callbacks/chainlit.py
show_error_message(error)
¶
Show error message.
Source code in langroid/agent/callbacks/chainlit.py
show_agent_response(content, language='text')
¶
Show message from agent (typically tool handler).
Source code in langroid/agent/callbacks/chainlit.py
show_start_response(entity)
¶
When there's a potentially long-running process, start a step, so that the UI displays a spinner while the process is running.
Source code in langroid/agent/callbacks/chainlit.py
get_user_response(prompt)
¶
get_user_response_async(prompt)
async
¶
ask_user(prompt, timeout=USER_TIMEOUT, suppress_values=['c'])
async
¶
Ask user for input.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
prompt |
str
|
Prompt to display to user |
required |
timeout |
int
|
Timeout in seconds |
USER_TIMEOUT
|
suppress_values |
List[str]
|
List of values to suppress from display (e.g. "c" for continue) |
['c']
|
Returns:
Name | Type | Description |
---|---|---|
str |
str
|
User response |
Source code in langroid/agent/callbacks/chainlit.py
ChainlitTaskCallbacks(task, config=ChainlitCallbackConfig())
¶
Bases: ChainlitAgentCallbacks
Recursively inject ChainlitAgentCallbacks into a Langroid Task's agent and agents of sub-tasks.
Source code in langroid/agent/callbacks/chainlit.py
show_subtask_response(task, content, is_tool=False)
¶
Show sub-task response as a step, nested at the right level.
Source code in langroid/agent/callbacks/chainlit.py
setup_llm()
async
¶
From the session llm_settings
, create new LLMConfig and LLM objects,
save them in session state.
Source code in langroid/agent/callbacks/chainlit.py
update_llm(new_settings)
async
¶
Update LLMConfig and LLM from settings, and save in session state.
Source code in langroid/agent/callbacks/chainlit.py
get_text_files(message, extensions=['.txt', '.pdf', '.doc', '.docx'])
async
¶
Get dict (file_name -> file_path) from files uploaded in chat msg
Source code in langroid/agent/callbacks/chainlit.py
wrap_text_preserving_structure(text, width=90)
¶
Wrap text preserving paragraph breaks. Typically used to format an agent_response output, which may have long lines with no newlines or paragraph breaks.