chainlit
langroid/agent/callbacks/chainlit.py
Callbacks for Chainlit integration.
ChainlitAgentCallbacks(agent, msg=None, config=ChainlitCallbackConfig())
¶
Inject Chainlit callbacks into a Langroid Agent
Source code in langroid/agent/callbacks/chainlit.py
start_llm_stream()
¶
Returns a streaming fn that can be passed to the LLM class
Source code in langroid/agent/callbacks/chainlit.py
start_llm_stream_async()
async
¶
Returns a streaming fn that can be passed to the LLM class
Source code in langroid/agent/callbacks/chainlit.py
cancel_llm_stream()
¶
finish_llm_stream(content, is_tool=False)
¶
Update the stream, and display entire response in the right language.
Source code in langroid/agent/callbacks/chainlit.py
show_llm_response(content, is_tool=False, cached=False, language=None)
¶
Show non-streaming LLM response.
Source code in langroid/agent/callbacks/chainlit.py
show_error_message(error)
¶
Show error message as a step.
Source code in langroid/agent/callbacks/chainlit.py
show_agent_response(content, language='text')
¶
Show message from agent (typically tool handler). Agent response can be considered as a "step" between LLM response and user response
Source code in langroid/agent/callbacks/chainlit.py
show_start_response(entity)
¶
When there's a potentially long-running process, start a step, so that the UI displays a spinner while the process is running.
Source code in langroid/agent/callbacks/chainlit.py
get_user_response(prompt)
¶
Ask for user response, wait for it, and return it, as a cl.Step rather than as a cl.Message so we can nest it under the parent step.
Source code in langroid/agent/callbacks/chainlit.py
show_user_response(message)
¶
Show user response as a step.
Source code in langroid/agent/callbacks/chainlit.py
show_first_user_message(msg)
¶
Show first user message as a step.
Source code in langroid/agent/callbacks/chainlit.py
ask_user_step(prompt, timeout=USER_TIMEOUT, suppress_values=['c'])
async
¶
Ask user for input, as a step nested under parent_id. Rather than rely entirely on AskUserMessage (which doesn't let us nest the question + answer under a step), we instead create fake steps for the question and answer, and only rely on AskUserMessage with an empty prompt to await user response.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
prompt |
str
|
Prompt to display to user |
required |
timeout |
int
|
Timeout in seconds |
USER_TIMEOUT
|
suppress_values |
List[str]
|
List of values to suppress from display (e.g. "c" for continue) |
['c']
|
Returns:
Name | Type | Description |
---|---|---|
str |
str
|
User response |
Source code in langroid/agent/callbacks/chainlit.py
542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 |
|
ChainlitTaskCallbacks(task, msg=None, config=ChainlitCallbackConfig())
¶
Bases: ChainlitAgentCallbacks
Recursively inject ChainlitAgentCallbacks into a Langroid Task's agent and agents of sub-tasks.
Source code in langroid/agent/callbacks/chainlit.py
show_subtask_response(task, content, is_tool=False)
¶
Show sub-task response as a step, nested at the right level.
Source code in langroid/agent/callbacks/chainlit.py
setup_llm()
async
¶
From the session llm_settings
, create new LLMConfig and LLM objects,
save them in session state.
Source code in langroid/agent/callbacks/chainlit.py
update_llm(new_settings)
async
¶
Update LLMConfig and LLM from settings, and save in session state.
Source code in langroid/agent/callbacks/chainlit.py
get_text_files(message, extensions=['.txt', '.pdf', '.doc', '.docx'])
async
¶
Get dict (file_name -> file_path) from files uploaded in chat msg
Source code in langroid/agent/callbacks/chainlit.py
wrap_text_preserving_structure(text, width=90)
¶
Wrap text preserving paragraph breaks. Typically used to format an agent_response output, which may have long lines with no newlines or paragraph breaks.