Skip to content

chat Β€

Chat module for conversing with a RAPTOR RAG-based LLM in the terminal.

Chat Β€

Chat(
    retriever: RetrieverLike,
    qa_model: QAModelLike,
    console: Console,
    history_file: str = ".bookacle-chat-history.txt",
    user_avatar: str = "πŸ‘€",
)

A terminal-based chat interface for interacting with a RAPTOR RAG-based LLM.

Attributes:

  • retriever (RetrieverLike) –

    Retriever to use for retrieving relevant context.

  • qa_model (QAModelLike) –

    QA model to use for answering questions.

  • console (Console) –

    Rich Console to use for displaying messages.

  • history_file (str) –

    File to store chat history.

  • user_avatar (str) –

    Avatar to use for the user in the chat UI.

Parameters:

  • retriever (RetrieverLike) –

    Retriever to use for retrieving relevant context.

  • qa_model (QAModelLike) –

    QA model to use for answering questions.

  • console (Console) –

    Rich Console to use for displaying messages.

  • history_file (str, default: '.bookacle-chat-history.txt' ) –

    File to store chat history. The file is created in the home directory.

  • user_avatar (str, default: 'πŸ‘€' ) –

    Avatar to use for the user in the chat UI.

display_ai_msg_stream Β€

display_ai_msg_stream(messages: Iterator[Message]) -> str

Display an AI message stream in the chat UI.

Parameters:

Returns:

  • str –

    The complete message as a string.

invoke_qa_model Β€

invoke_qa_model(
    tree: Tree,
    question: str,
    history: list[Message] | None = None,
    stream: bool = True,
    *args,
    **kwargs
) -> Message

Invoke the QA model to answer a question.

Parameters:

  • tree (Tree) –

    RAPTOR tree that should be used for RAG.

  • question (str) –

    The question to answer.

  • history (list[Message] | None, default: None ) –

    Chat history.

  • stream (bool, default: True ) –

    Whether to stream the AI response.

  • **args (tuple[Any], default: () ) –

    Additional positional arguments to pass to the retriever.

  • **kwargs (dict[str, Any], default: {} ) –

    Additional keyword arguments to pass to the retriever.

Returns:

  • Message –

    The response from the QA model.

run Β€

run(
    tree: Tree,
    initial_chat_message: str = "",
    system_prompt: str = "",
    stream: bool = True,
    *args,
    **kwargs
) -> None

Run the chat interface.

Parameters:

  • tree (Tree) –

    RAPTOR tree that should be used for RAG.

  • initial_chat_message (str, default: '' ) –

    Initial message to display in the chat.

  • system_prompt (str, default: '' ) –

    System prompt that should be used for the QA model.

  • stream (bool, default: True ) –

    Whether to stream the AI response.

  • *args (tuple[Any], default: () ) –

    Additional positional arguments to pass to the retriever.

  • **kwargs (dict[str, Any], default: {} ) –

    Additional keyword arguments to pass to the retriever.