chat
Β€
Chat module for conversing with a RAPTOR RAG-based LLM in the terminal.
Chat
Β€
Chat(
retriever: RetrieverLike,
qa_model: QAModelLike,
console: Console,
history_file: str = ".bookacle-chat-history.txt",
user_avatar: str = "π€",
)
A terminal-based chat interface for interacting with a RAPTOR RAG-based LLM.
Attributes:
-
retriever(RetrieverLike) βRetriever to use for retrieving relevant context.
-
qa_model(QAModelLike) βQA model to use for answering questions.
-
console(Console) βRich Console to use for displaying messages.
-
history_file(str) βFile to store chat history.
-
user_avatar(str) βAvatar to use for the user in the chat UI.
Parameters:
-
retriever(RetrieverLike) βRetriever to use for retrieving relevant context.
-
qa_model(QAModelLike) βQA model to use for answering questions.
-
console(Console) βRich Console to use for displaying messages.
-
history_file(str, default:'.bookacle-chat-history.txt') βFile to store chat history. The file is created in the home directory.
-
user_avatar(str, default:'π€') βAvatar to use for the user in the chat UI.
display_ai_msg_stream
Β€
invoke_qa_model
Β€
invoke_qa_model(
tree: Tree,
question: str,
history: list[Message] | None = None,
stream: bool = True,
*args,
**kwargs
) -> Message
Invoke the QA model to answer a question.
Parameters:
-
tree(Tree) βRAPTOR tree that should be used for RAG.
-
question(str) βThe question to answer.
-
history(list[Message] | None, default:None) βChat history.
-
stream(bool, default:True) βWhether to stream the AI response.
-
**args(tuple[Any], default:()) βAdditional positional arguments to pass to the retriever.
-
**kwargs(dict[str, Any], default:{}) βAdditional keyword arguments to pass to the retriever.
Returns:
-
MessageβThe response from the QA model.
run
Β€
run(
tree: Tree,
initial_chat_message: str = "",
system_prompt: str = "",
stream: bool = True,
*args,
**kwargs
) -> None
Run the chat interface.
Parameters:
-
tree(Tree) βRAPTOR tree that should be used for RAG.
-
initial_chat_message(str, default:'') βInitial message to display in the chat.
-
system_prompt(str, default:'') βSystem prompt that should be used for the QA model.
-
stream(bool, default:True) βWhether to stream the AI response.
-
*args(tuple[Any], default:()) βAdditional positional arguments to pass to the retriever.
-
**kwargs(dict[str, Any], default:{}) βAdditional keyword arguments to pass to the retriever.