Chat Interactions
QueryMT provides comprehensive support for chat-based interactions with Large Language Models, enabling you to build conversational AI applications.
Key Components
-
querymt::chat::ChatMessage: Represents a single message in a conversation. Key attributes include:role: Indicates who sent the message (querymt::chat::ChatRole::Userorquerymt::chat::ChatRole::Assistant).message_type: Specifies the nature of the content (e.g.,querymt::chat::MessageType::Text,Image,ToolUse,ToolResult).content: The primary text content of the message.- Source:
crates/querymt/src/chat/mod.rs
-
querymt::chat::ChatResponse: A trait representing the LLM's response to a chat request. It provides methods to access:text(): The textual content of the LLM's reply.tool_calls(): A list ofquerymt::ToolCallobjects if the LLM decided to use one or more tools.thinking(): Optional "thoughts" or reasoning steps from the model, if supported and enabled.usage(): Optional token usage information (querymt::Usage).- Source:
crates/querymt/src/chat/mod.rs
-
querymt::chat::BasicChatProvider: A trait that LLM providers implement to support fundamental chat functionality. It has a single method:chat(&self, messages: &[ChatMessage]): Sends a list of messages to the LLM and returns aChatResponse.- Source:
crates/querymt/src/chat/mod.rs
-
querymt::chat::ToolChatProvider: ExtendsBasicChatProviderto include support for tools (function calling). It has one primary method:chat_with_tools(&self, messages: &[ChatMessage], tools: Option<&[Tool]>): Sends messages along with a list of available tools the LLM can use.- Source:
crates/querymt/src/chat/mod.rs
How It Works
- Construct Messages: Your application assembles a sequence of
ChatMessageobjects representing the conversation history. UseChatMessage::user()for user messages andChatMessage::assistant()for assistant messages. This typically starts with alternatingUserandAssistantmessages. - Initiate Chat: You call the
chatorchat_with_toolsmethod on anLLMProviderinstance, passing the message history and optionally, a list of available tools. - Provider Interaction: The
LLMProvider(or its underlying implementation likeHTTPLLMProvider) formats the request according to the specific LLM's API, sends it, and receives the raw response. - Parse Response: The provider parses the raw response into an object implementing
ChatResponse. - Handle Response: Your application processes the
ChatResponse:- If
text()is present, it's the LLM's textual reply. - If
tool_calls()is present, the LLM wants to execute one or more functions. Your application needs to:- Execute these functions.
- Send the results back to the LLM as new
ChatMessages (typically withMessageType::ToolResult). - Continue the chat loop.
- If
Example Flow (Conceptual)
QueryMT's chat system is designed to be flexible, supporting simple Q&A, complex multi-turn dialogues, and sophisticated interactions involving external tools. The Tool and ToolChoice mechanisms provide fine-grained control over how LLMs can utilize functions.