diff --git a/docs/observability/how_to_guides/monitoring/static/convo.png b/docs/observability/how_to_guides/monitoring/static/convo.png index e26f695c..aecd7e9f 100644 Binary files a/docs/observability/how_to_guides/monitoring/static/convo.png and b/docs/observability/how_to_guides/monitoring/static/convo.png differ diff --git a/docs/observability/how_to_guides/monitoring/static/convo_tab.png b/docs/observability/how_to_guides/monitoring/static/convo_tab.png index ae629443..cb77b924 100644 Binary files a/docs/observability/how_to_guides/monitoring/static/convo_tab.png and b/docs/observability/how_to_guides/monitoring/static/convo_tab.png differ diff --git a/docs/observability/how_to_guides/monitoring/threads.mdx b/docs/observability/how_to_guides/monitoring/threads.mdx index 83ba2aa1..fb28408c 100644 --- a/docs/observability/how_to_guides/monitoring/threads.mdx +++ b/docs/observability/how_to_guides/monitoring/threads.mdx @@ -1,11 +1,16 @@ +import { + CodeTabs, + PythonBlock, + TypeScriptBlock, +} from "@site/src/components/InstructionsWithCode"; + # Set up threads :::tip Recommended Reading Before diving into this content, it might be helpful to read the following: - [Add metadata and tags to traces](../tracing/add_metadata_tags) - -::: + ::: Many LLM applications have a chatbot-like interface in which the user and the LLM application engage in a multi-turn conversation. In order to track these conversations, you can use the `Threads` feature in LangSmith. @@ -22,11 +27,183 @@ The key name should be one of: - `thread_id` - `conversation_id`. -The value should be a UUID, such as `f47ac10b-58cc-4372-a567-0e02b2c3d479`. +The value can be any string you want, but we recommend using UUIDs, such as `f47ac10b-58cc-4372-a567-0e02b2c3d479`. + +### Code example + +This example demonstrates how to log and retrieve conversation history from LangSmith to maintain long-running chats. + +You can [add metadata to your traces](../tracing/add_metadata_tags) in LangSmith in a variety of ways, this code will show how to do so dynamically, but read the +previously linked guide to learn about all the ways you can add thread identifier metadata to your traces. + + +new Date(b.start_time).getTime() - new Date(a.start_time).getTime() +); + +// The current state of the conversation +return [ +...sortedRuns[0].inputs.messages, +sortedRuns[0].outputs.choices[0].message +]; +} + +const chatPipeline = traceable( +async ( +question: string, +options: { +getChatHistory?: boolean; +} = {} +) => { +const { +getChatHistory = false, +} = options; + + let messages = []; + // Whether to continue an existing thread or start a new one + if (getChatHistory) { + const runTree = await getCurrentRunTree(); + const historicalMessages = await getThreadHistory( + runTree.extra.metadata.session_id, + runTree.project_name + ); + messages = [ + ...historicalMessages, + { role:"user", content: question } + ]; + } else { + messages = [{ role:"user", content: question }]; + } + + // Invoke the model + const chatCompletion = await client.chat.completions.create({ + model: "gpt-4o-mini", + messages: messages + }); + return chatCompletion.choices[0].message.content; + +}, +{ +name: "Chat Bot", +project_name: langsmithProject, +metadata: { session_id: threadId } + } +); + +// Start the conversation +await chatPipeline("Hi, my name is Bob");`), +]} +groupId="client-language" +/> + +After waiting a few seconds, you can make the following calls to contineu the conversation. By passing `getChatHistory: true`, +you can continue the conversation from where it left off. This means that the LLM will receive the entire message history and respond to it, +instead of just responding to the latest message. + + ## View threads -You can view threads by clicking on the `Threads` tad in any project details page. You will then see a list of all threads, sorted by the most recent activity. +You can view threads by clicking on the `Threads` tab in any project details page. You will then see a list of all threads, sorted by the most recent activity. ![Thread Tab](./static/convo_tab.png)