-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add observability quick start #632
Merged
Merged
Changes from all commits
Commits
Show all changes
13 commits
Select commit
Hold shift + click to select a range
f4cd396
wip
isahers1 b74353e
draft
isahers1 71deb80
x
isahers1 bbefaf5
nits
isahers1 7d271e3
Update docs/observability/index.mdx
isahers1 6b1c7a3
Update docs/observability/index.mdx
isahers1 50c69fe
Update docs/observability/index.mdx
isahers1 73301ed
Update docs/observability/index.mdx
isahers1 16a55f8
Update docs/observability/index.mdx
isahers1 04eaeb9
tanushree comments
isahers1 032e8f9
Update docs/observability/index.mdx
isahers1 047c35f
Update docs/observability/index.mdx
isahers1 ed01f43
Merge branch 'main' into isaac/observability_tutorials
isahers1 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,305 @@ | ||
--- | ||
sidebar_label: Quick Start | ||
sidebar_position: 0 | ||
table_of_contents: true | ||
--- | ||
|
||
import { | ||
CodeTabs, | ||
python, | ||
typescript, | ||
ShellBlock, | ||
} from "@site/src/components/InstructionsWithCode"; | ||
import { RegionalUrl } from "@site/src/components/RegionalUrls"; | ||
|
||
# Observability Quick Start | ||
|
||
This tutorial will get you up and running with our observability SDK by showing you how to | ||
trace your application to LangSmith. | ||
|
||
If you're already familiar with the observability SDK, or are interested in tracing more than just | ||
LLM calls you can skip to the [next steps section](#next-steps), | ||
or check out the [how-to guides](../observability/how_to_guides). | ||
|
||
:::tip Trace LangChain or LangGraph Applications | ||
If you are using [LangChain](https://python.langchain.com/docs/introduction/) or [LangGraph](https://langchain-ai.github.io/langgraph/), which both integrate seamlessly with LangSmith, | ||
you can get started by reading the guides for tracing with [LangChain](./observability/how_to_guides/tracing/trace_with_langchain) or tracing with [LangGraph](./observability/how_to_guides/tracing/trace_with_langgraph). | ||
::: | ||
|
||
## 1. Install Dependencies | ||
|
||
<CodeTabs | ||
tabs={[ | ||
{ | ||
value: "python", | ||
label: "Python", | ||
language: "bash", | ||
content: `pip install -U langsmith openai`, | ||
}, | ||
{ | ||
value: "typescript", | ||
label: "TypeScript", | ||
language: "bash", | ||
content: `yarn add langsmith openai`, | ||
}, | ||
]} | ||
groupId="client-language" | ||
/> | ||
|
||
## 2. Create an API key | ||
|
||
To create an API key head to the <RegionalUrl text='LangSmith settings page' suffix='/settings' />. Then click **Create API Key.** | ||
|
||
## 3. Set up your environment | ||
|
||
<CodeTabs | ||
tabs={[ | ||
ShellBlock(`export LANGSMITH_TRACING=true | ||
export LANGSMITH_API_KEY="<your-langsmith-api-key>" | ||
# The example uses OpenAI, but it's not necessary if your code uses another LLM provider | ||
export OPENAI_API_KEY="<your-openai-api-key>"`), | ||
]} | ||
groupId="client-language" | ||
/> | ||
|
||
## 4. Define your application | ||
|
||
We will instrument a simple [RAG](https://www.mckinsey.com/featured-insights/mckinsey-explainers/what-is-retrieval-augmented-generation-rag) | ||
application for this tutorial, but feel free to use your own code if you'd like - just make sure | ||
it has an LLM call! | ||
|
||
<details> | ||
<summary>Application Code</summary> | ||
<CodeTabs | ||
groupId="client-language" | ||
tabs={[ | ||
python({ label: "Python" })` | ||
from openai import OpenAI | ||
|
||
openai_client = OpenAI() | ||
|
||
# This is the retriever we will use in RAG | ||
# This is mocked out, but it could be anything we want | ||
def retriever(query: str): | ||
results = ["Harrison worked at Kensho"] | ||
return results | ||
|
||
# This is the end-to-end RAG chain. | ||
# It does a retrieval step then calls OpenAI | ||
def rag(question): | ||
docs = retriever(question) | ||
system_message = """Answer the users question using only the provided information below: | ||
|
||
{docs}""".format(docs="\\n".join(docs)) | ||
|
||
return openai_client.chat.completions.create( | ||
messages=[ | ||
{"role": "system", "content": system_message}, | ||
{"role": "user", "content": question}, | ||
], | ||
model="gpt-4o-mini", | ||
) | ||
`, | ||
typescript({ label: "TypeScript" })` | ||
import { OpenAI } from "openai"; | ||
|
||
const openAIClient = new OpenAI(); | ||
|
||
// This is the retriever we will use in RAG | ||
// This is mocked out, but it could be anything we want | ||
async function retriever(query: string) { | ||
return ["This is a document"]; | ||
} | ||
|
||
// This is the end-to-end RAG chain. | ||
// It does a retrieval step then calls OpenAI | ||
async function rag(question: string) { | ||
const docs = await retriever(question); | ||
|
||
const systemMessage = | ||
"Answer the users question using only the provided information below:\\n\\n" + | ||
docs.join("\\n"); | ||
|
||
return await openAIClient.chat.completions.create({ | ||
messages: [ | ||
{ role: "system", content: systemMessage }, | ||
{ role: "user", content: question }, | ||
], | ||
model: "gpt-4o-mini", | ||
}); | ||
} | ||
`, | ||
]} | ||
/> | ||
</details> | ||
|
||
## 5. Trace OpenAI calls | ||
|
||
The first thing you might want to trace is all your OpenAI calls. LangSmith makes this easy with the [`wrap_openai`](https://docs.smith.langchain.com/reference/python/wrappers/langsmith.wrappers._openai.wrap_openai_) (Python) or [`wrapOpenAI`](https://docs.smith.langchain.com/reference/js/functions/wrappers_openai.wrapOpenAI) (TypeScript) wrappers. | ||
All you have to do is modify your code to use the wrapped client instead of using the `OpenAI` client directly. | ||
|
||
<CodeTabs | ||
groupId="client-language" | ||
tabs={[ | ||
python({ label: "Python" })` | ||
from openai import OpenAI | ||
# highlight-next-line | ||
from langsmith.wrappers import wrap_openai | ||
|
||
# highlight-next-line | ||
openai_client = wrap_openai(OpenAI()) | ||
|
||
# This is the retriever we will use in RAG | ||
# This is mocked out, but it could be anything we want | ||
def retriever(query: str): | ||
results = ["Harrison worked at Kensho"] | ||
return results | ||
|
||
# This is the end-to-end RAG chain. | ||
# It does a retrieval step then calls OpenAI | ||
def rag(question): | ||
docs = retriever(question) | ||
system_message = """Answer the users question using only the provided information below: | ||
|
||
{docs}""".format(docs="\\n".join(docs)) | ||
|
||
return openai_client.chat.completions.create( | ||
messages=[ | ||
{"role": "system", "content": system_message}, | ||
{"role": "user", "content": question}, | ||
], | ||
model="gpt-4o-mini", | ||
) | ||
`, | ||
typescript({ label: "TypeScript" })` | ||
import { OpenAI } from "openai"; | ||
// highlight-next-line | ||
import { wrapOpenAI } from "langsmith/wrappers"; | ||
|
||
// highlight-next-line | ||
const openAIClient = wrapOpenAI(new OpenAI()); | ||
|
||
// This is the retriever we will use in RAG | ||
// This is mocked out, but it could be anything we want | ||
async function retriever(query: string) { | ||
return ["This is a document"]; | ||
} | ||
|
||
// This is the end-to-end RAG chain. | ||
// It does a retrieval step then calls OpenAI | ||
async function rag(question: string) { | ||
const docs = await retriever(question); | ||
|
||
const systemMessage = | ||
"Answer the users question using only the provided information below:\\n\\n" + | ||
docs.join("\\n"); | ||
|
||
return await openAIClient.chat.completions.create({ | ||
messages: [ | ||
{ role: "system", content: systemMessage }, | ||
{ role: "user", content: question }, | ||
], | ||
model: "gpt-4o-mini", | ||
}); | ||
} | ||
`, | ||
]} | ||
/> | ||
|
||
Now when you call your application as follows: | ||
|
||
```python | ||
rag("where did harrison work") | ||
``` | ||
|
||
This will produce a trace of just the OpenAI call in LangSmith's default tracing project. It should look something like [this](https://smith.langchain.com/public/e7b7d256-10fe-4d49-a8d5-36ca8e5af0d2/r). | ||
|
||
![](./tutorials/static/tracing_tutorial_openai.png) | ||
|
||
## 6. Trace entire application | ||
|
||
You can also use the [`traceable`] decorator ([Python](https://docs.smith.langchain.com/reference/python/run_helpers/langsmith.run_helpers.traceable) or [TypeScript](https://langsmith-docs-bdk0fivr6-langchain.vercel.app/reference/js/functions/traceable.traceable)) to trace your entire application instead of just the LLM calls. | ||
|
||
<CodeTabs | ||
groupId="client-language" | ||
tabs={[ | ||
python({ label: "Python" })` | ||
from openai import OpenAI | ||
# highlight-next-line | ||
from langsmith import traceable | ||
from langsmith.wrappers import wrap_openai | ||
|
||
openai_client = wrap_openai(OpenAI()) | ||
|
||
def retriever(query: str): | ||
results = ["Harrison worked at Kensho"] | ||
return results | ||
|
||
# highlight-next-line | ||
@traceable | ||
def rag(question): | ||
docs = retriever(question) | ||
system_message = """Answer the users question using only the provided information below: | ||
|
||
{docs}""".format(docs="\\n".join(docs)) | ||
|
||
return openai_client.chat.completions.create( | ||
messages=[ | ||
{"role": "system", "content": system_message}, | ||
{"role": "user", "content": question}, | ||
], | ||
model="gpt-4o-mini", | ||
) | ||
`, | ||
typescript({ label: "TypeScript" })` | ||
import { OpenAI } from "openai"; | ||
// highlight-next-line | ||
import { traceable } from "langsmith/traceable"; | ||
import { wrapOpenAI } from "langsmith/wrappers"; | ||
|
||
const openAIClient = wrapOpenAI(new OpenAI()); | ||
|
||
async function retriever(query: string) { | ||
return ["This is a document"]; | ||
} | ||
|
||
// highlight-next-line | ||
const rag = traceable(async function rag(question: string) { | ||
const docs = await retriever(question); | ||
|
||
const systemMessage = | ||
"Answer the users question using only the provided information below:\\n\\n" + | ||
docs.join("\\n"); | ||
|
||
return await openAIClient.chat.completions.create({ | ||
messages: [ | ||
{ role: "system", content: systemMessage }, | ||
{ role: "user", content: question }, | ||
], | ||
model: "gpt-4o-mini", | ||
}); | ||
}); | ||
`, | ||
]} | ||
/> | ||
|
||
Now if you call your application as follows: | ||
|
||
```python | ||
rag("where did harrison work") | ||
``` | ||
|
||
This will produce a trace of just the entire pipeline (with the OpenAI call as a child run) - it should look something like [this](https://smith.langchain.com/public/2174f4e9-48ab-4f9e-a8c4-470372d976f1/r) | ||
|
||
![](./tutorials/static/tracing_tutorial_chain.png) | ||
|
||
## Next steps | ||
|
||
Congratulations! If you've made it this far, you're well on your way to being an expert in observability with LangSmith. | ||
Here are some topics you might want to explore next: | ||
|
||
- [Trace multiturn conversations](./observability/how_to_guides/monitoring/threads) | ||
- [Send traces to a specific project](./observability/how_to_guides/tracing/log_traces_to_project) | ||
isahers1 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
- [Filter traces in a project](./observability/how_to_guides/monitoring/filter_traces_in_application) | ||
|
||
Or you can visit the [how-to guides page](./observability/how_to_guides) to find out about all the things you can do with LangSmith observability. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should have 3 top level sections here:
Get Started with LangSmith if you're using LangChain
LangChain integrates seamlessly with LangSmith, with no extra instrumentation needed. Learn how to start tracing with LangChain.
Get Started with LangSmith if you're using LangGraph
LangGraph integrates seamlessly with LangSmith, with no extra instrumentation needed. Learn how to start tracing with LangGraph.
Get Started instrumenting your application with with LangSmith
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Having 3 headers with only a sentence under each one does not render well. I added a note instead to achieve the same purpose.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sg!