Skip to content

Commit 3063016

Browse files
authored
fix small syntax errors in prompt engineering how-to (#641)
2 parents 3e5b9e2 + c1dc8a2 commit 3063016

File tree

1 file changed

+10
-6
lines changed

1 file changed

+10
-6
lines changed

docs/prompt_engineering/how_to_guides/prompts/manage_prompts_programatically.mdx

+10-6
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ To create a new prompt or update an existing prompt, you can use the `push promp
5959

6060
<CodeTabs
6161
tabs={[
62-
PythonBlock(`from langsmith import client
62+
PythonBlock(`from langsmith import Client
6363
from langchain_core.prompts import ChatPromptTemplate\n
6464
client = Client()\n
6565
prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")
@@ -77,7 +77,9 @@ print(url)
7777
TypeScriptBlock(`import * as hub from "langchain/hub";
7878
import { ChatPromptTemplate } from "@langchain/core/prompts";\n
7979
const prompt = ChatPromptTemplate.fromTemplate("tell me a joke about {topic}");
80-
const url = hub.push("joke-generator", chain);
80+
const url = hub.push("joke-generator", {
81+
object: prompt,
82+
});
8183
// url is a link to the prompt in the UI
8284
console.log(url);
8385
`),
@@ -91,7 +93,7 @@ The provider must be supported by the LangSmith playground. (see settings here:
9193

9294
<CodeTabs
9395
tabs={[
94-
PythonBlock(`from langsmith import client
96+
PythonBlock(`from langsmith import Client
9597
from langchain_core.prompts import ChatPromptTemplate
9698
from langchain_openai import ChatOpenAI\n
9799
client = Client()
@@ -116,7 +118,9 @@ import { ChatOpenAI } from "@langchain/openai";\n
116118
const model = new ChatOpenAI({ model: "gpt-4o-mini" });\n
117119
const prompt = ChatPromptTemplate.fromTemplate("tell me a joke about {topic}");
118120
const chain = prompt.pipe(model);\n
119-
await hub.push("joke-generator-with-model", chain);`),
121+
await hub.push("joke-generator-with-model", {
122+
object: chain
123+
});`),
120124
]}
121125
groupId="client-language"
122126
/>
@@ -131,7 +135,7 @@ To pull a **public prompt** from the LangChain Hub, you need to specify the hand
131135

132136
<CodeTabs
133137
tabs={[
134-
PythonBlock(`from langsmith import client
138+
PythonBlock(`from langsmith import Client
135139
from langchain_openai import ChatOpenAI\n
136140
client = Client()\n
137141
prompt = client.pull_prompt("joke-generator")
@@ -161,7 +165,7 @@ Make sure you have the proper environment variables set for the model you are us
161165

162166
<CodeTabs
163167
tabs={[
164-
PythonBlock(`from langsmith import client\n
168+
PythonBlock(`from langsmith import Client\n
165169
client = Client()
166170
chain = client.pull_prompt("joke-generator-with-model", include_model=True)
167171
chain.invoke({"topic": "cats"})`),

0 commit comments

Comments
 (0)