Skip to content

Add Phoenix tutorials #352

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,8 +47,8 @@ Haystack 2.0
| [Pipelines](./tutorials/11_Pipelines.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/11_Pipelines.ipynb) | [[OUTDATED] Simplifying Pipeline Inputs with Multiplexer](./tutorials/37_Simplifying_Pipeline_Inputs_with_Multiplexer.ipynb)| [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/37_Simplifying_Pipeline_Inputs_with_Multiplexer.ipynb)|
| [[OUTDATED] Seq2SeqGenerator](./tutorials/12_LFQA.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/12_LFQA.ipynb) | [Embedding Metadata for Improved Retrieval](./tutorials/39_Embedding_Metadata_for_Improved_Retrieval.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/39_Embedding_Metadata_for_Improved_Retrieval.ipynb)|
| [Question Generation](./tutorials/13_Question_generation.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/13_Question_generation.ipynb) | [Building a Chat Application with Function Calling](./tutorials/40_Building_Chat_Application_with_Function_Calling.ipynb)| [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/40_Building_Chat_Application_with_Function_Calling.ipynb)|
| [Query Classifier](./tutorials/14_Query_Classifier.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/14_Query_Classifier.ipynb) | | |
| [Table QA](./tutorials/15_TableQA.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/15_TableQA.ipynb) | | |
| [Query Classifier](./tutorials/14_Query_Classifier.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/14_Query_Classifier.ipynb)| [Tracing a Haystack Application with Arize Phoenix](./tutorials/41_Tracing_with_Arize_Phoenix.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/41_Tracing_with_Arize_Phoenix.ipynb)
| [Table QA](./tutorials/15_TableQA.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/15_TableQA.ipynb)| [Evaluating RAG with Arize Phoenix](./tutorials/42_Evaluate_RAG_with_Arize_Phoenix.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/42_Evaluate_RAG_with_Arize_Phoenix.ipynb)
| [Document Classifier at Index Time](./tutorials/16_Document_Classifier_at_Index_Time.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/16_Document_Classifier_at_Index_Time.ipynb) | | |
| [Make Your QA Pipelines Talk!](./tutorials/17_Audio.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/17_Audio.ipynb) | | |
| [Generative Pseudo Labeling](./tutorials/18_GPL.ipynb) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deepset-ai/haystack-tutorials/blob/main/tutorials/18_GPL.ipynb) | | |
Expand Down
193 changes: 193 additions & 0 deletions tutorials/41_Tracing_with_Arize_Phoenix.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,193 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<center>\n",
" <p style=\"text-align:center\">\n",
" <img alt=\"phoenix logo\" src=\"https://storage.googleapis.com/arize-phoenix-assets/assets/phoenix-logo-light.svg\" width=\"200\"/>\n",
" <br>\n",
" <a href=\"https://docs.arize.com/phoenix/\">Docs</a>\n",
" |\n",
" <a href=\"https://github.com/Arize-ai/phoenix\">GitHub</a>\n",
" |\n",
" <a href=\"https://join.slack.com/t/arize-ai/shared_invite/zt-1px8dcmlf-fmThhDFD_V_48oU7ALan4Q\">Community</a>\n",
" </p>\n",
"</center>\n",
"<h1 align=\"center\">Tracing and Evaluating a Haystack Application</h1>\n",
" \n",
"Phoenix makes your Haystack applications *observable* by visualizing the underlying structure of each call to your Haystack Pipelines and surfacing problematic spans of execution based on latency, token count, or other evaluation metrics.\n",
"\n",
"ℹ️ This notebook requires an OpenAI API key.\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Install Dependencies & set OpenAI API key"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!pip install arize-phoenix openinference-instrumentation-haystack haystack-ai"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from getpass import getpass\n",
"import os\n",
"\n",
"if not (openai_api_key := os.getenv(\"OPENAI_API_KEY\")):\n",
" openai_api_key = getpass(\"🔑 Enter your OpenAI API key: \")\n",
"\n",
"os.environ[\"OPENAI_API_KEY\"] = openai_api_key"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Initialize Phoenix\n",
"The command below initializes a local version of Phoenix that will run in the notebook. Phoenix also provides self-hosted and cloud deployment options."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import phoenix as px\n",
"session = px.launch_app()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Connect Phoenix to Haystack and Instrument\n",
"The command below connects Phoenix to your Haystack application and instruments the Haystack library. Any calls to Haystack pipelines from this point forward will be traced and logged to the Phoenix UI."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from openinference.instrumentation.haystack import HaystackInstrumentor\n",
"from phoenix.otel import register\n",
"\n",
"tracer_provider = register()\n",
"HaystackInstrumentor().instrument(tracer_provider=tracer_provider)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Build a Haystack Pipeline\n",
"The command below builds a simple Haystack pipeline that retrieves documents from an in-memory document store and uses an LLM to answer a question."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from haystack import Document, Pipeline\n",
"from haystack.components.builders.prompt_builder import PromptBuilder\n",
"from haystack.components.generators import OpenAIGenerator\n",
"from haystack.components.retrievers.in_memory import InMemoryBM25Retriever\n",
"from haystack.document_stores.in_memory import InMemoryDocumentStore\n",
"\n",
"document_store = InMemoryDocumentStore()\n",
"document_store.write_documents(\n",
" [\n",
" Document(content=\"My name is Jean and I live in Paris.\"),\n",
" Document(content=\"My name is Mark and I live in Berlin.\"),\n",
" Document(content=\"My name is Giorgio and I live in Rome.\"),\n",
" ]\n",
")\n",
"\n",
"prompt_template = \"\"\"\n",
"Given these documents, answer the question.\n",
"Documents:\n",
"{% for doc in documents %}\n",
" {{ doc.content }}\n",
"{% endfor %}\n",
"Question: {{question}}\n",
"Answer:\n",
"\"\"\"\n",
"\n",
"retriever = InMemoryBM25Retriever(document_store=document_store)\n",
"prompt_builder = PromptBuilder(template=prompt_template)\n",
"llm = OpenAIGenerator()\n",
"\n",
"rag_pipeline = Pipeline()\n",
"rag_pipeline.add_component(\"retriever\", retriever)\n",
"rag_pipeline.add_component(\"prompt_builder\", prompt_builder)\n",
"rag_pipeline.add_component(\"llm\", llm)\n",
"rag_pipeline.connect(\"retriever\", \"prompt_builder.documents\")\n",
"rag_pipeline.connect(\"prompt_builder\", \"llm\")\n",
"\n",
"question = \"Who lives in Paris?\"\n",
"results = rag_pipeline.run(\n",
" {\n",
" \"retriever\": {\"query\": question},\n",
" \"prompt_builder\": {\"question\": question},\n",
" }\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## View the Pipeline in Phoenix\n",
"You should now see traces in Phoenix!"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(f\"Phoenix is currently running on {session.url}\")"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "phoenix",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.9"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Loading