Skip to content

Commit 688732a

Browse files
committed
done with all implementation. Tested file search. But I have to comment out now since file uplaod has outage. Need deployment test
1 parent 0f2f56c commit 688732a

20 files changed

+1223
-1
lines changed

.gitignore

+10
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
__pycache__/
2+
3+
# Environments
4+
.env
5+
.venv
6+
env/
7+
venv/
8+
ENV/
9+
env.bak/
10+
venv.bak/

README.md

+69-1
Original file line numberDiff line numberDiff line change
@@ -1 +1,69 @@
1-
# azure-ai-projects-file-search
1+
# Sample Application using Agents from Azure AI Projects and File Search tool (Python)
2+
3+
This sample includes a simple Python [Quart](https://quart.palletsprojects.com/en/latest/) app that streams responses from OpenAI Assistant to an HTML/JS frontend using Server-Sent Events (SSEs). The application is configured to upload two documents under the `files` folder for use with the OpenAI Assistant's File Search tool.
4+
5+
The sample is designed for use with [Docker containers](https://www.docker.com/), both for local development and Azure deployment. For Azure deployment to [Azure Container Apps](https://learn.microsoft.com/azure/container-apps/overview), please use this [template](https://github.com/Azure-Samples/openai-chat-app-quickstart) and replace the `src` folder content with this application.
6+
7+
## Application Flow
8+
9+
This application utilizes agents from the azure-ai-projects SDK to interact with the Azure ML agents API. The following sequence diagram describes the interaction between each component in the system. More comprehensive logic related to thread management will be discussed in the next section:
10+
11+
```mermaid
12+
sequenceDiagram
13+
participant User
14+
participant Browser
15+
participant WebServer
16+
participant APIServer
17+
18+
WebServer->>APIServer: create_agent (post assistant API)
19+
APIServer-->>WebServer: return agent
20+
21+
User->>Browser: Open 'http://localhost:50505'
22+
Browser->>WebServer: /index
23+
WebServer-->>Browser: return HTML, JavaScript, CSS
24+
25+
User->>Browser: Type message and hit enter
26+
Browser->>WebServer: /chat
27+
WebServer->>APIServer: create_thread (post thread API)
28+
APIServer-->>WebServer: return thread
29+
30+
WebServer->>APIServer: create_message (post message API)
31+
APIServer-->>WebServer: return message
32+
33+
WebServer->>APIServer: create_stream (post run API)
34+
APIServer-->>WebServer: return chunk
35+
WebServer-->>Browser: return chunk (thread_id, agent_id in cookie)
36+
```
37+
38+
## Application Users and Thread Management
39+
40+
As a web application, it is designed to serve multiple users on multiple browsers. This application uses cookies to ensure that the same thread is reused for conversations across multiple tabs in the same browser. If the browser is restarted, the old thread will continue to serve the user. However, if the application has a new agent after a server restart or a thread is deleted, a new thread will be created without requiring a browser refresh or signaling to the users.
41+
42+
To achieve this, when users submit a message to the web server, the web server will create an agent, thread, and stream back a reply. The response contains `agent_id` and `thread_id` in cookies. As a result, each subsequent message sent to the web server will also contain these IDs. As long as the same agent is being used in the system and the thread can be retrieved in cookie, the same thread will be used to serve the users.
43+
44+
45+
## Local development with Docker
46+
47+
This sample includes a `docker-compose.yaml` for local development which creates a volume for the app code. That allows you to make changes to the code and see them instantly.
48+
49+
1. Install [Docker Desktop](https://www.docker.com/products/docker-desktop/). If you opened this inside Github Codespaces or a Dev Container in VS Code, installation is not needed. ⚠️ If you're on an Apple M1/M2, you won't be able to run `docker` commands inside a Dev Container; either use Codespaces or do not open the Dev Container.
50+
51+
2. Make sure that the `.env` file exists.
52+
53+
3. Store a keys and endpoint information (Azure) for the OpenAI resource in the `.env` file. The key should be stored in the `.env` file as `PROJECT_CONNECTION_STRING`. This is necessary because Docker containers don't have access to your user Azure credentials.
54+
55+
4. Start the services with this command:
56+
57+
```shell
58+
docker-compose up --build
59+
```
60+
61+
5. Click 'http://localhost:50505' in the browser to run the application.
62+
63+
## Example run
64+
65+
![File-Search-screenshot](assets/FileSearchAssistant.png)
66+
67+
## Deployment to Azure
68+
69+
As mentioned earlier, please integrate this app using [template](https://github.com/Azure-Samples/openai-chat-app-quickstart) and following the Azure Container App deployment steps there.

assets/FileSearchAssistant.png

243 KB
Loading

docker-compose.yaml

+10
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
services:
2+
app:
3+
build:
4+
context: ./src
5+
env_file:
6+
- .env
7+
ports:
8+
- 50505:50505
9+
volumes:
10+
- ./src:/code

src/.dockerignore

+3
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
.git*
2+
.venv/
3+
**/*.pyc

src/Dockerfile

+43
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
# ------------------- Stage 0: Base Stage ------------------------------
2+
FROM python:3.11-alpine AS base
3+
4+
WORKDIR /code
5+
6+
# Install tini, a tiny init for containers
7+
RUN apk add --update --no-cache tini
8+
9+
# Install required packages for cryptography package
10+
# https://cryptography.io/en/latest/installation/#building-cryptography-on-linux
11+
RUN apk add gcc musl-dev python3-dev libffi-dev openssl-dev cargo pkgconfig
12+
13+
14+
# ------------------- Stage 1: Build Stage ------------------------------
15+
FROM base AS build
16+
17+
COPY requirements.txt .
18+
19+
ADD packages packages
20+
21+
RUN pip3 install -r requirements.txt
22+
23+
COPY . .
24+
25+
# ------------------- Stage 2: Final Stage ------------------------------
26+
FROM base AS final
27+
28+
RUN addgroup -S app && adduser -S app -G app
29+
30+
COPY --from=build --chown=app:app /usr/local/lib/python3.11 /usr/local/lib/python3.11
31+
COPY --from=build --chown=app:app /usr/local/bin /usr/local/bin
32+
COPY --from=build --chown=app:app /code /code
33+
34+
# Copy the files directory
35+
COPY --chown=app:app files /code/files
36+
37+
# Set environment variables for Azure service principal
38+
39+
USER app
40+
41+
EXPOSE 50505
42+
43+
ENTRYPOINT ["tini", "gunicorn", "quartapp:create_app()"]

src/__init__.py

Whitespace-only changes.

src/files/product_info_1.md

+51
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
# Information about product item_number: 1
2+
3+
## Brand
4+
Contoso Galaxy Innovations
5+
6+
## Category
7+
Smart Eyewear
8+
9+
## Features
10+
- Augmented Reality interface
11+
- Voice-controlled AI assistant
12+
- HD video recording with 3D audio
13+
- UV protection and blue light filtering
14+
- Wireless charging with extended battery life
15+
16+
## User Guide
17+
18+
### 1. Introduction
19+
Introduction to your new SmartView Glasses
20+
21+
### 2. Product Overview
22+
Overview of features and controls
23+
24+
### 3. Sizing and Fit
25+
Finding your perfect fit and style adjustments
26+
27+
### 4. Proper Care and Maintenance
28+
Cleaning and caring for your SmartView Glasses
29+
30+
### 5. Break-in Period
31+
Adjusting to the augmented reality experience
32+
33+
### 6. Safety Tips
34+
Safety guidelines for public and private spaces
35+
36+
### 7. Troubleshooting
37+
Quick fixes for common issues
38+
39+
## Warranty Information
40+
Two-year limited warranty on all electronic components
41+
42+
## Contact Information
43+
Customer Support at [email protected]
44+
45+
## Return Policy
46+
30-day return policy with no questions asked
47+
48+
## FAQ
49+
- How to sync your SmartView Glasses with your devices
50+
- Troubleshooting connection issues
51+
- Customizing your augmented reality environment

src/files/product_info_2.md

+51
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
# Information about product item_number: 2
2+
3+
## Brand
4+
Contoso Quantum Comfort
5+
6+
## Category
7+
Self-Warming Blanket
8+
9+
## Features
10+
- Nano-fiber heating elements for even warmth distribution
11+
- Intelligent temperature control with machine learning preferences
12+
- Eco-friendly and energy-efficient design
13+
- Wireless and portable with up to 12 hours of battery life
14+
- Waterproof and machine washable material
15+
16+
## User Guide
17+
18+
### 1. Introduction
19+
Getting to know your new Self-Warming Blanket
20+
21+
### 2. Product Overview
22+
How to set up and operate your blanket
23+
24+
### 3. Sizing and Fit
25+
Selecting the ideal warmth setting for comfort
26+
27+
### 4. Proper Care and Maintenance
28+
Care instructions to maintain warmth and softness
29+
30+
### 5. Break-in Period
31+
What to expect during the first use
32+
33+
### 6. Safety Tips
34+
Best practices for safe use
35+
36+
### 7. Troubleshooting
37+
Common questions and solutions
38+
39+
## Warranty Information
40+
Three-year warranty with free technical support
41+
42+
## Contact Information
43+
Quantum Comfort Support at [email protected]
44+
45+
## Return Policy
46+
45-day satisfaction guarantee with full refund
47+
48+
## FAQ
49+
- How to pair the blanket with your smart home devices
50+
- Optimizing battery life for longer use
51+
- Adjusting blanket settings for different climates

src/gunicorn.conf.py

+20
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
import multiprocessing
2+
import os
3+
4+
from dotenv import load_dotenv
5+
6+
load_dotenv()
7+
8+
max_requests = 1000
9+
max_requests_jitter = 50
10+
log_file = "-"
11+
bind = "0.0.0.0:50505"
12+
13+
if not os.getenv("RUNNING_IN_PRODUCTION"):
14+
reload = True
15+
16+
num_cpus = multiprocessing.cpu_count()
17+
workers = 1 #(num_cpus * 2) + 1
18+
worker_class = "uvicorn.workers.UvicornWorker"
19+
20+
timeout = 120
Binary file not shown.

src/pyproject.toml

+20
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
[project]
2+
name = "quartapp"
3+
version = "1.0.0"
4+
description = "Create a simple chat app using Quart and OpenAI"
5+
dependencies = [
6+
"quart",
7+
"werkzeug",
8+
"gunicorn",
9+
"uvicorn[standard]",
10+
"openai",
11+
"azure-identity",
12+
"aiohttp",
13+
"python-dotenv",
14+
"pyyaml",
15+
"azure-ai-projects"
16+
]
17+
18+
[build-system]
19+
requires = ["flit_core<4"]
20+
build-backend = "flit_core.buildapi"

src/quartapp/__init__.py

+20
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
# Copyright (c) Microsoft. All rights reserved.
2+
# Licensed under the MIT license. See LICENSE.md file in the project root for full license information.
3+
4+
import logging
5+
import os
6+
from quart import Quart
7+
8+
9+
def create_app():
10+
if os.getenv("RUNNING_IN_PRODUCTION"):
11+
logging.basicConfig(level=logging.INFO)
12+
else:
13+
logging.basicConfig(level=logging.DEBUG)
14+
15+
app = Quart(__name__)
16+
17+
from . import chat # noqa
18+
19+
app.register_blueprint(chat.bp)
20+
return app

0 commit comments

Comments
 (0)