Skip to content

Commit 067ce3f

Browse files
committed
initial commit - ollama and groq connections
0 parents  commit 067ce3f

15 files changed

+443
-0
lines changed

Diff for: .gitignore

+12
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
.conda
2+
.env
3+
4+
# Byte-compiled / optimized / DLL files
5+
__pycache__/
6+
*.py[cod]
7+
*$py.class
8+
9+
10+
# Logs and databases
11+
*.log
12+
*.sqlite3

Diff for: LICENSE

+21
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) 2024 Vaskin Kissoyan
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE.

Diff for: README.md

+109
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,109 @@
1+
# 🧠 Semantic Kernel Python Tutorial 101 🐍
2+
3+
Welcome to the Semantic Kernel Python Tutorial 101! This repository provides a hands-on introduction to using Semantic Kernel with Python, demonstrating integration with various AI services.
4+
5+
## 🚀 Getting Started
6+
7+
### Prerequisites
8+
9+
- Python 3.10+
10+
- pip (Python package manager)
11+
12+
### Installation
13+
14+
1. Clone this repository:
15+
```
16+
git clone https://github.com/killerapp/semantic-kernel-python-tutorial.git
17+
cd semantic-kernel-python-tutorial
18+
```
19+
20+
2. Install the required packages:
21+
```
22+
pip install -r requirements.txt
23+
```
24+
25+
## 🔧 Configuration
26+
27+
Before running the examples, you need to set up your environment variables. We provide sample `.env` files for different services:
28+
29+
- [.env.ollama.example](sk-ollama/.env.ollama.example) - For Ollama
30+
- [.env.groq.example](sk-groq/.env.groq.example) - For Groq
31+
32+
Copy the appropriate `.env.*.example` file to `.env` and fill in your API keys and other required information.
33+
34+
## 📚 Examples
35+
36+
### Ollama Example
37+
38+
Run the Ollama example with:
39+
40+
```
41+
cd sk-ollama
42+
python sk-ollama.py
43+
```
44+
45+
This script demonstrates how to use Semantic Kernel with an Ollama server. Key features:
46+
47+
48+
```19:31:sk-ollama.py
49+
execution_settings = OllamaChatPromptExecutionSettings()
50+
51+
kernel = Kernel()
52+
53+
# Alternative using Ollama:
54+
service_id="ollama"
55+
kernel.add_service(
56+
OllamaChatCompletion(
57+
service_id=service_id
58+
)
59+
)
60+
```
61+
62+
63+
### Groq Example
64+
65+
Run the Groq example with:
66+
67+
```
68+
cd sk-groq
69+
python sk-groq.py
70+
```
71+
72+
This script shows how to integrate Semantic Kernel with Groq. Notable sections:
73+
74+
75+
```23:32:sk-groq.py
76+
# Use Groq:
77+
service_id = "groq"
78+
service = OpenAIChatCompletion(
79+
service_id=service_id,
80+
api_key=os.getenv("GROQ_API_KEY"),
81+
org_id=os.getenv("GROQ_ORG_ID"),
82+
ai_model_id=os.getenv("GROQ_MODEL")
83+
)
84+
service.client.base_url = os.getenv("GROQ_BASE_URL")
85+
kernel.add_service(service=service)
86+
```
87+
88+
89+
## 🧩 Plugins
90+
91+
This tutorial includes one sample plugin to demonstrate Semantic Kernel's capabilities:
92+
93+
- FunPlugin: Generates jokes, limericks, and creative excuses.
94+
95+
You can find plugins in the `plugins/` directory.
96+
97+
## 📝 License
98+
99+
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
100+
101+
## 🤝 Contributing
102+
103+
Contributions are welcome! Please feel free to submit a Pull Request.
104+
105+
## 📬 Contact
106+
107+
If you have any questions or feedback, please open an issue in this repository.
108+
109+
Happy coding with Semantic Kernel! 🎉

Diff for: plugins/FunPlugin/Excuses/config.json

+13
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
{
2+
"schema": 1,
3+
"description": "Turn a scenario into a creative or humorous excuse to send your boss",
4+
"execution_settings": {
5+
"default": {
6+
"max_tokens": 200,
7+
"temperature": 0.5,
8+
"top_p": 0.0,
9+
"presence_penalty": 0.0,
10+
"frequency_penalty": 0.0
11+
}
12+
}
13+
}

Diff for: plugins/FunPlugin/Excuses/skprompt.txt

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
Generate a creative reason or excuse for the given event. Be creative and be funny. Let your imagination run wild.
2+
Don't output anything else just the excuse as if you're talking to the boss.
3+
4+
Event:I am running late.
5+
Excuse:I was being held ransom by giraffe gangsters.
6+
7+
Event:{{$input}}

Diff for: plugins/FunPlugin/Joke/config.json

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
{
2+
"schema": 1,
3+
"description": "Generate a funny joke",
4+
"execution_settings": {
5+
"default": {
6+
"max_tokens": 1000,
7+
"temperature": 0.9,
8+
"top_p": 0.0,
9+
"presence_penalty": 0.0,
10+
"frequency_penalty": 0.0
11+
}
12+
},
13+
"input_variables": [
14+
{
15+
"name": "input",
16+
"description": "Joke subject",
17+
"default": ""
18+
},
19+
{
20+
"name": "style",
21+
"description": "Give a hint about the desired joke style",
22+
"default": ""
23+
}
24+
]
25+
}

Diff for: plugins/FunPlugin/Joke/skprompt.txt

+13
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
WRITE EXACTLY ONE JOKE or HUMOROUS STORY ABOUT THE TOPIC BELOW
2+
3+
JOKE MUST BE:
4+
- G RATED
5+
- WORKPLACE/FAMILY SAFE
6+
NO SEXISM, RACISM OR OTHER BIAS/BIGOTRY
7+
8+
BE CREATIVE AND FUNNY. I WANT TO LAUGH.
9+
Incorporate the style suggestion, if provided: {{$style}}
10+
+++++
11+
12+
{{$input}}
13+
+++++

Diff for: plugins/FunPlugin/Limerick/config.json

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
{
2+
"schema": 1,
3+
"description": "Generate a funny limerick about a person",
4+
"execution_settings": {
5+
"default": {
6+
"max_tokens": 100,
7+
"temperature": 0.7,
8+
"top_p": 0,
9+
"presence_penalty": 0,
10+
"frequency_penalty": 0
11+
}
12+
},
13+
"input_variables": [
14+
{
15+
"name": "name",
16+
"description": "",
17+
"default": "Bob"
18+
},
19+
{
20+
"name": "input",
21+
"description": "",
22+
"default": "Dogs"
23+
}
24+
]
25+
}

Diff for: plugins/FunPlugin/Limerick/skprompt.txt

+27
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
There was a young woman named Bright,
2+
Whose speed was much faster than light.
3+
She set out one day,
4+
In a relative way,
5+
And returned on the previous night.
6+
7+
There was an odd fellow named Gus,
8+
When traveling he made such a fuss.
9+
He was banned from the train,
10+
Not allowed on a plane,
11+
And now travels only by bus.
12+
13+
There once was a man from Tibet,
14+
Who couldn't find a cigarette
15+
So he smoked all his socks,
16+
and got chicken-pox,
17+
and had to go to the vet.
18+
19+
There once was a boy named Dan,
20+
who wanted to fry in a pan.
21+
He tried and he tried,
22+
and eventually died,
23+
that weird little boy named Dan.
24+
25+
Now write a very funny limerick about {{$name}}.
26+
{{$input}}
27+
Invent new facts their life. Must be funny.

Diff for: sk-groq/.env.groq.example

+5
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
# Groq for Semantic Kernel Shim via OpenAI API
2+
GROQ_API_KEY=<your_api_key>
3+
GROQ_MODEL=llama-3.1-70b-versatile
4+
GROQ_ORG_ID=<your_org_id>
5+
GROQ_BASE_URL="https://api.groq.com/openai/v1"

Diff for: sk-groq/requirements.txt

Whitespace-only changes.

Diff for: sk-groq/sk-groq.py

+84
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,84 @@
1+
# Semantic Kernel Labs 101 - Python
2+
# Example for using Semantic Kernel with Groq
3+
from dotenv import load_dotenv
4+
load_dotenv()
5+
6+
import asyncio
7+
import os
8+
9+
# Semantic Kernel packages
10+
from semantic_kernel import Kernel
11+
from semantic_kernel.functions import KernelArguments
12+
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
13+
from semantic_kernel.connectors.ai.open_ai.prompt_execution_settings.open_ai_prompt_execution_settings import OpenAIChatPromptExecutionSettings
14+
from semantic_kernel.prompt_template import PromptTemplateConfig
15+
# Semantic Kernel packages - end
16+
17+
execution_settings = OpenAIChatPromptExecutionSettings()
18+
19+
# Kernel Setup
20+
kernel = Kernel()
21+
# Use Groq:
22+
service_id = "groq"
23+
service = OpenAIChatCompletion(
24+
service_id=service_id,
25+
api_key=os.getenv("GROQ_API_KEY"),
26+
org_id=os.getenv("GROQ_ORG_ID"),
27+
ai_model_id=os.getenv("GROQ_MODEL")
28+
)
29+
service.client.base_url = os.getenv("GROQ_BASE_URL")
30+
kernel.add_service(service=service)
31+
32+
# Prompt based example
33+
prompt = """
34+
1) A robot may not injure a human being or, through inaction,
35+
allow a human being to come to harm.
36+
37+
2) A robot must obey orders given it by human beings except where
38+
such orders would conflict with the First Law.
39+
40+
3) A robot must protect its own existence as long as such protection
41+
does not conflict with the First or Second Law.
42+
43+
Give me the TLDR in exactly 8 words."""
44+
45+
prompt_template_config = PromptTemplateConfig(
46+
template=prompt,
47+
name="tldr",
48+
template_format="semantic-kernel",
49+
)
50+
51+
sk_function = kernel.add_function(
52+
function_name="tldr_function",
53+
plugin_name="tldr_plugin",
54+
prompt_template_config=prompt_template_config,
55+
prompt_execution_settings=execution_settings
56+
)
57+
58+
# Plugin based implementation - see the 'plugins' directory
59+
plugins_directory = "../plugins"
60+
plugin = kernel.add_plugin(parent_directory=plugins_directory, plugin_name="FunPlugin")
61+
arguments = KernelArguments(input="time travel to dinosaur age", style="super silly")
62+
63+
# Run your prompt
64+
# Note: functions are run asynchronously
65+
async def main():
66+
print("\n=== Prompt Template Example ===")
67+
print("Prompt: Give me the TLDR of Asimov's Three Laws of Robotics in exactly 8 words.")
68+
result = await kernel.invoke(sk_function)
69+
print("Result:", result)
70+
71+
print("\n=== Plugin Example ===")
72+
print("Using the FunPlugin to generate an excuse...")
73+
input_scenario = "time travel to dinosaur age"
74+
style = "super silly"
75+
print(f"Scenario: {input_scenario}")
76+
print(f"Style: {style}")
77+
78+
arguments = KernelArguments(input=input_scenario, style=style)
79+
result = await kernel.invoke(plugin["Excuses"], arguments)
80+
print("Generated Excuse:")
81+
print(result)
82+
83+
if __name__ == "__main__":
84+
asyncio.run(main())

Diff for: sk-ollama/.env.ollama.example

+8
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
# Semantic Kernel Configuration
2+
GLOBAL_LLM_SERVICE="OpenAI"
3+
# OLLAMA_HOST=https://ollama.example.com/
4+
OLLAMA_MODEL=llama3.1
5+
6+
# Google Connector reads from GOOGLE_API_KEY for the Custom Search API
7+
GOOGLE_API_KEY=<key>
8+
GOOGLE_SEARCH_ENGINE_ID=<id>

Diff for: sk-ollama/requirements.txt

+3
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
python-dotenv
2+
semantic-kernel
3+
ollama

0 commit comments

Comments
 (0)