Skip to content
This repository has been archived by the owner on May 13, 2024. It is now read-only.

Commit

Permalink
docs: improve readme
Browse files Browse the repository at this point in the history
  • Loading branch information
Chris Lemke committed Apr 8, 2023
1 parent a03e52c commit ad5126b
Show file tree
Hide file tree
Showing 3 changed files with 76 additions and 18 deletions.
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,9 +55,10 @@ Return this Python function including the Google style Python docstrings.
The response should be in plain text and should only contain the function
itself. Don't put the code is a code block.
```
Now we can use Alfred's [Text Action](https://www.alfredapp.com/universal-actions/) and the [text transformation](#text-transformation-%EF%B8%8F) feature to let ChatGPT automatically add a docstring to a Python function:
Now we can use Alfred's [Text Action](https://www.alfredapp.com/universal-actions/) and the [text transformation](#text-transformation-%EF%B8%8F) feature (<kbd>fn</kbd> option) to let ChatGPT automatically add a docstring to a Python function:

![Screenshot](chatgpt_text_transformation.gif)
Check out [this Python script](https://github.com/chrislemke/ChatFred/blob/main/workflow/src/text_chat.py). All docstrings where automatically added by ChatGPT.

#### **Text transformation** ⚙️
This feature allows you to easily let ChatGPT transform your text using a pre-defined prompt. Just replace the default *ChatGPT transformation prompt* in the workflow's configuration with your own prompt. Use either the [Send to ChatGPT 💬 Universal Actions](#universal-action--combined-prompts-%EF%B8%8F) (option: <kbd>⇧</kbd>) to pass the highlighted text to ChatGPT using your transformation prompt. Or configure a hotkey to use the clipboard content.
Expand Down Expand Up @@ -221,7 +222,7 @@ You can tweak the workflow to your liking. The following parameters are availabl
- **Always read out reply**: If enabled, ChatFred will read out all replies automatically. Default: `off`.
- **Always save conversation to file**: If enabled, all your request and ChatFred's replies will automatically be saved to a file (`{File directory}/ChatFred.txt`). Only available for InstructGPT. Default: `off`.
- **File directory**: Custom directory where the 'ChatFred.txt' should be stored. Default to the user's home directory (`~/`).
- **Paste response to frontmost app**: If enabled, the response will be pasted to the frontmost app. If this feature is switched on, the response will not be shown in [Large Type](https://www.alfredapp.com/help/features/large-type/). Default: `off`.
- **Paste response to frontmost app**: If enabled, the response will be pasted to the frontmost app. If this feature is switched on, the response will not be shown in [Large Type](https://www.alfredapp.com/help/features/large-type/). Alternatively you can also use the option <kbd>⌘</kbd> <kbd>⌥</kbd> when sending the request to ChatGPT. Default: `off`.
- **Always copy to clipboard**: If enabled, all of ChatFred's replies will be copied to the clipboard automatically. Default: `on`.
- **Image size**: The size of the by DALL·E 2 generated image. Default: `512x512`.
- **Show notifications**: Shows all notifications provided by the workflow. For this, to work System notifications must be activated for Alfred. Default: `on`.
Expand Down
9 changes: 5 additions & 4 deletions info.plist
Original file line number Diff line number Diff line change
Expand Up @@ -2856,11 +2856,12 @@ or use ChatFred as a fallback search in Alfred:
![Screenshot](assets/images/screenshot8.png)
Or paste ChatGPT's response directly into the frontmost app with switching on the *Paste response to frontmost app* in the [workflow's configuration](https://www.alfredapp.com/help/workflows/user-configuration/) or using the &lt;kbd&gt;&lt;/kbd&gt; &lt;kbd&gt;&lt;/kbd&gt; option. This is especially useful in combination with the Text transformation feature.
![Screenshot](assets/images/screenshot18.png)
The results will always be shown in [Large Type](https://www.alfredapp.com/help/features/large-type/). Check out the workflow's configuration for more options (e.g. *Always copy reply to clipboard*).
ChatFred can also automatically paste ChatGPT's response directly into the frontmost app. Just switch on the *Paste response to frontmost app* in the [workflow's configuration](https://www.alfredapp.com/help/workflows/user-configuration/) or use the &lt;kbd&gt;&lt;/kbd&gt; &lt;kbd&gt;&lt;/kbd&gt; option.
The results will always be shown in [Large Type](https://www.alfredapp.com/help/features/large-type/). Check out the [workflow's configuration](https://www.alfredapp.com/help/workflows/user-configuration/) for more options (e.g. *Always copy reply to clipboard* or *Paste response to frontmost app*).
![Screenshot](assets/images/screenshot18.png)
#### **Text transformation** ⚙️
This feature allows you to easily let ChatGPT transform your text using a pre-defined prompt. Just replace the default *ChatGPT transformation prompt* in the workflow's configuration with your own prompt. Use either Send to ChatGPT 💬 Universal Actions (option: &lt;kbd&gt;&lt;/kbd&gt;) to pass the highlighted text to ChatGPT using your transformation prompt. Or configure a hotkey to use the clipboard content.
Expand Down Expand Up @@ -2989,7 +2990,7 @@ You can tweak the workflow to your liking. The following parameters are availabl
- **File directory**: Custom directory where the 'ChatFred.txt' should be stored. Default to the user's home directory (`~/`).
- **Always copy to clipboard**: If enabled, all of ChatFred's replies will be copied to the clipboard automatically. Default: `on`.
- **Image size**: The size of the by DALL·E 2 generated image. Default: `512x512`.
- **Paste response to frontmost app**: If enabled, the response will be pasted to the frontmost app. If this feature is switched on, the response will not be shown in [Large Type](https://www.alfredapp.com/help/features/large-type/). Default: `off`.
- **Paste response to frontmost app**: If enabled, the response will be pasted to the frontmost app. If this feature is switched on, the response will not be shown in [Large Type](https://www.alfredapp.com/help/features/large-type/). Alternatively you can also use the option &lt;kbd&gt;&lt;/kbd&gt; &lt;kbd&gt;&lt;/kbd&gt; when sending the request to ChatGPT. Default: `off`.
- **Show notifications**: Shows all notifications provided by the workflow. For this, to work System notifications must be activated for Alfred. Default: `on`.
- **Show ChatGPT is thinking message**: Shows the message: "💭 Stay tuned... ChatGPT is thinking" while OpenAI is processing your request. Default: `on`.
Expand Down
80 changes: 68 additions & 12 deletions workflow/src/text_chat.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
"""This module contains the chatGPT API."""
"""This module contains the ChatGPT API."""

import csv
import functools
import os
import sys
import time
import uuid
from typing import List, Optional, Tuple
from typing import Dict, List, Optional, Tuple

from aliases_manager import prompt_for_alias
from caching_manager import read_from_cache, write_to_cache
Expand Down Expand Up @@ -39,6 +39,15 @@


def time_it(func):
"""A decorator function that times the execution of a given function.
Args:
func: The function to be timed.
Returns:
A wrapper function that times the execution of the input function.
"""

@functools.wraps(func)
def timeit_wrapper(*args):
start_time = time.perf_counter()
Expand All @@ -55,18 +64,36 @@ def timeit_wrapper(*args):


def get_query() -> str:
"""Join the arguments into a query string."""
"""Returns a string of all command line arguments passed to the script.
Returns:
str: A string of all command line arguments passed to the script.
"""
return " ".join(sys.argv[1:])


def stdout_write(output_string: str) -> None:
"""Writes the response to stdout."""
"""Writes the given string to the standard output.
Args:
output_string (str): The string to be written to the standard output.
Returns:
None
"""
output_string = "..." if output_string == "" else output_string
sys.stdout.write(output_string)


def exit_on_error() -> None:
"""Checks the environment variables for invalid values."""
"""Exits the program and shows some message if there is an error.
Args:
None
Returns:
None
"""
error = env_value_error_if_needed(
__temperature,
__model,
Expand All @@ -81,6 +108,13 @@ def exit_on_error() -> None:

@time_it
def read_from_log() -> List[Tuple[str, str]]:
"""Reads the last __history_length entries from the log file.
Returns:
List[Tuple[str, str]]: A list of tuples containing the last __history_length entries from the log file.
Each tuple contains two strings: the first is the timestamp and the second is the log message.
If the log file does not exist, returns a list with one empty tuple.
"""
if os.path.isfile(__log_file_path) is False:
return [("", "")]

Expand All @@ -99,11 +133,18 @@ def read_from_log() -> List[Tuple[str, str]]:
def write_to_log(
user_input: str, assistant_output: str, jailbreak_prompt: Optional[str] = None
) -> None:
"""Writes the user input and the assistant output to the log file."""
"""Writes user input and assistant output to a log file.
Args:
user_input (str): The user input to be logged.
assistant_output (str): The assistant output to be logged.
jailbreak_prompt (Optional[str]): A prompt to be logged if the user attempts to jailbreak the system. Defaults to None.
Returns:
None
"""
if not os.path.exists(__workflow_data_path):
os.makedirs(__workflow_data_path)

with open(__log_file_path, "a+") as csv_file:
csv.register_dialect("custom", delimiter=" ", skipinitialspace=True)
writer = csv.writer(csv_file, dialect="custom")
Expand All @@ -120,8 +161,15 @@ def remove_log_file() -> None:
os.remove(__log_file_path)


def intercept_custom_prompts(prompt: str):
"""Intercepts custom queries."""
def intercept_custom_prompts(prompt: str) -> None:
"""Intercepts custom queries.
Args:
prompt (str): The prompt to intercept.
Returns:
None
"""
last_request_successful = read_from_cache("last_chat_request_successful")
if prompt in error_prompts and not last_request_successful:
stdout_write(
Expand All @@ -137,8 +185,16 @@ def intercept_custom_prompts(prompt: str):


@time_it
def create_message(prompt: str):
"""Creates the messages for the OpenAI API request."""
def create_message(prompt: str) -> List[Dict[str, str]]:
"""Creates a message to be sent to the model.
Args:
prompt (str): The prompt to be included in the message.
Returns:
List[Dict[str, str]]: A list of dictionaries representing the message,
with each dictionary containing the role and content of the message.
"""
transformation_pre_prompt = """You are a helpful assistant who interprets every input as raw
text unless instructed otherwise. Your answers do not include a description unless prompted to do so.
Also drop any "`" characters from the your response."""
Expand Down Expand Up @@ -176,7 +232,7 @@ def make_chat_request(
frequency_penalty: float,
presence_penalty: float,
) -> Tuple[str, str]:
"""Sends a chat request to OpenAI's GPT-3 model and returns the prompt and
"""Sends a chat request to OpenAI's GTP-3 model and returns the prompt and
response as a tuple.
Args:
Expand Down

0 comments on commit ad5126b

Please sign in to comment.