-
-
Notifications
You must be signed in to change notification settings - Fork 5
Use system_prompt in solver configuration (breaking change) #19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThis update deprecates the Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant Plugin
participant OpenAICompletionsSolver
participant OpenAIChatCompletionsSolver
User->>Plugin: Sends query
Plugin->>OpenAIChatCompletionsSolver: Passes query and config (with system_prompt)
OpenAIChatCompletionsSolver->>OpenAICompletionsSolver: Ensures system_prompt is present
OpenAICompletionsSolver->>OpenAI API: Sends request with system_prompt and query
OpenAI API-->>OpenAICompletionsSolver: Returns response
OpenAICompletionsSolver-->>OpenAIChatCompletionsSolver: Forwards response
OpenAIChatCompletionsSolver-->>Plugin: Returns processed answer
Plugin-->>User: Delivers answer
Possibly related PRs
Suggested labels
Suggested reviewers
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
✨ Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
ovos_solver_openai_persona/__init__.py (1)
37-42
: Consider adding docstring for the new method.The new
stream_chat_utterances
method implementation looks good, but it's missing documentation.Add a docstring to improve code maintainability:
def stream_chat_utterances(self, messages: List[Dict[str, str]], lang: Optional[str] = None, units: Optional[str] = None) -> Iterable[str]: + """Stream chat utterances with system prompt prepended. + + Args: + messages (List[Dict[str, str]]): List of message dictionaries with 'role' and 'content'. + lang (Optional[str], optional): Language code. Defaults to None. + units (Optional[str], optional): Units for the query. Defaults to None. + + Yields: + str: Generated utterances from the chat model. + """ messages = [{"role": "system", "content": self.system_prompt }] + messages answer = super().stream_chat_utterances(messages, lang, units) yield from answer
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
README.md
(2 hunks)ovos_solver_openai_persona/__init__.py
(3 hunks)ovos_solver_openai_persona/engines.py
(2 hunks)ovos_solver_openai_persona/version.py
(1 hunks)
🔥 Files not summarized due to errors (1)
- ovos_solver_openai_persona/engines.py: Error: Disallowed special token found: <|im_end|>
✅ Files skipped from review due to trivial changes (1)
- ovos_solver_openai_persona/version.py
🔇 Additional comments (4)
ovos_solver_openai_persona/__init__.py (2)
12-12
: LGTM! Good default system prompt.The default system prompt is well-crafted, providing clear personality traits that align with a helpful assistant.
15-15
: LGTM! Clean persona parameter handling.The method correctly forwards the system prompt to the parent class, maintaining backward compatibility by keeping the unused
persona
parameter.README.md (2)
16-16
: LGTM! Clear example of system prompt usage.The example demonstrates the new parameter name with a clear and descriptive prompt value.
36-36
: LGTM! Configuration example updated correctly.The configuration example has been updated to use the new parameter name with a simple, clear default value.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
ovos_solver_openai_persona/__init__.py (1)
26-26
: Consider updating example comments to reflect new class.The bot instantiation correctly uses OpenAIChatCompletionsSolver. However, the example output comments below might benefit from being regenerated using the new class to ensure they accurately reflect current behavior.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
ovos_solver_openai_persona/__init__.py
(2 hunks)ovos_solver_openai_persona/engines.py
(5 hunks)
🔥 Files not summarized due to errors (1)
- ovos_solver_openai_persona/engines.py: Error: Disallowed special token found: <|im_end|>
🔇 Additional comments (2)
ovos_solver_openai_persona/__init__.py (2)
1-1
: LGTM!The warnings import is correctly added to support the deprecation warning.
5-11
: Implementation matches requested changes.The deprecation warning has been implemented exactly as requested by JarbasAl, with proper warning message, deprecation type, and stack level.
Cleaned system_prompt saving in init method Changed references to initial_prompt to system_prompt to unify Bumped major version to signify breaking change
Removed specific OpenAIPersonaSolver class to use OpenAIChatCompletionsSolver Moved specifics for system_prompt to OpenAIChatCompletionsSolver Updated docs to reflect changes
…rent class Prepend message system prompt to continue_chat method when not streaming utterances from OpenAI API
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
ovos_solver_openai_persona/engines.py (1)
110-115
: Good deprecation pattern for system_prompt implementationThe code properly handles the transition from 'initial_prompt' to 'system_prompt' with:
- Clear deprecation warning
- Fallback to initial_prompt if system_prompt is not set
- Default value if neither is provided
Consider using LOG.warning instead of LOG.error for the default message since this is a fallback behavior rather than a true error condition.
- LOG.error(f"system prompt not set in config! defaulting to '{self.system_prompt}'") + LOG.warning(f"system prompt not set in config! defaulting to '{self.system_prompt}'")
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
ovos_solver_openai_persona/__init__.py
(1 hunks)ovos_solver_openai_persona/dialog_transformers.py
(1 hunks)ovos_solver_openai_persona/engines.py
(6 hunks)setup.py
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (3)
- ovos_solver_openai_persona/dialog_transformers.py
- setup.py
- ovos_solver_openai_persona/init.py
🔇 Additional comments (5)
ovos_solver_openai_persona/engines.py (5)
27-27
: Model upgrade is a breaking change, but well-documentedChanging the default model from "text-davinci-002" to "gpt-4o-mini" is a significant update that might affect response behavior, token usage, and costs for users. This change aligns with the PR title indicating a breaking change, which is good for transparency.
187-190
: Parameter update for get_chat_history is consistentThe method now accepts an optional system_prompt parameter and uses the same fallback logic as the class initialization. The fallback chain (parameter -> self.system_prompt -> default) is clear and maintainable.
198-200
: Simplified get_messages implementationThe updated method correctly forwards the system_prompt parameter to get_chat_history, maintaining the function's purpose while supporting the new system_prompt configuration.
217-219
: Ensure system messages are present in continue_chatThis is a good defensive check to ensure the first message is always a system message, which is required for consistent behavior with the OpenAI API. The implementation automatically prepends the system prompt if needed, providing backward compatibility.
242-244
: Ensure system messages are present in stream_chat_utterancesSimilar to the continue_chat method, this change ensures the conversation always starts with a system message. Maintaining this consistency across both methods is important for reliable behavior.
Docstrings generation was requested by @JarbasAl. * #19 (comment) The following files were modified: * `ovos_solver_openai_persona/__init__.py` * `ovos_solver_openai_persona/dialog_transformers.py` * `ovos_solver_openai_persona/engines.py`
Note Generated docstrings for this pull request at #26 |
* 📝 Add docstrings to `patch-1` Docstrings generation was requested by @JarbasAl. * #19 (comment) The following files were modified: * `ovos_solver_openai_persona/__init__.py` * `ovos_solver_openai_persona/dialog_transformers.py` * `ovos_solver_openai_persona/engines.py` * Update engines.py --------- Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Co-authored-by: JarbasAI <[email protected]>
* Update downstream dependencies for ovos-openai-plugin * Update downstream dependencies for ovos-openai-plugin * Update downstream dependencies for ovos-openai-plugin * Update downstream dependencies for ovos-openai-plugin * Update downstream dependencies for ovos-openai-plugin * Merge pull request #19 from Delfshkrimm/patch-1 Use system_prompt in solver configuration (breaking change) * 📝 Add docstrings to `patch-1` (#26) * 📝 Add docstrings to `patch-1` Docstrings generation was requested by @JarbasAl. * #19 (comment) The following files were modified: * `ovos_solver_openai_persona/__init__.py` * `ovos_solver_openai_persona/dialog_transformers.py` * `ovos_solver_openai_persona/engines.py` * Update engines.py --------- Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Co-authored-by: JarbasAI <[email protected]> * Increment Version to 2.0.2a1 * Update Changelog * Update README.md * Update downstream dependencies for ovos-openai-plugin * Update README.md * Update README.md * Update README.md * Update release_workflow.yml * Increment Version to 2.0.2a2 * fix: setup.py (#29) * Increment Version to 2.0.2a3 * Update Changelog * Update __init__.py --------- Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com> Co-authored-by: Delfshkrimm <[email protected]> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Co-authored-by: JarbasAI <[email protected]> Co-authored-by: JarbasAl <[email protected]>
Summary by CodeRabbit