-
-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use system_prompt in solver configuration (breaking change) #19
base: dev
Are you sure you want to change the base?
Conversation
Cleaned system_prompt saving in init method Changed references to initial_prompt to system_prompt to unify Bumped major version to signify breaking change
WalkthroughThis pull request updates the configuration parameters and internal class structure. It replaces the parameters Changes
Sequence Diagram(s)sequenceDiagram
participant U as User
participant OCS as OpenAIChatCompletionsSolver
participant API as OpenAI Completions API
U->>OCS: Call get_spoken_answer(query)
OCS->>API: Send query with system_prompt
API-->>OCS: Return spoken answer
OCS-->>U: Deliver answer
Possibly related PRs
Poem
Tip 🌐 Web search-backed reviews and chat
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
🔥 Files not summarized due to errors (1)
✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
ovos_solver_openai_persona/__init__.py (1)
37-42
: Consider adding docstring for the new method.The new
stream_chat_utterances
method implementation looks good, but it's missing documentation.Add a docstring to improve code maintainability:
def stream_chat_utterances(self, messages: List[Dict[str, str]], lang: Optional[str] = None, units: Optional[str] = None) -> Iterable[str]: + """Stream chat utterances with system prompt prepended. + + Args: + messages (List[Dict[str, str]]): List of message dictionaries with 'role' and 'content'. + lang (Optional[str], optional): Language code. Defaults to None. + units (Optional[str], optional): Units for the query. Defaults to None. + + Yields: + str: Generated utterances from the chat model. + """ messages = [{"role": "system", "content": self.system_prompt }] + messages answer = super().stream_chat_utterances(messages, lang, units) yield from answer
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
README.md
(2 hunks)ovos_solver_openai_persona/__init__.py
(3 hunks)ovos_solver_openai_persona/engines.py
(2 hunks)ovos_solver_openai_persona/version.py
(1 hunks)
🔥 Files not summarized due to errors (1)
- ovos_solver_openai_persona/engines.py: Error: Disallowed special token found: <|im_end|>
✅ Files skipped from review due to trivial changes (1)
- ovos_solver_openai_persona/version.py
🔇 Additional comments (4)
ovos_solver_openai_persona/__init__.py (2)
12-12
: LGTM! Good default system prompt.The default system prompt is well-crafted, providing clear personality traits that align with a helpful assistant.
15-15
: LGTM! Clean persona parameter handling.The method correctly forwards the system prompt to the parent class, maintaining backward compatibility by keeping the unused
persona
parameter.README.md (2)
16-16
: LGTM! Clear example of system prompt usage.The example demonstrates the new parameter name with a clear and descriptive prompt value.
36-36
: LGTM! Configuration example updated correctly.The configuration example has been updated to use the new parameter name with a simple, clear default value.
Removed specific OpenAIPersonaSolver class to use OpenAIChatCompletionsSolver Moved specifics for system_prompt to OpenAIChatCompletionsSolver Updated docs to reflect changes
…rent class Prepend message system prompt to continue_chat method when not streaming utterances from OpenAI API
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
ovos_solver_openai_persona/__init__.py (1)
26-26
: Consider updating example comments to reflect new class.The bot instantiation correctly uses OpenAIChatCompletionsSolver. However, the example output comments below might benefit from being regenerated using the new class to ensure they accurately reflect current behavior.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
ovos_solver_openai_persona/__init__.py
(2 hunks)ovos_solver_openai_persona/engines.py
(5 hunks)
🔥 Files not summarized due to errors (1)
- ovos_solver_openai_persona/engines.py: Error: Disallowed special token found: <|im_end|>
🔇 Additional comments (2)
ovos_solver_openai_persona/__init__.py (2)
1-1
: LGTM!The warnings import is correctly added to support the deprecation warning.
5-11
: Implementation matches requested changes.The deprecation warning has been implemented exactly as requested by JarbasAl, with proper warning message, deprecation type, and stack level.
Summary by CodeRabbit