-
Notifications
You must be signed in to change notification settings - Fork 294
Teacherenv #416
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Teacherenv #416
Changes from 40 commits
f44eb81
530fed2
d5ca760
ad364ac
985311e
e563352
81f90a6
4f33ab8
bb2736d
64794e7
09ad401
d1fd89f
057c9fe
e84686b
e79af5f
abba562
82be871
98a5d3b
78c0a6d
f1cfc13
c275687
3a440f8
b457a67
2f371e0
8a348be
34a3936
fd5b426
c37516b
a54dfe7
62ef2fc
c26432b
7ec622a
a43b0b7
690e670
3df0e45
d8857eb
d1b0dee
600c54f
862cd36
148a4fd
a1b545c
994e9c2
322e7e6
a8cdb53
82964b6
697c594
6c56479
1b8ff07
12ba3cc
a171358
3a85ede
9bd299b
f053c77
805a0c0
7aba0d3
79baac1
41947e9
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -210,7 +210,7 @@ def resolve_openai_configs( | |
| f"Error creating final OpenAI configuration from merged settings: {e}\n" | ||
| f"Merged Dict: {openai_config_dict}" | ||
| ) from e | ||
| server_configs = final_openai_config | ||
| server_configs = [final_openai_config] | ||
| elif isinstance(default_server_configs, ServerBaseline): | ||
| # Pure ServerBaseline (not APIServerConfig) - no CLI overrides possible | ||
| logger.info("Using ServerBaseline configuration.") | ||
|
|
@@ -231,7 +231,7 @@ def resolve_openai_configs( | |
| ) from e | ||
|
|
||
| if isinstance(default_server_configs, APIServerConfig): | ||
| server_configs = final_openai_config | ||
| server_configs = [final_openai_config] | ||
|
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If you pass a list of configs here it uses the configs directly. But if you pass a single non list config object, it goes into "template mode" and auto-generates server URLs/ports
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I mean, you're not supposed to pass this in like that
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Agreed — the issue was the wrong config shape here. I fixed it so this path now returns |
||
| elif isinstance(default_server_configs, list): | ||
| server_configs = [final_openai_config] | ||
| else: | ||
|
|
@@ -241,4 +241,17 @@ def resolve_openai_configs( | |
| ) | ||
| server_configs = [final_openai_config] | ||
|
|
||
| if isinstance(server_configs, list): | ||
| logger.warning( | ||
| "resolve_openai_configs: returning list of %s config(s), URLs: %s", | ||
| len(server_configs), | ||
| [c.base_url for c in server_configs], | ||
| ) | ||
| else: | ||
| logger.warning( | ||
| "resolve_openai_configs: returning single %s (base_url=%s) — " | ||
| "ServerManager will use template mode!", | ||
| type(server_configs).__name__, | ||
| getattr(server_configs, "base_url", "N/A"), | ||
| ) | ||
| return server_configs | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. revert |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i think this may need to be reverted?