Skip to content
Open
Show file tree
Hide file tree
Changes from 40 commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
f44eb81
teacher env init
J-SUPHA Mar 6, 2026
530fed2
testing set up
J-SUPHA Mar 6, 2026
d5ca760
command change
J-SUPHA Mar 7, 2026
ad364ac
increase timeout cause vllm is super slow all of a sudden
J-SUPHA Mar 8, 2026
985311e
trial
J-SUPHA Mar 8, 2026
e563352
quicker training
J-SUPHA Mar 8, 2026
81f90a6
forgot something easy
J-SUPHA Mar 8, 2026
4f33ab8
apparently not so easy
J-SUPHA Mar 9, 2026
bb2736d
next
J-SUPHA Mar 10, 2026
64794e7
sneaky bug
J-SUPHA Mar 10, 2026
09ad401
sneaky bug logging
J-SUPHA Mar 10, 2026
d1fd89f
non blocking test
J-SUPHA Mar 10, 2026
057c9fe
shorten worker timeout
J-SUPHA Mar 10, 2026
e84686b
remove enforce eager
J-SUPHA Mar 10, 2026
e79af5f
testing config
J-SUPHA Mar 11, 2026
abba562
testing config
J-SUPHA Mar 11, 2026
82be871
testing config
J-SUPHA Mar 11, 2026
98a5d3b
testing config
J-SUPHA Mar 11, 2026
78c0a6d
tokenizer bug
J-SUPHA Mar 11, 2026
f1cfc13
tokenizer bug
J-SUPHA Mar 11, 2026
c275687
tokenizer bug
J-SUPHA Mar 11, 2026
3a440f8
tokenizer bug
J-SUPHA Mar 11, 2026
b457a67
tokenizer bug
J-SUPHA Mar 11, 2026
2f371e0
tokenizer bug
J-SUPHA Mar 11, 2026
8a348be
tokenizer bug
J-SUPHA Mar 11, 2026
34a3936
tokenizer bug
J-SUPHA Mar 12, 2026
fd5b426
tokenizer bug
J-SUPHA Mar 12, 2026
c37516b
tokenizer bug
J-SUPHA Mar 12, 2026
a54dfe7
tokenizer bug
J-SUPHA Mar 12, 2026
62ef2fc
training kernel
J-SUPHA Mar 12, 2026
c26432b
training kernel
J-SUPHA Mar 12, 2026
7ec622a
training ideas
J-SUPHA Mar 12, 2026
a43b0b7
training kernel
J-SUPHA Mar 12, 2026
690e670
investigating weird training issue
J-SUPHA Mar 12, 2026
3df0e45
investigating weird training issue
J-SUPHA Mar 13, 2026
d8857eb
investigating weird training issue
J-SUPHA Mar 13, 2026
d1b0dee
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 13, 2026
600c54f
clean log
J-SUPHA Mar 13, 2026
862cd36
clean logging
J-SUPHA Mar 13, 2026
148a4fd
remove training code
J-SUPHA Mar 13, 2026
a1b545c
remove cross tokenization and fix location of configs
J-SUPHA Mar 13, 2026
994e9c2
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 13, 2026
322e7e6
remove comments
J-SUPHA Mar 13, 2026
a8cdb53
address problems
J-SUPHA Mar 13, 2026
82964b6
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 13, 2026
697c594
changes
J-SUPHA Mar 13, 2026
6c56479
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 13, 2026
1b8ff07
adding tests
J-SUPHA Mar 13, 2026
12ba3cc
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 13, 2026
a171358
structural changes
J-SUPHA Mar 13, 2026
3a85ede
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 13, 2026
9bd299b
better logging for devex
J-SUPHA Mar 14, 2026
f053c77
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 14, 2026
805a0c0
revert to similar structure
J-SUPHA Mar 14, 2026
7aba0d3
fresh eyes check
J-SUPHA Mar 14, 2026
79baac1
clean
J-SUPHA Mar 17, 2026
41947e9
clean
J-SUPHA Mar 17, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 15 additions & 2 deletions atroposlib/envs/server_handling/openai_server.py
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think this may need to be reverted?

Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,7 @@ def resolve_openai_configs(
f"Error creating final OpenAI configuration from merged settings: {e}\n"
f"Merged Dict: {openai_config_dict}"
) from e
server_configs = final_openai_config
server_configs = [final_openai_config]
elif isinstance(default_server_configs, ServerBaseline):
# Pure ServerBaseline (not APIServerConfig) - no CLI overrides possible
logger.info("Using ServerBaseline configuration.")
Expand All @@ -231,7 +231,7 @@ def resolve_openai_configs(
) from e

if isinstance(default_server_configs, APIServerConfig):
server_configs = final_openai_config
server_configs = [final_openai_config]
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you pass a list of configs here it uses the configs directly. But if you pass a single non list config object, it goes into "template mode" and auto-generates server URLs/ports

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed — the issue was the wrong config shape here. I fixed it so this path now returns [final_openai_config] instead of a bare APIServerConfig.

elif isinstance(default_server_configs, list):
server_configs = [final_openai_config]
else:
Expand All @@ -241,4 +241,17 @@ def resolve_openai_configs(
)
server_configs = [final_openai_config]

if isinstance(server_configs, list):
logger.warning(
"resolve_openai_configs: returning list of %s config(s), URLs: %s",
len(server_configs),
[c.base_url for c in server_configs],
)
else:
logger.warning(
"resolve_openai_configs: returning single %s (base_url=%s) — "
"ServerManager will use template mode!",
type(server_configs).__name__,
getattr(server_configs, "base_url", "N/A"),
)
return server_configs
4 changes: 0 additions & 4 deletions atroposlib/envs/server_handling/vllm_server.py
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

revert

Original file line number Diff line number Diff line change
Expand Up @@ -424,10 +424,6 @@ def resolve_openai_configs(
elif isinstance(default_server_configs, list):
server_configs = [final_openai_config]
else:
logger.warning(
f"Unexpected type for default_server_configs: {type(default_server_configs)}. "
f"Proceeding with single OpenAI server configuration based on merged settings."
)
server_configs = [final_openai_config]

return server_configs
Expand Down
Loading
Loading