Skip to content

Conversation

@cdoern
Copy link
Collaborator

@cdoern cdoern commented Nov 19, 2025

What does this PR do?

the build.yaml is only used in the following ways:

  1. list-deps
  2. distribution code-gen

since llama stack build no longer exists, I found myself asking "why do we need two different files for list-deps and run"?

Removing the BuildConfig and altering the usage of the DistributionTemplate in llama stack list-deps is the first step in removing the build yaml entirely.

Removing the BuildConfig and build.yaml cuts the files users need to maintain in half, and allows us to focus on the stability of just the run.yaml

The build.yaml made sense for when we were managing the build process for the user and actually producing a run.yaml from the build.yaml, but now that we are simply just getting the provider registry and listing the deps, switching to run.yaml simplifies the scope here greatly.

Test Plan

existing list-deps usage should work in the tests.

@cdoern
Copy link
Collaborator Author

cdoern commented Nov 19, 2025

hmmm, need to fix CI. seems the env variables are causing issues.

@raghotham
Copy link
Contributor

Is there a corresponding change to documentation?

@cdoern
Copy link
Collaborator Author

cdoern commented Nov 19, 2025

@raghotham , I believe the current documentation only shows llama stack list-deps DISTRO_NAME rather than using a specific yaml config. I will do a sweep of the docs to make sure though!

@cdoern cdoern force-pushed the rm-build branch 3 times, most recently from 3557032 to ed55bcc Compare November 19, 2025 20:42
the build.yaml is only used in the following ways:

1. list-deps
2. distribution code-gen

since `llama stack build` no longer exists, I found myself asking "why do we need two different files for list-deps and run"?

Removing the BuildConfig and DistributionTemplate from llama stack list-deps is the first step in removing the build yaml entirely.

Removing the BuildConfig and build.yaml cuts the files users need to maintain in half, and allows us to focus on the stability of _just_ the run.yaml

The build.yaml made sense for when we were managing the build process for the user and actually _producing_ a run.yaml _from_ the build.yaml, but now that we are simply just getting the provider registry and listing the deps, switching to run.yaml simplifies the scope here greatly

Signed-off-by: Charlie Doern <[email protected]>
all of the additional pip packages are already in `llama-stack`'s pyproject except for psycopg2-binary (which I added), so they are unnecessary. This also allows me to get rid of the additional_pip_packages field

Signed-off-by: Charlie Doern <[email protected]>
@cdoern cdoern requested a review from ashwinb November 20, 2025 17:58
@ashwinb
Copy link
Contributor

ashwinb commented Nov 20, 2025

Removing the BuildConfig and DistributionTemplate

why do we want to remove DistributionTemplate? that is fundamentally how we specify what a distro contains.

@cdoern
Copy link
Collaborator Author

cdoern commented Nov 20, 2025

@ashwinb you're right, that was an oversight on my part. instead DistributionTemplate would probably change a bit down the line to use the same datatypes as the Providers do in the StackRunConfig. I noticed things like BuildProvider and such in DistributionTemplate.

Copy link
Collaborator

@leseb leseb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about renaminging StackRunConfig to StackConfig and stop using run.yaml but config.yaml.

"asyncpg", # for metadata store
"sqlalchemy[asyncio]>=2.0.41", # server - for conversations
"starlette>=0.49.1",
"psycopg2-binary",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍🏻

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants