Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add related docs for intel cpu/xpu/hpu container #550

Merged
merged 8 commits into from
Apr 8, 2025

Conversation

kaixuanliu
Copy link
Contributor

No description provided.

@kaixuanliu kaixuanliu changed the title add docker build doc for intel cpu/xpu/hpu add related docs for intel cpu/xpu/hpu container Apr 1, 2025
@kaixuanliu
Copy link
Contributor Author

@regisss @Narsil @baptistecolle Hi, pls help review, thx!

Copy link
Collaborator

@baptistecolle baptistecolle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don’t think modifying the custom_container.md or quick_tour.md files is the best approach for adding documentation on Intel CPU/XPU/HPU.

It would be more effective to create a new page, local_intel.md, with a title like Using TEI on Intel Hardware, where all the relevant information about CPU/XPU/HPU can be included. You can refer to local_metal.md, local_cpu.md, and local_gpu.md for inspiration and adapt them for Intel. I’d suggest having three sections—one for each device—detailing how to start the container. Additionally, it makes sense to also include a link to the Docker image, so users don’t have to necessarily manually build it but can pull the image directly from GHCR.

@kaixuanliu
Copy link
Contributor Author

I don’t think modifying the custom_container.md or quick_tour.md files is the best approach for adding documentation on Intel CPU/XPU/HPU.

It would be more effective to create a new page, local_intel.md, with a title like Using TEI on Intel Hardware, where all the relevant information about CPU/XPU/HPU can be included. You can refer to local_metal.md, local_cpu.md, and local_gpu.md for inspiration and adapt them for Intel. I’d suggest having three sections—one for each device—detailing how to start the container. Additionally, it makes sense to also include a link to the Docker image, so users don’t have to necessarily manually build it but can pull the image directly from GHCR.

Good advice! I have moved the content into a separate doc file following your advice.

Signed-off-by: Liu, Kaixuan <[email protected]>
Signed-off-by: Liu, Kaixuan <[email protected]>
Signed-off-by: Liu, Kaixuan <[email protected]>
Copy link
Collaborator

@baptistecolle baptistecolle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new page looks great! The only thing left is to link it in the doc_tree, so it becomes visible: https://github.com/huggingface/text-embeddings-inference/blob/main/docs/source/en/_toctree.yml

@kaixuanliu
Copy link
Contributor Author

@baptistecolle Is it OK?

Signed-off-by: Liu, Kaixuan <[email protected]>
@kaixuanliu kaixuanliu requested a review from baptistecolle April 7, 2025 01:58
Copy link
Collaborator

@baptistecolle baptistecolle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! thanks for the changes

Copy link
Collaborator

@Narsil Narsil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect ! Let's merge.

@baptistecolle feel free to merge when everything is green btw.

@Narsil Narsil merged commit f6842e8 into huggingface:main Apr 8, 2025
@kaixuanliu kaixuanliu deleted the intel-doc branch April 8, 2025 09:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants