Skip to content

Conversation

HIT-cwh
Copy link
Collaborator

@HIT-cwh HIT-cwh commented Sep 10, 2025

No description provided.

@HIT-cwh HIT-cwh requested a review from Copilot September 10, 2025 07:22
Copilot

This comment was marked as outdated.

Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds FP8 (8-bit floating point) training documentation to XTuner. The changes include fixing function call parameters in FP8 modules and adding comprehensive Chinese documentation about FP8 training capabilities.

Key changes:

  • Fix function call parameters by using positional arguments instead of keyword arguments
  • Add comprehensive FP8 training documentation in Chinese
  • Include tutorial links and navigation updates

Reviewed Changes

Copilot reviewed 5 out of 8 changed files in this pull request and generated no comments.

Show a summary per file
File Description
xtuner/v1/float8/float8_linear_tile_wise.py Remove keyword argument group_size in function call
xtuner/v1/float8/float8_gmm_tile_wise.py Remove keyword argument group_size and add input shape handling
docs/zh_cn/pretrain_sft/tutorial/llm_trainer.md Add reference labels for cross-linking
docs/zh_cn/pretrain_sft/advanced_tutorial/index.rst Add FP8 documentation to table of contents
docs/zh_cn/pretrain_sft/advanced_tutorial/float8.md Add comprehensive FP8 training documentation

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

@HAOCHENYE HAOCHENYE changed the base branch from main to 20250911 September 11, 2025 06:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant