Skip to content

Conversation

@dimapihtar
Copy link
Contributor

What does this PR do ?

This PR refactors the parallel group management to eliminate dependencies on global parallel_state.xxx APIs and instead use explicit group parameters (tp_group, pp_group, dp_cp_group) with fallbacks to existing global state when not provided.

Key Changes

1. Explicit Group Parameters

  • Added tp_group, pp_group, dp_cp_group parameters to key functions in:
    • megatron/training/checkpointing.py
    • megatron/training/utils.py
    • megatron/core/utils.py
  • Functions now accept Optional[torch.distributed.ProcessGroup] parameters with None defaults
  • When groups are None, code falls back to existing mpu.get_xxx_group() APIs for backward compatibility

2. Enhanced Metadata Handling

  • Extended _build_sharded_state_dict_metadata() to include dp_cp_group in metadata
  • Updated sharded state dict generation to properly propagate group information
  • dp_cp_group now consistently sourced from metadata across checkpoint operations

3. Improved Group Sourcing Strategy

  • Tensor/Pipeline Groups: Sourced directly from module.tp_group and module.pp_group
  • Data Parallel + Context Parallel Group: Sourced from metadata to ensure consistency across save/load operations
  • Utilizes get_pg_size() and get_pg_rank() utilities for group introspection

4. Function Signature Updates

Key functions updated with explicit group parameters:

  • save_checkpoint()
  • load_checkpoint()
  • get_rng_state()
  • _build_sharded_state_dict_metadata()

Contribution process

flowchart LR
    A[Pre-checks] --> B[PR Tests]
    subgraph Code Review/Approval
        C1[Expert Review] --> C2[Final Review]
    end
    B --> C1
    C2 --> D[Merge]
Loading

Pre-checks

  • I want this PR in a versioned release and have added the appropriate Milestone (e.g., Core 0.8)
  • I have added relevant unit tests
  • I have added relevant functional tests
  • I have added proper typing to my code Typing guidelines
  • I have added relevant documentation
  • I have run the autoformatter.sh on my PR

Code review

The following process is enforced via the CODEOWNERS file for changes into megatron/core. For changes outside of megatron/core, it is up to the PR author whether or not to tag the Final Reviewer team.

For MRs into `main` branch

(Step 1): Add PR label Expert Review

(Step 2): Collect the expert reviewers reviews

  1. Attach the Expert Review label when your PR is ready for review.
  2. GitHub auto-assigns expert reviewers based on your changes. They will get notified and pick up your PR soon.

⚠️ Only proceed to the next step once all reviewers have approved, merge-conflict are resolved and the CI is passing.
Final Review might get declined if these requirements are not fulfilled.

(Step 3): Final Review

  1. Add Final Review label
  2. GitHub auto-assigns final reviewers based on your changes. They will get notified and pick up your PR soon.

(Optional Step 4): Cherry-pick into release branch

If this PR also needs to be merged into core_r* release branches, after this PR has been merged, select Cherry-pick to open a new PR into the release branch.

For MRs into `dev` branch The proposed review process for `dev` branch is under active discussion.

MRs are mergable after one approval by either [email protected] or [email protected].

Merging your PR

Any member of core-adlr and core-nemo will be able to merge your PR.

Copy link
Contributor

@yanring yanring left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for the MoE part

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants