Skip to content

Conversation

@legitnull
Copy link
Collaborator

@legitnull legitnull commented Nov 7, 2025

PR Category

Inference

PR Types

New Features

PR Description

  1. Add TorchCompileTransformation
    1.1 Apply torch.compile on target module
    1.2 A backend wrapper to support custom passes
  2. Add a simple pass (TimestepEmbeddingFlipSineCosinePass) to test the custom pass functionality
  3. Check preflight of transformations in engine

@legitnull legitnull changed the title add TorchCompileTransformation [Diffusion] add TorchCompileTransformation Nov 10, 2025
@legitnull legitnull requested review from aoyulong and ceci3 November 10, 2025 12:35
@legitnull legitnull marked this pull request as ready for review November 10, 2025 12:37
@legitnull legitnull requested review from a team and zhaoyinglia as code owners November 10, 2025 12:37
-1,
_users=2, # The cat feeds both slices
)
slice_hi = CallFunction(aten.slice.Tensor, inner_cat, 1, split_idx, 9223372036854775807)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does 9223372036854775807 mean?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

-1

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still don't quite understand why it has to be written this way. Why not just write -1 directly?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't dig into it. Maybe it has something to do with how aten.slice represent end of the dim

previous = torch._inductor.config.post_grad_custom_post_pass

if not pass_manager.empty():
torch._inductor.config.post_grad_custom_post_pass = pass_manager

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should consider the four situations: pre_forward, post_forward, pre_backward, and post_backward, rather than just post_backward alone.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, let me check the torch api

@CLAassistant
Copy link

CLAassistant commented Nov 18, 2025

CLA assistant check
All committers have signed the CLA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants