Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[rewriter | torchlib] respect ops order in torchscript graph #2134

Merged

Conversation

titaiwangms
Copy link
Contributor

@titaiwangms titaiwangms commented Mar 25, 2025

Copy link

codecov bot commented Mar 25, 2025

Codecov Report

Attention: Patch coverage is 25.00000% with 6 lines in your changes missing coverage. Please review.

Project coverage is 72.86%. Comparing base (a1c9380) to head (2d0c78b).
Report is 2 commits behind head on main.

Files with missing lines Patch % Lines
onnxscript/function_libs/torch_lib/ops/nn.py 0.00% 4 Missing ⚠️
onnxscript/rewriter/ort_fusions/gelu_test.py 0.00% 2 Missing ⚠️
Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2134   +/-   ##
=======================================
  Coverage   72.86%   72.86%           
=======================================
  Files         222      222           
  Lines       29287    29287           
  Branches     3452     3452           
=======================================
  Hits        21340    21340           
  Misses       6798     6798           
  Partials     1149     1149           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@shubhambhokare1 shubhambhokare1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see that based on https://github.com/microsoft/onnxruntime/blob/main/onnxruntime/python/tools/transformers/fusion_fastgelu.py

The previous order matches the fuse_4 pattern and the new order matches the fuse_1 pattern, just a general question whether it might be a good idea to support both these patterns, maybe via using the check function/control parameter?

@titaiwangms
Copy link
Contributor Author

titaiwangms commented Mar 25, 2025

I see that based on https://github.com/microsoft/onnxruntime/blob/main/onnxruntime/python/tools/transformers/fusion_fastgelu.py

The previous order matches the fuse_4 pattern and the new order matches the fuse_1 pattern, just a general question whether it might be a good idea to support both these patterns, maybe via using the check function/control parameter?

Hmm you are right. I missed the 4th pattern. I wonder how it mismatched then..

No the previous pattern does not match anything. It has Pow but fuse_4 does not.

@titaiwangms titaiwangms added module: rewriter module: torchlib Related to the torch/aten function lib in development labels Mar 25, 2025
@shubhambhokare1
Copy link
Contributor

I see that based on https://github.com/microsoft/onnxruntime/blob/main/onnxruntime/python/tools/transformers/fusion_fastgelu.py
The previous order matches the fuse_4 pattern and the new order matches the fuse_1 pattern, just a general question whether it might be a good idea to support both these patterns, maybe via using the check function/control parameter?

Hmm you are right. I missed the 4th pattern. I wonder how it mismatched then..

No the previous pattern does not match anything. It has Pow but fuse_4 does not.

You're right, Although I wonder about
https://github.com/microsoft/onnxruntime/blob/9dbfee91ca9c2ba2074d19805bb6dedccedbcfe3/onnxruntime/python/tools/transformers/fusion_fastgelu.py#L31
https://github.com/microsoft/onnxruntime/blob/9dbfee91ca9c2ba2074d19805bb6dedccedbcfe3/onnxruntime/python/tools/transformers/fusion_fastgelu.py#L139

and if we would need cover the second pattern as well

@titaiwangms
Copy link
Contributor Author

Screenshot 2025-03-25 165915

The second pattern is still different from our op implementation. I guess we just need to follow torchscript ops order.

Copy link
Collaborator

@justinchuby justinchuby left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@shubhambhokare1 shubhambhokare1 merged commit 6774192 into microsoft:main Mar 26, 2025
24 of 29 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: rewriter module: torchlib Related to the torch/aten function lib in development
Projects
Development

Successfully merging this pull request may close these issues.

4 participants