Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Gemma3 with Clip fused attention #24187

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

titaiwangms
Copy link
Contributor

@titaiwangms titaiwangms commented Mar 26, 2025

Description

Essentially, the vision model is traced differently (this time it's without mask.), and the input indices of op.Add and op.MatMul can be different. Also, fp16 and fp32 need different tracing patterns (op.Cast).

  1. Add another traced pattern to CLIP attention to cover no attention_mask case
  2. Accept different index of input on op.Add and op.MatMul (be more general)
  3. fp16 and fp32 shows different pattern (op.Cast after op.Softmax)
  4. Refactor test_fastgelu.py to cover torch.onnx.export(..., dynamo=True)
  5. Add gemma3 vision attention (SigLip) test to cover both fp16 and fp32

Motivation and Context

To optimize Gemma3 multi-modal model, the changes are needed. https://huggingface.co/google/gemma-3-4b-it

@titaiwangms titaiwangms marked this pull request as ready for review April 1, 2025 17:55
qk_nodes = self.model.match_parent_path(
matmul_qkv, ["Cast", "Cast", "Softmax", "Add", "Mul", "MatMul"], [0, 0, 0, 0, 0, 0]
)
# If attention mask is not used, we can still match the qk path.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggest to change the condition so that layout is more friendly:
Before:

if qk_nodes is None:
    ...
else:
   add_mask = qk_nodes[1]

To

if qk_nodes is not None:
   add_mask = qk_nodes[1]
else:
   ...

Another possible change is to use match_parent_paths.

@titaiwangms
Copy link
Contributor Author

The errors seem to be some docker image authorization issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants