Skip to content

Conversation

@zzzzwwjj
Copy link
Collaborator

@zzzzwwjj zzzzwwjj commented Oct 23, 2025

What this PR does / why we need it?

vanilla_chunked_prefill_mla and vanilla_decode_mla is unused, so remove it.

Does this PR introduce any user-facing change?

How was this patch tested?

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request removes two unused functions, vanilla_chunked_prefill_mla and vanilla_decode_mla, from vllm_ascend/ops/attention.py. Based on the provided files, these functions are indeed not referenced anywhere else in the codebase. This is a good cleanup that improves code maintainability by removing dead code. The change is straightforward and correct.

@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

Signed-off-by: zzzzwwjj <[email protected]>
@wangxiyuan wangxiyuan merged commit 6be321b into vllm-project:main Oct 24, 2025
25 checks passed
@zzzzwwjj zzzzwwjj deleted the del_ops_attn branch October 24, 2025 08:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants