forked from vllm-project/vllm
-
Notifications
You must be signed in to change notification settings - Fork 5
Issues: EmbeddedLLM/vllm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Feature]: Enhance the usage of rocm_aiter_fused_experts in fused_moe.py
enhancement
New feature or request
#43
opened Apr 17, 2025 by
tjtanaa
1 task done
[Bug]: Bugfix LoRA for ROCm due to incompatible triton arguments are passed.
bug
Something isn't working
#42
opened Apr 17, 2025 by
tjtanaa
1 task done
[Feature]: Add AITER TKW1 support to enable llama4 FP8 V1 Eager and V0
enhancement
New feature or request
#40
opened Apr 16, 2025 by
tjtanaa
1 task done
[Bug]: Bugfix AITER RMSNORM
bug
Something isn't working
#39
opened Apr 16, 2025 by
tjtanaa
1 task done
[Feature]: Roadmap for 2nd Quarter 2025
enhancement
New feature or request
#29
opened Apr 15, 2025 by
tjtanaa
1 of 5 tasks
ProTip!
Follow long discussions with comments:>50.