Skip to content

cudnn flash attention requires even sequence length with attention bias

b6802b3
Select commit
Loading
Failed to load commit list.
Sign in for the full log view
Open

[tx] Add GLM4.7 #989

cudnn flash attention requires even sequence length with attention bias
b6802b3
Select commit
Loading
Failed to load commit list.

Annotations

1 error
skyrl_tx_gpu_tests
failed Jan 29, 2026 in 30s