Skip to content

Conversation

@vanbasten23
Copy link
Collaborator

@vanbasten23 vanbasten23 commented Oct 23, 2025

Description

Problem: When I run my CI test https://github.com/vllm-project/tpu-inference/blob/main/tests/lora/test_lora.py on the v6e-8 CI machine, the first test succeeds and the second one fails with the error jaxlib._jax.XlaRuntimeError: UNKNOWN: TPU initialization failed: open(/dev/vfio/1): Device or resource busy: Device or resource busy; Couldn't open iommu group /dev/vfio/1. It appears that after the first test finishes, it didn't release the TPU device. It's persistent. In my local env, ~70% of the chance I could reproduce the issue. Also I couldn't reproduce the issue on the v6e-1 machine in the CI.

This PR intends to fix the above problem.

Tests

CI

Checklist

Before submitting this PR, please make sure:

  • I have performed a self-review of my code.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have made or will make corresponding changes to any relevant documentation.

@github-actions
Copy link

Description

Start with a short description of what the PR does and how this is a change from
the past.

The rest of the description includes relevant details and context, examples:

  • why is this change being made,
  • the problem being solved and any relevant context,
  • why this is a good solution,
  • some information about the specific implementation,
  • shortcomings of the solution and possible future improvements.

If the change fixes a bug or a Github issue, please include a link, e.g.,:
FIXES: b/123456
FIXES: #123456

Tests

Please describe how you tested this change, and include any instructions and/or
commands to reproduce.

Checklist

Before submitting this PR, please make sure:

  • I have performed a self-review of my code.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have made or will make corresponding changes to any relevant documentation.

@vanbasten23 vanbasten23 changed the title [Do not review yet] Fix issues when running multiple tests on the v6e-8 machine. Fix issues when running multiple LoRA tests on the v6e-8 machine. Oct 27, 2025
@vanbasten23 vanbasten23 marked this pull request as ready for review October 27, 2025 16:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant