-
Notifications
You must be signed in to change notification settings - Fork 6.9k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
This bug took me a while to relize, but when generating images with Flux Schnell, if the random generator device is on cuda, it create weird, blurry, and noisy images.
Reproduction
"""
Reproduce: same seed, same model, same prompt — cpu vs cuda generator produces different images.
"""
import os
os.environ["CUDA_VISIBLE_DEVICES"] = "1"
import sys
sys.path.insert(0, "Normalized-Attention-Guidance")
import torch
from diffusers import FluxPipeline
SEED = 134
PROMPT = "A vintage red bicycle leaning against a brick wall, its pedals leading back to a solitary rear gear."
print("Loading FluxPipeline...")
pipe = FluxPipeline.from_pretrained(
"black-forest-labs/FLUX.1-schnell", torch_dtype=torch.bfloat16
).to("cuda")
kwargs = dict(
guidance_scale=0.0,
num_inference_steps=4,
max_sequence_length=256,
)
img_cpu = pipe(PROMPT, generator=torch.Generator("cpu").manual_seed(SEED), **kwargs).images[0]
img_cpu.save("gen_cpu.png")
print("Saved gen_cpu.png")
img_cuda = pipe(PROMPT, generator=torch.Generator("cuda").manual_seed(SEED), **kwargs).images[0]
img_cuda.save("gen_cuda.png")
print("Saved gen_cuda.png")
# Check if they're identical
import numpy as np
arr_cpu = np.array(img_cpu)
arr_cuda = np.array(img_cuda)
print(f"Images identical: {np.array_equal(arr_cpu, arr_cuda)}")
print(f"Max pixel diff: {np.abs(arr_cpu.astype(int) - arr_cuda.astype(int)).max()}")
Very interesting and cool bug but took me a while to find out as I was thinking it was my code broke it. And interestingly, only specific seed and prompt triggers it. I understand this could be expected, but I think it will be good to throw an warning
Logs
System Info
- 🤗 Diffusers version: 0.36.0
- Platform: Linux-6.8.0-94-generic-x86_64-with-glibc2.35
- Running on Google Colab?: No
- Python version: 3.11.14
- PyTorch version (GPU?): 2.8.0+cu128 (True)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Huggingface_hub version: 0.36.2
- Transformers version: 4.57.0
- Accelerate version: 1.12.0
- PEFT version: 0.18.1
- Bitsandbytes version: 0.49.2
- Safetensors version: 0.7.0
- xFormers version: not installed
- Accelerator: NVIDIA RTX A6000, 49140 MiB
NVIDIA RTX A6000, 49140 MiB
NVIDIA RTX A6000, 49140 MiB - Using GPU in script?:
- Using distributed or parallel set-up in script?:
Who can help?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working
