Skip to content

GPU isn't used on MacOS #640

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
alryks opened this issue Mar 29, 2025 · 0 comments
Open

GPU isn't used on MacOS #640

alryks opened this issue Mar 29, 2025 · 0 comments

Comments

@alryks
Copy link

alryks commented Mar 29, 2025

I've got M2 Pro chip, running on MacOS Sequoia 15.1. I built sd with the flag -DSD_METAL=ON. I am using FLUX.1-dev-Q4_K.gguf model.

When I start generation with command sd --diffusion-model model.gguf --vae ae.safetensors --clip_l clip_l.safetensors --t5xxl t5xxl.safetensors --cfg-scale 1.0 --sampling-method euler -o ~/Downloads/img.png -p "a lovely cat holding a sign says 'flux.cpp'" the generation runs and the resulting image is meeting my needs, but it is SO slow.

I started wondering if it wasn't running on GPU, so I opened GPU History in Activity Monitor and noticed, that while generation the GPU almost isn't used.

Image

I also get a huge amount of errors:

run.log

Why so? Isn't that how I should install stable-diffusion.cpp on MacOS?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant