Skip to content

Issues: Blaizzy/mlx-vlm

Batch Processing Feature
#40 opened Jun 11, 2024 by Blaizzy
Open 6
ChatUI improvements
#45 opened Jun 23, 2024 by Blaizzy
Open
Models to port to MLX-VLM
#39 opened Jun 11, 2024 by Blaizzy
Open 42
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Negative Prompt Support
#271 opened Mar 22, 2025 by yukiarimo
Remove scipy dependency
#253 opened Mar 18, 2025 by r-bit-rry
OpenAI API
#247 opened Mar 15, 2025 by chigkim
Gemma3 fine-tuning
#245 opened Mar 14, 2025 by regnio
Gemma 3 models do not see the image when the prompt is too long bug Something isn't working
#242 opened Mar 12, 2025 by asmeurer
Add FastAPI server enhancement New feature or request good first issue Good for newcomers
#241 opened Mar 12, 2025 by Blaizzy
Add support for Gemma 3?
#237 opened Mar 12, 2025 by alexgusevski
Support Phi-4-multimodal
#225 opened Mar 5, 2025 by kinfey
Running Siglip/Siglip2 on MLX?
#219 opened Feb 27, 2025 by maxlund
Negative padding bug Something isn't working
#214 opened Feb 23, 2025 by pavelgur
2
2
KeyError: 'image_token_index' bug Something isn't working
#213 opened Feb 23, 2025 by pavelgur
Add support for Ovis 2 ? enhancement New feature or request
#212 opened Feb 22, 2025 by alexgusevski
When will Janus-Pro be supported?
#204 opened Feb 18, 2025 by fackweb
ProTip! Updated in the last three days: updated:>2025-03-28.