Replies: 1 comment
-
Plugin SystemFirst thing I want to do is add is a plugin system (ala ComfyUI custom nodes) where you can clone a github repo to It would also allow others be able to develop, prototype and share functionality easier. First step is to design how plugin system would work. Installation: Should be able to copy anyones git repo into the plugins folder, e.g: git clone https://github.com/username/repo.git ~/.llms/plugins/repoSo to remove the feature they'd only need to delete the folder. Requirements, should be able to:
Convention for storing user data outside the plugins folder, e.g File StorageAdd a file/sha storage feature for attachments, that you raised earlier so we can keep image previews / original attachments in the UI. StreamingIt would be nice to show streaming of responses from providers that support it. llama.cpp/llamaswapExplicit support for llama.cpp and auto discovery of llama-swap models. Maybe vLLM as well if there's demand for it, although I've never used it personally. Static WebsiteAdding docs for all features in 1 long README isn't feasible, and will eventually need a static docs website. Tool call supportNice to have as we don't have any, as I have no idea what that would look like - copying what someone who does this well would be a good start. Gemini RAGOther than that the only feature I would like to build personally atm (after implementing a plugin system) is support for File Search in Gemini where I can select a folder to upload all files to Gemini and be able to query my knowledge base. No idea how that would look/work yet, but it's a feature I'd like to have. It's aggressively priced so I expect this to be very popular for querying knowledge bases.
With the export options, given we have optional GitHub OAuth support I think publishing to a gist makes the most sense. But what to export is unclear.
Not sure about this, it currently uses user-defined model ids so requests can be cascaded to other configured providers if they fail. There's no universal API to discover models/pricing so this would be unique and specific to each provider. As such I think this would be good to add this as a OpenRouter plugin which we could bundle by default.
This is available via --verbose flag: llms --serve 8000 --verbose
Maybe something that could fit in Tool support? Not currently on my radar.
Similar to the feature I'd like to build with Gemini File Search, Should be a plugin. A lot of people are going to want an all-local/privacy solution.
I don't spend anytime thinking about anything I don't use. Not interested in chasing features others are doing that I don't use personally. Happy to copy good ideas if I find them useful but I definitely don't want to be relying on anything that's VC funded which eventually leads into enshittification, as such happy to maintain my own tool that I have complete control of. I primarily want a good AI Assistance experience that I can use for local/cloud LLMs in the same UI. I'd like to keep the core functionality in a single file with minimal deps that's an easy drop-in any Python project (e.g. ComfyUI Custom Node) to access AI Features. Any other features should be delivered via plugins that users can easily enable/disable. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm considering contributing to the project. I was wondering if you could provide some insights on where you see it going. Do you have particular features or improvements in mind? I have some ideas, but they're fairly specific to my workflows and quirks. I imagine there are a lot of use cases I'm unaware of.
Some things I've been looking for in particular:
I'd also be interested in your thoughts around the directions/features of Open Web UI and LLM Gateway (and any other projects you think of) compared to this project.
Beta Was this translation helpful? Give feedback.
All reactions