[Script request] Ollama #2637
Replies: 2 comments
-
ollama is already present and also available via openwebui. Ollama as a single file is just not on the website. bash -c "$(wget -qLO - https://github.com/community-scripts/ProxmoxVE/raw/main/ct/ollama.sh)" |
Beta Was this translation helpful? Give feedback.
0 replies
-
I'd like to add that it would be great if this LXC script would set up to share/use GPUs like the Plex script does |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Application Name
Ollama
Website
https://github.com/ollama/ollama
Description
Ollama is an open-source tool that lets you run large language models (LLMs) locally on your computer. You can use Ollama to:
Run LLMs: Run LLMs like Llama 2 on your computer's CPU or GPU
Create models: Create and customize your own models
Integrate models: Integrate models into your own projects using programming languages like Python or JavaScript
Summarize text: Summarize key points from multiple sources, such as meeting notes
Draft documents: Draft emails, professional correspondence, and to-do lists
Set goals: Assist in personal goal setting and planning
Benefits of Ollama
Local control: Host models locally, so your data stays private and secure
Offline access: Run AI models locally without an internet connection
Cost savings: Avoid recurring costs related to cloud storage, data transfer, and usage fees
Customization: Tweak models according to specific project requirements
It would be great if a community script for Ollama could be developed.
It could also be helpful to integrate into the Script the optional installation of other tools, such as Open-webui during LXC setup and installation.
Due Diligence
Beta Was this translation helpful? Give feedback.
All reactions