llama|75

References

Ollama.com

Installation

curl -fsSL https://ollama.com/install.sh | sh

Useful Commands

sudo usermod -aG ollama $USER 
ollama pull llama3 llama2-uncensored godegemma gemma dolphin-mistral

Service Configuration

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
Environment="OLLAMA_HOST=0.0.0.0"

[Install]
WantedBy=default.target

Useful Plugins

  • Obsidian: local gpt
  • Openweb-UI

Misc Information

  • Service runs on port 11433
  • By default service only listens on local host