Deploy Ollama LLM server on ringtail #277
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "feature/ollama-ringtail"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Summary
models.txt+ sidecar sync script (mirrors kiwix torrent pattern)qwen2.5:14b,deepseek-r1:14b,phi4:14b,gemma3:12b/mnt/storage1/ollamafor fast local model storage (200Gi)ollama.ops.eblu.mefor API access from tailnetreplicas: 2) on nvidia-device-plugin so Frigate and Ollama share the RTX 4080Deployment and Testing
argocd app sync nvidia-device-pluginkubectl describe node ringtail --context=k3s-ringtailshowsnvidia.com/gpu: 2appsapp with--revision feature/ollama-ringtailargocd app set ollama --revision feature/ollama-ringtail && argocd app sync ollamakubectl logs -n ollama deploy/ollama -c model-sync --context=k3s-ringtailcurl https://ollama.ops.eblu.me/api/tagscurl https://ollama.ops.eblu.me/api/generate -d '{"model":"qwen2.5:14b","prompt":"Hello"}'argocd app set ollama --revision main && argocd app sync ollama@ -0,0 +81,4 @@- name: sync-scriptconfigMap:name: ollama-sync-scriptdefaultMode: 493this... can't be right. there must be a way to use octal or u=,g=,o= or something. Try a string?