From 5f1d03349bf4bd4c69843451c1b2ce34c4d7667a Mon Sep 17 00:00:00 2001 From: ducoterra Date: Mon, 25 Nov 2024 10:54:00 -0500 Subject: [PATCH] add TOC to ollama --- podman/graduated/ollama/README.md | 20 ++++++++++++++++++++ 1 file changed, 20 insertions(+) diff --git a/podman/graduated/ollama/README.md b/podman/graduated/ollama/README.md index 3fecc5a..d77eb18 100644 --- a/podman/graduated/ollama/README.md +++ b/podman/graduated/ollama/README.md @@ -1,5 +1,16 @@ # Ollama +- [Ollama](#ollama) + - [Run natively with GPU support](#run-natively-with-gpu-support) + - [Unsticking models stuck in "Stopping"](#unsticking-models-stuck-in-stopping) + - [Run Anything LLM Interface](#run-anything-llm-interface) + - [Anything LLM Quadlet with Podlet](#anything-llm-quadlet-with-podlet) + - [Now with Nginx and Certbot](#now-with-nginx-and-certbot) + - [Custom Models](#custom-models) + - [From Existing Model](#from-existing-model) + - [From Scratch](#from-scratch) + - [Converting to gguf](#converting-to-gguf) + ## Run natively with GPU support @@ -43,6 +54,13 @@ ollama pull nomic-embed-text:137m-v1.5-fp16 Note your ollama instance will be available to podman containers via `http://host.containers.internal:11434` +## Unsticking models stuck in "Stopping" + +```bash +ollama ps | grep -i stopping +pgrep ollama | xargs -I '%' sh -c 'kill %' +``` + ## Run Anything LLM Interface ```bash @@ -267,6 +285,8 @@ will be automatically renewed daily. + + ### From Existing Model ```bash