add TOC to ollama
This commit is contained in:
@@ -1,5 +1,16 @@
|
||||
# Ollama
|
||||
|
||||
- [Ollama](#ollama)
|
||||
- [Run natively with GPU support](#run-natively-with-gpu-support)
|
||||
- [Unsticking models stuck in "Stopping"](#unsticking-models-stuck-in-stopping)
|
||||
- [Run Anything LLM Interface](#run-anything-llm-interface)
|
||||
- [Anything LLM Quadlet with Podlet](#anything-llm-quadlet-with-podlet)
|
||||
- [Now with Nginx and Certbot](#now-with-nginx-and-certbot)
|
||||
- [Custom Models](#custom-models)
|
||||
- [From Existing Model](#from-existing-model)
|
||||
- [From Scratch](#from-scratch)
|
||||
- [Converting to gguf](#converting-to-gguf)
|
||||
|
||||
<https://github.com/ollama/ollama>
|
||||
|
||||
## Run natively with GPU support
|
||||
@@ -43,6 +54,13 @@ ollama pull nomic-embed-text:137m-v1.5-fp16
|
||||
|
||||
Note your ollama instance will be available to podman containers via `http://host.containers.internal:11434`
|
||||
|
||||
## Unsticking models stuck in "Stopping"
|
||||
|
||||
```bash
|
||||
ollama ps | grep -i stopping
|
||||
pgrep ollama | xargs -I '%' sh -c 'kill %'
|
||||
```
|
||||
|
||||
## Run Anything LLM Interface
|
||||
|
||||
```bash
|
||||
@@ -267,6 +285,8 @@ will be automatically renewed daily.
|
||||
|
||||
<https://www.gpu-mart.com/blog/import-models-from-huggingface-to-ollama>
|
||||
|
||||
<https://www.hostinger.com/tutorials/ollama-cli-tutorial#Setting_up_Ollama_in_the_CLI>
|
||||
|
||||
### From Existing Model
|
||||
|
||||
```bash
|
||||
|
||||
Reference in New Issue
Block a user