SQLite
---------
Install SQLite
$ wget https://www.sqlite.org/src/tarball/sqlite.tar.gz
$ tar zxvf sqlite.tar.gz
$ cd sqlite
$ ./configure --prefix=/usr/local
$ make
$ sudo make install
$ /usr/local/bin/sqlite3 --version
Chroma
----------
Install Chroma
$ python -m pip install chromadb
In .local/lib/python3.12/site-packages/chromadb/__init__.py, modify line 74
if IN_COLAB or not is_client:
This will fix RuntimeError : Chroma requires sqlite3 >= 3.35.0 during open-webui start.
Unzip ffmpeg into .local/bin
$ wget https://github.com/ffbinaries/ffbinaries-prebuilt/releases/download/v6.1/ffmpeg-6.1-linux-64.zip
$ unzip ffmpeg-6.1-linux-64.zip -d .local/bin
Ollama
-----------
Download and Install Ollama
$ curl -fsSL https://ollama.com/install.sh | sh
The Ollama API is now available at http://127.0.0.1:11434
$ sudo systemctl status ollama
$ sudo systemctl start ollama
Open WebUI
-----------------
Install Open WebUI
$ python -m pip install open-webui
Launch the Server
$ open-webui serve &
To see the Web UI go to: http://127.0.0.1:8080
Upgrade locally installed packages
$ python -m pip install -U open-webui
Visit https://openwebui.com/u/haervwe to access the collection of tools.
Locate the desired tool or function on the hub page.
Click the "Get" button
Open WebUI URL : http://127.0.0.1:8080
Import to WebUI.
Save the tool.
Admin Panel > Settings > Connections > Manage Ollama API Connections > Click on Configure icon
Edit Connection URL: http://localhost:11434
Save
Click on Manage icon
Access model names for downloading, click here
Filter models
Copy the name and paste to pull a model from Ollama.com
deepseek-r1:1.5b
llama3.2:1b
gemma2:2b
Click on Models to see the downloaded models.
$ sudo systemctl stop ollama
Interactive Chat Commands
$ export OLLAMA_FLASH_ATTENTION=1
$ export OLLAMA_KV_CACHE_TYPE=f16|q8_0|q4_0
$ ollama serve
$ ollama pull llama3.2:1b
$ ollama run llama3.2:1b
$ ollama stop llama3.2:1b
Download GGUF model from huggingface.
$ ollama pull hf.co/city96/HunyuanVideo-gguf:latest
No comments:
Post a Comment