This repository provides a set of simple shell scripts to install, manage, and run Ollama and OpenWebUI on a Linux system with systemd
.
These scripts provide a user-friendly way to:
- π¦ Install/update Ollama
- π Configure network access (localhost vs. network)
βΆοΈ βΈοΈ Manage the OpenWebUI service (start/stop/update)- π οΈ Check system status and troubleshoot issues
.
βββ shared.sh # π οΈ Common utility functions, colors, and error handling
βββ ollama-helpers.sh # π€ Helper functions specific to Ollama (API, service checks)
βββ install-ollama.sh # π¦ Installs or updates Ollama with version checking
βββ config-ollama.sh # βοΈ Unified script to configure network and advanced settings
βββ restart-ollama.sh # π Restarts Ollama service after system wake/sleep issues
βββ manage-models.sh # βοΈ Interactively pull, delete, and manage models
βββ run-model.sh # βΆοΈ Interactively select and run a local model
βββ stop-ollama.sh # π Stops the Ollama service cleanly
βββ logs-ollama.sh # π View Ollama service logs via journalctl
βββ check-status.sh # π Checks status of services and lists installed models
βββ test-all.sh # π§ͺ Runs all script self-tests
βββ diagnose.sh # π©Ί Generates diagnostic report for troubleshooting
βββ openwebui/ # π OpenWebUI management scripts and configuration files
βββ start-openwebui.sh # β‘ Starts the OpenWebUI service
βββ stop-openwebui.sh # π Stops the OpenWebUI service
βββ update-openwebui.sh # β¬οΈ Updates OpenWebUI container images
βββ docker-compose.yaml # π Docker Compose configuration for OpenWebUI
- π‘ curl: Required by the Ollama installer.
- π³ Docker: Required for running OpenWebUI. See Docker installation docs for instructions.
- π§© Docker Compose: Required for running OpenWebUI containers.
- π§° systemd: The Ollama management scripts (
stop-ollama.sh
,restart-ollama.sh
) are designed for Linux systems usingsystemd
. They will not work on systems without it (e.g., macOS or WSL without systemd enabled). - π¦ jq: Required for parsing JSON model data in
manage-models.sh
andcheck-status.sh
.
Note
Most of these scripts require sudo
to manage system services or write to protected directories. They will automatically prompt for a password if needed.
Warning
The Ollama management scripts will not work on systems without systemd
(e.g., standard macOS or WSL distributions).
To get up and running quickly:
Important
During installation, the script will prompt you to configure Ollama for network access. This is required for OpenWebUI (in Docker) to connect to it. Please choose Yes (y).
Important
sudo ufw allow 11434
on Ubuntu.
./install-ollama.sh
./openwebui/start-openwebui.sh
After it starts, open the link provided (usually http://localhost:3000
) and follow the on-screen instructions to create an account.
Script | Description |
---|---|
./diagnose.sh |
π©Ί Generates diagnostic report of the system, services, and configurations to help with troubleshooting. |
./check-status.sh |
π Checks the status of Ollama and OpenWebUI. Can also list installed models (--models ), watch currently loaded models in real-time (--watch ), or run self-tests (--test ). |
Script | Description |
---|---|
./install-ollama.sh |
π¦ Installs or updates Ollama. Can also be run with --version to check for updates without installing. |
./run-model.sh |
|
./manage-models.sh |
βοΈ An interactive script to list, pull, update, and delete local Ollama models. |
./logs-ollama.sh |
π A convenient wrapper to view the Ollama service logs using journalctl . |
./restart-ollama.sh |
π Sometimes on wake from sleep, the ollama service will go into an inconsistent state. This script stops, resets GPU state (if applicable), and restarts the Ollama service using systemd . |
./stop-ollama.sh |
π Stops the Ollama service. |
./config-ollama.sh |
βοΈ A unified, interactive script to configure network access, KV cache, models directory, and other advanced Ollama settings. Can also be run non-interactively with flags. |
Script | Description |
---|---|
./openwebui/start-openwebui.sh |
β‘ Starts the OpenWebUI Docker containers in detached mode. |
./openwebui/stop-openwebui.sh |
π Stops the OpenWebUI Docker containers. |
./openwebui/update-openwebui.sh |
β¬οΈ Pulls the latest Docker images for OpenWebUI and prompts for a restart if an update is found. |
The install-ollama.sh
script handles both initial installation and updates.
./install-ollama.sh
The script will:
- π§ͺ Check pre-req's
- π¦ Install Ollama (if not already installed)
- π Check the current version and update if a newer one is available.
- π Verify that the
ollama
service is running. - π Prompt you to configure Ollama for network access. This is required for OpenWebUI (in Docker) to connect to it. Please say Yes (y) when asked.
Flag | Alias | Description |
---|---|---|
--version |
-v |
Displays the currently installed version and checks for the latest available version on GitHub without running the full installer. |
--test |
-t |
Runs internal self-tests for script functions. |
--help |
-h |
Shows the help message. |
Example:
$ ./install-ollama.sh --version
-------------------------------------------------------------------------------
Ollama Version
-------------------------------------------------------------------------------
[i] Installed: 0.9.6
[i] Latest: 0.10.1
To quickly run any of your installed models from the command line, use the run-model.sh
script.
./run-model.sh
This will show a list of your local models. Choose one to start a chat session directly in your terminal.
This script provides a user-friendly interactive menu to manage your local Ollama models. You can list, pull (add), and delete models without needing to remember the specific ollama
commands.
Run the script without any arguments to launch the interactive menu.
./manage-models.sh
The menu allows you to perform actions with single keypresses (R
for Refresh, P
for Pull, etc.), making model management easier.
The script can also be used non-interactively with flags, making it suitable for scripting and automation.
Flag | Alias | Description |
---|---|---|
--list |
-l |
Lists all installed models. |
--pull <model> |
-p <model> |
Pulls a new model from the registry. |
--update <model> |
-u <model> |
Updates a specific local model. |
--update-all |
-ua |
Updates all existing local models. |
--delete <model> |
-d <model> |
Deletes a local model. |
--help |
-h |
Shows the help message. |
--test |
-t |
Runs internal self-tests for script functions. |
Examples:
# Update the 'llama3' model
./manage-models.sh --update llama3
# Update all local models
./manage-models.sh --update-all
# Pull the 'llama3.1' model
./manage-models.sh --pull llama3.1
# Delete the 'gemma' model
./manage-models.sh --delete gemma
By default, Ollama only listens for requests from the local machine (localhost
). For OpenWebUI (running in a Docker container) to connect to Ollama, the Ollama service must be configured to listen on all network interfaces (0.0.0.0
).
The install-ollama.sh
script will detect if this is needed and prompt you to apply this configuration automatically. This is the recommended way to set it up.
If you need to change any setting after the initial installation, you can use the unified configuration script:
This script provides an interactive menu to manage all settings in one place.
./config-ollama.sh
It can also be run non-interactively with flags like --expose
, --restrict
, --kv-cache
, --models-dir
, etc. Run ./config-ollama.sh --help
for a full list of options.
The unified ./config-ollama.sh
script handles all advanced settings. Run it without flags for an interactive menu, or use flags like --kv-cache q8_0
for non-interactive changes.
If Ollama becomes unresponsive or encounters issues, you can use the restart script.
./restart-ollama.sh
This script will:
- π Stop the Ollama service.
- π§Ό Reset NVIDIA UVM (if a GPU is detected) to clear potential hardware state issues.
- π Restart the Ollama service using
systemd
. - π Check the status of the service after restarting.
Flag | Alias | Description |
---|---|---|
--help |
-h |
Shows the help message. |
To stop the Ollama service:
./stop-ollama.sh
Flag | Alias | Description |
---|---|---|
--help |
-h |
Shows the help message. |
Note
The restart-ollama.sh
and stop-ollama.sh
scripts require sudo
to manage the systemd service. They will automatically attempt to re-run themselves with sudo
if needed.
To view and search the logs for the ollama
service, you can use the logs-ollama.sh
script. This is particularly useful for debugging.
This script is a simple wrapper that passes arguments directly to journalctl -u ollama.service
.
-
π View all logs (searchable):
./logs-ollama.sh
-
π Show logs since a specific time:
./logs-ollama.sh --since "1 hour ago"
Press Ctrl+C
to stop following the logs. This script is a simple wrapper and requires a systemd
-based system.
Flag | Alias | Description |
---|---|---|
[journalctl_options] |
Any valid options for journalctl (e.g., -f , -n 100 ). |
|
--help |
-h |
Shows the help message. |
--test |
-t |
Runs a self-test to check for dependencies. |
This script uses docker-compose
to run the OpenWebUI container in detached mode and prints a link (usually http://localhost:3000
) to the UI once it's ready.
./openwebui/start-openwebui.sh
Flag | Alias | Description |
---|---|---|
--help |
-h |
Shows the help message. |
On your first visit, OpenWebUI will prompt you to create an administrator account. The connection to your local Ollama instance is configured automatically.
This works because the included Docker Compose file tells OpenWebUI to connect to http://host.docker.internal:11434
, and the install-ollama.sh
script helps configure the host's Ollama service to accept these connections.
To stop the OpenWebUI containers:
./openwebui/stop-openwebui.sh
Flag | Alias | Description |
---|---|---|
--help |
-h |
Shows the help message. |
To update OpenWebUI to the latest version, you can pull the newest container images.
./openwebui/update-openwebui.sh
This script runs docker compose pull
to download the latest images. The script should detect if new images were pulled and offer to restart the containers automatically.
If you want to start the containers again manually, you can re-run the start script again:
./openwebui/start-openwebui.sh
Flag | Alias | Description |
---|---|---|
--test |
-t |
Runs internal self-tests for script functions. |
--help |
-h |
Shows the help message. |
To customize ports, create a .env
file in the root of the project directory. The scripts will automatically load it.
Example .env
file:
# Sets the host port for the OpenWebUI interface.
# The default is 3000.
OPEN_WEBUI_PORT=8080
# Sets the port your Ollama instance is running on.
# This is used to connect OpenWebUI to Ollama.
# The default is 11434
OLLAMA_PORT=11434
The Docker Compose file (docker-compose.yaml
) is pre-configured to use these environment variables.
If you encounter issues and need to ask for help, the diagnose.sh
script is the best way to gather all the relevant information. It collects details about your system, dependencies, service status, logs, and configurations into a single report.
./diagnose.sh
This will print a detailed report to your terminal. For sharing, it's best to save it to a file:
./diagnose.sh -o report.txt
You can then share the contents of report.txt
when creating a bug report or asking for help.
Flag | Alias | Description |
---|---|---|
--output <file> |
-o <file> |
Saves the report to the specified file instead of printing it to the terminal. |
--help |
-h |
Shows the help message. |
- Container Not Starting:
- Check Docker logs for errors. In the
openwebui/
directory, run:docker compose logs --tail 50
- Ensure Docker has sufficient resources (CPU, memory).
- Check Docker logs for errors. In the
- OpenWebUI Can't Access Ollama Models:
-
This usually means the Ollama service on your host machine is not accessible from inside the Docker container.
-
Solution: Run the network configuration script and choose to expose Ollama to the network:
./config-ollama.sh
-
This script can configure Ollama to listen on all network interfaces, which is required for Docker to connect.
-
Alternatively, re-running the installer (
./install-ollama.sh
) will also detect this and prompt you to fix it. -
Also check: Ensure your firewall is not blocking traffic on port
11434
(or your customOLLAMA_PORT
). For example, on Ubuntu, you might runsudo ufw allow 11434
.
-
-
DEBUG OLLAMA:
-
To easily view the
ollama
service logs in real-time, use the dedicated script:./logs-ollama.sh
-
This is helpful for debugging issues with the service itself. See the script's documentation above for more options, like viewing a specific number of past lines.
-
-
DEBUG OPENWEBUI:
-
To see the logs being generated by docker-compose, you can use the
-f
flag with the docker-compose command. This allows you to follow the logs in real-time, which can be helpful for debugging issues. -
For example:
docker compose logs -f
-
I'm open to and encourage contributions of bug fixes, improvements, and documentation!
The project includes a testing utility to ensure script quality and prevent regressions. Several scripts contain internal self-tests that can be run with a --test
or -t
flag.
To run all available tests across the project, use the test-all.sh
script:
./test-all.sh
This script will automatically discover and execute the self-tests for all testable scripts in the repository and provide a summary of the results. This is a great way to verify your changes before submitting a contribution.
This project is licensed under the MIT License. See the LICENSE
file for details.
Let me know if you have any questions. I can be reached at @IAmDanielV or @iamdanielv.bsky.social.