Skip to content

Pre-work

These are the required applications and general installation notes for this workshop.

Required Software and Models

  • uv - Provides Python, packages, and virtual environments.
  • Ollama - Allows you to locally host LLM models on your computer.
  • Models - Pull models to run with Ollama.
  • Open WebUI - A UI that works with Ollama models.

Install uv

We will be using uv as your Python package and environment manager. If you’re unfamiliar with uv, refer to the uv installation guide. uv is a fast and modern alternative to pip and virtualenv, fully compatible with both.

macOS/Linux

curl -LsSf https://astral.sh/uv/install.sh | sh

Windows

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Install Ollama

Most users can simply download from the Ollama website.

Pull models with Ollama

Please pull the models to be used in the workshop before arriving at the workshop!

ollama pull ibm/granite4:micro-h

Chat with the model

For a quick test, you can use the ollama CLI to ask the model a question.

ollama run ibm/granite4:micro-h "what model am I chatting with and and who created you?"  

Install Open WebUI

Once uv is installed, use uvx to run Open WebUI with Python 3.11 (recommend for Open WebUI).

macOS/Linux

DATA_DIR=~/.open-webui uvx --python 3.11 open-webui@latest serve

Windows

$env:DATA_DIR="C:\open-webui\data"; uvx --python 3.11 open-webui@latest serve

Once the downloads, install, and start are complete, you will have a fancy Open WebUI "get started" page at http://localhost:8080/. You can now kill the server with Control-C back in the terminal. We'll do the setup during the workshop.