Prework
These are the required applications and general installation notes for this workshop.
Required Software and Models¶
- uv - Provides Python, packages, and virtual environments.
- Ollama - Allows you to locally host LLM models on your computer.
- Models - Pull models to run with Ollama.
- Workshop Code - Git clone the workshop
Install uv¶
We will be using uv as your Python package and environment manager. If you’re unfamiliar with uv, refer to the uv installation guide. uv is a fast and modern alternative to pip and virtualenv, fully compatible with both.
macOS/Linux¶
curl -LsSf https://astral.sh/uv/install.sh | sh
Windows¶
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
Install Ollama¶
Most users can simply download from the Ollama website.
Pull models with Ollama¶
Please pull the models to be used in the workshop before arriving at the workshop!
ollama pull ibm/granite4:micro-h
Chat with the model¶
For a quick test, you can use the ollama CLI to ask the model a question.
ollama run ibm/granite4:micro-h "what model am I chatting with and and who created you?"
Get the workshop code¶
Option A: Clone with Git (recommended):
git clone https://github.com/IBM/beeai-workshop.git
Option B: Download ZIP:
If you're not comfortable with Git, download the ZIP file and extract it to your desired location.
Then:
cd beeai-workshop/opentech