Prework
These are the required applications and general installation notes for this lab.
Required Software and Models¶
- Ollama - Allows you to locally host LLM models on your computer.
- Models - Pull models to run with Ollama.
- uv - Provides Python, packages, and virtual environments.
Install Ollama¶
Most users can simply download from the Ollama website.
Pull models with Ollama¶
Please pull the models to be used in the workshop before arriving at the workshop!
ollama pull ibm/granite4:micro-h
Chat with the model¶
For a quick test, you can use the ollama CLI to ask the model a question.
ollama run ibm/granite4:micro-h "what model am I chatting with and and who created you?"
Install uv¶
We will be using uv as your Python package and environment manager. If you’re unfamiliar with uv, refer to the uv installation guide. uv is a fast and modern alternative to pip and virtualenv, fully compatible with both.
macOS/Linux¶
curl -LsSf https://astral.sh/uv/install.sh | sh
Windows¶
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
Install project Python dependencies¶
Preinstall all the packages used in the demo, installed in the correct virtual environment using uv.
-
Navigate to the specific demo folder for this workshop:
cd beeai-workshop/opentech -
Install all required python dependencies for the project:
uv sync --directory beeaiframework