Lab 0. Workshop Pre-Work
These are the required applications and general installation notes for this workshop.
Ollama and Python are required for this workshop, but you can choose an IDE and GUI interface from the options provided. If you don't know what to select, just go with the recommended options!
Remember, you can always ask the teacher for help if you get stuck on any step!
Required Software¶
- Python
- Ollama - Allows you to locally host an LLM model on your computer.
- Visual Studio Code (Recommended) or any Jetbrains IDE. This workshop uses VSCode.
- AnythingLLM (Recommended) or Open WebUI. AnythingLLM is a desktop app while Open WebUI is browser-based.
- Continue - An IDE extension for AI code assistants.
Installing Python¶
There are multiple ways to install Python, you can follow their beginner's guide based on your operating system.
Using Homebrew (Mac)¶
Install Homebrew using the following command:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
Then, install Python via brew
:
brew install python@3.11
Please confirm that your python --version
is at least 3.11+
for the best experience.
Installing Ollama¶
Most users can simply download Ollama from its website.
Using Homebrew (Mac)¶
Install Homebrew using the following command:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
Then, install Ollama via brew
:
brew install ollama
Note
You can save time by starting the model download used for the lab in the background by running ollama pull granite3.1-dense:8b
in its own terminal. You might have to run ollama serve
first depending on how you installed it.
Installing Visual Studio Code¶
You can download and install VSCode from their website based on your operating system..
Note
You only need one of VSCode or Jetbrains for this lab.
Installing Jetbrains¶
Download and install the IDE of your choice here.
If you'll be using python
(like this workshop does), pick PyCharm.
Installing Continue¶
Choose your IDE on their website and install the extension.
Installing AnythingLLM¶
Download and install it from their website based on your operating system. We'll configure it later in the workshop.
Note
You only need one of AnythingLLM or Open-WebUI for this lab.
Installing Open-WebUI¶
Assuming you've set up Python above, use the following commands to install Open-WebUI:
cd ~
mkdir openweb-ui
cd openweb-ui
python3.11 -m venv --upgrade-deps venv
source venv/bin/activate
pip install open-webui
open-webui serve
Conclusion¶
Now that you have all of the tools you need, head over to Lab 1 if you have AnythingLLM or Lab 1.5 for Open-WebUI.