Skip to content

Getting Started

What is agentics?

Agentics is a lightweight, Python-native framework for building structured, agentic workflows over tabular or JSON-based data using Pydantic types and transduction logic. Designed to work seamlessly with large language models (LLMs), Agentics enables users to define input and output schemas as structured types and apply declarative, composable transformations, called transductions across data collections. Inspired by a low-code design philosophy, Agentics is ideal for rapidly prototyping intelligent systems that require structured reasoning and interpretable outputs over both structured and unstructured data.

Installation

  • Clone the repository
  git clone git@github.com:IBM/agentics.git
  cd agentics
  • Install uv (skip if available)
curl -LsSf https://astral.sh/uv/install.sh | sh

Other installation options here

  • Install the dependencies
uv sync
# Source the environment (optional, you can skip this and prepend uv run to the later lines)
source .venv/bin/activate # bash/zsh 🐚
source .venv/bin/activate.fish # fish 🐟

🎯 Set Environment Variables

Create a .env file in the root directory with your environment variables. See .env.sample for an example.

Set Up LLM provider, Chose one of the following:

OpenAI

  • Obtain API key from OpenAI
  • OPENAI_API_KEY - Your OpenAI APIKey
  • OPENAI_MODEL_ID - Your favorute model, default to openai/gpt-4

Ollama (local)

  • Download and install Ollama
  • Download a Model. You should use a model that support reasoning and fit your GPU. So smaller are preferred.
    ollama pull ollama/deepseek-r1:latest
    
  • "OLLAMA_MODEL_ID" - ollama/gpt-oss:latest (better quality), ollama/deepseek-r1:latest (smaller)

IBM WatsonX:

  • WATSONX_APIKEY - WatsonX API key

  • MODEL - watsonx/meta-llama/llama-3-3-70b-instruct (or alternative supporting function call)

Google Gemini (offer free API key)

  • WATSONX_APIKEY - WatsonX API key

  • MODEL - watsonx/meta-llama/llama-3-3-70b-instruct (or alternative supporting function call)

VLLM (Need dedicated GPU server):

  • Set up your local instance of VLLM
  • VLLM_URL - http://base_url:PORT/v1
  • VLLM_MODEL_ID - Your model id (e.g. "hosted_vllm/meta-llama/Llama-3.3-70B-Instruct" )

Test Installation

test hello world example (need to set up llm credentials first)

python python examples/hello_world.py
python examples/self_transduction.py
python examples/agentics_web_search_report.py

Hello World

from typing import Optional
from pydantic import BaseModel, Field

from agentics.core.transducible_functions import Transduce, transducible


class Movie(BaseModel):
    movie_name: Optional[str] = None
    description: Optional[str] = None
    year: Optional[int] = None


class Genre(BaseModel):
    genre: Optional[str] = Field(None, description="e.g., comedy, drama, action")

movie = Movie(movie_name="The Godfather")

genre = await (Genre << Movie)(movie)

Installation details

Install poetry (skip if available)

curl -sSL https://install.python-poetry.org | python3 -

Clone and install agentics

poetry install
source $(poetry env info --path)/bin/activate 

Ensure you have Python 3.11+ 🚨.

python --version
  • Create a virtual environment with Python's built in venv module. In linux, this package may be required to be installed with the Operating System package manager.

    python -m venv .venv
    

  • Activate the virtual environment

Bash/Zsh

source .venv/bin/activate

Fish

source .venv/bin/activate.fish

VSCode

Press F1 key and start typing > Select python and select Select Python Interpreter

  • Install the package
    python -m pip install ./agentics
    
  • Ensure uv is installed.
    command -v uv >/dev/null &&  curl -LsSf https://astral.sh/uv/install.sh | sh
    # It's recommended to restart the shell afterwards
    exec $SHELL
    
  • uv venv --python 3.11
  • uv pip install ./agentics or uv add ./agentics (recommended)

This is a way to run agentics temporarily or quick tests

  • Ensure uv is installed.
    command -v uv >/dev/null &&  curl -LsSf https://astral.sh/uv/install.sh | sh
    # It's recommended to restart the shell afterwards
    exec $SHELL
    
  • uvx --verbose --from ./agentics ipython
  1. Create a conda environment:

    conda create -n agentics python=3.11
    
    In this example the name of the environment is agetnics but you can change it to your personal preference.

  2. Activate the environment

    conda activate agentics
    

  3. Install agentics from a folder or git reference
    pip install ./agentics
    

Documentation

This documentation page is written using Mkdocs. You can start the server to visualize this interactively.

mkdocs serve
After started, documentation will be available here http://127.0.0.1:8000/