π Agentics
Agentics is a lightweight, Python-native framework for building structured and massively parallel agentic workflows using Pydantic models and transducible functions .
π Documentation Overview
-
Getting Started π
Install Agentics, set up your environment, and run your first transducible function over a small dataset. -
Core Concepts π§
The mental model: Pydantic types, transducible functions, typed state containers, Logical Transduction Algebra (LTA), and MapβReduce. -
Transducible Functions βοΈ
How to define, configure, and invoke transducible functions; specifying instructions; controlling temperature, retries, and structured decoding. -
Agentics π§¬
Defining Pydantic models for inputs/outputs, working withAGcontainers, loading data from JSON/CSV/DataFrames, and preserving type information across the pipeline. -
Logical Transduction Algebra π
Chaining transducible functions, branching, fan-in/fan-out patterns, and building reusable pipeline components. -
Async MapβReduce Execution π
Usingamapandareducefor large-scale runs, batching strategies, handling failures, and performance considerations. -
Examples & Use Cases π
End-to-end examples: text-to-SQL, data extraction and enrichment, classification, document workflows, evaluation pipelines, and more.
Transducible Functions
A transducible function is an LLM-powered, type-safe transformation between Pydantic models. Agentics lets you:
- Define these transformations declaratively
- Compose them into pipelines
- Execute them at scale using an asynchronous MapβReduce execution engine βοΈ
Under the hood, Agentics is grounded in Logical Transduction Algebra (LTA), a logico-mathematical formalism that guarantees:
- β Composability
- β Explainability
- β Stability of LLM-based transformations
The result is a way to build agentic systems that are:
- Typed β every step has explicit input/output schemas π
- Composable β pipelines are built from reusable transducible functions π§©
- Traceable β outputs carry evidence back to input fields π
- Scalable β async
amap/areduceprimitives support large workloads π - Minimal β no heavy orchestrators: just types, functions, and data πͺΆ
Agentics code is simple, predictable, and robust, and is easy to embed into modern ecosystems (LangFlow, LangChain, CrewAI, MCP, etc.) π€.
π Key Features
βοΈ Transducible Functions (Core Abstraction)
Define LLM-powered transformations as first-class functions:
- π§Ύ Typed input and output via Pydantic models
- π‘οΈ Automatic schema validation and type-constrained generation
- πͺ Composable into higher-level workflows and chains
π§± Typed State Containers - a.k.a. Agentics (AG)
Wrap data into typed state collections so that every row or document carries a concrete Pydantic type:
- Safe, batch-level operations β
- Clear semantics over datasets and intermediate states π
- Input/output from DBs, CSV and Json
- Ideal to represent tabular/structured data
π Async MapβReduce Execution
Run transducible functions over large collections using:
- β‘
amapfor massively parallel application - π
areducefor aggregations and global summaries
Designed to scale on multi-core or distributed execution backends π₯οΈπ₯οΈπ₯οΈ.
π§© Dynamic Type & Function Composition
Create new workflows on the fly:
- π Merge or refine types dynamically
- 𧬠Compose transducible functions declaratively
- π Build polymorphic or adaptive pipelines driven by data and instructions
π Explainable & Traceable Inference
Each generated attribute can be traced back to:
- Specific input fields π§·
- The specific transducible function or step that produced it π§
This enables auditable, debuggable LLM reasoning across the pipeline.
π‘οΈ End-to-End Type Safety
Pydantic models are enforced at every boundary:
- β Validation on input loading
- β Validation after each transducible function
- β Predictable runtime behavior and clear failure modes
π Tool Integration
Agentics is fully compatible with Model Context Protocol (MCP) and expose external tools and knowledge to transducible functions:
- π Web / search tools
- ποΈ Databases & vector stores
- π» Code execution backends
- π MCP-based tools
β¨ Minimalistic, Pythonic API
The framework is intentionally small:
- π« No custom DSL to learn
- π Just Python functions, Pydantic models, and a few core primitives
- π Easy to embed into existing stacks (LangFlow nodes, CrewAI agents, MCPs, etc.)