Code Reference Documentation
Architecture Overview
The codebase implements a flexible framework for integrating and orchestrating Large Language Models (LLMs) with a focus on:
- Vendor-agnostic LLM integration
- Extensible tool framework
- Streaming capabilities
- Pattern detection
- Database integration for RAG applications
Core Components
🤖 Agent Layer
The agent layer manages conversations and orchestrates interactions between LLMs and tools. It handles:
- Stream management for real-time responses
- Tool execution coordination
- Context management
- Error handling
🔮 LLM Integration
The LLM integration layer provides a unified interface to multiple LLM providers:
- OpenAI
- Anthropic
- MistralAI
- WatsonX (with support for multiple models)
🛠️ Tools Framework
The extensible tools framework enables:
- Custom tool implementation
- JSON and non-JSON response parsing
- REST-based tool integration
- Built-in tools for common use cases (RAG, Weather)
💾 Database Integration
Database support for vector storage and retrieval:
- Elasticsearch integration
- Milvus support
- Query building
- Adapter pattern for database abstraction
Component Relationships
Key Features
LLM Provider Support
- Vendor-agnostic interface
- Provider-specific prompt builders
- Auth Token management
- Error handling and retry logic
Tool Framework
- Base classes for rapid tool development
- REST integration support
- Parser framework for response handling
- Registry for tool management
Pattern Detection
- Real-time text pattern matching
- Aho-Corasick algorithm implementation
- Buffered processing for streaming
- Detection strategy framework
Database Integration
- Vector storage support
- Connection pooling
- Error handling and retries
Getting Started
Development Workflow
-
Understanding the Architecture
- Review the Agent documentation
- Understand LLM Integration patterns
- Review Data Models
-
Implementing New Features
- Adding a new tool? Start with Tools
- New LLM provider? See LLM Adapters
- Custom prompt handling? Check Prompt Builders
-
Database Integration
- Review Database documentation
Common Use Cases
-
Adding a New LLM Provider
# 1. Create adapter # 2. Add prompt builder # 3. Update factory # See LLM documentation for details
-
Implementing a Custom Tool
# 1. Extend BaseTool or BaseRESTTool # 2. Register in ToolRegistry # 3. Add to configuration # See Tools documentation for details
-
Database Integration
# 1. Choose adapter # 2. Configure connection # 3. Build queries # See Database documentation for details
Quick Reference
Common Components
- Agent - Core orchestration
- LLM - Language model integration
- Tools - Tool framework
- Database - Storage integration
- API - REST endpoints
Utility Components
- Data Models - Shared data structures
- Utils - Common utilities
- Prompt Builders - LLM prompting