Agent Types
Understanding LLM and Custom agents in Fiberwise
📋 Quick Navigation
📋 Overview
Fiberwise supports two primary agent types that can be defined in your app manifest. Each type serves different use cases and has distinct capabilities for processing user inputs and generating responses.
🧠 LLM Agents
agent_type_id: llm
Language model agents that use configured LLM providers (OpenAI, Anthropic, Google, etc.) to process natural language inputs and generate AI responses.
🛠️ Custom Agents
agent_type_id: custom
Python-based agents with user-defined code that can perform custom logic, data processing, API integrations, and specialized computations.
🧠 LLM Agents
LLM agents provide seamless integration with Large Language Model providers, handling the complexity of API communication, prompt formatting, and response processing.
🔧 Configuration
agents:
- name: chatAgent
agent_type_id: llm
version: 1.0.0
description: AI chat assistant for customer support
input_schema:
type: object
properties:
prompt:
type: string
description: User message or question
system_prompt:
type: string
description: Optional system instructions
required: ["prompt"]
output_schema:
type: object
properties:
text:
type: string
description: AI-generated response
⚡ Key Features
- Automatic Provider Integration: Works with any configured LLM provider
- Context Handling: Maintains conversation context through activation metadata
- Flexible Input: Accepts prompts, system messages, and custom parameters
- Standardized Output: Returns structured text responses
- No Code Required: Pure configuration-based setup
🎯 Use Cases
💬 Chat Applications
Conversational AI for customer support, virtual assistants, and interactive help systems.
📝 Content Generation
Automated writing, summarization, translation, and content creation workflows.
🔍 Question Answering
Knowledge base queries, FAQ systems, and information retrieval applications.
📊 Text Analysis
Sentiment analysis, classification, and natural language understanding tasks.
💻 Activation Example
// Activate LLM agent with provider specification
const response = await FIBER.agents.activate('chatAgent', {
prompt: "What are the benefits of renewable energy?",
system_prompt: "You are an environmental expert."
}, {
context: { chat_id: "session-123" },
llm_provider_id: "openai-gpt4" // Specify LLM provider
});
console.log(response.output_data.text);
🛠️ Custom Agents
Custom agents allow you to implement specialized business logic using Python code, providing unlimited flexibility for complex processing tasks.
🔧 Configuration
agents:
- name: dataAnalyzer
agent_type_id: custom
version: 1.0.0
description: Analyze sales data and generate insights
input_schema:
type: object
properties:
data:
type: array
description: Raw sales data
analysis_type:
type: string
enum: ["summary", "trends", "forecasting"]
required: ["data", "analysis_type"]
output_schema:
type: object
properties:
insights:
type: object
description: Analysis results
recommendations:
type: array
description: Action recommendations
implementation: |
import pandas as pd
import numpy as np
from datetime import datetime
async def run(input_data, context):
"""
Custom agent implementation for data analysis
"""
data = input_data.get('data', [])
analysis_type = input_data.get('analysis_type')
# Convert to DataFrame for analysis
df = pd.DataFrame(data)
if analysis_type == "summary":
insights = {
"total_sales": df['amount'].sum(),
"average_order": df['amount'].mean(),
"transaction_count": len(df)
}
recommendations = [
"Focus on high-value customers",
"Optimize product mix"
]
elif analysis_type == "trends":
# Time series analysis
df['date'] = pd.to_datetime(df['date'])
monthly_sales = df.groupby(df['date'].dt.month)['amount'].sum()
insights = {
"monthly_trends": monthly_sales.to_dict(),
"growth_rate": calculate_growth_rate(monthly_sales)
}
recommendations = ["Increase marketing in low months"]
return {
"insights": insights,
"recommendations": recommendations,
"processed_at": datetime.now().isoformat()
}
def calculate_growth_rate(series):
if len(series) < 2:
return 0
return ((series.iloc[-1] - series.iloc[0]) / series.iloc[0]) * 100
⚡ Key Features
- Full Python Power: Access to entire Python ecosystem and libraries
- Custom Logic: Implement any business logic or processing workflow
- API Integrations: Connect to external services and databases
- Data Processing: Advanced analytics, transformations, and computations
- Structured I/O: Define exact input/output schemas for type safety
🎯 Use Cases
📊 Data Analysis
Statistical analysis, data transformations, reporting, and business intelligence.
🔗 API Integrations
Connect to external services, databases, webhooks, and third-party systems.
⚙️ Business Logic
Complex workflows, validation rules, approval processes, and decision engines.
🔄 Data Processing
ETL operations, file processing, format conversions, and data validation.
💻 Activation Example
// Activate custom agent with structured data
const response = await FIBER.agents.activate('dataAnalyzer', {
data: [
{ date: "2024-01-01", amount: 1500, customer_id: "A123" },
{ date: "2024-01-15", amount: 2300, customer_id: "B456" },
{ date: "2024-02-01", amount: 1800, customer_id: "A123" }
],
analysis_type: "summary"
}, {
context: { report_id: "monthly-2024-01" }
});
console.log("Insights:", response.output_data.insights);
console.log("Recommendations:", response.output_data.recommendations);
⚖️ Agent Type Comparison
Feature | 🧠 LLM Agents | 🛠️ Custom Agents |
---|---|---|
Setup Complexity | ✅ Minimal - Configuration only | 🔶 Moderate - Python code required |
Natural Language | ✅ Excellent - Native LLM capabilities | 🔶 Limited - Requires LLM integration |
Custom Logic | ❌ Limited - Prompt engineering only | ✅ Unlimited - Full Python power |
External APIs | ❌ Not supported directly | ✅ Full integration capabilities |
Data Processing | ❌ Text-based only | ✅ Advanced analytics and computation |
Performance | 🔶 Depends on LLM provider latency | ✅ Fast - Direct Python execution |
Maintenance | ✅ Low - Provider handles updates | 🔶 Medium - Code maintenance required |
Scalability | ✅ High - Provider infrastructure | 🔶 Depends on implementation |
🎯 Choosing the Right Agent Type
🤔 Decision Framework
Choose LLM Agents When:
- Building conversational interfaces
- Processing natural language inputs
- Generating human-like text responses
- Quick prototyping and minimal setup
- Leveraging latest AI model capabilities
- No complex business logic required
Choose Custom Agents When:
- Implementing specific business rules
- Processing structured data
- Integrating with external systems
- Performing complex computations
- Requiring deterministic outputs
- Building specialized workflows
🔄 Hybrid Approaches
Many sophisticated applications use both agent types together:
- LLM for Interface: Use LLM agents for user interaction and natural language processing
- Custom for Logic: Use custom agents for business logic and data processing
- Chain Activations: Activate multiple agents in sequence for complex workflows
💡 Real-World Examples
📞 Customer Support System
📊 Analytics Dashboard
🛒 E-commerce Assistant
✅ Best Practices
🧠 LLM Agent Best Practices
- Clear Schema: Define precise input/output schemas
- Provider Selection: Always specify
llm_provider_id
- Context Management: Use activation context for conversation state
- Error Handling: Plan for LLM provider failures and timeouts
- Cost Optimization: Monitor token usage and optimize prompts
🛠️ Custom Agent Best Practices
- Schema Validation: Validate inputs rigorously
- Error Handling: Implement comprehensive try/catch blocks
- Async Operations: Use async/await for I/O operations
- Logging: Include detailed logging for debugging
- Testing: Unit test your agent logic thoroughly
- Dependencies: Minimize external dependencies
🎯 General Best Practices
- Versioning: Use semantic versioning for agent updates
- Documentation: Provide clear descriptions and examples
- Monitoring: Track activation success rates and performance
- Security: Never log sensitive data or API keys