Functions & Pipelines CLI Tutorial โจ
Learn to build, execute, and manage functions and pipelines using Fiberwise's powerful command-line interface. Master advanced workflow automation and function orchestration.
๐ New Feature Highlight
The Functions & Pipelines CLI provides direct access to Fiberwise's function system without requiring a running API server. Perfect for development, testing, and automation!
๐ What You'll Learn
- โ Creating and managing functions using the CLI
- โ Executing functions with different input patterns
- โ Building multi-agent coordination workflows
- โ Creating and managing execution pipelines
- โ Advanced debugging and monitoring techniques
๐ Key Skills Covered
CLI Mastery
- Function Management - Create, list, execute, and manage functions
- Pipeline Orchestration - Build complex multi-step workflows
- Multi-Agent Systems - Coordinate multiple agents working together
- Batch Processing - Automate bulk operations and data processing
Advanced Patterns
- Function dependency injection and service integration
- Real-time activation history and performance monitoring
- Error handling and debugging strategies
- Development vs production function deployment
๐ Note for Self-Hosted Users
If you're using a self-hosted Fiberwise instance, add --to-instance your-instance-name
to all CLI commands.
For example: fiber functions list --to-instance my-self-hosted
๐ Prerequisites: Your Setup Checklist
Before you begin, you need a fully configured Fiberwise environment. This is the foundation for building any app on the platform.
๐ง Required Setup
โ All Set?
Once all boxes are checked, you are ready to proceed. If not, please complete the linked guides first.
๐ Step 1: Verify CLI Setup
# Verify Functions CLI is available
fiber functions --help
# Should show: Functions management commands
# Check platform connection
fiber account list-configs
# Should show your configured platform instances
# Test function capabilities
fiber functions list
# Shows existing functions (may be empty initially)
๐ ๏ธ Step 2: Create Your First Function
Create a Simple Data Processing Function
# Create a basic Python function file
cat > data_processor.py << 'EOF'
#!/usr/bin/env python3
"""
Data processing function for user information
"""
def run(input_data):
"""
Process user data and return formatted output
Args:
input_data (dict): Input containing user information
Returns:
Dictionary with processed results
"""
# Extract user information
name = input_data.get('name', 'Unknown')
age = input_data.get('age', 0)
email = input_data.get('email', '')
# Validate required fields
if not name or name == 'Unknown':
return {'status': 'error', 'message': 'Name is required'}
# Process the data
processed_info = {
'formatted_name': name.title(),
'age_group': 'adult' if age >= 18 else 'minor' if age > 0 else 'unknown',
'email_domain': email.split('@')[1] if '@' in email else 'no-email',
'profile_complete': bool(name and age > 0 and email)
}
# Return successful result
return {
'status': 'success',
'user_info': processed_info,
'metadata': {
'processed_at': str(__import__('datetime').datetime.now()),
'processor_version': '1.0'
}
}
EOF
Register the Function in Fiberwise
# Create the function
fiber functions create data_processor \
--description "Processes and validates user data" \
--type transform \
--file ./data_processor.py \
--verbose
Creating function from file: ./data_processor.py
Reading implementation from: ./data_processor.py
Creating function: data_processor
Type: transform
Description: Processes and validates user data
[SUCCESS] Function created successfully!
----------------------------------------
Name: data_processor
ID: a1b2c3d4-e5f6-7g8h-9i0j-k1l2m3n4o5p6
Type: transform
Created at: 2024-01-01 12:00:00
You can now execute it with:
fiber functions execute a1b2c3d4-e5f6-7g8h-9i0j-k1l2m3n4o5p6 --input-data '{}'
๐งช Step 3: Execute and Test Functions
Test 1: Basic Execution
# Execute with valid user data
fiber functions execute data_processor \
--input-data '{"name": "john doe", "age": 25, "email": "[email protected]"}' \
--verbose
Executing function: data_processor
Input: {"name": "john doe", "age": 25, "email": "[email protected]"}
[SUCCESS] Function executed successfully!
----------------------------------------
Execution ID: b2c3d4e5-f6g7-8h9i-0j1k-l2m3n4o5p6q7
Status: completed
Started: 2024-01-01 12:01:00
Completed: 2024-01-01 12:01:01
Result:
{
"status": "success",
"user_info": {
"formatted_name": "John Doe",
"age_group": "adult",
"email_domain": "example.com",
"profile_complete": true
},
"metadata": {
"processed_at": "2024-01-01 12:01:01.123456",
"processor_version": "1.0"
}
}
Test 2: Error Handling
# Test error handling with missing name
fiber functions execute data_processor \
--input-data '{"age": 30, "email": "[email protected]"}' \
--verbose
[SUCCESS] Function executed successfully!
----------------------------------------
Result:
{
"status": "error",
"message": "Name is required"
}
Test 3: Minimal Data
# Test with minimal data
fiber functions execute data_processor \
--input-data '{"name": "alice"}' \
--verbose
๐ Step 4: Function Management and Monitoring
List and Inspect Functions
# List all functions
fiber functions list
# Search for specific functions
fiber functions list --search "processor" --type transform --verbose
# Show detailed function information
fiber functions show data_processor --verbose
View Function Details and History
The show
command provides comprehensive information including execution history:
Function: data_processor
==================================================
ID: a1b2c3d4-e5f6-7g8h-9i0j-k1l2m3n4o5p6
Type: transform
System: No
Async: No
Description: Processes and validates user data
Input Schema:
{
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"email": {"type": "string"}
}
}
Recent Executions (Last 5):
------------------------------
[OK] b2c3d4e5... - completed - 2024-01-01 12:01:00
[OK] c3d4e5f6... - completed - 2024-01-01 12:02:00
[FAIL] d4e5f6g7... - failed - 2024-01-01 12:03:00
๐ค Step 5: Advanced Agent Integration
Create an AI-Powered Function
Let's create a more advanced function that uses Fiberwise's dependency injection system:
# Create an advanced function with AI integration
cat > chat_agent.py << 'EOF'
#!/usr/bin/env python3
"""
Chat agent function for intelligent conversations
"""
import json
import logging
import time
from typing import Dict, Any
# Set up logger
logger = logging.getLogger(__name__)
class ChatAgent:
"""Chat agent for intelligent conversations."""
async def run_agent(self, input_data: Dict[str, Any], fiber, llm_service) -> Dict[str, Any]:
"""
Process chat messages with AI assistance
"""
message = input_data.get('message', '')
chat_id = input_data.get('chat_id', f'chat_{int(time.time())}')
system_prompt = input_data.get('system_prompt', 'You are a helpful AI assistant.')
temperature = input_data.get('temperature', 0.7)
if not message:
return {'status': 'error', 'message': 'Message is required'}
try:
# Use LLM service to generate response
response = await llm_service.complete(
prompt=message,
system_prompt=system_prompt,
temperature=temperature,
max_tokens=500
)
# Store conversation in fiber app data service
conversation_entry = {
'chat_id': chat_id,
'user_message': message,
'ai_response': response,
'timestamp': time.time(),
'system_prompt': system_prompt
}
# Save conversation if fiber data service is available
if hasattr(fiber, 'data'):
await fiber.data.create('conversations', conversation_entry)
return {
'status': 'success',
'response': response,
'chat_id': chat_id,
'metadata': {
'response_length': len(response),
'processing_time': f'{time.time()}',
'temperature': temperature
}
}
except Exception as e:
logger.error(f"Chat agent failed: {str(e)}")
return {
'status': 'error',
'message': f'Chat processing failed: {str(e)}'
}
# Function entry point for CLI execution
async def run(input_data, fiber, llm_service):
"""
Entry point for CLI function execution
"""
agent = ChatAgent()
return await agent.run_agent(input_data, fiber, llm_service)
EOF
Create and Test the Agent Function
# Create the chat agent function
fiber functions create chat_agent \
--description "Intelligent chat agent with LLM integration and conversation context" \
--type support_agent \
--file ./chat_agent.py \
--verbose
Test Agent Function with AI Integration
# Test chat agent function
fiber functions execute chat_agent \
--input-data '{"message": "What are the benefits of using command-line tools for development?", "chat_id": "dev-discussion"}' \
--verbose
# Test with custom system prompt
fiber functions execute chat_agent \
--input-data '{"message": "Explain functions and pipelines", "system_prompt": "You are a Fiberwiseexpert. Explain concepts clearly with practical examples.", "temperature": 0.3}' \
--verbose
๐ Step 6: Pipeline Management
Explore Pipeline Features
# List available pipelines
fiber functions list-pipelines --verbose
# Execute a pipeline (if any exist)
# fiber functions execute-pipeline my-pipeline-id \
# --input-data '{"source": "cli", "data": {"key": "value"}}' \
# --verbose
# Check pipeline execution status
# fiber functions pipeline-status execution-uuid --verbose
๐ Step 7: Multi-Agent Coordination
Fiberwise's CLI now supports powerful multi-agent workflows, allowing you to coordinate multiple agents or functions working together:
Multi-Function Execution
# Execute multiple functions in parallel
fiber functions execute-multi data_processor chat_agent \
--input-data '{"message": "Process this data and provide insights"}' \
--coordination-mode parallel \
--verbose
# Chain functions together (output of one feeds to next)
fiber functions execute-multi data_processor chat_agent summarizer \
--input-data '{"raw_data": "user input here"}' \
--coordination-mode chain \
--verbose
# Execute functions sequentially with same input
fiber functions execute-multi validator processor formatter \
--input-data '{"content": "text to process"}' \
--coordination-mode sequential \
--verbose
Multi-Agent Activation
# Activate multiple agents in conversation mode
fiber functions activate-multi chatAgent testAgent \
--input-data '{"prompt": "Discuss the benefits of multi-agent systems"}' \
--context '{"chat_id": "multi-agent-session-1"}' \
--coordination-mode conversation \
--verbose
# Chain agent activations (output feeds to next agent)
fiber functions activate-multi analyzer reviewer summarizer \
--input-data '{"document": "content to analyze"}' \
--coordination-mode chain \
--verbose
# Parallel agent processing
fiber functions activate-multi sentiment_agent keyword_agent summary_agent \
--input-data '{"text": "analyze this content from multiple perspectives"}' \
--coordination-mode parallel \
--verbose
Monitor Multi-Agent Sessions
# View activation history by session
fiber functions activation-history \
--chat-id "multi-agent-session-1" \
--verbose
# Filter by specific agent
fiber functions activation-history \
--agent-id "chatAgent" \
--limit 10 \
--verbose
# View recent multi-agent activations
fiber functions activation-history --limit 20
๐ง Step 8: Automation and Batch Processing
Batch Function Execution
# Create a batch processing script
cat > batch_process.sh << 'EOF'
#!/bin/bash
echo "Starting batch user processing..."
# Array of user data to process
users=(
'{"name": "Alice Smith", "age": 28, "email": "[email protected]"}'
'{"name": "Bob Johnson", "age": 35, "email": "[email protected]"}'
'{"name": "Carol Wilson", "age": 42, "email": "[email protected]"}'
)
# Process each user
for i in "${!users[@]}"; do
echo "Processing user $((i+1))/${#users[@]}..."
result=$(fiber functions execute data_processor \
--input-data "${users[i]}" \
--format json)
if echo "$result" | grep -q '"status": "success"'; then
echo "โ
User $((i+1)) processed successfully"
else
echo "โ User $((i+1)) processing failed"
fi
done
echo "Batch processing complete!"
EOF
chmod +x batch_process.sh
./batch_process.sh
Multi-Agent Analysis Workflow
# Create an advanced multi-agent analysis script
cat > multi_agent_analysis.sh << 'EOF'
#!/bin/bash
echo "Starting multi-agent content analysis..."
# Sample content to analyze
CONTENT='{"text": "Fiberwiseis a revolutionary platform for AI agent coordination. It enables seamless integration of multiple agents working together to solve complex problems through innovative activation patterns and pipeline orchestration."}'
# Generate unique session ID for this analysis
SESSION_ID="analysis-$(date +%s)"
CONTEXT="{\"chat_id\": \"$SESSION_ID\", \"analysis_type\": \"multi_agent_content\"}"
echo "Session ID: $SESSION_ID"
# Run parallel analysis with different agents
echo "Step 1: Running parallel analysis agents..."
fiber functions activate-multi sentiment_agent keyword_agent summary_agent \
--input-data "$CONTENT" \
--context "$CONTEXT" \
--coordination-mode parallel \
--verbose
echo "Step 2: Running review chain..."
# Chain review agents for quality control
fiber functions activate-multi fact_checker editor reviewer \
--input-data "$CONTENT" \
--context "$CONTEXT" \
--coordination-mode chain \
--verbose
echo "Step 3: Generating final report..."
# Generate comprehensive report
fiber functions activate-multi report_generator \
--input-data "$CONTENT" \
--context "$CONTEXT" \
--verbose
echo "Analysis complete! View results with:"
echo "fiber functions activation-history --chat-id $SESSION_ID --verbose"
EOF
chmod +x multi_agent_analysis.sh
# ./multi_agent_analysis.sh
๐ Step 9: Development vs Production
Environment Management
# Create development version of function
fiber functions create data_processor_dev \
--description "Development version with debug logging" \
--type transform \
--file ./data_processor_dev.py \
--environment development
# Create production version
fiber functions create data_processor_prod \
--description "Production optimized version" \
--type transform \
--file ./data_processor.py \
--environment production
# Execute in specific environment
fiber functions execute data_processor_dev \
--input-data '{"name": "test"}' \
--environment development \
--debug
Function Versioning and Updates
# Update existing function
fiber functions update data_processor \
--file ./data_processor_v2.py \
--description "Updated with enhanced validation" \
--verbose
# View function versions
fiber functions versions data_processor
# Execute specific version
fiber functions execute data_processor \
--version "1.1" \
--input-data '{"name": "test"}'
๐ Advanced Use Cases and Next Steps
Function Templates
# Create function from template
fiber functions create-from-template data_processor \
--template data_transformation \
--language python \
--verbose
# List available templates
fiber functions list-templates
Integration with External Systems
Functions can integrate with external APIs, databases, and services:
# Example function with external integration
async def run(input_data, fiber):
"""Function that integrates with external API"""
import aiohttp
# Get configuration from Fiberwise
api_key = await fiber.config.get('external_api_key')
# Make external API call
async with aiohttp.ClientSession() as session:
async with session.get(
'https://api.external-service.com/data',
headers={'Authorization': f'Bearer {api_key}'}
) as response:
data = await response.json()
# Process and return results
return {
'status': 'success',
'external_data': data,
'processed_at': time.time()
}
๐ Next Steps
๐ Congratulations!
You've successfully mastered the Functions & Pipelines CLI! You can now:
- โ Create and manage functions with the CLI
- โ Execute functions with various input patterns and configurations
- โ Build multi-agent coordination workflows
- โ Automate batch processing and complex analysis tasks
- โ Monitor and debug function executions
- โ Integrate external services and APIs
You're now equipped with advanced CLI skills for building sophisticated automation workflows!