LLM Provider Setup

Set up your AI language model provider to power Fiberwise agents with a single CLI command.

โฑ๏ธ 2-5 minutes ๐Ÿ“š Beginner ๐Ÿ”ง Configuration

๐Ÿค– Why You Need an LLM Provider

Fiberwise agents use Large Language Models (LLMs) to process and generate intelligent responses. You'll need an API key from one of the supported providers:

๐Ÿง  OpenAI

GPT-4, GPT-3.5-turbo, and more

โœ… Most popular โœ… Great for general use โœ… Fast responses

๐ŸŽญ Anthropic

Claude 3, Claude 2, and Haiku models

โœ… Excellent reasoning โœ… Safety-focused โœ… Long context

๐ŸŒŠ Cohere

Command and specialized models

โœ… Enterprise-ready โœ… Multilingual โœ… Cost-effective

๐Ÿ  Local/Self-Hosted

Ollama, vLLM, and custom endpoints

โœ… Privacy โœ… No API costs โœ… Full control

โšก Quick Setup Commands

Choose your provider and run the setup command. The CLI will guide you through the process:

๐Ÿง  OpenAI Setup

๐Ÿง  Add OpenAI Provider
# Note: LLM provider management is handled through the web UI
# Navigate to http://localhost:5757/settings/llm-providers
# Or manage via the Python SDK in your applications

# Alternatively, providers can be configured via environment variables
export OPENAI_API_KEY="your-api-key-here"

Get your API key: OpenAI API Keys

Models available: gpt-4, gpt-4-turbo, gpt-3.5-turbo

๐ŸŽญ Anthropic Setup

๐ŸŽญ Add Anthropic Provider
# Configure Anthropic via web UI or environment variables
export ANTHROPIC_API_KEY="your-api-key-here"

# Or manage through the web interface at:
fiber account set-default-provider anthropic

Get your API key: Anthropic Console

Models available: claude-3-opus, claude-3-sonnet, claude-3-haiku

๐ŸŒŠ Cohere Setup

๐ŸŒŠ Add Cohere Provider
# Add Cohere as your LLM provider
fiber account add-provider cohere --api-key YOUR_COHERE_API_KEY

# Optional: Set as default provider
fiber account set-default-provider cohere

Get your API key: Cohere Dashboard

Models available: command, command-light, command-r

๐Ÿ  Local/Self-Hosted Setup

๐Ÿ  Add Local Provider
# Add a local or self-hosted provider
fiber account add-provider custom \
  --name "My Local LLM" \
  --endpoint "http://localhost:11434/v1" \
  --api-key "optional-auth-token"

# For Ollama specifically
fiber account add-provider ollama \
  --endpoint "http://localhost:11434" \
  --model "llama2:7b"

Popular local options: Ollama, vLLM, LocalAI, LM Studio

No API key needed for most local setups

โœ… Verify Your Setup

Test your provider connection to make sure everything is working:

# List all configured providers
fiber account list-providers

# Test your provider connection
fiber account test-provider openai

# View provider details
fiber account show-provider openai

โœ… Expected Output

โœ… Providers configured:
  โ€ข openai (gpt-4) - Default โœ“
  โ€ข anthropic (claude-3-sonnet) - Available
  
๐Ÿงช Testing openai provider...
โœ… Connection successful!
๐Ÿ“Š Available models: gpt-4, gpt-4-turbo, gpt-3.5-turbo
๐Ÿ’ฐ Usage: 1,234 tokens used this month

โš™๏ธ Advanced Configuration

๐ŸŽ›๏ธ Provider Settings

Customize your provider settings for optimal performance:

# Set provider-specific defaults
fiber account configure-provider openai \
  --default-model "gpt-4" \
  --max-tokens 2048 \
  --temperature 0.7

# Set rate limits
fiber account configure-provider openai \
  --rate-limit 100 \
  --rate-period "1m"

# Set cost controls
fiber account configure-provider openai \
  --monthly-budget 50.00 \
  --alert-threshold 80

๐Ÿ” Environment Variables

You can also set up providers using environment variables:

# Set environment variables
export FIBERWISE_OPENAI_API_KEY="your-api-key"
export FIBERWISE_ANTHROPIC_API_KEY="your-api-key"
export FIBERWISE_DEFAULT_PROVIDER="openai"

# Then configure without exposing keys in command history
fiber account add-provider openai --api-key-from-env

๐Ÿข Team/Organization Setup

For team environments, you can share provider configurations:

# Export provider config for team sharing
fiber account export-providers > team-providers.json

# Import shared provider config
fiber account import-providers team-providers.json

# Or configure organization-wide defaults
fiber org set-default-providers --config org-providers.yaml

๐Ÿ”ง Troubleshooting

โŒ "Invalid API Key"

Solution:

  • Double-check your API key from the provider's dashboard
  • Ensure no extra spaces or characters
  • Try regenerating the API key
# Re-add with new key
fiber account update-provider openai --api-key NEW_KEY

โŒ "Rate Limit Exceeded"

Solution:

  • Check your provider's usage dashboard
  • Upgrade your provider plan if needed
  • Configure rate limiting in Fiberwise
# Check current usage
fiber account usage-stats openai

โŒ "Connection Timeout"

Solution:

  • Check your internet connection
  • Verify firewall settings
  • For local providers, ensure the service is running
# Test with verbose output
fiber account test-provider openai --verbose

โŒ "Model Not Found"

Solution:

  • Check available models for your provider
  • Ensure you have access to the requested model
  • Update to a supported model name
# List available models
fiber account list-models openai

๐Ÿš€ Next Steps

Now that your LLM provider is configured, you're ready to build AI applications!

๐Ÿ“š Start Tutorial

Build your first AI chat app with the Hello World tutorial

Start Building

๐Ÿค– Explore Agents

Learn about creating and managing AI agents

Learn Agents

๐Ÿ’ก Example Apps

Browse pre-built applications you can deploy

Browse Examples

โš™๏ธ Advanced Config

Deep dive into provider configuration and optimization

Advanced Setup