Database Developer Guide
Build database-agnostic applications with automatic query translation, cross-database compatibility, and production-ready patterns.
Overview
FiberWise Core Web provides a database-agnostic development experience with automatic query translation. You can write SQL queries in any supported dialect (PostgreSQL, SQLite, MySQL, or named parameters), and the query adapter will automatically convert them to work with your target database.
🚀 Write once, run anywhere - switch databases by changing one environment variable
Supported Databases
SQLite
Perfect for development with zero configuration. Automatic file creation and async locks.
PostgreSQL
Production-ready with connection pooling, JSONB support, and full ACID compliance.
DuckDB
Analytics-focused with columnar storage and thread-local connections for data processing.
Quick Start
1. Choose Your Database
Set your database provider via environment variable:
# SQLite (Default - Perfect for development)
DATABASE_URL=sqlite:///./.fiberwise-core/fiberwise.db
# PostgreSQL (Production-ready)
DATABASE_URL=postgresql://user:password@localhost:5432/fiberwise
# DuckDB (Analytics-focused)
DATABASE_URL=duckdb:///./.fiberwise-core/analytics.duckdb
2. Write SQL in Your Preferred Dialect
The query adapter automatically translates between database parameter styles:
PostgreSQL Style (Recommended)
# Write once, works everywhere
await db.fetch_one(
"SELECT * FROM users WHERE email = $1 AND active = $2",
"[email protected]", True
)
SQLite Style
# Also works everywhere
await db.fetch_one(
"SELECT * FROM users WHERE email = ? AND active = ?",
"[email protected]", True
)
Named Parameters
# Query adapter handles this too
await db.fetch_one(
"SELECT * FROM users WHERE email = :email AND active = :active",
{"email": "[email protected]", "active": True}
)
3. The Magic: Automatic Translation
Your Code (PostgreSQL) | Translated for SQLite | Translated for DuckDB |
---|---|---|
WHERE id = $1 |
WHERE id = ? |
WHERE id = ? |
WHERE name = $1 AND age = $2 |
WHERE name = ? AND age = ? |
WHERE name = ? AND age = ? |
NOW() |
CURRENT_TIMESTAMP |
CURRENT_TIMESTAMP |
ILIKE |
LIKE |
LIKE |
Database Operations
Basic CRUD Operations
from api.core.dependencies import get_database
async def create_user(db, user_data):
"""Create a new user - works with any database"""
user_id = await db.fetch_val(
"""
INSERT INTO users (email, username, display_name)
VALUES ($1, $2, $3)
RETURNING id
""",
user_data.email,
user_data.username,
user_data.display_name
)
return user_id
async def get_user_by_email(db, email):
"""Fetch user by email - automatic query translation"""
return await db.fetch_one(
"SELECT * FROM users WHERE email = $1",
email
)
async def update_user(db, user_id, updates):
"""Update user data with dynamic fields"""
set_clause = ", ".join([f"{k} = ${i+2}" for i, k in enumerate(updates.keys())])
query = f"UPDATE users SET {set_clause} WHERE id = $1"
await db.execute(query, user_id, *updates.values())
async def delete_user(db, user_id):
"""Delete user - supports all databases"""
await db.execute("DELETE FROM users WHERE id = $1", user_id)
Advanced Queries
Complex Joins
async def get_user_apps(db, user_id):
"""Complex query with joins - works across all databases"""
return await db.fetch_all(
"""
SELECT
a.app_id,
a.name as app_name,
a.description,
av.version,
av.status
FROM apps a
JOIN app_versions av ON a.current_live_app_version_id = av.app_version_id
WHERE a.creator_user_id = $1
ORDER BY a.created_at DESC
""",
user_id
)
JSON Operations
async def update_user_metadata(db, user_id, metadata):
"""JSON operations work across databases"""
await db.execute(
"""
UPDATE users
SET metadata = $1, updated_at = CURRENT_TIMESTAMP
WHERE id = $2
""",
json.dumps(metadata),
user_id
)
Analytics Queries
async def get_app_analytics(db, date_range):
"""Analytics query - DuckDB will optimize, others work fine"""
return await db.fetch_all(
"""
SELECT
DATE(created_at) as date,
COUNT(*) as activations,
COUNT(DISTINCT agent_id) as unique_agents,
AVG(duration_ms) as avg_duration
FROM agent_activations
WHERE created_at >= $1 AND created_at <= $2
GROUP BY DATE(created_at)
ORDER BY date DESC
""",
date_range.start,
date_range.end
)
Transaction Management
async def transfer_app_ownership(db, app_id, old_owner_id, new_owner_id):
"""Atomic operations with transactions"""
async with db.transaction():
# Verify current ownership
current_owner = await db.fetch_val(
"SELECT creator_user_id FROM apps WHERE app_id = $1",
app_id
)
if current_owner != old_owner_id:
raise ValueError("Invalid current owner")
# Transfer ownership
await db.execute(
"UPDATE apps SET creator_user_id = $1, updated_at = CURRENT_TIMESTAMP WHERE app_id = $2",
new_owner_id, app_id
)
# Log the transfer
await db.execute(
"""
INSERT INTO app_transfer_log (app_id, old_owner_id, new_owner_id, transferred_at)
VALUES ($1, $2, $3, CURRENT_TIMESTAMP)
""",
app_id, old_owner_id, new_owner_id
)
Migration Development
Why Migrations Don't Use Query Adapter (Yet)
Migrations currently execute raw SQL for important reasons:
- Schema Reliability: DDL operations need predictable, database-specific syntax
- Historical Stability: Existing migrations must remain unchanged
- Rollback Safety: Direct SQL ensures exact control over schema changes
- Complex Schema Operations: Some DDL requires database-specific features
However, the query adapter interface is ready - we just need to integrate it safely into the migration system.
Writing Cross-Database Migrations
For maximum portability, write standard SQL that works across databases:
✅ Good: Cross-Database Compatible
-- migrations/001_add_user_preferences.sql
-- Works on SQLite, PostgreSQL, DuckDB
ALTER TABLE users ADD COLUMN preferences TEXT DEFAULT '{}';
CREATE INDEX idx_users_preferences ON users(preferences);
-- Use standard SQL functions
UPDATE users SET updated_at = CURRENT_TIMESTAMP WHERE preferences IS NULL;
✨ Better: Database-Specific Variants
SQLite Version
-- migrations/sqlite/001_add_user_preferences.sql
ALTER TABLE users ADD COLUMN preferences TEXT DEFAULT '{}';
CREATE INDEX idx_users_preferences ON users(preferences);
PostgreSQL Version
-- migrations/postgresql/001_add_user_preferences.sql
ALTER TABLE users ADD COLUMN preferences JSONB DEFAULT '{}';
CREATE INDEX idx_users_preferences ON users USING GIN (preferences);
DuckDB Version
-- migrations/duckdb/001_add_user_preferences.sql
ALTER TABLE users ADD COLUMN preferences JSON DEFAULT '{}';
CREATE INDEX idx_users_preferences ON users(preferences);
🚀 Future: Query Adapter for Migrations
Coming Soon: We're integrating the query adapter into the migration system. Soon you'll be able to write data migrations in any SQL dialect (PostgreSQL, SQLite, MySQL, named parameters) and they'll automatically work across all databases.
Schema migrations (DDL) will likely remain database-specific for safety, but data migrations (DML) will gain full query adaptation support.
# Future: Write data migrations in any style
async def migrate_user_statuses(migration_manager):
# PostgreSQL style - will work everywhere
await migration_manager.execute_adapted(
"UPDATE users SET status = $1 WHERE created_at < $2",
"migrated", datetime(2024, 1, 1)
)
# Or SQLite style - also will work everywhere
await migration_manager.execute_adapted(
"UPDATE users SET last_login = ? WHERE last_login IS NULL",
datetime.now()
)
Development Workflow
1. Local Development (SQLite)
# Start with SQLite for rapid development
export DATABASE_URL="sqlite:///./.fiberwise-core/dev.db"
python main.py
2. Testing (PostgreSQL)
# Test with PostgreSQL to catch compatibility issues
export DATABASE_URL="postgresql://postgres:password@localhost:5432/fiberwise_test"
python -m pytest
3. Analytics Development (DuckDB)
# Use DuckDB for analytics feature development
export DATABASE_URL="duckdb:///./.fiberwise-core/analytics.duckdb"
python analytics_script.py
4. Production (PostgreSQL + Pooling)
# Production with optimized connection settings
export DATABASE_URL="postgresql://user:pass@prod-host:5432/fiberwise?pool_min_size=5&pool_max_size=20"
Best Practices
1. Always Use Parameters
✅ Good - Uses parameterized queries
await db.fetch_one("SELECT * FROM users WHERE email = $1", email)
❌ Bad - SQL injection risk
await db.fetch_one(f"SELECT * FROM users WHERE email = '{email}'")
2. Prefer PostgreSQL-Style Parameters
✅ Recommended - Clear parameter numbering
query = "SELECT * FROM users WHERE email = $1 AND status = $2"
✅ Also fine - Will be auto-translated
query = "SELECT * FROM users WHERE email = ? AND status = ?"
3. Use Transactions for Related Operations
✅ Good - Atomic operations
async with db.transaction():
user_id = await db.execute("INSERT INTO users ...")
await db.execute("INSERT INTO user_profiles ...", user_id)
❌ Risk - Operations might be partially applied
user_id = await db.execute("INSERT INTO users ...")
await db.execute("INSERT INTO user_profiles ...", user_id)
4. Handle Database-Specific Features Gracefully
async def get_user_search_results(db, query_text):
"""Handle database-specific search capabilities"""
# Try PostgreSQL full-text search first
try:
if hasattr(db, 'provider') and db.provider == 'postgresql':
return await db.fetch_all(
"SELECT * FROM users WHERE to_tsvector(display_name) @@ plainto_tsquery($1)",
query_text
)
except:
pass
# Fallback to basic LIKE search for all databases
return await db.fetch_all(
"SELECT * FROM users WHERE display_name LIKE $1",
f"%{query_text}%"
)
Troubleshooting
Common Issues
1. Query Parameter Mismatch
❌ Error: Wrong number of parameters
await db.fetch_one("SELECT * FROM users WHERE id = $1 AND email = $2", user_id)
✅ Fix: Provide all parameters
await db.fetch_one("SELECT * FROM users WHERE id = $1 AND email = $2", user_id, email)
2. Database-Specific Syntax
❌ Error: PostgreSQL-specific syntax in SQLite migration
CREATE TABLE users (
id SERIAL PRIMARY KEY, -- SERIAL doesn't exist in SQLite
email VARCHAR(255) UNIQUE NOT NULL
);
✅ Fix: Use compatible syntax
CREATE TABLE users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
email TEXT UNIQUE NOT NULL
);
3. Connection Issues
Check your DATABASE_URL and ensure the database server is running
# Debug: Test connection
from api.core.dependencies import get_database_manager
async def test_connection():
db_manager = get_database_manager()
is_healthy = await db_manager.health_check()
print(f"Database healthy: {is_healthy}")
Getting Help
- Database Issues: Check logs in
.fiberwise-core/logs/
- Migration Problems: Run migrations with
DEBUG=true
for detailed output - Performance: Use database-specific query analyzers (EXPLAIN, EXPLAIN ANALYZE)
- Query Translation: The query adapter logs translations in debug mode
Remember: The goal is to write database-agnostic code that performs well across SQLite, PostgreSQL, and DuckDB. The query adapter handles the translation complexity for you! 🚀