Skip to content

Stop building platform plumbing

Fabric handles identity, auth, job orchestration, and AI provider routing so you can focus on the product. One Rust binary. Postgres. That's it.
Without Fabric
  • Build auth middleware
  • Build multi-tenant org model
  • Build RBAC from scratch
  • Build job queue + retries
  • Build provider abstraction
  • Build SSE event system
  • Build cost tracking
  • Build webhook delivery
  • ~3 months before you ship a feature
With Fabric
Terminal window
just run
Identity
Orgs, teams, memberships, invitations, service accounts, API keys. Your app reads from Fabric — never owns user state.
Auth
Four roles (Owner/Admin/Member/Viewer), effective permission computation, batch checks. Route-level enforcement built in.
Workflows
Python DAGs with .then(), .fork(), retries, timeouts, shared context. Write tasks as async functions, run them on N workers.
AI Routing
OpenAI, Anthropic, Gemini, Ollama, Whisper, fal.ai, ComfyUI. Tier-based selection, cost-aware fallback, health checks, streaming.
Events
SSE, WebSocket, and HMAC-signed webhooks. Every state transition. Per-run replay. No polling.
Runtimes
Native (Rust), Tool (ffmpeg, yt-dlp), Provider (AI), Wasm (sandboxed plugins). Same semantic operation, four execution strategies.

Two planes. The control plane owns all state — identity, tenancy, RBAC, workflows, providers, cost, audit. HTTP on :3001, gRPC on :3002, SSE and WebSocket for real-time.

The execution plane is stateless. Workers claim steps with SELECT … FOR UPDATE SKIP LOCKED, execute them, report results. Scale by adding more workers.

Everything goes through Postgres. No Redis, no message broker, no distributed consensus. The database is the queue.

Control Plane:3001 HTTP · :3002 gRPC · SSE/WS
Postgres
Workers (N)Native · Tool · Provider · Wasm
Terminal window
brew install just protobuf
just infra-up && cp .env.example .env
just run # control plane
just executor # workers
from fabric_platform import FabricClient
fabric = FabricClient()
# topic in, production video out
run_id = fabric.run_workflow("global/ai-shorts", context={
"topic": "Why sleep is a superpower",
})
result = fabric.wait_for_run(run_id)
print(result["output"]["final_video_path"])

Fabric ships workflows you can run immediately — or use as starting points for your own.