Stop building platform plumbing
Fabric handles identity, auth, job orchestration, and AI provider routing so you can focus on the product.
One Rust binary. Postgres. That's it.
Without Fabric
- Build auth middleware
- Build multi-tenant org model
- Build RBAC from scratch
- Build job queue + retries
- Build provider abstraction
- Build SSE event system
- Build cost tracking
- Build webhook delivery
- ~3 months before you ship a feature
With Fabric
just runWhat’s inside
Section titled “What’s inside”Identity
Orgs, teams, memberships, invitations, service accounts, API keys. Your app reads from Fabric — never owns user state.
Auth
Four roles (Owner/Admin/Member/Viewer), effective permission computation, batch checks. Route-level enforcement built in.
Workflows
Python DAGs with
.then(), .fork(), retries, timeouts, shared context. Write tasks as async functions, run them on N workers.AI Routing
OpenAI, Anthropic, Gemini, Ollama, Whisper, fal.ai, ComfyUI. Tier-based selection, cost-aware fallback, health checks, streaming.
Events
SSE, WebSocket, and HMAC-signed webhooks. Every state transition. Per-run replay. No polling.
Runtimes
Native (Rust), Tool (ffmpeg, yt-dlp), Provider (AI), Wasm (sandboxed plugins). Same semantic operation, four execution strategies.
How it’s built
Section titled “How it’s built”Two planes. The control plane owns all state — identity, tenancy, RBAC, workflows, providers, cost, audit. HTTP on :3001, gRPC on :3002, SSE and WebSocket for real-time.
The execution plane is stateless. Workers claim steps with SELECT … FOR UPDATE SKIP LOCKED, execute them, report results. Scale by adding more workers.
Everything goes through Postgres. No Redis, no message broker, no distributed consensus. The database is the queue.
Control Plane:3001 HTTP · :3002 gRPC · SSE/WS
Postgres
Workers (N)Native · Tool · Provider · Wasm
Try it
Section titled “Try it”brew install just protobufjust infra-up && cp .env.example .envjust run # control planejust executor # workersfrom fabric_platform import FabricClientfabric = FabricClient()
# topic in, production video outrun_id = fabric.run_workflow("global/ai-shorts", context={ "topic": "Why sleep is a superpower",})
result = fabric.wait_for_run(run_id)print(result["output"]["final_video_path"])import { FabricClient } from "@fabric-platform/sdk";const fabric = new FabricClient();
const run = await fabric.workflows.runs.submitRun({ workflowSlug: "global/ai-shorts", input: { topic: "Why sleep is a superpower" },});
for await (const event of fabric.workflows.runs.streamRunEvents(run.id)) { console.log(event.kind);}curl http://localhost:3001/healthz
curl -X POST http://localhost:3001/v1/workflows/global%2Fai-shorts/runs \ -H 'content-type: application/json' \ -d '{"context":{"topic":"Why sleep is a superpower"}}'Built-in workflows
Section titled “Built-in workflows”Fabric ships workflows you can run immediately — or use as starting points for your own.
AI ShortsTopic in, 45-second video out. Script, actor, voiceover, b-roll, music, subtitles — 6-way parallel.Problem IntelligenceScrape Reddit/HN/YouTube/Twitter. Extract problems, cluster, score, generate startup ideas and specs.Clip GeneratorDrop a YouTube URL. Get vertical clips with face tracking, subtitles, hook overlays, and effects.Research → VideoDeep research a topic, generate hooks, fan out N parallel video renders. One command.