Quick Start
Prerequisites
Section titled “Prerequisites”brew install just protobufYou also need Rust stable (1.75+) and Docker installed.
Start Infrastructure
Section titled “Start Infrastructure”# Start Postgres + Ollama + Whisperjust infra-up
# Build Docker tool images (ffmpeg, whisper, yt-dlp, python-ml)just docker-build-tools
# Configure environmentcp .env.example .envRun the Control Plane
Section titled “Run the Control Plane”just runThe server starts on 127.0.0.1:3001.
Start the Executor (separate terminal)
Section titled “Start the Executor (separate terminal)”just executorThe executor connects to Postgres and begins claiming workflow steps for execution.
Verify It Works
Section titled “Verify It Works”import { FabricClient } from "@fabric-platform/sdk";
const fabric = new FabricClient();
// List available AI providersconst providers = await fabric.listProviders();console.log(providers);from fabric_platform import FabricClient
fabric = FabricClient()
# List available AI providersproviders = fabric.list_providers()print(providers)use fabric_sdk::FabricClient;
let client = FabricClient::new("http://localhost:3001", "")?;let providers = client.list_providers().await?;println!("{:?}", providers);# Health checkcurl http://localhost:3001/healthz
# List available AI providerscurl http://localhost:3001/v1/providersCreate Your First Resources
Section titled “Create Your First Resources”Create an Organization
Section titled “Create an Organization”const org = await fabric.createOrganization({ slug: "acme", name: "Acme Corp",});org = fabric.create_organization(slug="acme", name="Acme Corp")let org = client.create_organization("acme", "Acme Corp").await?;curl -X POST http://localhost:3001/v1/organizations \ -H 'content-type: application/json' \ -d '{"slug":"acme","name":"Acme Corp"}'Submit a Job
Section titled “Submit a Job”const job = await fabric.createJob({ modality: "text", input: { prompt: "Hello from Fabric" }, organizationId: org.id,});console.log("Job:", job.id, job.status);job = fabric.create_job( modality="text", input={"prompt": "Hello from Fabric"}, organization_id=org["id"],)print("Job:", job["id"], job["status"])let job = client.create_job(serde_json::json!({ "modality": "text", "input": {"prompt": "Hello from Fabric"}, "organization_id": org.id})).await?;println!("Job: {} {}", job.id, job.status);curl -X POST http://localhost:3001/v1/jobs \ -H 'content-type: application/json' \ -d '{ "modality": "text", "input": {"prompt": "Hello from Fabric"}, "organization_id": "<org-id>" }'Stream Events
Section titled “Stream Events”await fabric.streamEvents((event) => { console.log(event.kind, event.payload);});for event in fabric.stream_events(): print(event["kind"], event.get("payload"))let mut stream = client.stream_events().await?;while let Some(event) = stream.next().await { println!("{} {:?}", event.kind, event.payload);}curl -N http://localhost:3001/v1/events/streamWhat’s Next?
Section titled “What’s Next?”- Configuration — Learn about all CLI flags, environment variables, and Docker profiles
- Local Models — Set up Ollama, Whisper, and OpenAI-compatible servers
- API Examples — Explore the full API with curl examples
- Architecture Overview — Understand the control plane and execution plane