Submitting Jobs
A job is a uniquely identifiable client-facing unit of work. Every job has a canonical job_id and maps to a single-node workflow under the hood.
Creating a Job
Section titled “Creating a Job”import { FabricClient } from "@fabric-platform/sdk";
const fabric = new FabricClient({ apiKey: "fab_xxx" });
const job = await fabric.createJob({ modality: "text", tier: "basic", input: { prompt: "Summarize this article" }, params: { temperature: 0.5 }, organizationId: "<org-id>",});console.log(job.id, job.status);from fabric_platform import FabricClient
fabric = FabricClient(api_key="fab_xxx")
job = fabric.create_job( modality="text", tier="basic", input={"prompt": "Summarize this article"}, params={"temperature": 0.5}, organization_id="<org-id>",)print(job["id"], job["status"])use fabric_sdk::FabricClient;
let client = FabricClient::new("http://localhost:3001", api_key)?;
let job = client.create_job(serde_json::json!({ "modality": "text", "tier": "basic", "input": {"prompt": "Summarize this article"}, "params": {"temperature": 0.5}, "organization_id": "<org-id>"})).await?;println!("{} {}", job.id, job.status);curl -X POST http://localhost:3001/v1/jobs \ -H 'Authorization: Bearer fab_xxx' \ -H 'content-type: application/json' \ -d '{ "modality": "text", "tier": "basic", "input": {"prompt": "Summarize this article"}, "params": {"temperature": 0.5}, "organization_id": "<org-id>" }'The response includes the canonical job_id you can use to track the job:
{ "data": { "job_id": "job_01HXYZ...", "status": "pending", "workflow_run_id": "run_01HXYZ..." }}Job Lifecycle
Section titled “Job Lifecycle”Jobs progress through these states:
- pending — Created, waiting for execution
- started — Executor has claimed the job
- completed — Finished successfully with output
- failed — Execution failed after all retries
Querying Jobs
Section titled “Querying Jobs”// Get a specific jobconst job = await fabric.getJob("<job-id>");
// List all jobsconst jobs = await fabric.listJobs();
// Get job usage/costconst usage = await fabric.getJobUsage("<job-id>");# Get a specific jobjob = fabric.get_job("<job-id>")
# List all jobsjobs = fabric.list_jobs()
# Get job usage/costusage = fabric.get_job_usage("<job-id>")// Get a specific joblet job = client.get_job("<job-id>").await?;
// List all jobslet jobs = client.list_jobs().await?;
// Get job usage/costlet usage = client.get_job_usage("<job-id>").await?;# Get a specific jobcurl http://localhost:3001/v1/jobs/<job-id>
# List all jobscurl http://localhost:3001/v1/jobs
# Get job usage/costcurl http://localhost:3001/v1/jobs/<job-id>/usageIdempotency
Section titled “Idempotency”Fabric supports idempotent job submission to prevent duplicate work from retries or network failures.
Include an idempotency_key in the request:
const job = await fabric.createJob({ modality: "text", input: { prompt: "Generate a summary" }, organizationId: "<org-id>", idempotencyKey: "client-request-abc-123",});job = fabric.create_job( modality="text", input={"prompt": "Generate a summary"}, organization_id="<org-id>", idempotency_key="client-request-abc-123",)let job = client.create_job(serde_json::json!({ "modality": "text", "input": {"prompt": "Generate a summary"}, "organization_id": "<org-id>", "idempotency_key": "client-request-abc-123"})).await?;curl -X POST http://localhost:3001/v1/jobs \ -H 'Authorization: Bearer fab_xxx' \ -H 'content-type: application/json' \ -d '{ "modality": "text", "input": {"prompt": "Generate a summary"}, "organization_id": "<org-id>", "idempotency_key": "client-request-abc-123" }'Duplicate submissions with the same idempotency key return the original resource.
Cost Estimation
Section titled “Cost Estimation”Before submitting a job, you can estimate its cost:
const estimate = await fabric.estimateCost({ modality: "text", model: "gpt-4", input: { prompt: "Generate a summary" },});estimate = fabric.estimate_cost( modality="text", model="gpt-4", input={"prompt": "Generate a summary"},)let estimate = client.estimate_cost(serde_json::json!({ "modality": "text", "model": "gpt-4", "input": {"prompt": "Generate a summary"}})).await?;curl -X POST http://localhost:3001/v1/providers/estimate \ -H 'content-type: application/json' \ -d '{ "modality": "text", "model": "gpt-4", "input": {"prompt": "Generate a summary"} }'Local models (Ollama, Whisper) always return $0.00.
Job Events
Section titled “Job Events”Subscribe to real-time events for a specific job:
await fabric.streamEvents("<job-id>", (event) => { console.log(event.kind, event.payload);});for event in fabric.stream_job_events("<job-id>"): print(event["kind"], event["payload"])let mut stream = client.stream_job_events("<job-id>").await?;while let Some(event) = stream.next().await { println!("{} {:?}", event.kind, event.payload);}curl -N http://localhost:3001/v1/jobs/<job-id>/eventsJob event types:
| Event | Description |
|---|---|
job.created | Job was created |
job.started | Executor began processing |
job.completed | Job finished with output |
job.failed | Job execution failed |
The per-job event stream replays all existing events on connect (catch-up), then streams live events as they occur.
Modalities
Section titled “Modalities”Jobs support these modalities, routed to the appropriate provider:
| Modality | Description | Example Providers |
|---|---|---|
text | Text generation and completion | OpenAI, Anthropic, Ollama |
image | Image generation | OpenAI (DALL-E), ComfyUI |
audio | Audio transcription | Whisper |
embedding | Vector embeddings | OpenAI, Ollama (nomic-embed-text) |
Tier Selection
Section titled “Tier Selection”Specify a tier preference to control provider selection:
- basic — Local/free providers (Ollama, Whisper, Echo)
- premium — Remote/paid providers (OpenAI, Anthropic)
If no tier is specified, the router selects by cost (cheapest first), then health status.