Asset Lifecycle
Fabric is a stateless execution engine. Workflow-generated assets (videos, images, audio files) are held in temporary storage for a configurable window — 72 hours by default — then automatically deleted. Consumers are responsible for downloading and persisting any assets they need before the window closes.
Asset Kinds
Section titled “Asset Kinds”Fabric distinguishes two categories of assets:
| Kind | Examples | Expires | Default TTL |
|---|---|---|---|
| Generated | Rendered videos, composed images, TTS audio, transcripts | Yes | 72 hours |
| Permanent | Org-level assets, voice clones, user uploads | No (expires_at = NULL) | Never |
Only generated assets — those produced by workflow runs — participate in the transient lifecycle described below. Permanent assets are unaffected by the reaper.
Lifecycle
Section titled “Lifecycle”Workflow run Consumer Reaper │ │ │ │ 1. Node produces │ │ │ artifact │ │ │ │ │ │ │ 2. SDK uploads blob │ │ │ with TTL ──────────► │ │ │ (expires_at = │ │ │ now + 72h) │ │ │ │ │ │ │ 3. Run completes ──────►│ │ │ (webhook / SSE) │ │ │ │ 4. GET download_url │ │ │ (signed, 1h TTL) │ │ │ │ │ │ │ 5. Download blob │ │ │ & persist locally │ │ │ │ │ │ ┌────────────┤ │ │ │ 6. Poll │ │ │ │ every 5min │ │ │ │ expired? │ │ │ │ │ │ │ │ │ 7. Delete │ │ │ │ blob + │ │ │ │ soft-delete│ │ │ │ row │ │ │ └────────────┘- A workflow node produces an output artifact (video file, image, etc.)
- The Python SDK uploads the blob to the object store with a
ttl_secondsparameter. The API setsexpires_at = now + TTLon the asset row. - The run completes and Fabric emits
workflow.run.completedvia SSE and webhooks. The event payload includes anoutput_urlpointing to the run’s output endpoint. - The consumer GETs
output_urlto retrieve artifacts with signed download URLs (or uses the SDK convenience methods). - The consumer downloads the blob and stores it in their own system.
- The reaper background task polls for expired assets every 5 minutes.
- Expired assets are deleted from the object store and soft-deleted in the database.
Download URLs
Section titled “Download URLs”Download URLs are signed and short-lived — separate from the 72-hour storage TTL. A signed URL authorizes a single download window; the underlying blob remains available for the full storage TTL.
| Property | Default | Range | Description |
|---|---|---|---|
| Signed URL TTL | 1 hour | 1 second – 24 hours | How long the download link is valid |
| Storage TTL | 72 hours | 0 (never) – unlimited | How long the blob exists in the object store |
Request a longer download window with the expires_in query parameter:
// Submit + wait + get output with signed URLs in one callconst result = await fabric.workflows.runs.submitAndGetOutput("video/ai-shorts", { input: { topic: "AI news" }, expiresIn: 86400, // 24-hour download URLs});for (const a of result.artifacts) { console.log(a.filename, "→", a.download_url);}
// Or download all artifact binaries directlyconst files = await fabric.workflows.runs.downloadAllArtifacts(runId);for (const f of files) { fs.writeFileSync(f.filename, Buffer.from(f.data));}
// Or get output for an existing runconst output = await fabric.workflows.runs.getOutput(runId, { expiresIn: 86400,});// Submit, wait, and get output with signed URLslet output = client.run_workflow_and_get_output( "video/ai-shorts", json!({ "topic": "AI news" }), Some(86400), // 24-hour download URLs).await?;
// Download artifact binary contentif let Some(artifacts) = output["artifacts"].as_array() { for a in artifacts { if let Some(url) = a["download_url"].as_str() { let bytes = client.download_artifact(url).await?; std::fs::write(a["filename"].as_str().unwrap(), &bytes)?; } }}curl "https://gofabric.dev/v1/workflow-runs/$RUN_ID/output?expires_in=86400" \ -H "Authorization: Bearer fab_xxx"The response includes both the URL and its expiration timestamp:
{ "download_url": "https://...", "download_url_expires_at": "2026-04-23T12:00:00Z"}Asset Reaper
Section titled “Asset Reaper”A background task runs inside the Fabric server process and automatically cleans up expired assets:
- Poll interval: every 5 minutes (configurable)
- Batch size: 100 assets per tick
- Process per asset:
- Delete the blob from the object store (S3 / GCS / local filesystem)
- Soft-delete the metadata row (
deleted_at = now()) - Remove dangling gallery references
- Failure handling: if blob deletion fails, the row is left intact and retried on the next tick
The reaper uses FOR UPDATE SKIP LOCKED to avoid contention with concurrent workers.
Configuration
Section titled “Configuration”| Environment Variable | Default | Description |
|---|---|---|
ASSET_TTL_HOURS | 72 | Storage TTL for generated assets. Set to 0 to disable expiration. |
ASSET_REAPER_POLL_SECS | 300 | How often the reaper checks for expired assets (seconds) |
Best Practices
Section titled “Best Practices”- Download on completion. Wire a webhook to
workflow.run.completedand download artifacts immediately. Don’t rely on polling days later. - Use webhooks for automation. For backend pipelines, webhooks are more reliable than polling — they fire as soon as the run finishes and retry on failure.
- Request longer signed URLs when needed. If your consumer needs more than 1 hour to download (large files, slow connections), pass
expires_inup to86400(24 hours). - Don’t treat Fabric as permanent storage. Fabric is a processing engine, not a CDN. Persist assets in your own object store (S3, GCS, etc.) after download.
- Monitor reaper health. If expired assets accumulate, check that the Fabric server process is running and the reaper is not failing. Blob deletion errors are retried automatically.